Blog Post

The History of Analysts: Part Two

by
Nick Freund
-
January 31, 2024

This is the second in a series of posts investigating the evolution of analysts, their toolset, and the common challenges uniting them throughout the course of history. 

The London Whale

Let me be direct: the London Whale was the stupidest and most embarrassing situation I have ever been a part of. 

— Jamie Dimon, JP Morgan 2012 letter to shareholders

2012 should have been a banner year for Ina Drew. One of the highest-ranking female executives on Wall Street, she ran the division responsible for investing JP Morgan’s $350 billion excess deposit balance. The prior year, CEO Jamie Dimon publicly praised her as a “bold” figure to be emulated.

Yet in May 2012, she became the most senior figure at the bank to resign as a result of a $6 billion trading loss. The story behind the trade is instructive.

The loss was the outcome of an outsized bet taken by trader Bruno Iksil that the European credit markets would improve. By March 2012, Iksil’s synthetic credit portfolio had ballooned to $157 billion, right as the European credit crisis took hold. As his massive bet turned against him, the hedge funds on the other side nicknamed him the “London Whale.”

How did a firm as well run as JP Morgan, which so expertly navigated the financial crisis a few years earlier, incur such a loss? Drew oversaw the firm’s risk management team, which maintained a Value-at-Risk model. The model tracked the bank’s positions and measured how much it could reasonably expect to lose on any given day.

In the aftermath, an audit by the bank’s model-review group revealed that the model operated by manually copying and pasting data across a series of Excel spreadsheets. The group also found that it calculated a critical figure by dividing by a sum, rather than an average, lowering the assessed value-at-risk by “a factor of two.”  

The firm’s control mechanism failed to properly capture data, ensure its integrity, and process it. These basic problems in executing quantitative analysis have plagued analysts throughout history, and continue to bedevil them today.

The Evolution of Quantitative Analysis

What unites analysts is how they deliver on the jobs they are hired to do: they unveil and communicate insights to inform critical decisions. 

For the first task — unveiling insights — analysts across history have leveraged an ever-evolving and more powerful set of quantitative analysis tools.

Traders of the Silk Road

Wandering the many market towns that dotted the ancient Silk Road, one would quickly run into a myriad of traders buying, selling, and bartering their wares. 

Enter the abacus, the quantitative analysis tool developed to assist with quick calculations. Variants included the simple counting boards of the western merchants, to the sophisticated Suanpan used by the Chinese traders, capable of utilizing decimals.

A Chinese Suanpan


Without the Silk Road’s analysts, trade would have been impossible and the course of early history would have been completely different. The development of the Chinese, Korean, and Japanese civilizations, as well as those across Europe and Arabia, would have been stunted.

Whiz Kids of the Second World War

When explaining the Allied victory in the Second World War, often mentioned are superior tactics, better technology, indefatigable morale, and stronger weaponry. Less frequently mentioned are Robert McNamara and his team of analysts. 

Throughout the war, his team of “Whiz Kids” within the Army Air Force applied modern quantitative techniques and the latest in data-processing technology (including punched card tabulators and IBM’s largest teletype network) to unveil insights. 

The original punched card tabulator, first used in the 1890 US Census

They included why 20% of Royal Air Force bombers aborted before reaching their targets (answer: fear, due to the 4% loss rate on missions), and how to increase bomber target destruction (answer: reduce flight altitudes from 20,000 to 7,000 feet).

Andrew Cuomo and COVID-19

After New York became the pandemic’s global epicenter this past spring, Governor Andrew Cuomo turned to data analysis to accomplish his most challenging of tasks: convincing millions of hard-headed New Yorkers that the only way to stay safe was to shelter in their tiny apartments. 

Cuomo’s team leveraged the modern data stack to unveil insights. Data was ingested, normalized, and used to populate interactive visualizations. Cuomo’s daily briefings included easily-digestible charts, including vital information such as testing positivity and fatality rates, and the latest hotspots. 

New York's COVID Dashboard


Without analysts, Cuomo would have had inadequate information to make decisions, and he would have been hard pressed to convince New Yorkers. The staggering and horrific death toll would have been much higher.

Increasing Quantitative Power Over the Ages

The pace at which we create data is growing exponentially: we currently process more data each year than in the entire history of civilization combined. 

This exponential growth rate is not a modern phenomenon. It spans ages and technology revolutions.

Compared to their immediate predecessors, the analysts of each age have wielded an order of magnitude more powerful toolset to unveil insights. Each evolution empowered analysts to capture, validate, and process the new quantity of data available.

The Polynesian navigator Tupaia’s eyes could only focus on a dozen of stars at any one time (data capture). He was then further limited by his brain’s capacity to ensure he was looking at the right ones (data integrity), and intuit his ship’s vector (processing). 

On the Silk Road, the quantity of goods in any single trade was far greater: hundreds, or thousands of data points in a single deal. The power of the abacus allowed analysts to accurately count them (data capture and integrity), apply values and assign prices (processing). 

The Whiz Kids leveraged their punched card tabulators to collect tens of thousands of data points about factories, the success and failure of missions, and apply statistical analysis. But as revolutionary as their methods were, this quantitative power was unbelievably primitive compared to the capabilities of the modern analyst.

The risk of information age tools in the data age

The $6 billion lost in the London Whale incident was not the mere result of Iksil’s outsized bet. More broadly, the loss was a result of applying information age quantitative analysis tools — the Excel model — to unveil insights — portfolio risk — in the data age. 

Manually collecting market and trading data, copying and pasting it across spreadsheets, and building Excel formulas is a paradigm of the information age. But the quantity of data is now orders of magnitude too large, and the speed by which we need to make decisions is too fast.

Like McKinsey and Governor Cuomo, leveraging the modern data stack to capture, process, and analyze big data sets is no longer an aspirational goal. It is a base level requirement to competently run one’s organization, much less compete in the marketplace. In the past decade, there have been billions of dollars spent to develop and modernize analytics capabilities across every industry. 

But where there has been no innovation, and relatively little investment, is in tools that allow analysts to deal with the speed by which they must now communicate insights and inform critical decisions. Those are still very much a vestige of the information age, and it is time for the next phase of innovation and transformation.

Enter Workstream.

Please stay tuned for the final post in our series, in which we will examine the past, present and future of tools used by analysts to communicate insights and inform critical decisions.

by
Nick Freund
-
January 31, 2024