When JP Morgan Chase announced it had lost more than 2 billion dollars on the capital markets back in May 2012, many pointed to the actions of rogue trader Bruno Iksil as the cause. But was the "London Whale" — the nickname he was given by other traders for his outsized positions — the victim not of hubris, but a simple spreadsheet error?
James Kwak, associate professor at the University of Connecticut School of Law and co-founder of the Baseline Scenario blog, noted some interesting facts in JP Morgan Chase's post-mortem investigation of the losses. Specifically, that the Value at Risk (VaR) model that underpinned the hedging strategy
“operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another", and "that it should be automated" but never was.
This is a surprisingly common practice: through accretion and incremental advancements, an important statistical calculation somehow ends up being implemented as a convoluted series of Excel worksheets, connected by hundreds (or even thousands) of cell-reference formulas, all driven by a series of input parameters that need to be entered manually. Not only does this impose the risk of introducing errors when cutting-and-pasting the inputs, it also makes the workbook extremely fragile. As anyone who's build a budget in Excel knows, it's very easy when editing the spreadsheet to find that formulas no longer extend to their expected ranges (ever missed the bottom row from a formula when adding new data?), or point to the wrong data entirely. And then there's the possibility of errors in the formulas themselves, which seemed to have been an issue here as well:
“After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”
Excel is an excellent tool for many applications, but the intertwined cross-references of formulas make errors like this hard to detect, and hard to correct even if discovered. That's why a programming language designed for data analysis such as the R language is a better platform for building the computational engines behind VaR models and other financial systems. Not only can it automate the process of importing data and inputs from other systems (and thus eliminate cut-and-paste errors), it also provides a structured, maintainable environment for the computational logic, within a framework that promotes code review and unit testing to detect errors. Excel may still be the preferred vehicle for delivering the results, but use R to generate the analytics computations (or embed real-time R computations in Excel) instead of risking an implementation in Excel formulas.
Read also: How Validus Re uses Revolution R Enteprise for risk management
The Baseline Scenario: The Importance of Excel (via Ben Lorica)
B.D. McCullough of Drexel specializes in the reliability of statistical software. He first challenged Excel's math in 1999, and has published a total of ten articles on the subject in professional journals such as Computational Statistics and Data Analysis. There is a list of publications on his home page at http://www.pages.drexel.edu/~bdm25/
Excel is not the only software with questionable math. When IBM acquired SPSS, they tested it with the NIST datasets; one-way ANOVA, GLM and MEANS all failed. IBM, to its credit, owned up to the issue. http://www-01.ibm.com/support/docview.wss?uid=swg21485458
Posted by: Thomas W Dinsmore | February 11, 2013 at 19:22
I used to work at one of the biggest banks a decade ago, in a backoffice team doing risk management. All we used was Excel!
Posted by: Ben Nicholson | February 12, 2013 at 13:49
Whether developed with the Excel tool or some other application, Value-at-Risk is not a coherent measure, and not a sound basis for assessing the risk of an investment.
If Excel is indeed an unsound estimation tool, then the "Whale" was victim of a two-fer: an unsound risk measure calculated using an unreliable estimating tool.
Posted by: Desi Erasmus | March 04, 2013 at 09:18
Great post,
We're seeing more and more in the news about companies losing big bucks due to poor Excel skills. It's a trend that I see continuing until proper investment is made into simple training courses.
Although JP Morgans Excel docs would have been more complicated that most, there were still a number of techniques that they could have used in order to avoid these glaring mistakes. I've written about them in my most recent post for Microsoft Training dot net.
http://www.microsofttraining.net/b/exceltraining/2013/04/get-back-control-of-your-excel-spreadsheets/
Posted by: Terence Noah Craven | April 16, 2013 at 02:11
A great shame. Excel is good for some things, but it does seem appropriate in some cases to teach people in the world of finance more software engineering skills, even if only to get them to write excel macros that do suitable validation....
Posted by: Martin Harvey | July 24, 2013 at 15:19