Against the Gods: The Remarkable Story of Risk (1996) by Peter L. Bernstein

This isn’t a book; it’s an education. And I can guarantee that if you’re a serious, committed, thoughtful reader the purchase price is but a tiny fraction of the value that Bernstein will deliver to you with “Against the Gods.” The following review may seem overly detailed and specific, wordy beyond measure, but that’s mainly because I learned so much and wanted to capture all that I could while it was still fresh.

In the first section, “To 1200: The Beginnings” Bernstein explains why modern understandings of probability and risk were so slow to develop. The Greeks were brilliant mathematicians and even more brilliant philosophers, but several handicaps prevented them from discovering risk, according to the author. First, their number system was klugey and ill-adapted for even the most rudimentary calculation (you just try multiplying XVII by XIV!). Second, the great Greek thinkers were forever in single-minded search of truth, not “the likeness of truth,” which is the rough translation of probability in Greek. Finally, life and times in the ancient world moved slowly. Livelihoods and prices and ways of working changed hardly at all from one generation to the next. Moreover, whatever the future might hold was resigned to the will of the gods or, later on, to God. Thus, the tools and motivation to better understand odds and risk we face every day were lacking for centuries. And then along came the Renaissance and Reformation.

In section two, “1200-1700: A Thousand Outstanding Facts,” Bernstein plots the slow but steady discovery of basic probability theory. Two changes were fundamental to success. First, the Hindu/Arabic numeral system made its way West and slowly went mainstream (first recorded use in Europe was only in 1134), especially after the introduction of the printing press overcame one of its chief weaknesses – the relative ease of doctoring numbers (i.e. changing 1 into 7 or a 0 into 9). Suddenly there was a readily understandable system of digits that could be calculated with relative ease (the typical third grade memorization of “times tables” was unknown to even the most accomplished mathematicians in ancient times). Indeed, the invention of “zero” may be as remarkable and influential as that of fire. Second, for the first time learned men began to ponder what their futures held, untethered from an all knowing or at least highly influential supernatural being. God or fate no longer dictated whether or not you gained or lost money, but proper planning and consideration of probabilities based on past events might.

The wheel of modern risk theory began to turn because of a brainteaser centered on gambling, a vice that dates back at least as far as the so-called oldest profession. In 1202, a 27-year-old Italian named Fabonacci published a handwritten book called “Liber Abaci” that demonstrated in a very practical sense how Arabic numerals could help people solve important practical questions, such as profit margin, interest rates, etc. Fabonacci in turn influenced another Italian centuries later, Paccioli, who in 1494 published a book on mathematics and accounting (“Summa de Artithmetic…”) that, among other things, posed a problem that was the touchstone for modern risk theory: “A and B are playing a fair game of balla. They agree to continue until one has won six rounds. The game actually stops when A has won five and B three. How should the stakes be divided?” (To put it in more comprehensible terms for modern Americans, Bernstein writes that this question is like asking: Imagine that the Giants and the Red Sox are in the World Series and are perfectly matched, but the Giants won game one. What are the odds that the Red Sox will win the Series?). Nearly a century-and-a-half later this challenge was picked up by the brilliant and eccentric Frenchman Blaise Pascal in 1650 at the behest of an amateur mathematician and wealthy gambler, Chevalier de Mere. In October 1654, Pascal had a religious revelation and consequently swore off mathematics forever, but not before bequeathing his mechanism for solving the Paccioli puzzle, his famous “Pascal’s Triangle” that showed that future outcomes were reliant on the results of those that preceded them. (It was the Triangle that would inspire Francis Galton’s “quincunx” machine in 1874 that demonstrated the tendency toward normal distribution – or the Bell Curve.)

Then, in 1662, a 42-year-old British button merchant, John Graunt, began to digging into the birth and death records of the City of London and became the first person to use statistical sampling to estimate the expected age of death by probability, an approach further refined by a young Edmund Halley (of Halley’s comet fame) in the late 1690s. This innovation in statistical analysis found its way to Lloyd’s coffee house on the wharves in London where denizens began to offer insurance for the various ships that sailed from London…and Lloyd’s of London was born.

The third section is the most dense, “1700-1900: Measurement Unlimited,” a remarkable family of Swiss mathematicians looms large, the Bernoullis. In 1705, Jacob Bernoulli observed that under similar conditions the occurrence of an event in the future will assume the pattern of the observation in the past and then, in 1731, his nephew Daniel Bernoulli published a paper at the St Petersburg Academy of Sciences that introduced the economic concept of utility, that people behave differently depending on the utility that they derive from the decision and consequence, or “to each is own.”

But for all that Bernstein makes of the Bernoullis (and he constantly refers back to them), it was the discovery of normal distribution and the associated instruments of standard deviation and regression to the mean that were the most profound. First recognized by a French mathematician Abraham de Moivre in 1730 and then further refined by the legendary German mathematician Carl Gauss in the early 19th century, the power of normal distribution was powerful. But the discovery of regression to the mean by Englishman Francis Galton, privileged cousin of Charles Darwin and modern founder of eugenics, was pure “dynamite” according to Bernstein. No longer was probability a static concept of randomness and large numbers, but rather a dynamic process of outliers predestined to join the crowd at the middle. The driving force of the world would always be toward the mean, the average, the mediocre. His experiments with pea pods and then a height study in London demonstrated precisely the opposite of what he was hoping for: eminence does NOT beget even greater eminence; it begets something less brilliant, more normal. Large peas or human couples beget offspring slightly smaller and vice versa.

The fourth section, “1900-1960: Clouds of Vagueness and Demands for Precision” deals with the disillusionment of the Victorian Age, those who believed that the world was steadily getting better and that the answers to life’s mysteries could be discovered if we could just measure more. The First World War destroyed all of this, Bernstein writes. Two interwar period economist represent this counter trend well, the American University of Chicago economist Frank Knight, who first fully distinguished between risk (I know my odds and I’m taking a chance) and uncertainty (I have no idea what my true odds are), and the legendary British economist John Maynard Keynes, who argued the future was inherently unknown and always subject to human caprice (or “animal spirits” as he called them). The common theme, as Knight evidently hated Keynes, is the basic question: why should past performance dictate our view on future performance?

Bernstein uses this section, building on Knight and Keynes’ disbelief in rationality, to introduce two final critical contributions to risk theory in the pre-1960 period. First, “game theory,” as developed by John von Neumann and Oskar Morgenstern, which demonstrated that the true source of our uncertainty in life lay not in the odds of some static probability model, but rather the likely reaction of others. Second, Harry Markowitz showed in 1952 that the safest way to maximize reward (i.e. profit) and minimize loss was to diversify our portfolios and thus minimizing volatility, an idea that was radical at the time when inflation, exchange rates, commodity prices, and the bond and stock market were largely predictable. But, beginning in the early 1970s, when everyone’s view of normal was changing rapidly, the brilliance of Markowitz’s formulae was finally discovered.

Finally, Bernstein ends with “Degrees of Belief: Exploring Uncertainty,” that essentially ends in the mid-1990s. With the benefit of a decade (a very short time frame considering the sweep of this book), it seems that Bernstein just missed the next great wave of discovery in probability – behavioral economics, which has become something of a cottage industry recently, with notable (and highly recommended) books like Dan Ariely’s “Predictably Irrational” (or even “Freakonomics”) that demonstrate in fuller relief the hunches of Knight and Keynes and others.

In closing, this is one of the best, more influential books that I’ve read. I found it at a church rummage sale and bought it for $0.50. I can’t even begin to compute the return on investment from that purchase. If you like books like Heilbruner’s “The Worldly Philosophers” and/or if you’re a practitioner in today’s world of hedging and calling, this masterpiece absolutely cannot be missed.