## Monday, 1 May 2017

### BOOK REVIEW: Uncertainty (2017)

From The Philosopher, Volume CV No. 1 Spring  2017

UNCERTAINTY
Review article
By Thomas Scarborough

It would be helpful to begin at the beginning. Probability, while it was known by the ancients, found its first serious application in the 16th Century, through game probability. In the case of game probability, it may be a fairly simple matter to predict an outcome. A coin toss, for instance, will yield either heads or tails with a probability of 0.5, which is an equal chance of either – if the situation is theoretically perfect. Another example is throwing a dice. The chance that it will turn up any given number is one in six.

Now consider, rather, two dice. Things become more complex now – such that one could do with the help of a simple mathematical formula. For instance, to calculate the probability that the sum of two dice will be 5, one divides the number of favourable outcomes by the total number of possible outcomes. This might lead us to believe that probability is much like game probabilty – but this is deceptive. The real world is seldom as simple as a game – even the most complex of games. Chess and bridge, for instance, may be simplicity itself in comparison with grasping the spread of an epidemic, or predicting the outcome of a vote – as we so well know. To deal with more complex uncertainties, one begins to depend heavily on analysis.

But how then does one establish just what it is that is uncertain in a given analysis? and how does one factor this into one’s thinking? This, writes the author, is not answered by grasping for equations, let alone models. It requires ‘slow, maturing thought’. It is more a matter of philosophy than of mathematics. Yet people shun the effort. Instead, they grasp at pre-packaged probability theory, which is far too easily applied without further thought. In fact, the author sketches a situation of crisis proportions. There is altogether too much that we get wrong.

How, then, does one establish what it is that is uncertain in a given analysis? and how does one factor this into one’s thinking? It requires ‘slow, maturing thought’.

In principle, the science of uncertainty would seem to be simple. In science one has, on the face of it, certainty. This is encapsulated by scientific laws, for example a = F/m. Apply a certain force F to mass m, and the acceleration of m is a. To recast this in terms of probability, the results of such laws have a probability of 1. On the other hand, there may be complete uncertainty, which too represents a kind of certainty. This has a probability of 0, because it is certain that it will never happen. The chances are nil. In both cases, one knows perfectly – or imagines that one does – what one is dealing with, and what one should anticipate.

However, any figure between 0 and 1 introduces an interesting situation – not merely in practice, but often enough in principle. Assume that the probability of something happening is 0.7. In such a situation, one neither has complete certainty nor complete uncertainty, and the reason for this is that we have uncertain influences on our analysis of a situation, beyond our knowledge or control – alternartively, too complex to contemplate. More importantly, one cannot pin these factors down precisely, or one would be dealing with certainty, not uncertainty. This pinning down of uncertain factors, contends the author, is where far more mistakes are made than is generally understood.

The publisher describes this work as a textbook. It begins with what one might call a componential analysis of probability. It carefully examines such concepts as truth, induction, chance – and many besides. Then it applies these observations to the field of modelling. While the mathematics are complicated, this is compensated for by the authors’s gift of explanation.

The book really brightens up when one reaches worked examples of what can and does go wrong, and how probability calculations for the self same situations may easily turn out to be quite different. The examples are generalised, too, so as to be meaningful beyond specific contexts. Some particularly illuminating sections of the book include a series of graphs and equations in which the quantification of GPAs, the probabilities of developing cancer, or how one might validate homophobia, are discussed.

I have one demurral ato make. In places, the style seems unnecessarily to get in the way of the content. In particular, outbursts such as ‘Die, p-value, die, die, die!’ or ‘p-values, God rot them!’, while they are certainly memorable, do not seem to serve the book well as the serious academic work that it is.

All in all, if the author is right, then our world has strayed down a path which is dangerously simplistic – and this tendency towards simplistic thinking has much to do with how we think about uncertainty. One might go so far as to say: that we have misapplied, and continue to misapply, theory which has to do with things of critical importance, including the very future of humanity.

The Philosopher’s verdict: Useful warnings about the complexities of simplistic thinking.

Uncertainty: The Soul of Modeling, Probability & Statistics
By William Briggs Springer International Publishing
ISBN: 978-3-319-39755-9
(Hardcover £42.00 ) 978-3-319-39756-6
(eBook £27.94), 258pp 2016.

#### 1 comment:

1. William Briggs’s book sounds intriguing, and an important read; there’s so much to the story of probability and chaos. Most people are familiar with the frequently recounted story involving the mathematician-cum-meteorologist Edward Lorenz, who used computer models to predict weather — and in the process serendipitously contributed to the development of chaos theory, and how scientists look at such exquisitely nonlinear systems as the weather. Back in the early Sixties, he decided to rerun one of his weather simulations. However, not thinking it mattered, Lorenz decided to begin the simulation in the middle, using numbers (for the ‘initial conditions’) from the first run. Much to his astonishment, the new virtual weather pattern, which he expected to follow the first run of the model, dramatically deviated from it. What he subsequently realized is that whereas the computer had stored in its memory the first run’s results to six decimal places, the printout, from which he reentered the numbers, had truncated the numbers to just three decimal places, to save space. As for predictions, modeling, nonlinearity, probability, initial conditions, uncertainty, controls, chaos, and outcomes, the rest is history. I look forward to reading this book.

Our authors very much value feedback from readers. Unfortunately, there is so much spam on the internet now that we now have to moderate posts on the older articles. Please accept our apologies for any extra time this may require of you.