The Great British Bake Off: a lesson in fairness and manners

This year’s Great British Bake Off ended last night. It was an immensely enjoyable show, and also very popular with the general public. Drawing wider lessons from such apparent trivia is a tricky business – but I was particularly concerned by this article by one of the contestants (Ruby Tandoh) in today’s Guardian. There has, apparently, been a lot of abusive comment in the press and on social media. What does this say about the state of British manners?

First let’s have the good news. The manners displayed in the contest itself were quite beautiful. In spite of the highly pressured atmosphere, and the obviously competitive nature of the activity, the contestants behaved wonderfully to each other. They actually seemed to like being together – bonding in response to the common task. This applied even to the comments made by the contestants without the others there. This is not always the case with these game shows (though we do not watch many of them), where rather silly competitive stuff often gets said. Nobody forgot that this was just a pointless contest about cakes and bread. This is all part of the charm of the programme, and clearly it helps make excellent viewing. This is welcome relief against the apparent conventional wisdom that bad manners make good viewing.

It is also worth pointing out that the judging was inevitably hard, but scrupulously fair. One of the judges, Mary Berry, being a particularly fine exemplar of good manners while at the same time passing difficult judgements. The other, Paul Hollywood, was less tactful, but never rude and always fair. This huge effort to be fair in the face of something very subjective also made for very good viewing. There are important social lessons there in a cynical world.

So what was the fuss about? Well I know about it mostly from Ms Tandoh’s article. My Facebook friends hardly talked about it, still less said anything inappropriate. I did pick up an article in The Daily Mail while I was on holiday last week though, claiming that Miss Tandoh should have been knocked out that week, but wasn’t because a tendency to burst into tears had affected the judges. We watched a recorded version of the show, which showed this accusation to be clearly nonsense. Apparently there was a lot more of this rubbish around. Miss Tandoh’s view is that it was largely misogynistic – responding to the fact that the final five contestants were all women.

What this clearly shows is that bad manners are rife on social media. That Britain’s awful press  pick up on this and stir it up further is entirely unsurprising. But people buy these papers and clearly like to read it. I can’t say for sure whether this means that standards of social behaviour are slipping, or whether social media is simply exposing behaviour that was previously concealed. I suspect the latter.

Regardless, it shows that the British public has a lot to learn about manners on social media. But it is rather wonderful to have a TV programme like the Bake Off to show how good manners can done in a thoroughly modern way, and that it brings with a feel-good factor with it. Miss Tandoh’s article is model of good manners itself. She has put her critics to shame.

Lies and statistics

This week The Economist has an interesting article, Unreliable research: trouble at the lab, on the worrying level of poor quality scientific research, and weak mechanisms for correcting mistakes. Recently a drug company, Amgen, tried to reproduce 53 key studies in cancer research, but could get the original results in six. This does not appear to be untypical in attempts to reproduce research findings. The Economist points to a number of aspects of this problem, such as the way in which scientific research is published. But of particular interest is how poorly understood is the logic of statistics, not only in the world at large, but in the scientific community. This is, of course, applies particularly to the economic and social science research so beloved of political policy think tanks.

One particular aspect of this is the significance of a concept generally known as “prior probability”, or just “prior” for short, in interpreting statistical results. This is how inherently likely or unlikely a hypothesis is considered to be, absent any new evidence. The article includes an illustrative example. Hypotheses are usually tested to a 95% confidence level (a can of worms in itself, but let’s leave that to one side). Common sense might suggest that this means that there only a 5% chance of a false positive result – i.e. that the hypothesis is incorrect in spite of experimental validation. But the lower the prior (i.e. less inherently probable), the higher the chance of a false positive (if a prior is zero, at the extreme, no positive experimental result would convince you, as any positive results would be false – the result of random effects). If the prior is 10% there is a 4.5% inherent probability of a false positive, compared to an 8% change of a true positive. So there is a 36% chance that any positive result is false (and, for completeness, a 97% chance that a negative result is truly negative). Very few

The problem is this: an alternative description of “low prior” is “interesting”. Most of the attention goes to results with low priors. So most of the experimental results people talk about are much less reliable than many people assume – even before other weaknesses in statistical method (such as false assumptions of data independence, for example) are taken into account. There is, in fact, a much better statistical method for dealing with the priors problem, called Bayesian inference. This explicitly recognises the prior, and uses the experimental data to update it to a “posterior”. So a positive experimental result would raise the prior, to something over 10% in the example depending on the data, while a negative one would reduce it. This would then form the basis for the next experiment.

But the prior is an inherently subjective concept, albeit one that becomes less subjective as the evidence mounts. The scientific establishment hates to make such subjective elements so explicit, so it is much happier to go through the logical contortions required by the standard statistical method (to accept or reject a null hypothesis up to a given confidence level). This method has now become holy writ, in spite of its manifest logical flaws. And , as the article makes clear, few people using the method actually seem to understand it, so errors of both method and interpretation are rife.

One example of the scope for mischief is interesting. The UN Global Committee on Climate Change presented its conclusion recently in a Bayesian format. It said that the probability of global warming induced by human activity had been raised from 90% to 95% (from memory). This is, of course, the most sensible way of presenting its conclusion. The day this was announced the BBC’s World at One radio news programme gave high prominence to somebody from a sceptical think tank. His first line of attack was that this conclusion was invalid because the standard statistical presentation was not used. In fact, if the standard statistical presentation is appropriate ever, it would be for the presentation of a single set of experimental results, and even that would conceal much about the thinness or otherwise of its conclusion. But the waters had been muddied; our interviewer, or anybody else, was unable to challenge this flawed line of argument.

Currently I am reading a book on UK educational policy (I’ll blog about it when I’m finished). I am struck about how much emphasis is being put on a very thin base of statistical evidence – and indeed how statistical analysis is being used on inappropriate questions. This seems par for the course in political policy research.

Philosophy and statistics should be part of very physical and social sciences curriculum, and politicians and journalists should bone up too. Better than that, scientists should bring subjectivity out into the open by the adoption of Bayesian statistical techniques.

One cheer for Ed Miliband

The focus of British politics is now clear. The prospects of the different parties at the General Election due in May 2015 dominates everything. No doubt it is with relief that the political elite and their coterie of journalists and commentators focus on this question, rather than the much more difficult one of how to make this country a better place for its citizens. The defining event of this year’s conference season so far has been Labour leader Ed Miliband’s speech. For good or ill, he seems to have set the political agenda. But does this tell us anything about whether Labour really would do a good job of running the country?

To read the commentaries, Mr Miliband has made a decisive shift to the left. He did this by focusing his ire on big business, and threatening them with government action. There were two signature policies. One was forcing energy companies to freeze their prices while the government reconfigured the regulatory regime to squeeze them more permanently. The second was forcing private sector developers to “use or lose” their land banks to build houses. All this in an attempt to reverse the decline in living standards that the bulk of the population has suffered since the economic downturn started in 2008.

Commentators of the right and centre, such as the Economist’s Bagehot column, interpret this as Labour vacating the decisive “centre ground” of politics, where elections are won and lost. This is the ground which the Liberal Democrats’ Nick Clegg is trying to push his party into, in spite of grumbles by older activists. The Conservatives’ David Cameron is likely to stake his claim there too, and ignore the voices of his party urging him to adopt right wing populism to fend off the threat from Ukip. They will attack Labour’s policies as unworkable, and part of a failed socialist outlook. Mr Cameron will offer tax cuts as a surer route to improving living standards; Mr Clegg will offer tax cuts too, though rather narrower ones, and something about tackling barriers to social mobility (affordable child care, better schools, etc.).

Commentators of the left, like the Guardian’s Jonathan Freedland, do not deny Mr Miliband’s leftward lurch and seem quite content. His new policies are popular with the public, and have lifted Labour’s poll ratings; there may be many more populist left wing policies, bashing bankers and big business that will go down well electorally. With a large chunk of the previous Lib Dem vote going to Labour, and the Conservatives struggling with Ukip, the next election is Labour’s to lose.

But rather than evaluate these calculations at face value, let’s pause and step back for a minute. All this looks like the narcissism of small differences. All three parties remain firmly embedded in much same policy space. Mr Miliband would have made a decisive step to the left if he had outlined a policy of increased public spending, funded by extra taxes. In this way he might be able to halt and even reverse the relentless squeeze on benefits and public services. But he has decided not to; instead his party will have to make something in the region of £26 billion of cuts in the next parliament (a number I have culled from this perceptive article by the Resolution Foundation’s Gavin Kelly in the paywalled FT). He has said nothing about where these cuts, or increased taxes, will fall. The Conservatives would be making a radical shift to the right if they were proposing to make deeper cuts to the state, which they can only do if they cut the NHS (or rather make the public pay for more of its services) and old age pensions. There is no sign of that.

And there is something else. None of the parties is embracing more than gradual or token decentralisation of power from Westminster. Instead they argue over this, that or the other centralised tax, subsidy or regulatory regime. This can be seen in the season’s signature policy ideas. Mr Clegg has announced free school meals for English school infants; all of them, everywhere, because it looks like a clever idea based on a couple of pilot schemes. Mr Miliband wants to bash energy companies through central regulation: but how does this help solve the country’s slow path to reducing carbon emissions? And what on earth is the point of the Conservatives’ tax break for married couples?

But Mr Miliband deserves credit for one thing. He, alone amongst the three party leaders that I can see, has pointed out one of the two central challenges of the British political economy. That is that the benefits of economic growth are bypassing most people. This is nothing new, but economic stagnation is making it hard to gloss over. It arises primarily from technological change, and its effect on manufacturing industry and office work, helped along by the rise of globalisation. The problem isn’t that our tax and benefit system is failing to redistribute wealth, it is that increasingly wages are too low in the first place.

But there Mr Miliband’s insight seems to end. He seems content to blame big “predator” corporations, and offer the hope that better regulation will help. He didn’t even mention the second great challenge, which is that real terms funding for public services and benefits will fall rather than rise in the years ahead. He offers palliatives rather than solutions. Britain’s right and centre are no better.

What is the solution? In my view there are three interrelated elements. Improve the education system so that skills better balance where the jobs are in future. Redesign public services and benefits so that they can be tailored to individual and community needs. Strengthen local networks to counterbalance the effect of the rise of centralised, winner takes all networks. These three require a radical decentralisation of power. How long will it take before our political classes start to realise this?