Join our community of smart investors

The death of inflation

RIP, RPI? Not quite, but the UK’s best-known measure of inflation may have to change if it is to survive. Philip Ryland reports
June 13, 2019

Readers of Investors Chronicle may not spend much time thinking about strappy tops. Their teenage daughters may well do, but not our oh-so-sensible readers. Perhaps they should – especially if they own index-linked gilt-edged stock. Unlikely though it seems, changes to the prices of strappy tops have contributed an annual windfall of perhaps £1.2bn a year since 2010 to holders of index-linked gilts. But now that windfall is under threat.

The odd connection between changes in the price of skimpy fashion clothing and dividends on so-called ‘linkers’ lies in the way that the Office for National Statistics (ONS), the data processing arm of the state-controlled UK Statistics Authority, gathers data for, and then calculates, the UK’s inflation data, especially for its oldest and best-known measure of inflation, the Retail Price Index (RPI).

Much of it is arcane stuff. Even so, methods of data collecting and of calculating various measures of UK inflation have big effects on readers as investors, as taxpayers and as consumers.

●  As investors, they pocket a windfall if they own index-linked gilts, but they are about to lose an equivalent bonus on their holdings of NS&I index-linked savings certificates,

●  As taxpayers, they lose out because personal allowances against income tax don’t rise as much as they might under alternative ways of measuring inflation,

●  As consumers, it is much the same – prices rise (for, say, alcohol, tobacco and rail fares) more than they might because the government chooses to index these to a measure of inflation that consistently produces high figures.

Behind the various oddities and inconsistencies in the way that the UK’s inflation rates are calculated lies this question: which is the best measure of inflation? The answer is partly subjective but, however it is tackled, there are three major influences:

●  The way in which the basic data is collected,

●  The mathematical methods used to aggregate the basic data into bigger categories of goods or services,

●  Whether the major costs of owning a home are included in the inflation rate and, if so, how?

The way in which the basic price data is collected gets us back to those strappy tops. Back in 2009 or thereabouts, the statisticians at the office for National Statistics (ONS) reckoned that inflation in women’s clothing was oddly low. According to their results, between 1987 and 2009, on average, prices for women’s outerwear fell by 2.5 per cent a year.

If correct, that would have been an astonishing drop. The stats were saying that, while price levels on average in the UK had risen by about 50 per cent, prices for women’s outerwear had fallen by 42 per cent; while the price of a basket of goods costing £20 had risen to £30, the price of, say, a £20 top had fallen to £12. True, outsourcing much production to sweat-shops in the third world was a factor, but the statisticians intuitively knew that the fall was also connected to the difficulties of making comparisons between items that changed by the season and changed according to, well, fashion. So they widened the sample of clothing from which they gathered price data and they relaxed rules about comparing items of clothing, which is where those strappy tops came in.

The results were, to put it mildly, surprising. From that negative inflation rate, prices swung madly. Between 2010 and 2017, the average inflation rate for women’s outerwear soared to 11.1 per cent a year, a swing of 13.6 percentage points from its average level of 1987 to 2009. Meanwhile, the price of that £12 top was up to £25.

If price changes were being understated before, now they were being exaggerated. But no one knew quite why and how. As Jill Leyland, a member of the National Statistician’s advisory panel on consumer prices, told an inquiry into measuring inflation by the House of Lords: “A relatively minor technical matter was not discussed by any advisory committee and, disastrously, was not tested before implementation.”

Errors in data collection were then compounded by fundamental design faults in the mathematics behind measuring inflation, especially in the maths used to calculate RPI.

To explain, we need – mentally at least – to go back to school and recall how averages can be calculated. Broadly speaking, there are two ways. Either you can have an arithmetic mean or a geometric mean. The arithmetic variation is what the lay man thinks of as an average: take a series of numbers – say, five of them – add them together and divide the sum by five. The resulting figure is the average. Geometric means are a bit more complicated. Take the same five numbers, multiply each of them together, then take the 1/n root of the product of the numbers, which, in this case, would be the 1/5 root. That gives you the geometric mean. But the key point is that, given a string of data, the arithmetic mean will always generate a higher value than the geometric variety, assuming, that is, that the numbers in the string are not all the same (see the box below, I should cocoa, for more explanation).

 

This has important implications for calculating inflation. The manipulation of core price data in the RPI relies heavily on using arithmetic averages. In contrast, the newer – and more favoured – Consumer Prices Index (CPI) assembles its base data, where prices for varieties of a single product are aggregated into one, mostly using the geometric mean.

The result is that, other factors being equal, RPI inflation will always be higher than the CPI version. And how. Table 1 shows the average annual inflation rates – arithmetic mean, before you ask – for the 30 years 1989 to 2018. On average, the gap between RPI and CPI is 0.7 of a percentage point per year – 3.3 per cent against 2.6 per cent. In other words, RPI produces an inflation rate that has been fully a quarter higher than CPI.

The underlying difference in the maths behind RPI and CPI is responsible for most of this. However, the drama of the strappy tops has added perhaps 0.3 of a percentage point to RPI since 2010 and Chart 1 shows clearly how the RPI index has accelerated away from CPI since then. This is where that £1.2bn a year windfall for holders of index-linked gilts comes from. With about £400bn of linkers in issue, an estimated extra 0.3 percentage point in the rate of RPI would result in an extra £1.2bn a year of dividends being paid out.

The House of Lords Economic Affairs Committee was deeply unimpressed by this. In a report, Measuring Inflation, published in January it said “we do not see why a windfall is acceptable, but a loss (to holders of gilts) is not”.

Then again, the Lords committee was unimpressed with much of the government’s ‘inflation shopping’ (see Table 2). It singled out the decision, which takes effect from May, to switch inflation-linking on NS&I index-linked savings certificates from RPI to CPI; a move that will cut interest payable by about £120m a year. “This appears to be a further example of a switch motivated by its favourability to the government rather than a principled approach to uprating,” said the committee.

True, the lords can take this stance partly because most of them are no longer elected politicians. But it’s also because the matter is not that simple.

It is much easier to see the shortcomings in the way that RPI is calculated than it is to fix them. In response to the shock of the strappy tops, the UK Statistics Authority commissioned an expensive report from a Canadian economist and expert in number theory, who came up with what, even then, was a well-worn suggestion that using arithmetic means should be dropped from RPI’s methodology.

Following further consultation, Dame Jil Matheson, then the National Statistician, the head of ONS, decided to do nothing except drop RPI as an approved statistic because its formulation did not meet acceptable standards.

Yet it is debatable whether RPI is as poor as its harshest critics suggest. It has been known for almost 100 years that problems can be caused by using arithmetic averages at the core of a statistical calculation (the so-called ‘Carli method’, see the box). In particular, this method suffers from an upwards bias whereby it can produce inflation where common sense would say no inflation exists.

This bias prompted an economics professor, Robert Hill, from the University of Graz, to tell the Lords committee that using the Carli formula in calculating RPI was “indefensible”.

Not so, according to the UK’s Royal Statistical Society, It told the Lords that it was a “fundamental misjudgement” to blame the Carli formula for the oddly high numbers that RPI was splurging out during the strappy tops affair. No statistical formula could cope with such variation in prices, the society suggested.

If it was not clear how much blame to attach to the way that RPI is calculated for the rogue inflation figures, it remains uncertain what to do about the problem. The present National Statistician, John Pullinger, told the Lords inquiry that if the issue of the clothing data were tackled, that would only open the door to having to deal with other shortcomings in the index.

Yet clearly this matters. On the one hand, the ONS has little time for RPI – the chairman of the UK Statistics Authority, Sir David Norgrove, best known as private secretary to Margaret Thatcher in the 1980s and as a director of Marks and Spencer (MKS) in the 1990s, told the Lords: “It is not a good measure of inflation, does not have the potential to become one and we strongly discourage its use.”

On the other hand, RPI is still so widely used as to be almost woven into the nation’s economic fabric. In addition to being the benchmark for setting the levels of many taxes (see Table 2), it is also used in private-sector contracts, especially to set levels of rent and United Utilities (UU.) told the Lords inquiry that about half the £3.5bn debt owed by its water utility was linked to RPI.

And then there are those index-linked gilts. The good news for holders of linkers is that a three-way pull between the statistics authority, the Bank of England and the Chancellor of the Exchequer makes it unlikely that RPI will be changed to become a measure of inflation that, in effect, consistently notches up a lower rate.

That’s because making such a change – to use the key phrase from the 2007 Statistics and Registration Service Act – would be “materially detrimental” to the interests of those holding linkers. In those circumstances, the Bank of England would be obliged to tell the statistics authority to, first, get the go-ahead from the Chancellor of the Exchequer. Yet Sir David Norgrove told the Lords inquiry that it would be pointless to go through the “stately dance” of being directed to the chancellor by the bank only to be told “no”. So it hasn’t bothered – a fact that left the Lords deeply unimpressed; so much so that its inquiry concluded: “In publishing an index which it admits is flawed but refuses to fix, the authority could be accused of failing in its statutory duties”.

Yet ‘unlikely’ that changes will be made to RPI index linking does not mean ‘impossible’. Of the 30 index-linked gilts issues outstanding, only three have payments specifically tied to RPI and the longest dated of those expires in 2030. The other 27 have payments tied to an index that the chancellor considers is “an officially recognised index measuring changes in the level of UK prices”.

That does not have to be RPI. It could be CPI or the newer and much-favoured CPIH (more about that in a moment), both of which generate consistently lower inflation rates than RPI. But for that to happen the chancellor would have to face down the shrieks of protest from City institutions who suddenly see their future income dwindling, their annuity contracts unprofitable, their pensioner customers impoverished and so on. The institutions might have a point. The actuarial arm of consultant KPMG estimates that of £2,000bn-worth of defined-benefit pension-fund liabilities in the UK, the pay-outs on about £1,100bn are contractually linked to changes in RPI. Yet those funds would be heavily invested in index-linked gilts so, to the extent that their assets matched their liabilities, equal changes to the obligations and the asset values on each side of their balance sheets would cancel out.

But before any changes are seriously contemplated, there is the need to sort out one more snag in the inflation calculations – how to account for housing costs. This major item of household spending almost slips into the calculations as an after thought.

RPI uses mortgage costs (chiefly interest payments) and changes in the value of homes as its chief housing costs. This makes for a volatile index as was demonstrated in 2009 when, in the vice of the global financial crisis, RPI briefly showed negative inflation (see Chart 2). That was because interest rates plummeted, causing interest payments to drop. Additionally, house prices fell, too. That also depressed RPI since the index uses changes in house prices as a proxy for the major maintenance costs of owning a home. Usually that works because house prices almost always rise. But when house prices fell, then – irrationally – maintenance costs dropped, too.

If that’s silly, then – arguably – even sillier is the absence of any major housing costs from the construction of CPI. That’s for a rather ridiculous reason. When CPI was first formatted in the 1990s as the Harmonised Index of Consumer Prices across the European Union, the eurocrats involved could not agree on how housing costs should be included. The solution was simple – the costs were excluded. And, implausible though it sounds, working housing costs into CPI and its EU-wide equivalents “remains controversial”, according to a report from the European Commission in 2018.

Hence the pressing need to improve on CPI and it has now been superseded by CPIH as the government’s and Bank of England’s preferred measure of inflation. Essentially, this is CPI with housing costs tagged on, which may be why it does not even have a proper name.

Basically, CPIH uses ‘rental equivalence’ as its housing costs where the rent paid for an equivalent home in the private sector is used as a proxy for housing costs. CPIH was introduced in 2013, ran into teething problems and was dropped as a national statistic in 2014 then regained its ‘national’ status in 2017 following improvements.

Not everyone is convinced by rental equivalence. The Lords inquiry noted that “the private rental sector is subject to its own distortions and may not provide a good proxy for owner-occupier housing costs”. Quite possibly, yet CPIH is favoured by the ONS, the government and the Bank of England as the best measure, even though they acknowledge that its ‘new-kid’ status means it needs a longer run of real-world data to prove itself.

Perhaps that will give RPI the time to make a come-back. That’s not out of the question. In 2015, Paul Johnson, the director of the Institute of Fiscal Studies, a think tank, reviewed RPI and concluded that it was not worth the effort to improve it; better to phase it out. Now, however, the continuing need – sometimes a legal obligation – for RPI means he has changed his mind. He suggests its shortcomings – especially in the collection of data – should be sorted out. That would enhance its credibility. More radically, Dr Ben Broadbent, a deputy governor of the Bank of England, wonders if RPI could be transformed into CPIH. That would simplify matters. If it also meant likely CPIH-style rates of inflation fused into RPI that would have holders of index-linked gilts hopping mad.

Meanwhile, the Economic Affairs Committee of the House of Lords wants the UK Statistics Authority to recommend to the government a single preferred measure of inflation, which should be adopted within five years. Some hope. RIP, RPI? Not quite.