Join our community of smart investors

New evidence on nudging

Nudge theory states that people can be ‘nudged’ into making better decisions. New research casts doubt on its effectiveness
New evidence on nudging
  • Nudging has become an increasingly mainstream economic concept
  • But a new study suggests that nudges may not be an effective tool for behavioural change

In 2008, Richard H Thaler and Cass R Sunstein published the hugely influential Nudge. It argued that, being only human, people are prone to blunders and biases, which lead to bad decision making. If policymakers make it easier for people to choose what is best for them, they can be nudged into making good decisions – without restricting freedom of choice.

Nudge theory has broad appeal. The synopsis of the book presents its insights as being ‘from neither the left nor the right’, and it soon became a mainstream idea. The UK government set up a Nudge Unit in 2010, and it is even examined as part of the A Level Economics syllabus

Nudge theory has also found its way into the world of investment. Pimco and the Chicago Booth Center for Decision Research list a series of behavioural finance tips for investors. These include nudging yourself to save and invest by using precommitment strategies and reminding yourself of your long-term goals before investing. It also warns investors of the likes of confirmation bias – the tendency to overvalue new information that aligns with our beliefs and to ignore data that contradicts them. 

But a recent letter published in the official journal of the US National Academy of Sciences (NAS) journal has caused a stir: the authors’ research found that there is no evidence that nudges are effective as tools for behavioural change. This is not the first time that nudge theory has faced criticism: a 2017 paper by Queen Mary University’s Lin, Osman and Ashcroft raised questions about its theoretical and ethical underpinnings.

Nudge theory builds on the idea that decision making stems from two cognitive systems. System 1 is intuitive, biased and automatic, whereas System 2 is slower, more rule-based and analytical. Lin et al critiqued a lack of clarity on how the two ‘systems’ interact and what precisely distinguishes them. They also discuss the ethical considerations of nudging – something especially significant when tools are used to ‘covertly’ change behaviour. 

Establishing a secure evidence base has also proved difficult. Nudging research is typically undertaken via individual trials whose results can be compiled as a meta-analysis. This combines the results of multiple scientific studies to build a clearer picture of the overall effectiveness of nudging. 

In late 2021, a paper by the University of Geneva’s Mertens, Herberz, Hahnel and Brosch reviewed 212 publications, and found that "nudging was an effective and widely applicable behaviour change tool". But crucially, this paper also found evidence of something called ‘publication bias’. Publication biases arises because ‘negative’ studies yielding insignificant results are more likely to be abandoned, and less likely to be published. This means that a meta-analysis is more likely to include successful trials than unsuccessful ones, distorting results. 

New research suggests that this might be even more of a problem than we originally thought. Experimental psychologists Maximilian Maier and colleagues used a new technique – robust Bayesian meta-analysis – to correct for publication bias. Their analysis was submitted to the NAS journal as a letter and is relatively brief. Yet its findings were meaningful. Once the effects of publication bias had been stripped out, they found that "no evidence remains that nudges are effective as tools for behaviour change".

This won’t be the end of nudge theory. The pandemic and its accompanying public health exhortations reignited interest in nudging as a policy tool, and its effectiveness will continue to be debated. It is also likely that it works in certain settings. Even the study by Maier et al found evidence of heterogeneity: although their research showed evidence against the mean effect, it implied that some nudges might work. But behavioural economists must take note. After all – overlooking evidence because it challenges their beliefs would be just the kind of confirmation bias they urge against.