Determinants of prosocial behaviour: Lessons from an experiment with referees at the Journal of Public Economics

Raj Chetty, Emmanuel Saez, László Sándor 11 August 2014

a

A

Many organisations rely on prosocial behaviours – choices that benefit others but have a personal cost – to achieve their objectives. For instance, foundations rely on charitable contributions for funding, governments partly rely on voluntary compliance for tax revenue, and employers rely on voluntary referrals for hiring. Because such prosocial behaviours have positive externalities by definition, increasing such behaviour can improve welfare. What are the most effective policies to encourage prosocial behaviour? While there is a large body of evidence from the lab on the determinants of prosocial behaviour and altruism (e.g. Ledyard 1995, Vesterlund 2014), evidence from the field remains more limited.

In Chetty et al. (2014), we study this question by focusing on a setting familiar to academic researchers – the peer review process. Peer review is a classic example of prosocial behaviour – the personal rewards from submitting a high-quality referee report quickly are typically small, but the gains to the authors of the paper and to society from the knowledge produced are potentially large (Ellison 2002).

We evaluate the impacts of economic and social incentives on peer review using an experiment with 1,500 referees at the Journal of Public Economics. We randomly assign referees to four groups: a control group with a six-week (45-day) deadline to submit a referee report, a group with a four-week (28-day) deadline, a ‘cash incentive’ group rewarded with $100 for meeting a four-week deadline, and a ‘social incentive’ group in which referees were told that their turnaround times would be publicly posted.

Figure 1. Review times by treatment group

Notes: This figure shows the distribution of review times by treatment group during the experimental period. Each survival curve plots the percentage of reports still pending vs. the number of days elapsed since the referee received the invitation. The solid vertical lines depict the six week deadline (45 days) and the four week deadline (28 days). The dashed vertical lines depict the reminders sent one week before each deadline.

Figure 1 shows the impacts of the treatments on review times. It plots ‘survival curves’ by treatment group – that is, the fraction of reports that were submitted by the day shown on the horizontal axis. Based on the evidence in this figure, as well as other related analyses of referee performance, we obtain four sets of results.

First, shortening the deadline from six weeks to four weeks reduces median review times from 48 days to 36 days. Because missing the deadline has no direct consequence, we believe the shorter deadline acts as a ‘nudge’ (Thaler and Sunstein 2008) that changes the default date at which referees submit reports. Consistent with this interpretation, most of the increase in referees’ speed occurs immediately after they receive an email reminding them that their report is due in one week.

Second, providing a $100 cash incentive for submitting a report within four weeks reduces median review times by an additional eight days. Prior work has debated whether extrinsic incentives such as cash payments are effective in increasing prosocial behaviour because they may crowd out intrinsic motivation (Titmuss 1971). The fact that monetary incentives work implies that the positive effect of the price subsidy predicted by traditional economic models dominates any crowd-out of intrinsic motivation in the context of peer review. We further assess whether cash incentives crowd out intrinsic motivation by testing whether referees who received cash incentives become slower than those in the four-week deadline group after the cash incentives end. We find no such evidence – again indicating that crowd-out of intrinsic motivation is not large in this context.

Third, we find that the social incentive treatment reduces median review times by approximately 2.5 days. This effect is much smaller than the impacts of the other treatments, but the degree of social pressure applied here is relatively light. The fact that even this treatment has an effect suggests that more direct forms of social pressure – such as personalised emails from editors – may have powerful impacts on referee behaviour.1 Moreover, social incentives complement the other interventions by influencing individuals who are less responsive to other incentives. In particular, we find that tenured professors are less sensitive to deadlines and cash incentives than untenured referees, perhaps because they are busier or wealthier. Social incentives, in contrast, have much larger effects on tenured professors, as shown in Figure 2.

Figure 2. Impacts of treatments on tenured vs. untenured referees

Notes: This figure shows the distribution of review times for tenured referees (Panel A) and untenured referees (Panel B). Each survival curve plots the percentage of reports still pending vs. the number of days elapsed since the referee received the invitation.

Finally, we evaluate whether the treatments have an impact on other outcomes besides review time.2 Economic models of multi-tasking (e.g. Holmstrom and Milgrom 1991) predict that referees will prioritise the incentivised task (i.e. submitting a report quickly) at the expense of other aspects of performance (e.g. the quality of reviews). We find that the shorter deadline has no effect on the quality of the reports that referees submit, as measured by whether the editor follows their recommendation or the length of referee reports, as shown in Figure 3. The cash and social incentives induce referees to write slightly shorter referee reports, but do not affect the probability that the editor follows the referee’s advice. We also find little evidence of negative spillovers across journals – the treatments have no detectable effects on referees’ willingness to review manuscripts and review times at other Elsevier journals.

Figure 3. Impact of treatments on review quality

Notes: This figure shows the effects of the treatments on review quality, as measured by the percentage of cases in which the editor's decision to accept or reject the manuscript matches the referee's recommendation.

Lessons for the peer review process

Our findings offer three lessons for improving the peer review process.

1. Shorter deadlines are extremely effective in improving the speed of the review process. Moreover, shorter deadlines generate little adverse effect on referees’ agreement rates, the quality of referee reports, or performance at other journals. Indeed, based on the results of the experiment, the Journal of Public Economics now uses a four-week deadline for all referees.

2. Cash incentives can generate significant improvements in review times and also increase referees’ willingness to submit reviews. However, it is important to pair cash incentives with reminders shortly before the deadline. Some journals, such as the American Economic Review, have been offering cash incentives without providing referees reminders about the incentives. In this situation, sending reminders would improve referee performance at little additional cost.

3. Social incentives can also improve referee performance, especially among subgroups such as tenured professors who are less responsive to deadlines and cash payments. Light social incentives, such as the Journal of Financial Economics’ policy of posting referee times by referee name, have small effects on review times. Stronger forms of social pressure – such as active management by editors during the review process in the form of personalised letters and reminders – could potentially be highly effective in improving efficiency.

More generally, our results reject the view that the review process in economics is much slower than in other fields, such as the natural sciences, purely because economics papers are more complex or difficult to review. Instead, our findings show that small changes in journals’ policies can substantially improve the peer review process at little cost.

Lessons for increasing prosocial behaviour

Beyond the peer review process, our results also offer some insights into the determinants of prosocial behaviour more broadly.

1. Attention matters – reminders and deadlines have significant impacts on behaviour. Nudges that bring the behaviour of interest to the top of individuals’ minds are a low-cost way to increase prosocial behaviour, consistent with a large literature in behavioural economics (Thaler and Sunstein 2008).

2. Monetary incentives can be effective in increasing some forms of prosocial behaviour. We find no evidence that intrinsic motivation is crowded out by financial incentives in the case of peer review, mirroring the results of Lacetera et al. (2013) in the case of blood donations. While crowd-out of intrinsic motivation could be larger in other settings, these results show that one should not dismiss corrective taxes or subsidies as a policy instrument simply because the behaviour one seeks to change has an important prosocial element.

3. Finally, social incentives can be effective even when other policy instruments are ineffective. This result echoes findings in other settings – such as voting (Gerber et al. 2008), campaign contributions (Perez-Truglia and Cruces 2013), and energy conservation (Allcott 2011) – and suggests that social incentives are a useful complement to price incentives and behavioural nudges.

References

Allcott, Hunt (2011), “Social Norms and Energy Conservation”, Journal of Public Economics, 95(9–10): 1082–1095.

Chetty, Raj, Emmanuel Saez, and László Sándor (2014), “How Can We Increase Prosocial Behavior? An Experiment with Referees at the Journal of Public Economics”, Journal of Economic Perspectives, 28(3).

Cruces, Guillermo, Ricardo Perez-Trugliad, and Martin Tetaza (2013), “Biased perceptions of income distribution and preferences for redistribution: Evidence from a survey experiment”, Journal of Public Economics, 98: 100–112.

DellaVigna, Stefano, John A List, and Ulrike Malmendier (2012), “Testing for Altruism and Social Pressure in Charitable Giving”, Quarterly Journal of Economics, 127(1): 1–56.

Ellison, Glenn (2002), “The Slowdown of the Economics Publishing Process”, Journal of Political Economy, 110(5): 947–993.

Gerber, Alan S, Donald P Green, and Cristopher W Larimer (2008), “Social Pressure and Voter Turnout: Evidence from a Large-scale Field Experiment”, American Political Science Review, 102(1): 33–48.

Holmstrom, Bengt and Paul Milgrom (1991), “Multitask Principal-agent Analyses: Incentives Contracts, Asset Ownership, and Job Design”, Journal of Law, Economics and Organization, 7: 24–52.

Lacetera, Nicola, Mario Macis, and Robert Slonim (2013), “Economic Rewards to Motivate Blood Donations”, Science, 340(6135): 927–928.

Ledyard, John O (1995), “Public Goods: A Survey of Experimental Research”, in John H Kagel and Alvin E Roth (eds.), Handbook of Experimental Economics, Princeton: Princeton University Press: 111–194.

Thaler, Richard H and Cass R Sunstein (2008), Nudge: Improving Decisions about Health, Wealth, and Happiness, New Haven: Yale University Press.

Titmuss, Richard M (1971), The Gift Relationship, London: George Allen and Unwin.

Vesterlund, Lise (2014), “Voluntary Giving to Public Goods: Moving Beyond the Linear VCM”, in John H Kagel and Alvin E Roth (eds.), Handbook of Experimental Economics, Vol 2, Princeton: Princeton University Press.

Footnotes

1 Similar social pressure interventions also have significant impacts on voting and charitable contributions (Gerber et al. 2008, DellaVigna et al. 2012).

2 The cash incentive increases the fraction of referees who agree to review a manuscript. The social incentive reduces agreement rates, while the shorter deadline has no impact. We find that the selection effects induced by these changes in agreement rates are modest and are unlikely to explain the observed changes in review times.

a

A

Topics:  Frontiers of economic research

Tags:  research, incentives, Behavioural economics, academia, journals, peer review, social pressure, intrinsic motivation

Professor of Economics, Harvard University

Emmanuel Saez

Professor of Economics and Director of the Center for Equitable Growth, University of California Berkeley

PhD Candidate in Economics, Harvard University