In most countries, job seekers are told they should search ‘broadly’ for jobs, meaning they should not only look for the types of jobs they have had before. Yet little is known about how they should broaden their search – what other jobs they should consider, how one might advise them in the process, and whether broader searching is in fact a good idea or not. After all, it could be that broader searching dilutes effort and as a consequence does not improve job prospects overall. This raises the question of whether one can improve the search outcomes of job seekers by providing them with information on alternatives, or whether this would backfire. Such information provision is becoming an increasingly realistic option because of the low transmission cost on digital media.
The value of interventions in job searches
Such an intervention based on advice would constitute a soft intervention, as opposed to policies that are based on monitoring and sanctions that require particular types of behaviour, since individuals in the former voluntarily change their behaviour based on novel information, while they face some coercion in the latter. Currently, we know little about the value of soft interventions relative to hard interventions, because most existing research studies settings that “combine both work search verification and a system designed to teach workers how to search for jobs” (Ashenfelter et al. 2005), so that it is not clear which element generates the documented success. For example, one of the best known and largest meta-studies on the evaluation of active labour market policies by Card et al. (2010) merges “job search assistance or sanctions for failing to search” into one category.1 Despite the obvious advantage of soft interventions in respecting individuals’ choices, only a handful of experimental studies focus on these, usually through labour-intensive and costly counselling on multiple aspects of job search (e.g. Bennemarker et al. 2009, Crepon et al. 2013, Krug and Stephan 2013, Behaghel et al. 2014). These studies find positive effects, but it is not clear that these outweigh the substantial cost of provision.2 A low-cost but impersonal information brochure on various aspects of the labour market sent by Altmann et al. (2015) showed no effect overall, but some effect amongst those at risk of long-term unemployment.
Overview of the study
The aim of our study (Belot et al. 2015) is to investigate the effects of personalised, but easily replicable occupational advice in a low-cost environment. Instead of a broad set of advice on many labour market dimensions, we designed and tested a tool focused on broadening the occupational outlook of job seekers by providing relevant suggestions of potential alternative jobs. Low-cost provision is achieved by incorporating the tool directly in a job search engine. Job seekers first need to specify their preferred occupation and, based on that, they are then offered a list of suggestions of alternative occupations they could include in their search. These suggestions are based on an algorithm we designed using real labour market statistics on transitions between occupations that could easily be replicated in other studies.
We find positive effects both on the search inputs as well as on the number of job interviews (for search on our platform as well as otherwise) based on randomised controlled trial with 300 job seekers living in the area of Edinburgh, Scotland. These job seekers were invited to our computer facilities for 12 consecutive weeks and were asked to search using search engines that we had commissioned for this purpose. They searched over jobs that were made available to us by the largest UK online job search site (Universal Jobmatch), whose vacancy database exceeds 80% of the official UK vacancy count. Participants in our study were first all asked to use a search engine similar to the available online search engines, in which they themselves have to input relevant keywords for their search. After four weeks, half of the sample was offered our ‘alternative’ search engine, which gave suggestions. The suggestions on the alternative platform are tailored to the participating job seekers based on the preferred occupation they specify. It uses representative household survey data, and displays those occupations which are the prominent targets of transitions of surveyed workers who initially hold a job in the occupation that our participant specified. Additionally, we provided information based on skill transferability. Each time, not only the suggested occupations are displayed, but also the associated vacant jobs in our database.
The randomised controlled experiment allows for a straightforward comparison of those that did and did not receive occupational suggestions.
- We find that the suggestions alter the job search strategies of its users. Those who are offered the alternative search interface consider a set of vacancies that is broader in terms of the diversity of occupations, and they receive a 30% increase in job interviews.
- These effects are largest for job seekers that searched occupationally narrow in the first three weeks of the study.
- When they are exposed to the treatment from week four onwards, they increase their job applications by 30% and experience a 50% increase in job interviews (compared to similarly narrow searchers in the control group).
The increase in interviews that resulted from search on our job search tool is significant, and it does not crowd out interviews due to job search through other channels. To the contrary, job offers accruing as the result of other job search activities also increase significantly, indicating that some effects of our information intervention seems to spill over into other job search activities. Among this group of job seekers, the effects are concentrated among those with above-median unemployment duration of 2.5 months, for whom job interviews increase even more (by 70%). For job seekers who already searched broadly before our intervention, we find a reduction in the diversity of occupations they consider, but little effect on their job offers. Overall, we take this as indication that increasing the breadth of search increases job prospects, and targeted job search assistance can be beneficial. We focus on job interviews as our sample is too small to allow reliable statistical inference regarding the number of jobs found, since finding a job is a much rarer event than getting an interview.
The fact that some populations react more to our advice than others can be rationalised along the following lines, which we formalise in Belot et al. (2015). After losing their job, individuals might initially search narrowly because jobs in their previous occupation appear particularly promising. If the perceived difference with other occupations is large, our endorsement of some alternative occupations does not make up for the gap. After a few months, unsuccessful individuals learn that their chances in their preferred occupation are lower than expected, and the perceived difference with other occupations shrinks. Now alternative suggestions can render the endorsed occupations attractive enough to be considered. Our intervention then induces search over a larger set of occupations and increases the number of interviews. One can contrast this with the impact on individuals who already search broadly because they find many occupations roughly equally attractive. They can rationally infer that the occupations that we do not endorse are less suitable, and they stop applying there to conserve search effort. Their broadness declines, but effects on job interviews are theoretically ambiguous because search effort decreases but is better targeted.
We exploit a ‘field-in-the-lab’ experimental approach to investigate whether occupational advice can contribute to increased chances in the labour market. It exploits the recent widespread use of the internet in the job search process (Kuhn and Mansour 2014) to provide this advice at a negligible marginal cost in a format that can easily be replicated. We design the search process for our treatment group to include advice based on other workers’ choices and on skill proximity. The ability to not only collect outcome data, but also to have data on how people actually change their search is a novel dimension of our design. The effects are very promising. Obvious avenues for future work include larger studies combining online and administrative data to achieve more power on the dimension of job finding and to analyse general equilibrium implications.
Altmann, S, A Falk, S Jager, and F Zimmermann (2015), “Learning about job search: A field experiment with job seekers in Germany”, IZA Working Paper No 9040.
Ashenfelter, O, D Ashmore, and O Deschenes (2005), “Do unemployment insurance recipients actively seek work? Evidence from randomized trials in four U.S. states”, Journal of Econometrics, 125(1-2):53 – 75.
Behaghel, L, B Crepon, and M Gurgand (2014), “Private and public provision of counseling to job-seekers: Evidence from a large controlled experiment”, American Economic Journal: Applied Economics, 6(4):142–174.
Belot, M, P Kircher, and P Muller (2015), “Providing Advice to Job Seekers at Low Cost: An Experimental Study on On-Line Advice”, CEPR Discussion Papers 10967.
Bennemarker, H, E Gronqvist, and B Ockert (2009), “Effects of outsourcing employment services: evidence from a randomized experiment”, IFAU - Institute for Evaluation of Labour Market and Education Policy Working Paper Series 2009:23.
Card, D, J Kluve, and A Weber (2009), “Active labour market policy evaluations: A meta-analysis”, IZA Discussion Paper Series, No. 4002
Card, D, J Kluve, and A Weber (2010), “Active labour market policy evaluations: A meta-analysis”, The Economic Journal, 120:452-477.
Crepon, B, E Duflo, M Gurgand, R Rathelot, and P Zamora (2013), “Do labour market policies have displacement effects: Evidence from a clustered randomized experiment”, Quarterly Journal of Economics, 128(2):531–580
Krug, G and G Stephan (2013), “Is contracting-out intensified placement services more effective than provision by the PES? Evidence from a randomized field experiment”, IZA Discussion Paper No. 7403.
Kuhn, P and H Mansour (2014), “Is internet job search still ineffective?” The Economic Journal, 124(581):1213-1233.