Posts Tagged ‘wonk’

Evidence in a post-experts world

October 11, 2016

(c) The Thick of It / BBC

Something I wrote for the What Works Centre that I thought would be good here too.

*

The What Works Centre for Local Economic Growth is three years old. So what have we learnt?

Two weeks ago we were in Manchester to discuss the Centre’s progress (the week before we held a similar session in Bristol). These sessions are an opportunity to reflect on what the Centre has been up to, but also to think more broadly about the role of evidence in policy in a post-experts world. In Bristol we asked Nick Pearce, who ran the No 10 Policy Unit under Gordon Brown, to share his thoughts. In Manchester we were lucky to be joined by Diane Coyle, who spoke alongside Henry and Andrew on the platform. Here are my notes from the Manchester event.

*

Evidence-based policy is more important than ever, Diane pointed out. For cash-strapped local government, evidence helps direct resources into the most effective uses. As devolution rolls on, adopting an evidence-based approach also help local areas build credibility with central government departments, some of whom remain sceptical about handing over power.

Greater Manchester’s current devolution deal is, in part, the product of a long term project to build an evidence base and develop new ways of working around it.

A lack of good local data exacerbates the problem, as highlighted in the Bean Review. The Review has, happily, triggered legislation currently going through House of Commons to allow ONS better access to administrative data. Diane was hopeful that this will start to give a clearer picture of what is going on in local economies in a timely fashion, so the feedback can be used to influence the development of programmes in something closer to real time.

Diane also highlighted the potential of new data sources — information from the web and from social media platforms, for example — to inform city management and to help understand local economies and communities better. We think this is important too; I’ve written about this here and here.

*

So what have we done to help? Like all of the What Works Centres, we’ve had three big tasks since inception: to systematically review evaluation evidence, to translate those findings into usable policy lessons, and to work with local partners to embed those in everyday practice. (In our case, we’ve also had to start generating new, better evidence, through a series of local demonstrator projects.

Good quality impact evaluations need to give us some idea about whether the policy in question had the effects we wanted (or had any negative impacts we didn’t want). In practice, we also need process evaluation — which tells us about policy rollout, management and user experience — but with limited budgets, WWCs tend to focus on impact evaluations.

In putting together our evidence reviews, we’ve developed a minimum standard for the evidence that we consider. Impact evaluations need to be able to look at outcomes before and after a policy is implemented, both for the target group and for a comparison group. That feels simple enough, but we’ve found the vast majority of local economic growth evaluations don’t meet this standard.

However, we do have enough studies in play to draw conclusions about more or less effective policies.

The chart above summarises the evidence for employment effects: one of the key economic success measures for LEPs and for local economies.

First, we can see straight away that success rates vary. Active labour market programmes and apprenticeships tend to be pretty effective at raising employment (and at cutting time spent unemployed). By contrast, firm-focused interventions (business advice or access to finance measures) don’t tend to work so well at raising workforce jobs.

Second, some programmes are better at meeting some objectives than others. This matters, since local economic development interventions often have multiple objectives.

For example, the firm-focused policies I mentioned earlier turn out to be much better at raising firm sales and profits than at raising workforce head count. That *might* feed through to more money flowing in the local economy — but if employment is the priority, resources might be better spent elsewhere.

We can also see that complex interventions like estate renewal don’t tend to deliver job gains. However, they work better at delivering other important objectives — not least, improved housing and local environments.

Third, some policies will work best when carefully targeted. Improving broadband access is a good example: SMEs benefit more than larger firms; so do firms with a lot of skilled workers; so do people and firms in urban areas. That gives us some clear steers about where economic development budgets need to be focused.

*

Fourth, it turns out that some programmes don’t have a strong economic rationale — but then, wider welfare considerations can come into play. For example, if you think of the internet as a basic social right, then we need universal access, not just targeting around economic gains.

This point also applies particularly to area-based interventions such as sports and cultural events and facilities, and to estate renewal. The evidence shows that the net employment, wage and productivity effects of these programmes tends to be very small (although house price effects may be bigger). There are many other good reasons to spend public money on these programmes, just not from the economic development budget.

*

Back at the event, the Q&A covered both future plans and bigger challenges. In its second phase, the Centre will be producing further policy toolkits (building on the training, business advice and transport kits already published). We’ll also be doing further capacity-building work and — we hope — further pilot projects with local partners.

At the same time, we’ll continue to push for more transparency in evaluation. BEIS is now publishing all its commissioned reports, including comments by reviewers; we’d like to see other departments follow suit.

At the Centre, we’d also like to see wider use of Randomised Control Trials in evaluation. Often this will need to involve ’what works better’ settings where we test variations of a policy against each other — especially when the existing evidence doesn’t give strong priors. For example, Growth Hubs present an excellent opportunity to do this, at scale, across a large part of the country.

That kind of exercise is difficult for LEPs to organise on their own. So central government will still need to be the co-ordinator — despite devolution. Similarly, Whitehall has important brokering, convening and info-sharing roles, alongside the What Works Centres and others.

Incentives also need to change. We think LEPs should be rewarded not just for running successful programmes — but for running successful evaluations, whether or not they work.

Finally, we and other Centres need to keep pushing the importance of evidence, and to as wide a set of audiences as we can manage. Devolution, especially when new Mayors are involved, should enrich local democracy and the local public conversation. At the same time, the Brexit debate has shown widespread distrust of experts, and the ineffectiveness of much expert language and communication tools. The long term goal of the Centres — to embed evidence into decision-making — has certainly got harder. But the community of potential evidence users is getting bigger all the time.

Save

Save

Save

Twenty commandments

September 12, 2016

Not mine; Dani Rodrik’s. Ten for economists, ten for non-economists.

Take a look below. Read Diane’s very positive review, and this more critical one by Unlearning Economics.

Then buy the book.

 

*

Ten commandments for economists

1/ Economics is a collection of models; cherish their diversity.

2/ It’s a model, not the model.

3/ Make your model simple enough to isolate specific causes and how theyr work, but not so simple that it leaves out key interactions among causes.

4/ Unrealistic assumptions are OK; unrealistic critical assumptions are not OK.

5/ The world is (almost) always second best.

6/ To map a model to the real world you need explicit empirical diagnostics, which is more craft than science.

7/ Do not confuse agreement among economists for certainty about how the world works.

8/ It’s OK to say ‘I don’t know’ when asked about the economy or policy.

9/ Efficiency is not everything.

10/ Substituting your values for the public’s is an abuse of your expertise.

Ten commandments for non-economists

1/ Economics is a collection of models with no predetermined conclusions; reject any arguments otherwise.

2/ Do not criticise an economist’s model because of its assumptions; ask how the results would change if certain problematic assumptions were more realistic.

3/ Analysis requires simplicity; beware of incoherence that passes itself off as complexity.

4/ Do not let maths scare you; economists use maths not because they’re smart, but because they’re not smart enough.

5/ When an economist makes a recommendation, ask what makes him/her sure the underlying model applies to the case at hand.

6/ When an economist uses the term ‘economic welfare’, ask what s/he means by it.

7/ Beware that an economist may speak differently in public than in the seminar room.

8/ Economists don’t (all) worship markets, but they know better how they work than you do.

9/ If you think all economists think alike, attend one of their seminars.

10/ If you think economists are especially rude to noneconomists, attend one of their seminars.

 

Big data and local growth policy

March 11, 2016

Twitter.

I’ve written a couple of posts for the What Works Centre on how to use new data sources, and data science techniques, in designing and evaluating local growth programmes.

In parts of the interweb ‘Big Data’ is now such a cliché that dedicated Twitter bots will dice up offending content – see above. But in local economic development, and urban policy more broadly, researchers and policymakers are only beginning to exploit these resources.

The first post lays out the terrain, concepts and resources. The second post is more focused on evaluation, research design and delivery.

Happy reading!

 

New job

September 1, 2015

New St Signal Box (c) 2012 Max Nathan

Some news! I’m starting a new job at Birmingham University in October, based in the Business School.

I’m going to be a Senior Birmingham Fellow in Regional Economic Development. This is basically an Assistant Professor position, but focused on research until 2020 after which it converts to a regular tenured slot. So it’s a lovely thing to have.

I’ll be working on a bunch of projects looking at technology ecosystems and clusters across the UK and further afield, as well as helping out with the new City-REDI initiative. More details on those soon. I’m also planning to continue research on diversity and migration, with new colleagues at the Institute for Research into Superdiversity. And I’m looking forward to kicking off new ideas with colleagues in Geography, Economics and elsewhere on campus.

In true academic fashion I’m keeping my LSE affiliation, and will continue at the What Works Centre there. I’m also becoming a visiting fellow at NIESR.

 

What I did in New Zealand

August 4, 2015

Matiu / Somes Island. (c) 2015 Max Nathan

Am back from New Zealand and just about over the jetlag. Thanks again to Motu and the Caddanz team for hosting me. I’m already plotting a return trip …

Here’s my talk from the Pathways conference. This is on the economics of migration and diversity, and brings together various projects from the past few years.

Here are slides and audio from my public policy talk at Motu. This looks at the What Works agenda in the UK, particularly the work of the What Works Centre for Local Economic Growth, and some of the opportunities and challenges these institutions face.

Experimenting on yourself

August 29, 2014

A recent post for the What Works Centre that I thought would be good here too.

*

At the What Works Centre we’re keen on experiments. As we explain here, when it comes to impact evaluation, experimental and ‘quasi-experimental’ techniques generally stand the best chance of identifying the causal effect of a policy.

Researchers are also keen to experiment on themselves (or their colleagues). Here’s a great example from the Journal of Economic Perspectives, where the editors have conducted a randomised control trial on the academics who peer-review journal submissions.

Journal editors rely on these anonymous referees, who give their time for free, knowing that others will do the same when they submit their own papers. (For younger academics, being chosen to review papers for a top journal also looks good on your CV.)

Of course, this social contract sometimes breaks down. Reviewers are often late or drop out late in the process, but anonymity means that such bad behaviour rarely leaks out. To deal with this, some journals have started paying reviewers. But is that the most effective solution? To find out, Raj Chetty and colleagues conducted a field experiment on 1,500 reviewers at the Journal of Public Economics (where Chetty is an editor). Here’s the abstract:

We evaluate policies to increase prosocial behavior using a field experiment with 1,500 referees at the Journal of Public Economics. We randomly assign referees to four groups: a control group with a six-week deadline to submit a referee report; a group with a four-week deadline; a cash incentive group rewarded with $100 for meeting the four-week deadline; and a social incentive group in which referees were told that their turnaround times would be publicly posted. We obtain four sets of results.

First, shorter deadlines reduce the time referees take to submit reports substantially. Second, cash incentives significantly improve speed, especially in the week before the deadline. Cash payments do not crowd out intrinsic motivation: after the cash treatment ends, referees who received cash incentives are no slower than those in the four-week deadline group. Third, social incentives have smaller but significant effects on review times and are especially effective among tenured professors, who are less sensitive to deadlines and cash incentives. Fourth, all the treatments have little or no effect on rates of agreement to review, quality of reports, or review times at other journals. We conclude that small changes in journals’ policies could substantially expedite peer review at little cost. More generally, price incentives, nudges, and social pressure are effective and complementary methods of increasing pro-social behavior.

*

What can we take from this?

First, academics respond well to cash incentives. No surprise there, especially as these referees are all economists.

Second, academics respond well to tight deadlines – this may surprise you. One explanation is that many academics overload themselves and find it hard to prioritise. For such an overworked individual, tightening the deadline may do the prioritisation for them.

Third, the threat of public shame also works – especially for better-paid, more senior people with a reputation to protect (and less need to impress journal editors).

Fourth, this experiment highlights some bigger issues in evaluation generally. One is that understanding the logic chain behind your results is just as important as getting the result in the first place. Rather than resorting to conjecture, it’s important to design your experiment so you can work out what is driving the result. In many cases, researchers can use mixed methods – interviews or participant observation – to help do this. Another is that context matters. I suspect that some of these results are driven by the power of the journal in question: for economists the JPubE is a top international journal, and many researchers would jump at the chance to help out the editor. A less prestigious publication might have more trouble getting these tools to work. It’s also possible that academics in other fields would respond differently to these treatments. In the jargon, we need to think carefully about the ‘external validity’ of this trial. In this case, further experiments – on sociologists or biochemists, say – would build our understanding of what’s most effective where.

 

A version of this post originally appeared on the What Works Centre for Local Economic Growth blog.

Spaces of Evidence seminar, 26 September

June 27, 2014

(c) richard serra / max nathan

I’m speaking at Goldsmiths in September, at one of the ESRC Spaces of Evidence seminars which will look at different types of economic evidence, their characteristics and limitations, and their uses in policy-making.

Will Davies, the organiser, has put together a nice lineup including Angus Deaton (Princeton), Suzy Moat (Warwick), Martin Giraudeau (LSE), Tiago Mata (UCL), Zsuzsanna Vargha (Leicester) and Vera Ehrenstein (Goldsmiths).

Here’s the blurb:

Economics and economists have a long history of providing a scientific basis or justification for public policy decisions. Concepts derived from welfare economics, such as ‘market failure’, have provided a language through which politicians and government officials can understand where and why the state might (and might not) intervene in market processes. The efficiency of potential regulation can be tested through the use of models, based on neo-classical assumptions.

However, events such as the financial crisis have thrown a renewed scepticism upon the capacity of orthodox economic theories to adequately model situations. At the same time, a new empiricism has emerged, which makes a bold appeal to data and field trials, which are purportedly less cluttered by normative assumptions about causality and probability. ‘Big Data’ and randomised controlled trials are at the forefront of new efforts to probe economic activity, in search of policies which ‘work’. The distinction between ‘model’ and ‘reality’ is abandoned, and the economy becomes treated as a zone of experimentation and data-mining, such that behavioural patterns can be discerned.

The seminar will explore the implications of these new directions in economic evidence, and ask what they mean for the authority of public policy, how they reconfigure expertise, and what types of epistemological and political assumptions they conceal.

It’s open to all, but you’ll need to register. Full details are here.

Same Difference?

June 25, 2014

v2

I have a new article out in the Journal of Economic Geography. Originally one of my PHD papers, it looks at the demography of innovation, particularly the roles of what I call ‘minority ethnic inventors’.

Here’s the abstract:

Minority ethnic inventors play important roles in US innovation, especially in high-tech regions such as Silicon Valley. Do ‘ethnicity–innovation’ channels exist elsewhere? Ethnicity could influence innovation via production complementarities from diverse inventor communities, co-ethnic network externalities or individual ‘stars’. I explore these issues using new UK patents microdata and a novel name-classification system. UK minority ethnic inventors are spatially concentrated, as in the USA, but have different characteristics reflecting UK-specific geography and history. I find that the diversity of inventor communities helps raise individual patenting, with suggestive influence of East Asian-origin stars. Majority inventors may benefit from multiplier effects.

The full paper is here. You can also read the working paper version, though bear in mind there are some differences to the final edit. I’ll upate this post at some point with a proper pre-print.

 

My new book

May 23, 2014

I have a book out: Urban Economics and Urban Policy, written with Henry Overman and Paul Cheshire, and published by Edward Elgar.

In a nutshell, it’s ‘economic urbanism’. We bring together last two decades of work by economists and economic geographers on urban issues, and distill some high-level lessons for policymakers. We look at trends in city growth and change, spatial disparities and urban housing/labour markets, as well as evaluating a range of urban policies.

The focus is on the UK, and especially work done at LSE’s Spatial Economics Research Centre since 2008. You can read the first chapter here.

*

The book began as a kind of greatest hits compilation for SERC, but has morphed into a broader attempt to show what economists (and economic geographers like me) can bring to cities and urban analysis.

Economics’ influence on urban policy has historically been very limited: urban thinking has been dominated by architects, planners and governance types.

In part, this is because economists haven’t been very interested in space until recently. As Paul pointed out at the book’s launch, economics 101 classes mention the three factors of production – capital, labour and land – after which land is rarely (if ever) discussed again. That has only really begun to change in the last decade or so, with the very obvious death of ‘death of distance’ arguments, and people like Paul Krugman and Ed Glaeser making their influence felt in the profession. (Ed kindly wrote the foreword for our book.)

It’s also because spatial economic concepts and techniques are fiddly and difficult to explain. Dealing with spatial autocorrelation is rarely as glamorous or compelling as iconic buildings or big political personalities. Evan Davies did economic geographers everywhere a great service with the Mind the Gap series, which did a bravura job of distilling agglomeration, knowledge spillovers and path-dependence into everyday language.

And of course lessons from spatial economics aren’t always ones policymakers want to hear. Urban systems tend to build in spatial differences, and these inequalities are self-reinforcing and hard for policy to reverse. Many urban policies are effective, but many popular ones – such as Enterprise Zones or cluster programmes – often don’t have much impact.

*

In turn, that highlights both the advantages and limitations in the economic urbanist’s approach. City leaders should take economic ideas and analysis seriously, especially when making decisions about housing, planning or development. The book is an attempt to put economic thinking back in the room. But we can’t reduce cities to purely economic processes: as objects or systems, they are too complex and chaotic for that. And as Max Weber says:

… The explanation of everything by economic causes alone is never exhaustive in any sense whatsoever, not in even in the … economic sphere itself. In principle, a banking history of a nation which adduces only economic motives for explanatory purposes is naturally just as unacceptable as an explanation of the Sistine Madonna as a consequence of the social-economic basis of the culture of the epoch in which it was created. 

That logic also applies to policy choices. In practice we often have to trade off economic, social and environmental goals – when planning new roads or houses, for instance. Citizens’ welfare is rather wider than economic welfare, and we should avoid collapsing the first into the second.

Given those complexities, economists need to be mindful of real-world priorities and politics when giving policy advice. (As do others – Richard Rogers’ reductionist readings of Jane Jacobs have not been very helpful in the UK, for example.) The What Works Centre for Local Economic Growth, which I’m helping to run, is one attempt to translate quantitative academic analysis from a range of fields into feasible, pragmatic policy ideas.

As an economic geographer, co-authoring a book with two economists proper is a rewarding experience – and a challenging one. The three of us didn’t agree on everything: as you can imagine, my views on regeneration, brownfield development and place-based policies are more optimistic than some of my co-authors. In the book we carefully flag who led on each chapter, and which work is genuinely joint.

*

I hope all that’s encouraged you to take a further look. The hardback is painfully expensive, as academic books always are. The ebook edition is quite a lot cheaper. Either way, order it from the EE website and use the code CHES35 to get yourselves a 35% discount. Happy reading!

Migrant entrepreneurs: fuzzy numbers and real impacts

March 24, 2014

The Centre for Entrepreneurs think tank recently made waves with this report on migrant entrepreneurship. The headlines are striking: 450,000 migrants set up 1 in 7 UK companies, at almost twice the rate of the UK population (17% vs 10%).

Here are some reactions. Overall, this is a welcome piece of work. However, I have some reservations about the numbers. And the report – understandably – doesn’t address some of the big issues where we still need answers.

*

First, praise where it’s due. This is completely new analysis which reflects serious effort. The actual analysis was done by Duedil, who crunched around 3m raw observations to get these numbers. (At NIESR we’ve been working with similar data: it’s a lot of work.).

The analysis has also shone some much-needed light on the role of migrants in entrepreneurship. As CFE point out, policymakers are only just starting to take this stuff seriously. Canada and Chile already have proper start-up visas. The US seems stuck in endless discussions. The UK is still getting there. And the underlying evidence base is under-developed.

*

Second, the data. It’s clear there’s a story here, and this is why the CFE work is welcome. But push a bit and the numbers are less convincing. It’s hard to tell whether the true numbers are higher or lower: the results are fuzzy.

One big issue is that there’s no correction for corporate structure – the raw data is legal entities, not businesses, and the true number of firms could be a lot smaller than the count of corporations. The relative contribution of non-UK entrepreneurs could then be higher or lower than before.

Another is that companies with migrant and UK-born founders are counted as migrant-founded. That makes sense. But without knowing what share of ‘migrant-founded’ companies are co-founded it’s hard to be clear on the true migrant contribution.

A third issue is that the data provides the nationality of company directors, not birth country. As CFE point out, this likely underestimates the count of migrant entrepreneurs (since many won’t take UK citizenship). But since there’s also many more migrants than non-UK nationals in the wider workforce (see here), this likely reduces the share  of migrant entrepreneurship. The report shows that 17% of non-UK nationals set up companies in the UK. Based on LFS figures, for that figure to hold for migrants the data would have to uncover a further 290,000-odd migrant entrepreneurs on top of the 450,000-odd non-UK nationals already identified. I’m not sure if that’s really plausible.

Full disclosure: We’ve asked DueDil for the raw data so we can see how they did the analysis. We’ve also put our questions to them: they haven’t responded yet, but I’ll update this post when they do. 

*

More broadly, this is descriptive analysis, which understandably doesn’t try to look at impacts (exploring these is a project in itself). CFE are upfront about this, although they suggest those impacts are likely to be large and positive.

I think there are (at least) four big questions for further research.

1/ What is the relative economic impact of migrant-founded companies versus co-founded and native-founded? Do they tend to do better or worse in terms of sales, productivity, attracting finance or coming up with new ideas?

2/ What explains this? Is it a ‘migrant x-factor’, or something about the company, or industry-level effect, location (say in a city)? Or some combination of these?

3/ What is the additional impact of migrant top team members? There’s no counterfactual here, but one possible workaround is to look at companies where a migrant director joined, and compare the change in their performance against similar companies where this didn’t happen. The data in the CFE report would provide a great basis to do this.

4/ What is the distributional impact of migrant (co)-founded companies? Do new migrant firms tend to complement or displace UK-founded companies?  The evidence suggests that more competition in an industry means some firms innovate out of trouble, while others exit.

Working out who are the winners and losers is politically crucial. As consumers, we may just want the best and cheapest goods and not care who makes them. But government needs to decide if they are interested in competitive markets per se, or the competitive position of UK businesses. Industrial policies often get into trouble for this reason. But the same issues likely apply to immigration policy too.