Archive for June, 2017

Doing broadband better

June 30, 2017

 

A What Works Centre post I thought would be good here too.

 

Our new broadband toolkit takes a look at how policymakers (nationally and locally) can increase broadband takeup.

Why do we care about this? From a social point of view, it’s increasingly clear that decent internet access is a citizen right (especially as many public services are being shifted online). From an economic angle, evidence from our broadband review shows that household broadband takeup can have positive effects on house prices, female labour market participation, employment, firm growth, and economic growth. (Household adoption is also strongly linked to firm adoption.)

With that in mind, the toolkit looks at three different areas where public policy can step into the market.

*

The first toolkit looks at direct public and public-private provision. EU State Aid rules limit what member governments can do here, but even so there is room to provide networks in rural areas. And if Brexit means these rules no longer apply, the UK can (potentially) make big investments.

We find that direct public provision raises broadband takeup, even in countries like the US where the private sector is already active. We also found a study of an Italian programme that successfully raises takeup in rural areas. Crucially, though, the US evidence suggests that provision on its own is only half as effective as provision combined with info/education on broadband’s benefits.

Frustratingly, it’s hard for us to say much on cost-effectiveness on the basis of the available evidence; although it’s clear that there are huge variations in programme costs across countries. Some of this reflects physical ruggedness (Norway is tougher to pipe than the Netherlands), but there may also be potential to deliver schemes more smartly than the UK currently does.

*

The second toolkit looks at ‘local loop unbundling’ – essentially, opening up the last mile of broadband networks to all-comers. Currently only BT is obliged to do this in the UK, but in theory, national government could bring cable providers into the scope of the legislation. There’s also the question of access fees and other arrangements.

The argument against LLU is a simple one: firm X has invested in a very expensive network, so why should others get the benefit, and why should X invest more in future if they are forced to open up access? (In BT’s case this is complicated by the company’s history as a state monopoly.) The counter-argument is also simple: the wider benefits to society (via higher broadband takeup) outweigh losses to a single firm, which can in any case be compensated through charging for access.

The available evidence shows a pretty clear impact of LLU on household broadband takeup (it’s less clear for firms). Across the EU, between 2000 and 2010, LLU raised household broadband adoption by 15%. Perhaps surprisingly, the majority of studies also find no evidence that LLU crowds out future investment by the network owner, and in one case, may even lead to upgrading. This seems partly explained by relatively high access charges in most countries.

We’d urge some caution here though. For the UK, the positive effects of LLU have decreased over time, as broadband rollout covers more and more of the population. That might change if future technology gets a lot better, but this seems unlikely any time soon. Setting lower access charges would also bump up the adoption effect, but also risks discouraging future investment.

*

The third toolkit looks across other tools that Governments have used to nudge providers and users: incentives (loans, subsidies, tax breaks), information campaigns, and demand-side measures such as buying clubs and bulk purchasing.

Sometimes policy works directly with providers (ISPs); at other times the nudges are applied to users (firms and households). Does it matter which? Not from the point of view of take-up: we find evidence that both kinds of measure can be effective. Specifically, we find that loans to ISPs and demand aggregation measures like buying clubs have the effect of raising downstream takeup. For direct-to-user policies, subsidies, training and providing computers are all effective.

For providers, a cross-OECD study finds that a bundle of producer incentives have the effect of raising fibre uptake by 10% (from slower copper networks). On the user side, a US study shows that a partial loans scheme for farms raised broadband takeup by 13-14% points.

Again, it’s not easy to figure out cost-effectiveness, but we estimate that typical household programmes cost £1.1-1.3k per extra connection, with the farms scheme above costing around £3-4k per farm connected.

The UK has strongly pushed programmes like this, notably the last government’s SME broadband voucher scheme. Sadly, this has had no proper evaluation done (something we’d like to see change for future schemes) – although a user survey was published which (perhaps not surprisingly) found that voucher recipients were very happy with £3k off the cost of a fast broadband line …

*

Originally posted here on 23 June 2017.

YouGov called the election right. How?

June 30, 2017

This General Election has been full of surprises. So I’ve been digging into the YouGov MRP voting model, pretty much the only one that got the 2017 Election result correct.

Given all the current humble pie and book eating by pundits who didn’t spot the result coming, this seems worth doing. I also think there are also some useful takeaways for cities, especially as devolution rolls on.

*

YouGov’s MRP (multilevel regression and post-stratification) method not only got the national result right, but correctly predicted results in 93% of seats. Compare this to most other polls [£, and chart below]. The model somehow also predicted the Canterbury result, where Labour won for the first time since 1918.

It turns out that earlier versions of an MRP model also spotted the Leave vote in 2016, and did pretty well in the US 2016 Presidential Election. Though this version seems to have worked better, for reasons I’ll come back to.

Remember, this is a predictive method — how might people vote in the future? — that did about as well as the main exit poll — which asked people *how they just voted*. (John Curtice has more on how UK exit polling is done here.)

*

So how does the MRP model work? Here’s an overview: this gives us the main features but not surprisingly, doesn’t reveal all the datasets or the functional form/s. YouGov describe this as a Big Data approach; it seems to involve bespoke data and data science methods, but also lots of public datasets, aka ‘administrative Big Data’. The key steps seem to be:

1/ YouGov have weekly individual-level data on voting intention and detailed characteristics (including past voting). They run around 50k online interviews per week, and anyone can sign up;

2/ They use this to build a typology of voter types;

3/ For each voter type, they then fit a model that predicts voting intention;

4/ For each constituency, they then estimate how these types are spread (using public resources like the British Election Study and other ONS resources, perhaps these);

5/ They work out how the vote should go in each constituency.

By contrast, traditional polls tend sample about 1,000 people, then project direct from respondents to the whole UK, using weights to compensate for demographics, voting intention and so on.

This helps us see why an MRP approach might work better than conventional methods.

First, MRP has a much bigger starting sample. More observations = sharper results.

Second, MRP is micro-to-macro: it models each constituency individually, so stands a better chance of picking up local issues (such as the hospital closure crisis which helped drive the Canterbury result).

Third, MRP is both fine-grained and high-frequency. The only pundits to pick up on the reality of #GE2017 got out there on the ground. Given the complexity of UK politics right now, we also need methods to get at this complexity in a structured way.

Fourth, MRP methods should get better over time. you end up with loads of high-frequency training data, and this progressively makes the model better.

*

This doesn’t mean that conventional polling has had its day – e.g. Survation were also on the money. But it’s notable that most conventional polls fell over this time, just as they did in the 2015 General Election.

I suspect that these four factors helped YouGov pick up higher turnout for younger voters faster than most pollsters (and many mainstream journalists), as well as shifts in other age groups. This post by Ben Lauderdale, one of their chief modellers, seems to supports that. (Note that we won’t know turnout by age for sure until the next BES in a few months. If modelling can get us to a decent understanding faster, that’s very useful.

As Sam Freedman points out, it also helps show precisely, and in close to real time, the huge damage the Conservative manifesto did to the party’s chances.

*

Micro-to-macro techniques like MRP could beuseful for Mayoral elections and city politics. With a 50k in sample, could you train the model on a city-region like the West Midlands using public data? If so, this feels much more useful and adaptable than one-off traditional polling.

YouGov say their model works at local authority level, so some version of this could probably be done now. However, I suspect that even a big national sample might be too sparse for very local analysis, say at neighbourhood level. In this latter case, you could also imagine building a richer, locally-specific model for a whole conurbation — like the West Midlands or Greater Manchester — using a big base of local respondents.

This would be expensive — but for a local university, or a group of them, it would be a super interesting (and public-spirited) long term investment.

Birmingham University’s city-regional lab City-REDI will be exploring this further in the coming months.

%d bloggers like this: