Immigration is a vast and complex phenomenon; no one person can possibly apprehend it on their own, even someone who knows many migrants, or is or has been a migrant herself. All of us have views based on some combination of personal experience, second-hand reports, and information we’ve received from various forms of ‘media’ – including newspapers, TV news, and even entertainment – that might expose us to ideas about and examples of migration. It is very hard to figure out where people get their information, and how they combine it over a period of many years to form their beliefs and impressions.
My research for several years has been working on both immigration attitudes and coverage of migration in the British press. My collaborators and I have highlighted the ways in which both public perceptions of immigrants and media coverage of migration can diverge from “reality” (as measured by our best sources of data, which of course are imperfect as well). “Thinking Behind the Numbers” (and its further development in “Imagined Immigration”), showed that people in Britain are much more likely to have in mind asylum seekers rather than students when they normally think about immigration, even though students have made up a much larger share of immigration in-flows to Britain for many years. In “Migration in the News”, Will Allen and I found that ‘illegal’ is by far the most prevalent descriptor of ‘immigrants’ in British newspapers, even though the vast majority of Britain’s migrant population does have legal status.
In a new project, co-author Anne-Marie Jeannet and I begin to bring these pieces together. We report on a series of survey experiments – conducted with a representative sample of the British population through the web-based polling firm YouGov – to see whether exposure to actually-occurring media portrayals of immigrants in Britain can affect attitudes toward and perceptions of immigrants among members of the British public.
A survey experiment is a very useful tool for examining the causes of opinion change and formation, although it has its limits as well. It takes place in the context of the normal sort of survey, questionnaire or poll that is commonly used to measure public attitudes toward political issues, or any other topic that a researcher or pollster might ask about. To embed an experiment into a survey, however, the experimenter will divide the sample of respondents into two or more groups, and give each group a version of the questionnaire that differs in one key respect while keeping everything else the same. Often one group is designated as the ‘control group’, while other groups receive a ‘treatment’.
This language – and logic – is perhaps more familiar in the context of medical studies in which one group is given a placebo and one or more other groups receive an actual medication, to test whether the treatment actually works. The idea is to compare outcomes for the treatment group(s) with the control or placebo group, to isolate the true impact of the treatment. Experiments are very good at testing causal relationships, as opposed to mere correlation.
In the social science survey context, a similar logic applies. In our survey, we presented groups with a short news item before they answered a series of questions about migration and other topics. The ‘treatments’ in this case were not medicinal but informational. We created multiple versions of the news item, each of which presented a slightly different message about immigration figures. These treatments were designed to represent some of the findings from the “Migration in the News” study of actually-occuring portrayals of migrants by contemporary British newspapers.
For example, the control version of our news story simply reported that official statistics showed continued migration to Britain in 2010. Treatments changed this line in several ways: one group got this message with a ‘numbers frame’ that included official statistics on migration flows; another got a story that spoke metaphorically of migration as a ‘flood’. Other versions mentioned migrants residing ‘illegally’ in Britain, migrants coming to Britain from Eastern Europe, and contributions to Britain from highly-skilled migrants, respectively. The idea was to see whether any of these ways of portraying migrants, given in one small but concentrated dose, had any measurable impact on the attitudes toward or perceptions of migrants that people report on surveys (and which form the evidence base on public opinion that informs policy-makers and media reports).
In the end, we find support for the notion that even subtle shifts in a single sentence in a news story can shift public conceptions of immigration. Although many of the treatments showed no measurable impact on the responses, the high-skill migration frame produced statistically significant changes on several key measures. In particular, the group exposed to the high-skill migration treatment showed changed perceptions of who immigrants are. When asked which groups they normally thought of when thinking of immigrants, respondents who had read the high-skill treatment were less likely than the control group to report thinking of migrants as asylum seekers, more likely to think of migrants as workers, and marginally less likely to think of migrants as ‘illegal’. The high-skill frame also led to reduced assessments of the proportion of the UK’s migrant population that is illegally resident in the UK. This treatment group gave an average guess of 15%, compared with 20% by the control group.
Finally, both the high-skill treatment and the Eastern European migration treatment had an impact on people’s preferred immigration policies. In particular, those exposed to the Eastern European migration frame were a bit less likely to prefer reduced immigration numbers, compared with the control group.
It is important not to get carried away with ‘media effects’. Our study found mostly null effects. And when we did find statistically significant effects, they were small in size. Perhaps most people have established views about migration that are not necessarily easy to influence; although media coverage may have helped those views to form and to solidify, this long-term development is extremely difficult to isolate and measure, and is not amenable to study by controlled experimentation.
We suspect that the high-skill frame had the most consistent impact because it is the least common of the frames we chose. Most of the others were based on some of the most widely-used language in the British press. It might not have made much difference to hear about large numbers of migrants, or ‘illegal’ immigration, for the umpteenth time, if that is the way most of our respondents are normally thinking about migrants anyway. On the other hand, mentioning high-skilled migration might have added some new information or momentarily shifted the way our respondents were imagining immigration.
However, those small changes may stand in for important substantive impact. If changing one sentence in the lead-up to a survey question can produce even small changes in perceptions and attitudes that respondents will express, then multiplying exposure to similar messages over the course of many years may well add up to larger effects. Also, while it is important not to overestimate the direct role of the media, it is equally important to keep in mind its indirect role. Classic theories of political communication postulated a ‘two-step flow’, in which more attentive members of the public read and listened to news media and then spread (selectively) what they had learned to others through casual conversation. On a topic such as immigration, which is very salient and much-discussed, the media’s influence may not be limited to direct impact on attitudes, but also involve informing the conversations that British people have in their daily lives that shape their thoughts, impressions, and beliefs.
For the full study, download the conference paper.