Blog

What does a ‘post-facts’ world mean for migration research and evidence?

Published 1 November 2016 / By William Allen

Back to Articles

The British public, as was famously said during the EU referendum campaign, ‘has had enough of experts’. Meanwhile, the number of false statements made by US presidential candidates in the 2016 campaign is more than in the previous two elections—with Donald Trump making about four times as many as Hillary Clinton by some counts. It would be understandable if you said facts, and the ‘experts’ who trade in them, have taken a rubbishing this year.

A 2013 Migration Observatory commentary seems particularly relevant to this current state of affairs:

…[I]t is important to recognise that that evidence—in the form of data and analysis about the scale and impacts of migration—is only one side of what drives decisions on migration policy. The other side is perhaps less tangible, but no less consequential: values.

These intangible views about what is fair, or what kind of a society people want to live in, are also important factors that influence what people think about migration. So, where does this leave social scientists and their ‘evidence’? I’d argue that researchers shouldn’t retreat or give up. Instead, they should actually engage more deeply with their assumptions and established ways of public communication. Part of this will involve actively listening to—and with—those groups they work with.

Research evidence in civil society

One such group that migration researchers often involve in their projects is civil society, comprising organisations—large and small—that are outside of the public and private sector. These might include non-governmental organisations, charities, or voluntary groups. When I asked staff from civil society organisations around the UK how they viewed research ‘evidence’, they mentioned at least three different aspects. In some cases, it can refer to knowledge that helps fill a gap or identifies new problems. This is perhaps closest to an academic point of view. But in other cases, organisations use evidence to illustrate a pre-determined problem, or support an established approach. Finally, people expressed how ‘evidence’ conveyed authority or impact to their funders and users.

Why did these staff members perceive and use ‘evidence’ differently? Part of the answer involved their intended audiences: policy makers have different needs than service providers, for example. Another reason was due to different levels of skills and confidence in handling or interpreting research. Also, the ever-changing issue areas these organisations worked on—as well as public opinion towards them—influenced what kinds of research were valued over others. ‘So, sometimes’, a senior manager told me, ‘evidence-based research is things that are done quickly and fast just to keep the momentum of an issue’.

Doing ‘relevant’ research

These kinds of reasons—timeliness, relevance, accessibility—echo what others have found to be true in media and policy settings. (In fact, there is a whole journal dedicated to studying how ‘evidence’ across many disciplines is used in a variety of spheres). In short, different contexts have different demands, norms, and values. This makes some research content or presentation styles more palatable than others.

Some would argue social scientists need to do more ‘policy-irrelevant research’, rather than try to tailor their questions to pre-existing agendas. And they have a point: when studies get designed with particular ends or categories already in mind, they veer dangerously towards the second view of evidence shared earlier. But seeking ‘relevancy’ is also about understanding the different contexts in which users, whoever they may be, encounter and engage with research evidence. What I’m arguing for is revisiting how academics plan and practice their public communication.

First, researchers should make room for discussions with stakeholders about values and organisational missions as they design their projects. Second, they should consider how their plans can develop skills and self-confidence in participants as they handle research outputs in the course of their work. Third, they should appreciate how the intangible values, skills, and needs already discussed lead to practical constraints and opportunities. And fourth, since all of these activities depend on the unique mixes present in each project, researchers should build time into the project at the beginning to develop shared goals and intended outcomes.

When I asked participants what they would say to academics who wanted to work with the public, one researcher who worked with a community-level charity said: ‘it would be about building a relationship. Something should come out of that research which also supports the organisation in terms of providing service or of giving them credit in their involvement’.

So, in a ‘post-facts’ world, is there room for social scientific ‘experts’? I would say yes. But the ways that we communicate and engage with others outside of university settings need to be more patient, appreciative of values, and—perhaps more than ever—empathic to the needs and contexts of those with whom we hope to engage.

This blog post is based on the article ‘Factors that impact how civil society intermediaries perceive evidence’, which is freely available until 31 December 2016. The findings are also available in a briefing. The research was funded by the Toyota Foundation-Japan, through the project ‘Big Data, Big Visions: Challenges and Opportunities for British Civil Society Engagement with Data-Driven Research’.

Further Reading

Boswell, Christina (2009) The Political Uses of Expert Knowledge: Immigration Policy and Social Research, Cambridge: Cambridge University Press

Collins, Harry (2014) Are We All Scientific Experts Now? Cambridge: Polity Press

Davies, William (2016) The Age of Post-Truth PoliticsNew York Times, 24 August

Oliver, Kathryn, Simon Innvar, Theo Lorenc, Jenny Woodman, and James Thomas (2014) A Systematic Review of Barriers to and Facilitators of the Use of Evidence by Policymakers, BMC Health Services Research 14 (1): 1–12