Post-truth was the Oxford Dictionaries word of the year in 2016, and many seem to believe that it accurately describes the current state of the world. I—and we at Sense about Science—do not share that pessimism.

Our daily experience shows that a vast range of citizens, no matter their day jobs, care about evidence and want to see it taken into account by policymakers. Transparency about the evidence used during policy development is a first and necessary step towards improving that use, allowing for assessments of the quality of the evidence and the merits of policies.

We have heard a lot over the past couple of years about how we now live in a post-truth society, in which, according to the Oxford English dictionary, “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. Commentators have suggested that people have lost faith in politicians, institutions and experts. This notion has caused anxiety. But is it warranted?

At Sense about Science we are not so anxious about the public and their relationship with expertise, facts and evidence. We think the real risk is that politicians, institutions and experts might internalise this caricature of the public, and even that some less scrupulous actors may see it as an opportunity to abandon the principles of truthfulness, openness and meaningful transparency.

The public care about evidence

In June 2016, UK government minister Michael Gove was widely quoted to have said, “people in this country have had enough of experts.” (He clarified this statement some time later.) Polling for the Institute for Government found the opposite: 85% of people wanted politicians to consult professionals and experts when making difficult decisions, and 83% wanted government to make decisions based on objective evidence. Trust in experts and confidence in government had both increased since a similar poll two years earlier and people who voted leave and remain in the UK’s referendum on EU membership shared much the same view.

At Sense about Science we hear from lots of people who care about evidence. That’s why we organised events in the UK and European parliaments where citizens told their MPs, MEPs, ministers and officials that evidence matters. At each event over a hundred people came with us, and 15 of them spoke for a minute each about why evidence matters to them. Collectively they told policymakers that they expect: government (or the commission) to use evidence when making policy, ministers (commissioners) to explain their reasoning, and parliament to seek and to scrutinise the reasoning behind policies.

Yet, without transparency about the evidence and reasoning behind policies, the public cannot understand or question policy proposals, and researchers cannot evaluate the evidence the government is using, or help improve on it.​

Show your workings

In November 2016 we published the results of a preliminary review of the UK government’s transparency about the evidence behind its policies. This review was the first of its kind, and highlighted good and bad practice across government.

In collaboration with the Institute for Government and the Alliance for Useful Evidence, this report set out a draft transparency framework: an approach to testing evidence transparency that could be applied rapidly, did not require subject expertise, gave meaningful results and allowed comparison between different policy areas.

Put simply, the question we asked 12 government departments is: could someone outside government work out what you’re proposing to do, and why? The evidence transparency framework looks at this across the following areas:

  • Diagnosis – the issue that will be addressed.
  • Proposal – the government’s chosen intervention.
  • Implementation – how the intervention will be introduced and run.
  • Testing and evaluation – how we will know if the policy has worked or, in the case of consultations and further investigations, how the information gathered will be used.

Under these headings, scorers were asked to consider: Can you tell what evidence has been used? Can you tell how the government has assessed or used this evidence?

None of this work aimed to examine the quality of evidence used, or the merits of any particular policy. The sole objective was to quantify the level of transparency around policy evidence, design, implementation and evaluation.

In Show your Workings, the Institute for Government, Sense about Science and the Alliance for Useful Evidence established that in order to evaluate policy evidence and the effectiveness of initiatives to improve it, government’s use of evidence needs to be more transparent.

In other words, transparency is “a first and necessary step in enabling the quality of a department’s evidence-based decision-making to be judged”. A well-founded policy and a poorly-founded policy may both score well for transparency; a transparent evidence base enables an informed conversation to determine which is which.

In our reports we defined a policy as a specific intervention to change the status quo – what the public would intuitively think of as a policy and the usual way that policies are presented in announcements. Only documents available at the point when the government first set out these policies publicly were typically assessed. This is the point at which the public, researchers, parliament and the media first have the chance to assess a new proposal, and when it is important that the government exposes the evidence behind its initial thinking in order to promote informed engagement.

The volunteer scorers for the spot check came from all walks of life – many of them were early career researchers. Their experience with policy ranged from some to none at all.

How transparent are government departments?

We have since spot checked the same 12 departments, to find out how they were faring one year on. In January, we published Transparency of evidence: a spot check of government policy proposals July 2016 to July 2017, including the scores for 94 randomly selected policies proposed by the 12 departments (6-8 policies per department).

The good news is that it looks like we are getting there in terms of clarity about the evidence behind policy. We have seen some real improvements in evidence transparency in the year since our preliminary review.

We found the best and most consistently high scoring examples of sharing policy evidence in the Department for Transport, the Department for Business, Energy and Industrial Strategy, the Department of Health (now Department of Health and Social Care), the Department for Environment, Food and Rural Affairs and the Department for Work and Pensions.

The highest scoring policy was the Department for Transport’s ‘Cutting down on noise from night flights’, which scored top marks against every section of the framework and was commended by scorers for a range of innovative ways of presenting the government’s thinking and how the department had arrived at conclusions.

The lowest scoring policy was the Department for Education’s ‘Modern foreign languages A and AS level content’, which proposed removing the speaking assessments from some examinations. This was given 0 for every section of the framework. Scorers could form no idea of what the policy was based on. Equally low-scoring was the Department for Digital, Culture, Media and Sport’s ‘Public services incubator for small charities’, which was set out in a press release.

Room for improvement

People should be able to follow the thinking between Diagnosis, Proposal, Implementation and Testing and Evaluation. The most transparent proposals demonstrated a clear chain of reasoning as to what the problem was and why the policy was the chosen response, and included discussion about the limitations of the evidence.

It would be a mistake to imagine that a document crowded with references or extensive extracts is better grounded or more transparent. References should be meaningful and useful, to enable the reader to understand how the source is relevant and to enable them to assess that source for themselves if they wish. Scorers gave some of the most straightforward proposals, such as the Home Office’s ‘New fees for firearms licences’, some of the highest marks for their clarity and simplicity.

The most alarming finding in 2016 was the waste of evidence departments had gathered but omitted from publication. Encouragingly, sharing work done was the strongest observed improvement in the spot check. Referencing also seemed to be more common. However, many of the policies that achieved only low scores did so due to vague referencing that gave no indication of what point departments were referring to or of its relevance to the policy proposal.

Scores for the Testing and Evaluation section of the framework were the most disappointing, and showed little improvement since 2016. There is a lot of scope to improve the description of plans for testing and evaluation and, in the case of consultations, what they will do with inputs.

A balancing act

Policymaking involves a balance of politics, values and pragmatism. Governments can and do propose policies that are predominantly expressions of values, such as fairness. In these cases, there may be less of a role for evidence. We considered policies to be transparent when they were clear about being based on values, and when any testable claims were supported by evidence. One example of a testable claim is the latter of these two statements: “Hosting the Olympics will be an exciting and prestigious thing for our city to do” vs “Hosting the Olympics will encourage more people to take up sport and get more exercise, and there will be a net financial benefit from increased tourism and investment”.

Sometimes, governments must act when the evidence base is weak or absent. Policymakers may not have the luxury of waiting until those gaps have been filled before introducing proposals. The most transparent policy documents were those that acknowledged this and explained how the department would fill the gap or evaluate the policy at a later point.

A case for transparency

Without clarity about what evidence the government has looked at, it is very difficult for citizens to understand the motivations for policy, decide whether they agree with it, participate, or consider whether it is working. Researchers and specialist contributors can’t see what they could add; and government is less able to build on its own past work, let alone determine whether initiatives such as the What Works centres are improving the evidence base for policy. A transparent chain of reasoning is vital to all of this.

Despite commitments to publish the evidence behind policy, it is still a negotiated issue. During this review, departments variously told us, this policy was not typical because it: was at consultation stage, was dropped, became the focus of public debate, was not a focus of public debate, was developed jointly with other departments, is derived from manifesto commitments, was announced in the Budget, is low priority, concerns a specific group of specialists, had to be done in a rush, was inherited from a previous government...

Perhaps further improvement now depends on acknowledging that there just are no ‘normal’ circumstances for policymaking, and that showing the workings—being clear about the chain of reasoning behind proposals—applies to all situations. Certainly improved publication of evidence depends on greater trust: trust among colleagues in government that they know what to publish, and greater trust in the public’s ability to handle the fact that policy evidence is rarely complete and definitive.

Note: The report “Transparency of evidence: a spot check of government policy proposals July 2016 to July 2017” was produced by Sense about Science in 2017–2018 in partnership with the Institute for Government. The research was supported by a grant from the Nuffield Foundation and additional funding from the Alliance for Useful Evidence, but the views expressed are those of the authors and not necessarily those of the Foundation or the Alliance. Sense about Science has final responsibility for the content.

Note: All articles on the ANGLE website are published under a CC BY-ND license.