Capabilities in Academic Policy Engagement (CAPE) is a research project exploring ways to improve engagement between academics and policymakers in England. In close partnership with the Chief Scientific Adviser’s (CSA) Office at the Department of Levelling Up, Housing and Communities (DLUHC), Nesta has launched an immersive, 12-month pilot project that aims to improve the way that evidence and academic expertise is used when policies are created and implemented.
“The ability to make a tight, evidence-rich, fact-based, argument which doesn’t waste words or evade hard choices is critical to effective Government.” – Michael Gove
Over the last decade, and increasingly throughout the Covid-19 pandemic, policymakers have come under scrutiny for the way research and data is gathered, analysed and used when addressing complex problems at both local and national levels. Despite increasing demands for the consideration of evidence to help navigate and tackle complex policy questions, it can be challenging to identify the “right” evidence or what counts as good or good enough evidence in a particular context. As Geoff Mulgan writes, “there remains a striking imbalance between the science advice available and the capacity to make sense of it”.
We believe that finding better ways to utilise the breadth and depth of evidence and research expertise across England can help reduce uncertainties in decision-making, and ensure that outcomes of policy are more effective in generating a meaningful social impact. Through sustained collaboration between academics and policy professionals, government services can be informed by relevant, timely and authoritative data sources.
Building on the Transforming Evidence Hub’s exploration of academic-policy engagement initiatives, we are looking to understand what makes for effective skills development that can equip policy professionals with the knowledge to overcome the challenge of integrating academic insight into decision-making.
As part of the CAPE pilot, Nesta will be working closely with two policy teams from the DLUHC, with the aim of exploring how tools and approaches to improving evidence use, scrutiny, and academic engagement can support decision-making at the varying stages of the policy making process. The pilot uses the Treasury’s ROAMEF Policy Cycle1: Rationale, Objectives, Appraisal, Monitoring, Evaluation, Feedback as a framework to situate learning objectives – from informing an understanding of the rationale and objectives behind a policy goal, to appraising and monitoring various sources of evidence and progress towards impact, to supporting the evaluation and feedback of policy successes and failures.
What we will do: delivering training
The two participating policy teams will attend five online workshops, each one aligned with different stages of the ROAMEF policy cycle described above. The workshops will be collaborative, inviting representatives from both policy teams to help shape content, learning objectives and activity designs to ensure their applicability and suitability to the complex and ever-evolving nature of policy work. Over the course of the pilot, we’ll explore the tools, processes and techniques that could help to foster greater relationships and a healthy environment for knowledge exchange.
Over the 12 months, we will look to develop a deeper understanding of the participants’ individual and team level needs and motivations towards using academic evidence in their work. Each workshop will build on the insights identified in the last, reinforcing learning by layering learning outcomes to strengthen knowledge, skills and attitudes (a technique recommended by BS Bloom in his guidance about educational objectives written in 1956). Using a combination of traditional classroom methods and simulation-based training, the policy teams will situate the learning directly to their own real-life policy problems, ensuring that the facts, concepts, processes, insights and confidence they acquire are underpinned by their own experience of working in a complex policy environment.
We want to understand whether our training can improve evidence-informed decision-making at the individual, team and organisational level. We hope to build a baseline understanding of how varying evidence sources and modes of academic engagement can be used to inform decision-making across the policy cycle in a way that feels realistic when taking into account the complexity of issues facing policymakers.
By exploring different ways that academics and policymakers could communicate – whether it’s about how problems are framed, how evidence is summarised, or the most effective ways to engage – we hope to help teams generate and engage with the myriad of evidence sources more confidently. We’re also committed to exploring ways to ensure that academic engagement moves beyond the “usual suspects” and can include core EDI principles throughout.
“Learning is the purpose around which the system is to grow” – Bela Banathy
Through this pilot we are also interested in understanding how sustained, formalised learning can help develop and embed systemic processes for applying research in policy and practice. As A Best and BJ Holmes articulated in their report, Systems Thinking, Knowledge and Action: Towards Better Models and Methods (in the Evidence & Policy A Journal of Research Debate and Practice) it is vital to ensure that innovation systems are informed by the best available evidence.
We will be capturing feedback from participants, facilitators and the embedded evaluation team to understand whether course content, materials and activities can motivate policymakers to engage with academia and use evidence when making decisions, but also foster a wider culture of working more collaboratively. By the end of the pilot we will generate a learning toolkit to share insights on what we’ve seen works to shape the environment for improved academic engagement and evidence use within and across the public sector.