Top Ten lessons learned from the Forum’s efforts to support evidence informed policymaking

– [John] So, thanks very much
everybody for joining us. So this is a presentation
by myself that, of course, reflects the insights from our fantastic team at The Forum as well
as insights that we’ve garnered working in
collaboration with many partners around the world over the years. We added this webinar
to the series because we realised that while
myself, colleagues like Ewan Moats and others,
have given this or similar talks in a number of different settings. We’ve never actually done it at The Forum. So we wanted to take advantage
of the webinar series to share the lessons learned. If I move on for the next slide, and I’m gonna come back
to this summary slide after I’ve walked through the 10 lessons. But these are the 10 lessons that we’ve picked up over the years. I think of these as being
the substantive lessons that we’ve learned from our
own work and collaborating and watching closely the
work of our partners. We’ve also, I think, learned how to better communicate our work, and I have one slide about that after the 10 lessons. We’ve also learned a lot
about how to function as an organisation from
a more operational basis as a group that tries
to be very responsive to the needs emerging
from the system and to work on very tight timelines. I’m not gonna talk about those operational lessons learned today,
but we do have a new initiative starting with
partners in 14 other countries where we’re documenting those. We’re up to 10 lessons learned right now, but we anticipate that
we’ll enrich those lessons and I’m sure we’ll end
up at 10 very quickly. The other thing that I’m
gonna do when I finish the 10 lessons is talk
about four new directions that we’re heading. So these are more emergent
areas and we haven’t yet derived lessons from
them, but we hope that if we were to give this
webinar in a few years they will have lessons emerging from those four directions that
we are focusing on now. So onto the lessons learned. I typically have one, at most two, slides for each of these. The first is be clear about the goal. And in a few past webinars,
we’ve talked about this goal. But I want to go back
to it for those of you even who’ve seen it before
because of how we have come to realise that to achieve this goal, we really need analytical
capacity in three areas, which you’ll see at the
bottom of the slide. We need it in policy analysis, which is the top part that
I’ll come to in a minute. We need it in the area
of political analysis, which is the bottom part,
the second bullet point. And we need to be able
to do systems analysis, which comes in primarily
in the options workup. So that realisation has
come to us relatively recently that in fact when
we think about this goal, it means we need three
very distinct types of analytical capacity. But to come back to the
goal, what we think that we’re trying to support when we talk about evidence in Forum policy
making is using the best available data and research evidence, systematically and transparently
in the time available. Sometimes that will be minutes and hours, other times days and weeks,
other times months and years. And we need to do it in four areas. Better understanding
problems or feeding in to agenda setting, better
helping people to choose among options, which is
the policy of programme development phase,
ensuring that the chosen option makes an optimal
impact at acceptable cost. That’s feeding into implementation. And then looking at
monitoring and evaluation in terms of what needs to be done. So that we consider to be
policy analysis and our goal is to try to get best
evidence citizen values and stakeholder insights into
that type of an analysis. The second part of the
definition recognises that we’re trying to influence
policy, which in most countries means that we’re
subjected to democratic forces where different
groups are elected at different times to make tough decisions on our collective behalf. And those elected
politicians are making tough decisions under an array of
institutional constraints facing interest group
pressure, considering citizen values, and many
other sources of ideas. So in other words, at the end of the day, policymaking is inherently political. And we think that it’s very
important to be able to do political analyses as
an adjunct to the policy analysis so that you can
better identify windows of opportunity, what might
make sense at particular times, and try to therefore
have a greater impact. And then I mentioned
already that the systems analysis comes in particularly helpful in that options workup
by helping us better understand what’s currently
functioning or not in given systems, the
governance, financial, and delivery arrangements, and how we can intervene differently
with those arrangements to improve systems. So that’s the first lesson learned, that we need to be clear about the goal and for us that we need
analytical capacities in those three very distinct areas. The second lesson learned
is that we need to learn and use a systematic
approach to analysing priority issues. I failed to mention at the
beginning that while I’m looking at 10 lessons, I’m
also singling out for these five of the message with which we’ve got the greatest traction. And one of them here is
workshops for policymakers on how to fit evidence
into their traditional approach to analysing priority issues. So we find it helpful to unpack problems and their causes, options,
and implementation, each separately. We also find it helpful
when trying to understand problems to think about
risk factors or conditions separate from whether
we’re currently using the right programme services or products and those two things in turn separate from how their system is currently organised in terms of existing
governance, financial, and delivery arrangements. Where the literature exists,
how it’s talked about, and so on varies according
to which of those you’re dealing with, and
we find it helpful to dissect out problems and
their causes in those ways. When we come to options,
you can either be thinking about adding, dropping,
or changing programmes, services, or products
or changing governance, financial, and delivery arrangements. The Forum is particularly
focused on that second bullet. There are groups that are
out there in health systems around the world that
do an excellent job on helping to choose them on
clinical programmes and services or clinical products like drugs. There are other groups
that are more focused on public health programmes and services. Our focus is much more
how do we help systems to organise themselves to
get the right programmes, services, and products to people. And the finally, on the
implementation front, we find it helpful to
think about once a course of action is chosen, how
do we diagnose the current drivers of what’s happening
in the system that will influence whether
that option gets taken up and then how would one design and deliver an implementation
strategy to make sure that that option makes a
difference on the ground and affects the people
who lives that we’re trying to influence. So that’s our second lesson learned. Learn and use a systematic approach. And we primarily operationalize
that through workshops. Third is look for the right
types of research evidence. I’m not gonna go into detail about this. But the research literature
is very fragmented. You have people that built
entire careers on answering particular types of
questions that might give us insight into some aspects
of policy problems or they might give us insight into how we think about particular features of options or they might be particularly helpful on the implementation front. So what I’ve done in this
slide is to give you some of the language that we
think about when we’re trying to understand problems,
options, or implementation and then the types of
jargon that’s used in the research world when people
are trying to answer those types of questions. And our view is that good
evidence informed policymaking is drawing on all of these
types of research evidence. If I now move on to lesson three. Just a continuation. Some additional points
here in terms of looking for the right types of of evidence. Our view over the years
has been that systematic reviews should be where we look to first for answers to those questions. And I define what I mean
here by systematic review. We think it’s very
important that policymakers be helped to rapidly
identify the quality of the review to understand
where the included studies were conducted to help them
assess the applicability of the evidence and be
able to rapidly pull out the key messages. And we think that in order
to do that and to force out of the fourth lesson
that I’m gonna talk about, we think it’s very important
that policymakers know the right one stop shop
to find pre-appraised evidence and evidence
that flags this type of decision relevant information so they can make use of it very quickly. So these one stop shops
can give them what they need quickly and comprehensively. But also these one stop
shops, if they go to them preferentially, they will
be reassured that there’s a very good sense they’re
not missing evidence that might be fitting in
other databases, because typically these one
stop shops are searching all of these little suspects in terms of sources and research evidence, so they have done the work for you. If I now talk about
looking in the right places for research evidence. I’m gonna talk in particular
about another one of our methods, Health Systems Evidence, but first I’ll mention
that sometimes the Forum addresses what we know
about clinical topics. Which are the right programmes,
services, and products? When we need to do that type of work, our first go to resource is ACCESSSS. It provides only high quality studies. So it looks at the
literature being published in high profile journals. It extracts only those that
meet minimum quality standards. And that’s very helpful as
a first point of contact with the literature about clinical topics. When we think about public
health, we go to the one source of pre-appraised, synthesised research evidence in that
space, Health Evidence. That helps us with
questions about benefits and harms in public programmes and services. But when we’re in our
space that we work in 90% or 95% of the time,
which has to do with how we organise ourselves to get the right programmes, services,
and products to people, we go to Health Systems Evidence, which includes quality
rated, systematic reviews. It allows for filters
that allow you to search the literature in a number of ways. It provides links to
user friendly summaries. You can save documents
as well as searches. You can sign up for
monthly evidence services and it’s available for searching in five different languages. So lesson learned number four,
look in the right places. When we run workshops for
policymakers and we ask them to be honest and tell
us where they currently go, the answer is almost uniformly Google. Our view is that is making your
life exceptionally difficult and if you’re trying
to answer the questions in the slide about lesson learned three, you’ll see that you need
to look for many types of evidence, and these
one stop shops like HSE are very helpful. Continuing on this fourth lesson learned, we’ve increasingly found
ourselves working at the intersection between
health and other sectors. And when we began working in this space, we were frustrated by the
lack of a one stop shop. So one of our recent big
areas of focus has been to develop a new one stop shop called Social Systems Evidence,
which contains quality rated reviews in 16
government, programme, and service areas. In the box on the right,
you’ll see the spectrum of areas that we currently cover. For those of you who know
the sustainable development goals, we’re currently
covering all of them except for parts of the one related to health, which is covered by the
databases on the slide that I just went through. Part of seven, which is about energy, and the three environmental goals about climate, land, and water. But we’re fairly close
and hope to be successful in reaching an agreement
very soon with a partner that will allow us to
fill those remaining gaps in the evidence, in which
case we’ll soon be at a point where we have a
one stop shop for all of the pre-appraised
synthesised research evidence about the SETs except in the health space, because those are so well covered by the three databases that I just mentioned. This database will
function very, very similar to Health Systems Evidence,
but we here won’t just cover the governance, financial,
and delivery arrangements. We’ll also cover the programme
services and products. So if I now move on to
lesson five from the Forum’s now almost 10 years
experience in supporting evidence in foreign
policymaking, this one is that we need to package the
best available evidence in the right format and
on the right timeline. And here I’ve singled out
one of the four examples, rapid syntheses, as a
method that is getting significant traction. At the bottom, you’ll
see that I’ve defined best available as being
the highest quality and most locally applicable evidence. But here for packaging,
I give you four examples. User friendly summaries of the research. To our knowledge, there are
eight groups in the world that are producing summaries
of systematic reviews. All of those groups or
as many as have produced a summary are linked to
from Health Systems Evidence and Social Systems Evidence. Rapid syntheses are
something that we started at the Forum three or four years ago. Originally we provided them in three, 10, or 30 business days. We now provide even more
comprehensive syntheses on timelines of 60 and 90 days. Another lesson learned
within this lesson five is that we are increasingly being asked to summarise not just the
best available evidence but also to conduct what we sometimes call jurisdictional scans to better understand who else is doing what
in other jurisdictions. We also, though, look for evidence, because it could be other
jurisdictions are doing very interesting things,
but there’s absolutely not evidence that those other things are achieving better outcomes at lower cost. So those are our rapid syntheses. We also produce evidence
briefs, but they are for us, not an end in themselves. They are input to a stakeholder dialogue, which I’ll talk about under
the next lesson learned. And these evidence briefs
try to address that full range of questions
that I mentioned when I was talking about lesson three. So they provide the
health and the political system content. They describe a problem and its causes. They present what’s
known about three options to address the problem or what we’ve found over the years is that increasingly, we’re presenting not three
mutually exclusive options, but instead three
elements of a potentially comprehensive approach that could be given more or less emphasis,
that could be sequenced in different ways, so
there are still significant choices to be made even
with an elements approach rather than an options approach. And then finally, key
implementation considerations. So these evidence briefs
are the full summary of the broad array of questions
or answers to the broad array of questions that
need to be answered. And more recently, we’ve
added citizen briefs that fulfil a very similar function. But here the target
audience is citizens rather than stakeholders in the
system that might have more technical background in the area and more facility with the
language that’s used in the sector, so this
is an effort to explain in more plain language what the issues are and what’s known on a topic. Hereto, they’re not an end in themselves. They’re an input to a citizen panel, which I’ll come to next. Lesson learned six, which
is another one of our more recently learned lessons. So it’s only been a few
years that we have had a citizen panels programme at the Forum. But we came to realise
with our stakeholder dialogues that try as hard as we could, we were not successful in bringing diverse citizen voices to the table through our stakeholder dialogue programme. We always have citizens in the room, but no matter how much
preparatory work that we put into it, no matter how much support we provide to the individuals,
it is extremely difficult as a citizen
to sit around a table with top government
officials, the heads of nursing and medical
associations, the heads of a variety of institutions like hospitals, and hold your own in that
conversation and add value. So while we still involve
citizens in our dialogues, we now have standalone
citizen panels that have features that are in
many ways similar to the dialogues that we’ve run
for now almost 10 years. Some of the ways that
they’re different from our dialogues are attentiveness to diversity. So we have a set of six
criteria for which we always seek diversity. So those are gender,
ethno cultural background, and socio economic status,
to give three examples. But there are also
criteria or characteristics that are typically unique
to the topic at hand. We’re typically looking
for people who have lived experience as an
individual or a service recipient, as a family member,
or as a community member with the topic at hand. The other ways that our
panels are different from our dialogues are that our
focus is much more on values. What are the values that citizens believe should guide decision
making on the topic at hand? So with our stakeholder
dialogues, when we’re dealing with groups that have a lot
more background knowledge about how the health
or social system works and much more of a
technical facility with the issues at hand, we can get
into much more specific insights about what needs
to be changed and how with citizens though what
is so powerful is hearing from them what values they believe should drive decision making. So that’s lesson learned
six, a more recent lesson for us about the power of citizen voices in influencing policy processes. Lesson seven is one that
we learned in the very early days of the Forum,
and in fact, it in some ways was a lesson learned
before the Forum was created and contributed to the
founding of the Forum. This was our first signature programme, our stakeholder dialogues programme. So I mentioned before a
number of shared features with citizen panels. Some differences are we
have a separate deliberation at the end of the dialogue,
which we don’t have with our panels, where
we discuss who could do what differently. That’s the first of the asterisked points in the left hand column. Here it’s informed by an evidence brief rather than a citizen brief. Here the discussion is
informed by a discussion of all of the factors that
will influence decision making. So the politics of the
situation are very much a part of the discussion. We convene both those who will be involved in decision making and
those who will be affected by those decisions. I mentioned before that includes citizens, but it also includes
heads of organisations, not for profit amputees,
regional health authority directors, and so on. Many of the other attributes
though are similar. So that’s our seventh lesson learned. One other comment about
this lesson is we have over time come to find kind
of a natural time frame for these, which is
roughly in the 20, 22 week time period from a call to a decision. The fastest we have ever
been able to do this is seven weeks from a
call from a top government official to, in this
case, a cabinet decision. But if we’re not under
that kind of tremendous time pressure, we find
that a natural time frame that we can work on very
easily without in quotation marks “breaking a sweat” would
be in a 20 to 22 weeks range. So this is contrast to the one stop shop, like Health Systems Evidence
or Social Systems Evidence, where you get your answer
in seconds or minutes. The rapid response service,
which provides answers in three, 10, 30, or now 60 and 90 days, and the stakeholder
dialogues, on the other hand, typically in the 20 to 22 week range. And if we add in one
or more citizen panels, that adds in about three or
four weeks to the timeline. So if I now use, it’s up to lesson eight, lesson eight is about
using the resulting story to drive change. So in the early days of
the Forum, we were very focused on evidence and
all of our communication was about the importance of evidence. We still think evidence
is absolutely essential and evidence in Forum
deliberations about values, which is what we try and
do to our citizen panel programme, and evidence
and values inform insights from stakeholders,
which is what we try and elicit from our stakeholder dialogues, are both absolutely key. But we find that combination
of the research evidence from the evidence brief
and the citizen brief, citizen values from the citizen panels, and stakeholders experiential
and tacit knowledge hugely helpful to policymakers. And it provides a compelling
story that can often drive change at multiple
levels, from the high level policymaker level, where
they need to be convinced there’s a compelling problem. There are thousands of
problems out there they could focus on. Is this one compelling? That there’s a workable policy option and that the politics are right for action at this particular moment in time. But that combination of
evidence, values, and insights are also helpful for
people who are commission services, leading organisations,
working in different types of associations as well as for client and citizen groups. So that’s lesson eight. Lesson nine is moving
from more the supply side, us outside government, to the demand side. And we give a number of examples here. The one that I remain
the most intrigued by is the research evidence
checklist that we’ve seen used in some jurisdictions
which needs to be completed before briefing
materials are submitted to a minister or to cabinet. Hugely helpful in causing
people to stand back and say, can we succinctly
describe how we found and used evidence to
inform our understanding of a problem, of options,
and about implementation. And then can we be clear
about which databases we searched and what type
of evidence, for example, systematic reviews of effects,
we found to inform this work. And as for examples,
someone were asked to review the literature or to submit
a briefing to a minister about a particular
health system arrangement and had research and evidence, it would be a good
opportunity to push it back and request that they go to a source that people understand to be comprehensive to make sure that nothing was missed. I won’t go through the other examples, but you’ll see other
examples of how groups are trying to institutionalise
the use of evidence in policy environments. Lesson 10 is about evaluating innovations and make adjustments as needed. So we can point to many
examples of direct impacts on the policy process,
including, as I mentioned before, seven weeks from a call
to a cabinet decision. We have huge amounts of
evidence about how briefs and dialogues lead to
strong intentions to act. We also have large amounts
of evidence from the Forum as well as from our
partners in many other countries about high rankings
of the design features for things like our briefs and dialogues as well as for now our citizen panels. And one of the interesting
things is we see a virtuous cycle of more
evidence informed policymaking, leading to more evidence
informed interest group pressure and more policy relevance research. So those are our 10 lessons
learned at the Forum. Some have stood the test
of time almost a decade, such as lesson number seven. Others for us are newer. Lessons like lesson six,
the importance of eliciting citizen values to drive
the policy process. Also about the need for
rapid syntheses to kind of come up the middle between
the self serve one stop shop, like Health Systems
Evidence, and the more full serve stakeholder
dialogue, typically informed by citizen panels. And others we’ve come to
realise that we need to talk about things in different ways, like lesson eight, talk
about the importance of best evidence citizen values
and stakeholder insights. So those are our lessons learned. As I mentioned before at the
beginning of the webinar, these are really the substantive
lessons that we’ve learned. We’ve also come to realise
because of work that we’re now doing with 14 partners
in other countries is that we have not done as
good a job in documenting lessons learned on the operational side. The things that we have done over time, learned the hard way that
need to be done if you’re trying to be an organisation
that is responsive on a wide variety of
timelines, that is constantly having to make sure that
it has the resources to keep a well trained staff
in place and the list goes on. So we currently have a list
of nine lessons learned, but we’re going to be
continuing to work on that list with our partners to
better document the lessons learned on the operational side. Another thing that I said
at the beginning of the webinar that I was just
going to quickly touch on, lessons learned about how
we communicate what we do. And so years ago at the
Forum we went through a process to re-conceptualize
how we talk about what we do. And we now, if you were
to go to our website, have five gateways into what we do. Learn how, which is where things
like our workshops fit in. Find evidence, which is
where our one stop shops, our rapid syntheses, and our
two types of briefs fit in. Spark insights, which is
where our citizen panels and our stakeholder dialogues come in. Then I’m gonna jump down
one to embed support. So that’s another area
that we’re trying to give much greater attention
to, to help organisations institutionalise the use
of research evidence. And the fifth area is
evaluate innovations. You’ll see two other things on this slide. One is the numbers of
different types of products or workshops that we’ve done. And the other thing that
I was gonna point out was how this links to the lessons learned. So the only lesson learned
that we don’t have a distinct gateway for is
use the resulting story to drive change, that fourth bullet point. And the reason for that
is it’s so embedded in the other points that
we haven’t singled it out. But that’s a lesson learned
on the communications front about how we now
talk about our work. And it’s been a very
long time since you would have heard us use the jargon
of knowledge translation. So what we are instead
focusing on are these very tangible areas that we’re trying to provide assistance in. So the final comments
that I’ll make before I open it up for questions
are directions we’re moving in now. So in a way, these are emergent lessons. Areas where we recognise
we need to do better and in some cases that we’re taking steps towards them. In other cases, they’re planned steps. So the first direction,
which we’ve actually been moving on for some time
now, but we’re formalising in a more explicit way,
is engaging citizens. So you’ll see in blue
lower down, citizen briefs and citizen panels, which
I’ve spoken about before. But we’re now also starting to develop and provide workshops for
citizens about how to find and use evidence
and how to understand and strengthen health and social systems. So we’ve increasingly
come to the view that if we’re trying to
really make sure that we get the right programmes,
services, and products to the people who need
them, we desperately need to have citizens pushing
for those changes themselves where we are in a
supporting role and they are in the lead role. And these workshops are for us. The beginning of
collaborations with different citizens to try and
work out how can we best support them in developing and sharing the capacity to do this. Another area that we focused
on, this time in partnership with four other groups at
McMaster, is in developing and supporting the use
of the McMaster Optimal Ageing Portal, which
provides citizen targeted best evidence. And there are four types
of content in the portal, all targeted at citizens,
all helping them either make better informed decisions
about their own health, self management, or treatment choices, or helping them understand
what they should be or could be pushing
for at the policy level to make sure that the
right programmes, services, and products are available
for people like themselves. So that’s direction one,
a direction we’ve been moving in for some time
and plan to keep up. A second direction, which is newer for us, and it’s work that we’re
very keen to pursue in Canda, but we’re also
working with partners in a number of different
countries to learn from them about how they are doing this is joining up the ecosystems of policies supporting organisations. So here what I’ve done
in the left column is provide labels for groups. Not all groups would
recognise themselves from those labels, but we think
of there being at least seven groups out there
that are aiming to support policy in some form or another. Data analytics groups, guideline groups, technology assessment
groups, modellers, and so on. Each of them typically
focuses on one or two phases of the policymaking
process, even though they don’t sometimes talk
about it in that way. And they more often focus
on programmes, services, and products than they
do health and social system arrangements. But our view is that if
you’re sitting in government and committed to supporting
evidence informed policymaking, these
are the types of people who you either should be reaching out to or are now knocking on your
door with offers of help. But none of them have the solutions alone. And we have come to
the realisation that we like these other groups
are part of a solution and that the future
needs to include far more effort to join up these
ecosystems within a given province or state
or country so that people are playing to their
comparative advantages but also involving other people who bring additional comparative
advantages to the table. So this is a more
emergent direction for us. A third direction that
we’re moving in now is adopting, piloting, and
iteratively revising these methods that I’ve
talked about before as we move from our
historic focus on the health sector to now a broader focus on health and social systems. So we now have an initiative
at the Forum called Forum+ for our work outside of health. And we’re now working
in partnership with a network of 14 country
teams that are taking these approaches, adapting them, piloting them, and iteratively revising
them as we get better experience with them in
the education sector, in the justice sector, in
the environmental sector. So this is for us a very
exciting area to try to understand how we need to
adapt, pilot, and revise these approaches as we try to use them in other sectors. So this is a very emergent
area, but another lesson learned that we simply
can’t take the methods we’ve been using and apply
them without adaptation in other sectors. The final direction
that we’re heading in is also relatively new. We were asked by a provincial
government in Canada to do some work, a rapid
synthesis on rapid learning health systems. This essentially turned
on a light for many of us. We realised that this is
perhaps the most exciting framework that we’ve
encountered that supports the type of bridge building
that we talked about before and that constantly
ask how can we prioritise moving the dial in ways
that make a tangible difference in the impact
of citizens and clients. And use all of these
insights about evidence, about data, about aligned
governance, financial, and delivery arrangements. So what you see on this
slide is the definition that we use, and in the
numbers are the seven characteristics of a rapid learning health or social system. So we’re increasingly using
this frame in all of our work. And it could be that within a few years, we’ve moved away entirely
from discussions of supporting evidence in Forum policymaking to entirely language around rapid learning systems. For now we’re using both sets of language in a transitional phase,
but we’re finding this framework to be hugely
helpful to nest our work within a bigger strategy
of making a tangible difference to clients or
patients at the front line. So this for us is a much
more recent lesson learned about new directions. So those are the 10 lessons learned. To summarise again, they’re
the more substantive 10 lessons that we’ve
learned over the years. Some have been around for a long time. Some are more recent. I mentioned we’ve also
been starting to document operational lessons
learned, and perhaps in a few years we’ll have
a webinar specifically focused on that. I had one slide about
communication lessons learned. So we now talk about the
five areas where we do work, our five gateways. And the final thing that I touched on were our lessons learned about where we need to move in the future, and
I’ve given four examples of paths that we’re either
currently on and have been on for a year or two or that we’re embarking on right now. This final slide is resources. If you wanna see any more
information or find any information about what
I’ve been talking about, you can go to some of these sites. I’ll just point out two
things that are in blue. Three things. One is our resources. We have a number of one
or two page summary sheets that cover a lot of key
insights from our work that you can access through our website. Second, the rapid synthesis we did. We’re currently in the
process of doing a national review on this topic. Every Canadian provincial and
territorial health system. But for now what’s available
is the Ontario work. And the final thing is my Twitter handle at the bottom, @ForumHSS, HSS standing for Health and Social Systems. So I think that that’s it. I’ll skip this extra slide. It’s just there in case I needed it. So I’ll turn it back over to Steve for moderating the questions. – [Steve] Great, thanks very much, John, for your insights and
sharing those lessons learned from our work
in supporting evidence informed policymaking. We do have some questions
that have come in, but I would certainly
encourage anybody else if you have anything
that you’d like to share or other questions to
please use the chat box to ask those. So the first set of
questions were actually from Olivia Dearman. She asked whether we
think these lessons apply in the context of low and
middle income countries. And then secondly also,
what types of research methods could be used more to investigate evidence informed policymaking
and the lessons learned? – [John] So great, so hi Olivia. Good to see your name and
thank you for the questions. So the lessons learned
absolutely, I said they’re from the Forum, but of course
the Forum has learned a huge amount from our
partnerships with evidence informed policy networks in a
range of different countries and we have with our
collaborators talked about these lessons many, many times. We have a workshop coming
up on Saturday and Sunday in Johannesburg where we’ll
be with our 14 partners. And this set of lessons
is one of the first slides to check in with them
again that this reflects their experiences. I think that the lessons
that in fact, to be honest, emerge more from them and
we have then acted on them included lesson five, rapid syntheses. So we were, of course,
aware of the importance of timely syntheses,
but it was the practical experience of our colleagues
in Uganda, Cameroon, and Burkina Faso and seeing
how policymakers in those three African countries
embrace rapid syntheses as being so useful to
their work that led us to undertake a similar programme. And maybe the one that
has had less opportunity for experimentation in
low and middle income countries is citizen panels. So we only at this point
know of one partner, the policy centre at the
American University of Beirut, who is using these in
the ways that we mean citizen panels here. So perhaps lesson five
more coming from low and middle income countries
and then we have embraced it and run with it. Lesson six, perhaps, coming more from us. But our three year partnership with the 14 country teams that I mentioned before will allow us to experiment with this approach to citizen panels in a broader way with low and middle income countries. So I’ll have to get back to you about six. But certainly the others
certainly resonate with our partners in low
middle income countries. Then in terms of your next
question, which Steve, can you just remind me
it was about what types of message should we
be using in evaluating whether we’re making a
difference in evidence informed policy? – [Steve] What types of
research methods should be used more to investigate
evidence informed policymaking and the lessons learned? – [John] Yeah, so tough one. I mean, the difficulty to
me with evaluation in this space is that the gold
standard for evaluating impact are remarkably labour
intensive case studies of policymaking processes
where one can have the key informant interviews,
the media analysis, the documentary analysis
that provides you with a very nuanced understanding
of whether and how the insights from a variety
of types of evidence influence the policy
process and also how a particular dialogue or a particular panel shaped thinking. And part of that is because
we know that evidence can change how people think about problems and more conceptual use
of evidence they can inform very narrowly defined decisions, which would be more instrumental use. And other times evidence
can be used after the fact to justify decisions
made for other reasons, which we can call a
political use of evidence. And without that gold standard approach, it’s very hard to do all of those. So where we have come
to is when we are doing more evaluative research to try to find intermediate outcomes,
and the one that we’ve given the most attention
to has been in the area of the theory of planned behaviour, people’s intentions to
act on what was learned, which correlates quite
well with actual behaviour, but also we collect
information on the three conditions under which
intentions typically translate into actions. So we have a lot of different
methods that we use. But what’s the most
difficult is finding easy to measure outcomes that can be looked at in both an intervention
and a control group. And so we have some
lovely recent published examples, like this spirit
trial out of Australia where people have tried to do this in a randomised trial context. But it is an extremely
labour intensive process that they went through
with their outcome measures and hard to imagine
reproducing that at scale across contents and issues,
which is where we think the future needs to go
is understanding under what context and for
what issues to particular approaches or methods get traction. So I wish I had great answers,
but it’s a complex field. It’s always gonna end
up being multi method. And it’s gonna require
us to make a variety of different types of trade offs. But that’s where we are in our thinking and what we continue to draw on as we, for example, embark on this
new 14 country partnership. – [Steve] Great, thanks John. We have a number of other
questions to get through here. – [John] Okay, I’ll be faster. – [Steve] Yeah, no, that’s good. It’s good to have lots of questions. So there’s a couple that
have come through about the citizen panel programme. So they’re similar but a
slightly different take. So one asks how we might
respond when evidence and citizen values are at odds. And in a similar kind of
vein, somebody asked about how to manage when the
values are very diverse and polarised. – [John] Absolutely, so great questions. I suppose an easy thing
about not being the elected politician is
that I can say at the end of the day, it’s
those individuals that have to make these difficult trade offs. They’re the ones that are elected. What we see our role
as being is being very systematic in our approach to synthesising the evidence and to eliciting the citizen values and eliciting
the stakeholder insights and to present them in a way
that is very transparent. But at no point do we
give recommendations about how to balance those. And certainly not when they are competing. So we just try to be very transparent, put it in front of people, and then at the end of the day, they’re the ones that were elected and have to make
the difficult decisions. So in a way, you could
argue that I’m punting it, that I’m not taking
responsibility, but I also feel that our responsibility
and our role as what we hope are credible
intermediaries between the world of evidence and policymaking is to bring to the
table insights that come from the use of systematic
methods applied and reported in ways that
they’re very transparent. And that means not adding
our own advice about how to balance those inputs
when they’re conflicting. On the polarisation front,
I mean, the interesting thing, I suppose, is virtually
all of our programmes operate on a purely responsive basis. We typically are not the
ones that are picking topics. And we have not been
asked to take on issues that are highly polarising. And that probably simply
reflects the incredible experience of the many
people who otherwise come to us or who come
to us for other topics who recognise when people
are, when issues are highly polarised, so
with abortion, with some other issues, the probability
of insights emerging from evidence that would help to come to more of a shared view
about where we need to go are extremely low. So we have typically
not been invited in to address issues where
there exists already a tremendous degree of polarisation. We of course encounter
polarisation sometimes, but not extreme, but we
encounter some degrees of polarisation at citizen
panels and stakeholder dialogues. And that’s why one of
the features of both is we don’t aim for consensus. We are not trying to put
square pegs in round holes and make people agree to
things and make compromises. We’re trying to capture areas of agreement where they exist and to
document areas of divergence and if possible the
underlying rationales for that divergence when that is the case. And again, from a politician’s
perspective, that’s helpful. They need to know, are
people on the same page about the programme and its causes? Do they agree about the viability
of particular solutions? And then for them are the politics right? And sometimes, and we’re
seeing this increasingly in the United States and
in Ontario, politicians seem to be seeking out
policy issues that are polarising, but typically
that’s not their preference. But at the end of the
day, it’s up to them to figure out how to do that. So that’s how we deal. That’s our experience with polarisation, that we’re often not pulled in and when we see it,
we’re trying to document and report it systematically. But our methods we
don’t think probably are gonna be the perfect solution for highly polarised situations where
nothing else has worked and it’s unlikely these would
get much traction either. – [Steve] Great thanks, John. We had just one other question
about the citizen panels about whether the outcomes
are made available to decision makers. So I would just say that
all of our products, including the citizen
briefs as well as the citizen panel summaries
and our dialogue products are all available on our website. So under the find
evidence and our products section of the website. So they’re all made available there. And then in terms of the
panels that when we’re doing those as part of the
feedback that John mentioned that they certainly
provide the citizen summary of the discussions to those involved. Another question in terms
of resources that was asked, John, is whether
the research evidence checklist or an example
is available somewhere that you could point people to. – [John] Yeah, so the one,
I can’t point people to it off the top of my head, but
we’ll figure out some way to flag this for people after the webinar. Steve, you’ll have to help me thinking about how we do that. But there is a lovely
document, for example, from the United Kingdom
with work that was done to support cabinet decision making there. The Ontario example, as far as I know, is not in the public domain. We have a modified
version of that checklist that we’re going to be
putting in front of our 14 partners in the coming months and then work with them to
collaboratively adapt it. And once that work is
done, then we’re very happy to put that in the public domain. So the one thing that I
know is in the public domain is the very well documented UK example. And within a few months,
we will have the example that we’re gonna be piloting in the field with our partners in a variety of low and middle income countries. – [Steve] Great so what
I would suggest, John, is that when we have
that, maybe what we can do is post that as a link on the page on our learn how webinar page
in front of this webinar, because we will be posting
the video there as well. – [John] Fantastic. – [Steve] So we have a few more questions. I’m not sure if we’re gonna
get through all of them, but we’ll try. So in terms of the public education side, this question was that
public education has struggled to use evidence
in decision making. And have you encountered
publicly funded education systems or tools that you
would point to as exemplar? – [John] Sorry, this is
about using these ideas in the education sector, is that right. – [Steve] Yes. – [John] Okay, so well, I
think that two of the 14 partners that we’re starting
this new programme of work with are the EPICentre at
University College London. And they are part of
an institute called the Institute of Education
that has been working in the education space for a long time and working in partnership
with I believe there are two work centres in the
UK focused on education. One more kindergarten to grade 12 and the other higher education. So those three entities
from an outsider perspective seem to be doing excellent
work in this space. And then another partner is
Africa Centre for Evidence that has been similarly
doing some very, very interesting work in the education space, I believe in South Africa. I can’t recall if that has
also been in the sub region. So those are two examples. But part of our focus
going forward with this 14 member coalition is to
be actively seeking out excellent examples in
other sectors and applying these methods in these other sectors. So in the coming few years,
what we hope to be doing a lot more work in education
and in other sectors and documenting whether
and how the approaches seem to be adapted and
so on for those sectors. And also documenting
excellent examples of where people in those sectors are
embedding or institutionalising these types of mechanisms. So in the short run, I
would point you towards those groups, and in
the longer run, if you sign up for our Forum
newsletter or check our website periodically,
we’ll be increasingly sharing examples from
education in other sectors. – [Steve] Great, so we
have two minutes left and two more questions to get through. So the first one is just
about our communities of practise, how they
work, how the topics are chosen, and what the
intended outcomes are. – [John] Yeah, so
communities in practise are a new area for us. We have a colleague,
Terry Waddell, who’s been heading up that area, so
in the interest of time, maybe what I can do is point you to her and we can make available
her email address. I think it’s waddellk,
W-A-D-D-E-L-L-K at And we’re currently experimenting
with them in Ontario, staring with the home and
community care sector. And this is for us an
experiment about how to do it. We’re all setting up a
community of practise among these 14 country teams
spread around the world. So communities of practise
for us is a new area. Terry’s done excellent work
summarising the lessons learned from that literature
and using that to guide very practical advice that we’re trying on as we experiment for the
first time with formally managing or supporting
communities of practise. So she would be the best
resource person on that front. – [Steve] Great, and
you can also email us at [email protected] as well
and we can pass those inquiries on. So the last question is
about just a more generally what the mission of the health Forum and how it ties in with the
growth of translational medicine as of late. The person there said that it appears that the health Forum emphasises synthesis and accessibility of evidence
based health and social systems knowledge through
systematic reviews by meta analysis in a setting of the bench to bedside and community where
the goal of accessibility and translation is emphasised. Do you see this as possibly
adding to the burden of unappraised knowledge
or a sector that is necessary and ever changing
dynamic field of medicine? It’s a big question, but if you wanna. – [John] What was the very last phrase? Is it what? – [Steve] Do you see this
as possibly adding to the burden of unappraised
knowledge or a sector that is necessary in an ever changing
dynamic field of medicine? – [John] Yeah, I don’t know
why it would add to the pool of unappraised. I mean, because virtually
everything we do is about presenting appraised evidence. And I guess I just put
up a slide that was here as an extra one, which
is distinguishing policy about clinical public health
and health system arrangements. I mentioned before we work
more in that blue circle. And we sometimes work in the intersection with the green and the purple circle, but we never work in the yellow circle. And for me that’s where
the transitional medicine space is between the
biomedical and clinical. So to be honest, it’s
just not an area that we work in and that I can
speak knowledgeably about. So we’re very much policy
about health systems, sometimes policy about
clinical and public health, but never that translational
medicine interface between biomedical and clinical. There’s many, many groups
that would be leap years ahead of us in terms of knowledge
and capacity in that area. – [Steve] Great, well thanks again, John, for taking your time today
to share your insights and lessons learned from
our work in supporting evidence informed policymaking. If anybody would like to
watch this webinar again or share it with their
colleagues, we will be making the video recording
available on our website. And you can certainly go
there as well to check out either past top 10 webinars. And we will be posting
some more detail shortly about upcoming webinars this fall. So thanks again, John,
and thanks to everybody for joining us today,
and I hope you’ll join us again soon. – [John] Thank you, bye bye. – [Steve] Bye.

Leave a Reply

Your email address will not be published. Required fields are marked *