YouTube Transcript: Impact Evaluations: Strategic directions, challenges and innovations
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
Video Summary
Summary
Core Theme
This webinar discusses the strategic directions, challenges, and innovations in impact evaluation within UN agencies, specifically highlighting the efforts of the World Food Programme (WFP) and UNICEF to strengthen their capacity and application of rigorous evidence generation for programmatic learning and accountability.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
good morning good afternoon good um good
evening everybody Welcome to the webinar
on impact evaluation strategic
directions challenges and Innovations
we're just waiting a couple of minutes
for other participants to join in
currently there are 65 and coming thank
zlata I think we're good over to you
thank you thank you Celeste and uh hello
everyone and very warm welcome to this
um evaluation practice exchange
seminar of your net group
and I'm zlata brutoff senior variation specialist
specialist
evaluation office
um headquarters UNICEF and I will be
co-host with my counterpart my colleague
Jonas German
who leads the impact evaluation unit at
our office of evaluation are wfp
um we also will be joined later about in
a half an hour by our guest from GIS our senior
senior
social protection advisor step one BL
BL
um who we hope will give his um
perspective from the programmatic side
on this topic
um the reason we chose to propose this
topic for epe session is
um basically we see that uh there is a
lot of interest and demand from
externally and internally
um uh to do more of rigorous uh
evaluative evidence on impact and
outcomes from donors from our partners
from executive boards but generally this
is an area which is still underdeveloped
among our un agencies the field of
impact evaluation has grown
exponentially in the last decade and you
know all probably about the Nobel prizes
that were given to social experiments
and natural experiments in the recent
years but again this is a relatively um
um
uh underdeveloped area of evaluation for
uh for un agencies in at the same time
we see that uh there are some um
changing shifts in our environment and
challenges that will shape our work and
shape the trends basically Big Data
climate change acceleration and
increasing focus on humanitarian and
development Nexus and we all are trying
to grasp very far although these are
different contextual factors
so therefore we we do need to
help each other to
um to exchange our practices on this and
our kind of build uh build on the
learning of each other and build the
community of practice arm in this area
of work among the UN agencies that was
our reasoning and that was our idea
but uh now without further Ado I I would
like to
um to invite uh Jonas to um to share
with us what the wfp is doing are in
this area
over to you and us
okay great thank you zlata and thank you
so much everyone for joining us today
um over the next 15-20 minutes or so I'm
just going to give an overview of the
journey that we've started here at wfp
kind of what we're trying to achieve and
what we've learned and some of the
innovations that we're trying to try out
um going forward
I'm not sure do I have control over the
uh you have to say next slide please
okay next thanks I was clicking a button
that didn't do anything um
um
Celeste will oh sorry
change in plans let me do this huh
no no I think oh
sorry okay can you see the next slide
yeah I see it now you takeaways yes
thank you okay so before I go into the
content I just want to actually start
off with a few messages that I think
resonate particularly with the
evaluation offices and evaluation communities
communities
um for wfp we we see impact duration
very much as a tool that's designed to
support learning and it does that by
testing out program theories and
understanding what works it's definitely
something that we see as having high
internal validity and limited external
so we see it uh evidence coming from
Impact evaluations as something that
needs to be built up over time and
across context and different programs
um we also think that the important part
of imagination is that it's giving us a
real Glimpse at what is in terms of
programming operationally
um working and and effective so to have
that you really need a strong level uh
no strong partnership built on trust uh
with the program teams but also with the
donors that you're working with it
cannot be imposed
um and then we also see that to make
impactivation I guess happen in any in
any context m e is still a crucial part
so program m e is a critical kind of
basic requirement to do good
impactivation work and then finally
impact evaluations are definitely for
wfp something that complement other
types of evaluations we don't see them
as replacing or in any form of hierarchy
against other types of evidence but
they're one form that complements a
wider evidence base about wfp programs next
next foreign
so in the next few minutes I'm going to
go briefly over the impactivation
strategy at wfp and how that has then
informed the later policy I'm going to
describe a bit about how we develop
multi-country impactivation windows in
key priority areas and then I'm going to
reflect a bit on some of the lessons
learned over the last five years and and
some of the innovations that we're
trying now next
so before
um next please
so before wfp actually embedded
impactuation into its uh evaluation
policy we had an impactation strategy
that we launched in 2019 the strategy
came after about a year of consultation
with wfp's donors and program teams and
other external experts and it really was
focusing on two main areas one is that
wfp sees evacuation as a tool for really
optimizing interventions so program
learning and two it should do so in a
way that the evidence generated provides
thought leadership globally to the areas
where wfp is operating so it's not
something that we see as only meeting
internal needs but something that we
always reflect on is this credible to an
external academic and and donor audience next
after two years of piloting that impact
question strategy and in that period
developing this portfolio that I'm going
to talk about impact evaluation was then
finally embedded into wfp's corporate
evaluation policy last year this was
based on recognition of the demand and
the amount of attention and support that
was going into impact tuition and also
the kind of unique characteristics which
made it very difficult for Country
offices to manage on their own which is
why we see decentralized evaluations and
it's also required significant
specialist skills that were not already
available in the office of evaluation
which is essentialized evaluations so
to put in place this as a key or a third
type of evaluation we defined it quite
narrowly we described it or defined it
as the evaluations that measured changes
in development outcomes of interest that
can be attributed to a specific program
or policy through a credible kind of
factual so those are very loaded words
for evaluation audiences obviously we're
looking at changes in development
outcomes so immediately we're not using
a kind of results framework definition
of the word impact or oecd definition
the word impact it's really about
Effectiveness we're also looking for
attribution so it's very much thinking
about a causal design and for us that
causal design needs to have a credible
counterfactual and so that's pointing us
towards a limited set of methods but
we're complementing it as we go forward
the other reason for kind of narrowing
it down a bit was it it was not seen as
something that needed to be repl needed
to replace already existing work so for
decentralized evaluation operations in
wfp they do a lot of different mixed
methods and qualitative evaluation work
for centralized evaluations there's a
very standardized kind of a Performance
Based and process evaluation and those
two things are are continuing and then
why do we do impact missions is because
it does something additional or
different that was not already there
the other thing is that impact
evaluations in this way are seen as
taking place very much during program
implementation this helps us to make
sure that we have the quality data that
we need before programs start
implementing so it's really putting in
place baselines in the tracking progress
over time and they're not they're not
officially or required to be presented
to our executive board and this is
distinguishing them from centralized
evaluations which do have to be reported
to the board the reason for that again
is because it's seen as a learning tool
and we see it as something that needs to
build up over time and
um that we haven't decided to have no
coverage Norm so there's never a
requirement for any wfp office to do an
so in that context the strategy which is
trying to like Focus On Demand lead
impact variation and has four
overarching objectives one is that again
contributing to Global goals delivered
operational and but to make sure it does
meet demand we had to maximize the
responsiveness so with the unit that we
have in um in HQ really is just there to
make sure that we're responding to
Country office contacts and needs as we
identify them and then seeing if they
can fit into this definition of wfp's
impactivation and it's also focused on
to the extent possible harnessing new
early on when we started off on this
work we did not have an impactuation
unit that came again last year after it
was made as part of the wfp policy and
so we really relied on technical
Partnerships uh for wfp the first strong
like big technical partnership that we
had was with the world bank's dime
Department their department now
um and that was kind of chosen for two
reasons one dime had a long track record
of doing impact evaluations in the
development space and two wfp in the
world bank had a global strategy that
they signed in 2018 which is the year
before we launched our strategy for
impact tuition it really set out a
vision for working on in this Nexus
space and tackling both humanitarian and
development challenges together and so
dime became a kind of an obvious choice
for a partner at the beginning of this
um in addition all of the work that we
do at wfp is supported with I mean
amazing and great support from our donor
community so at the moment that includes
um bmz and kfw Quaker usaid and we're
just discussing now with NORAD about um
working on homegrown School feeding
together in Malawi but they really make
this possible the cost of impact
evaluations varies greatly but it does
tend to be higher than other types of
evaluation largely due to the time they
run over three or four years and the
amount of primary data collected we also
work closely with other un Partners so
in Sudan and South Sudan where we're
working on impact tuitions that are
funded by bmz
um there we're actually doing it of
joint programs and so those joint
programs are also
um developed or the impactivations also
developed jointly with the program teams
from both sides to make sure that the
questions we ask and the tools we use
answer UNICEF and wfp priority questions
and then we work with a large range of
external networks um and academics thanks
no
it doesn't involve for some reason okay
well the next ones would be the windows
I think yeah okay so go for one further then
yeah okay
okay
so in this context where we're trying to support
support
um demand-led impact evaluations and
there's no coverage Norm we had to think
a lot about why and when to use impact
evaluation and so we've focused on
organizing our activations around
corporate priority areas for wfp and to
do that we've developed what we call
impact version Windows it's kind of a
normal term in the impactivation space
Also for other research but we have a
couple of kind of twists on How We Do It
um one is that these windows are
developed very much in partnership with
program division so they are involved in
identifying what they see as the
Strategic priorities for evidence so
looking forward where do we see wfp
spending more money and focusing more
attention we also then have no need in
terms of Shifting the windows to
actually close them what you see instead
is that these windows would evolve over
time by updating the priorities as in
when wfp updates its policies or
strategies we would only think of
closing a window if for example wfp were
to stop doing cash transfers or stop
focusing on one of the key priority
areas and these priority areas were
selected again they align with wfp's
program divisions across the windows we
also have a humanitarian workstream
which is really about out the context we
operate in so the humanitarian context
still focus on the same outcomes like we
still want to achieve climate adaptation
and resilience objectives but there's a
different way of working and a different
need for thinking about data sources and
tools so that work stream is across all
so when we set out to develop a window
after we do or in parallel actually with
the consultation process
um which is with wfp HQ and also country
offices in Regional bureaus we also
conduct a literature review the
literature review is very much focused
in on the same areas where they see a
parity in terms of spending and it looks
at the last in most cases there's been
the last 10 years of rigorous impact
evaluation evidence for the type of
interventions that are being supported
and that helps us then choose from a
longer list of questions which
intervention types or questions are not
well supported with um with rigorous
impact Evolution evidence so that helps
us then narrow down where we should
focus our impact evaluations we then put
out a call for expressions of interest
to all country offices
um basically informing them of the
priority area and the opportunity to
work with us on impact evaluations and
they're being asked to volunteer their
future programming so they're being
asked to think about in the next six
months or the next year will you be
delivering new programs that would fit
into these windows and can we work with
you on designing a rigorous impact
evaluation of that program we then
conduct feasibility assessments of the
programs that volunteer and then finally
the country's selection is again
proposed back to a steering committee
here in HQ that also includes program
colleagues and that's just to confirm
that the the country selection is also
again relevant to the operational
questions and priorities here at wfp next
an example of one of these impactuations
that we have ongoing right now and this
is a design that's actually being rolled
out in three countries so
um El Salvador Rwanda and Kenya and
early on I mean the first window that we
opened was on cash and gender and we did
the literature review and literature you
showed that cash does have positive or
potential positive impacts on women
through different uh mixes but it
doesn't tend to give them directly more
Authority or more decision-making power
we also saw studies and and also
qualitative work done by wfp that said
that women have more control when the
income that they have is seen as earned
income or income assigned because they
were contributing some amount of time to
for that income and so for wfp food
assistance for assets is a very common
programming approach it's a way of
basically helping communities develop
assets that will have longer term
benefits but during the process it
actually requires communities to assign
people to work on those assets and those
can be both Community level and actually
household level and so in a lot of cases
there's no clear gender designation but
men tend to be the ones who are assigned
to work so we've actually created the
opportunity by just saying in in a
handful of communities and in each
country we will offer an option which is
a women's only focused food assistance
for assets programming and then
households can choose to participate by
volunteering women or women can choose
to participate by volunteering their own
time to work in the on those assets and
so then what we're trying to understand
is compared to groups that are doing
food assistance as business as usual
does offering women the explicit option
and giving a women a safe space to work
does that change their the control over
women's income but also their
decision-making power and the
perceptions of women's work in addition
to all the normal outcomes that we're
looking at in terms of consumption and
um that was just an example as you can
see across the three windows that we
already have open the fourth one on
nutrition is going to be open in 2023
and we have a well we have 15 kind of
confirmed in about 18 ongoing
impactivations they all have some form
of experimental design like the one I
just described some are rcts that
include a kind of factual that is a
control group and most of them are rcts
that conclude multiple treatment arms next
next
okay so now I'm going to briefly reflect
on kind of some lessons learned over the
last five years since we started this work
work
um the first set of lessons here is
coming out of an independent review at
the end of the pilot phase so between
2019 and 2021 those two years were
considered a pilot phase for the wfp
impactivation strategy that was really
for two reasons the first was because we
were not sure what kind of demand or
whether it be feasible in wfp so it was
kind of a way to learn what works but it
also aligned then with the new policy
which we knew was going to be published
in 2022. so overall the review found
that there was positive feedback most of
the country offices involved with
impactuation saw benefit there was
struggles for sure there was issues
early on but most of them were able to
deliver an impactivation design that
they were happy or saw value in and were
overall happy with the idea of doing
more impact oceans and actually one of
the big complaints was there was more
demand from country offices than what we
were currently needing um as wfp's
office of evaluation next
next
uh the other thing that it recommended
is we start to think a bit more about
how we deliver in populations so
um it recommended having more capacity
in the office of evaluation to support
impact relations again we didn't have a
unit at this point it was really just a
very very small team of a couple people
working in Rome and so they are
recommended that there should be a more
investment in in-house capacity to make
sure that the process of engaging impact
violations is as smooth as possible they
also suggested having
um focusing more on capacity building
and linking more with country offices
and the global academic communities and
countries where we work they also
pointed us towards broadening the
methods we were using again we're doing
mostly our CTS but we were also weren't
at that point doing much more on top of
the RCT work in terms of qualitative or
other evidence and they also said we
should do more like events like we're
doing now to improve awareness of the
strategy and make sure that people are
aware of the opportunities and then
finally there was a recommendation to
think about how do we institutionalize
impact duration in wfp so that it just
becomes business as usual when appropriate
appropriate next
so we immediately responded to that we
just finalized um impact Creation in El
Salvador which is going to have its
endline Workshop here in another month
or so and following the inline data
collection last year we worked with a
qualitative pi to do a qualitative study
that looked at differences that were
already visible in the inline data from
the quantitative work and so what you
see is yes we see improvements which is
a positive impact in terms of
quantitative measures on health hustle
consumption on the left hand side but on
the right hand side you see one of the
quotes that came from the qualitative
work which again had a qualitative
sampling strategy that tried to unpack
different experiences and there it says
that at the time my children got sick I
had to buy medicine and they had the
money that they needed to actually do
that so it's a really nice way to make a
much richer picture there over would be
normally just a little graph on the left next
the other thing we're doing is in
countries where we have high frequency
data which is our resilience window
we've also developed a high frequency
data dashboards and these allow country
offices to see every time that we have
new data from those high frequency
servers which one every two months
um they can actually within a short
period see the changes over time in the
different outcomes but also in things
like coping mechanisms other things
where we collect data next
and then finally as mentioned the
humanitarian workstream is is moving
away from I guess traditional RCT
methods and doing what we call a b
testing and that's really based on both
um the recognition that in an emergency
setting people need support it's not
really whether they need support it's
about when or how you get the support
and so here's an example of forecast
based financing where we have a little
bit of money up front where we can
actually support households before
shocks happen and then two versions of
responses now you could say oh it's
always great to do it early but there
are good arguments to question whether
that's true particularly in terms of
like flood responses where if the
markets are completely destroyed or
there's a change in expect in terms of
price or purchasing power from the
transfer within a short period of time
there's good reasons to think about
whether or not the timing is right for
different households and similarly not
every household will be affected the
same so there's also gains and
understanding again who's most
vulnerable or who needs the most support
in reconstruction next
next
okay so that's me I mean that's
everything I was going to say today this
QR code if you get a chance you can scan
that I don't know maybe we could pull it
up later again and that'll allow you to
join our mailing list and then we can
make sure that you get our future
newsletters okay over to you later thank
you very much uh I will uh at the end of
my presentation I'll return to this
slide and uh leave it for for people to
scan uh so we'll move on to um
um
so unisa
strategy and what we are doing and so
our WP example is very inspiring at
least uh to us and I often say that we
are three years behind on wfp in terms
of developing this area but uh we
definitely uh are making our first steps
um first step was uh also developing a
strategy we call it evaluation of impact strategy
strategy
um reflecting the broader definition or
the broader perspective on this area of
work uh we also have gone through a very
long extensive consultations through
evaluation of function of Unicef but
also externally and now hopefully we are
in the design and layout stage and
hopefully there the document will be
available on very soon
but before I start and uh share with you
some key um pillars of this strategy I I
just would like to Echo are the key
messages the key takeaways that were
presented at the very beginning to which
I fully
um agree with but also say that when
designed and planned well impact
evaluation or high quality rigorous
impact evidence is a very powerful tool
to make a difference
and it is not an assumption it's
actually a fact
um and uh just to give you a very few
um examples most recent examples there
are definitely more
um of Unicef evaluations that uh were
commissioned very recently and already
had some very positive influence on uh
government decisions to scale up but
also programmatic changes and
programmatic learning so Mozambique
social protection the rigorous impact
evaluation which was conducted alongside
the process evaluation
um demonstrated very positive results on
The Cash Plus model cash transfer plus
case management and social behavioral uh
communication package and this allowed
uh or enables the government to scale up
the pilot phase which reached around 15
000 children to our decision to to reach 250
250
000 families with children between
um within the next two years India
sanitation National Sanitation project
the evaluation of Economic and financial
impacts of the swash Barat Mission
um led to the sanitation cabinet
decision to of additional Government
funding of 18 billion to increase our
the programming within the next four years
um Nigeria I chose this example because
of its actual
um influence on the programmatic changes
and programmatic adjustments
alongside the um impact of original the
volunteer community mobilizers Network
on Polar eradication help UNICEF country
office to advocate for retention of 17
000 volunteer community mobilizers from
polio eradication campaign
um but also helped the country office to
um make adjustments
um and changes in the health strategy
and social behavior communication programming
so there are of course uh institutional
rationale for Unicef to strengthen the
work on outcomes and impact levels
um first of all the new strategic plan um
um
2020-2025 has explicit for the first
time explicit focus on outcomes and we
have a mandate to
um to support that as an as a change strategy
strategy
um this relates to accountability and
another aspect of accountability is the
pressure increased pressure from donors
and executive board to demonstrate
effectiveness of Unicef Investments
learning is critical and so as uh
previously was sad learning probably is
a is a very important and
um the core element of of this work
um programs UNICEF programs become more
Innovative more integrated and we have
to test interventions that are being
implemented rolled out uh simply to know
what works and what doesn't uh not to
waste money and efforts and secondly our
humanitarian programming is rapidly
expanding our the investment in
humanitarian programming also Rising so
we need to find new innovative ways to
uh to show the results are in that area
so our vision um
um
kind of outward looking and inward
looking so outward looking we think that
our beta beta evident impact evidence
will help and support National systems
and policies are by facilitating UNICEF
advocacy and supporting National
Partners in their decisions to scale up
a child focus uh policies and programs
and inward looking it simply are
contributes to improved our
organizational Effectiveness uh by
um through allocating uh limited public
our um I have to say that our approach
is slightly uh different from wfp in
that we are we recognize that our UNICEF
are areas of work are very Broad and
UNICEF Works Upstream as well as
Downstream are the work on advocacy
governance Public Finance for children
for instance are as important to UNICEF
as our intervention type of programming
so therefore we are we see and again
recognizing there are internal and external
external
um discourse in this area we see that uh
programmatic parameters these are Define
the purpose evaluative purpose and we
need to look uh at what is the nature of
intervention what is the nature of
outcome we are looking at what is the
nature of the program what kind of
questions we ask are they causal
questions or not
um and then we see which track we take
if causal attribution to outcomes
um through credible counterfactual is
possible and feasible then we are go for
impact evaluations with our in which a
specific quantitative methods are best
positioned to to do to fit this arm requirement
requirement
if the program or interventions are more
suited for causal contribution then we
need to apply theory-based
non-experimental methods
um that are also available and also very
um credible for
um specific research and validation questions
so we also looked at the our did some
Diagnostics and where we are at the
beginning of this are journey and UNICEF
has done quite
um a few evaluations and some of them
are very well known as specifically
thanks to their transfer project and
social protection but we identified
overall uh that 30 UNICEF Pro commission
to conducted 36 impact evaluation over
their five-year period uh this is about
six percent of the total number of
evaluations evaluative products produced
we see a substantial or thematic and
Geographic disparity and this is one of
the challenges of social protection is
where very well covered uh but other
areas like ecd nutrition are adolescent
programming and even child protection I
have very very few
um credible are rigorous evidence available
available
in terms of the methods our UNICEF
evaluations to date are have been are
both are including rcts and experimental
designs but also a lot of our 21 were
done using class experimental approaches
approaches
so we identified a number of challenges
that are also through online survey of
our staff I will not go I'm not going to
to read or uh uh elaborate on those
you're all very familiar with them
including high cost and resources and
low awareness and capacity on the ground
to our of staff and partners to or to
understand the feasibility and
parameters requirements of impact evaluations
evaluations
So based on that we developed 3p
strategic pillars
and I will just say a few words about
each of them what we are what we are
currently doing
uh the first pillar is to increase
initiation coverage and requirements for
impact evaluations across UNICEF again
as I said UNICEF has been doing impact
evaluations for a long time but what is
different now is that we are trying to
develop similar to wfp a more strategic more
more
holistic approach to generating this
evidence identifying the areas of high priority
priority
uh within this current strategic plan
the areas with the least coverage and
trying to stimulate our demand
initiation in those uh thematic and
Geographic areas
um so one way to do it is to explore we
don't call them thematic windows but are
basically it's it is some uh priority areas
areas um
um
through we would like to use the impact
evaluation Catalyst fund
um that will stimulate basically provide
matching contributions to
um country offices to initiate our
impact evaluation rigorous impact
pollution evidence
the first step is again slightly
different approach from W he uh we start
uh from a multi-country impact
visibility assessment and we have
completed uh one on child marriage just
now uh uh jointly with uh unfpa and we
have started uh the one on mental health
and psychosocial support and I will say
very briefly on what it entails are in
terms of methodology
second we will uh start very soon a very
comprehensive impact evaluation package
for adaptive social protection it but in
partnership with bmz which includes for
impact evaluations in fragile context
here the interesting model is that it's
it's a comprehensive evidence uh project
so it's not only impact evaluations but
it is operational research and data innovation
innovation
and finally we work on our supply side
and so we are trying to establish our
Partnerships with academic institutions
and other actors but also uh worked on
long-term agreements with who impact
evaluations uh which will be um
um just to give you a sense on
multi-country impact visibility
assessment which we consider as assist
as a systematic uh expert driven and
strategic approach to understand what is
feasible what are the opportunities
limitations to conduct conduct our
impact evaluation portfolios are it
basically consists of four steps are
starting from
um stock taking on the literature
mapping intervention uh interventions or
identify Global gaps against UNICEF
interventions then programmatic Deep
dive into our selected list of countries
and then final recommendations um
um
just the methodology includes developing
a very specific and clear criteria for the
the
um for selecting country cases including
um political interests or demand for
rigorous evidence at the country level
at the end of the day we want to scale
up we want this evidence to be used our
operational facilities meaning the
existence or availability of strong are
data collection companies and National
Partners they could support US policies
and so on
Gila 2 focuses on diversification on
method of methods and Innovation and
this is this work uh very similar in a
way that our Jonas are presented we also
will look at
um we're already looking at there are
utilize better utilization of secondary
data sources administrative data
household survey are to utilize the
quite experimental designs right now we
are focusing on humanitarian portfolio
because this is where the evidence is
really uh lacking not only for Unicef but
but
um globally but for that work we really
focus on outcomes are we see we
recognize there are short and
intermediate are terms that usually
applied for humanitarian setting um
um
this is with a credit to the our
regional icar office uh who initiated
and launched the so-called so-called uh
digital rcts low-cost
um rcts of Digital Services uh or
digital applications basically using big data
data
um the results will be available
um at the end of June and we hope to
replicate these models for other regions
and generally in in terms of methods we are
are
promote mix mixed approaches but are
very close integration of process
related evaluation questions with impact
evaluation uh questions
um using so-called nested approaches
that would help the program to receive
short-term learning respond to the
short-term learning needs of the program
implementers but also contribute to
their Global Learning and longer term
and the third pillar is basically
capacity building and learning again
addressing some of the challenges of low
awareness a low understanding of among
program staff and evaluation stuff in
the field what are the requirements for
doing rigorous and credible impact uh
evaluation uh evaluations
um but at the same time we as I said
before we do promote and we do try to
support our
regions and Country offices with other
methods are non-experimental these are
methods our contribution analysis
um qualitative impact protocol are
process tracing
um we are developing a methods Guide
Series that will uh provide a
user-friendly versions of those
methodologies to support our summative
evaluations that would want to look at
the I would want to ask causal questions
and look at the contribution
we are interested in developing National
capacity impact evaluation and some of
the our ideas is to develop a network of
academic institutions of the South
uh and young evaluative fellowships that
would uh support and give opportunities
for young evaluators and young
researchers to work on impact evaluations
so our immediate priorities are is
institutionalization of impact
evaluations in the new evaluation policy
revision of which is ongoing right now
um test the more cost efficient methods
and I saw the question in Q a about
costing was probably probably the first
question are and very
um Valiant for for this area of work so
so
um Aid and B testing that sianas uh
mentioned this is what we are going to
try to do as well
answer we are planning this year we are
planning to launch this multi-country
initiatives in uh specifically in
adaptive social protection and child
protection and nutrition
um Celeste I don't see Stefan joint
the panel Stefan
um no I did not see him let me so we
waited for it yes because I said at the
beginning uh from gizet uh to see a few
words but we can start um
um
addressing some of the questions and maybe
maybe um
um
Jonas if you want to answer the question
about costing
I mean it's uh I don't know
it it ranges very a very large amount in
WFT we we the cost of our impact
evaluations are down to the country
context the number of survey rounds and
the the timeline before the impact
commission so at the low end we see
impact evaluations that are I guess what
we call more lean so between two or
three hundred thousand dollars and again
those are often ones that harness the
monitoring data are more about the a b
testing designs and are not going for
many years at the high end it can go way
above that if we're running surveys over
three to five years and
um yeah and providing constant support
it can be multiple times that so just
with a context that's actually not very
high compared to a lot of wfp's
centralized evaluations or decentralized
divisions so you can do impact
evaluations for about the same cost of
any evaluation but you can also spend a
lot more if you have a reason to and
there's I guess important reasons to
collect data over
thank you are so I see Stefan Jones just to
to
um thank you hi Stefan
um I I introduced you at the beginning
but I will say it again Stefan Pierre um
um
social protection advisor at GI that our
partner and we have been uh working with
Stefan um in the last year quite
intensely developing the our the
portfolio on impact evaluation for
adaptive social protection and so we
hope Stefan will give some his thoughts
on the from programmatic side on the uh
on their potential for this area work
anybody anymore
thanks zlata and hi everyone
um sorry I was connected but in
spectator mode so that's why you
probably couldn't see me
um so yeah
um Let Me Maybe say a few things about
my my current position and then I can
situate that in the way
the the kind of evidence that that
UNICEF is generating
um comes in very handy there and how
useful that kind of work is and some
other Reflections related to that I hope
that sort of goes into the intended
Direction otherwise
feel free to steam in a different direction
direction
yeah so currently does the sector initiative
initiative
um or sector program means that I'm
primarily tasked with advising um
um
BM sets sector unit on social protection
you know which means we very often have to
to
very rapidly react to all sorts of
different uh search protection related
questions then can sometimes be quite
political but also sometimes be very
Integrity operational questions when it
comes to like just you know quickly
responding to something that we hear
from some some
operational programs that we're
implementing or whatever yeah so therefore
therefore
in that work
it is
absolutely vital that something like for
example the transfer project has done
over many years has been done because
without that my job would be so much
harder you know so it's
it's really invaluable and I think very
often for for the policy makers they
don't they don't see that side so much
because in a way we're the mediators between
between
researchers between evaluators on the
one hand and then the policy makers on
the other hand yeah but for us in this
intermediate position that kind of
um like rigorous evidence that is also
but also combined with operational
insights all the different kinds of
evaluation approaches that the human
setup was there zlata also highlighted
bringing that together and clearly
teasing out the the gist and showing how
solid it is and what we know for which
context and all that like having done that
that
in such a comprehensive way for social
protection and that
example of the transfer project
is absolutely invaluable so I'm very
happy and then leading now to what
Salata also has indicated that now
through the bmz UNICEF partnership it
very much looks like there's gonna be an
asp so adaptive social protection
related evaluation partnership which my
hope is very much that what has been
achieved before for
one area will now be done for another
area so that going forward again my work
which is currently focused a lot on the
depth of search protection will also
become much much easier because we can
stand on that firmly established
knowledge and we don't always have to go
back to the drawing board what we can
basically stand on those established
factors so this is something that
I personally feel is maybe under
appreciated sometimes so I just wanted
to to flag that to everyone who is
working in evaluation and so on and I've
worked in that area beforehand just to
flag that so I know that it sometimes
can actually be a bit frustrating and
maybe sometimes one doesn't quite know
how it then filters into something but I
just want to say it actually is
incredibly valuable even though you
might sometimes not not see it and
therefore I'm very happy that
um to the extent that my current
position allows it I can still
facilitate between those two worlds and
sort of have at least one foot
still in in that area
so I'll leave it at that but feel free
if you had any discussions earlier where
I wasn't there yet that I should get
into it
um thank you
I I just will reflect uh saying that uh
it's a very common argument
um that impact evaluation is very uh
evidence from Impact evaluation it's
very contextualized it's only uh
applicable to one specific context it
cannot be uh replicated or generalized
this is true
unless the impact evaluation evidence is
generated at scale
and the transfer project showed us an
example that the scaled up of evidence
and then it becomes we reach the
saturation level we basically know that
okay this works in context a in context
B in context C then it means that it's
likely to work in context F right so
that's why the whole point of doing the
Strategic approach of planning and
developing impact evaluation portfolios
in specific thematic Windows
priority thematic areas is to build this
evidence base that we are talking about
that allows us to
um to replicate the US most successful
and most uh transformative interventions
but um let's move thank you so much um
um
for your intervention and let's move to
the questions are
uh Jonas do you see there these questions
questions
hi sorry uh this is Malika from UNICEF I
think I can migrate some I can moderate
some of the questions considering we
just have five minutes left
um yeah so the first question is for
Jonas uh the audience members wanted to
know how you integrate the different
components so let's review qualitative
and quantitative to answer different
puzzling results and just related to
that if you could speak a little bit
about the feasibility assessment
um to decide on the impact evaluations
in wxp yeah sure I mean very much
similar to what uh was presented by
zlata we do see the liter view as a key
starting point I mean when we do the
literature review we're looking at not
only which interventions were effective
but also what were the outcome measures
they used so we do also look at the
actual modules that were used in the
data that's available
um for the cash and gender that include
doing a meta-analysis actually of some
30 impact evaluations done prior to our
window to understand again what is the
average impact across many many studies
of a transfer and does that vary by
transfer size or by recipient so it's
very much informs the pre-analysis plan
um for window now the windows keep going
but we do have multi-country
pre-analysis plans and so this design
where we were varying the targeting
towards women was informed by that
literature and then that design was
registered so we do register all our
designs and that was the American
economic Association registration
um in terms of the qualitative this is
something that we've been playing around
with we I haven't in past lives as both
academic and and bureaucrat use the
qualitative work to inform design so
I've personally been engaged in Impact
questions where ethnographic work was
used to select interventions did they
test it in the cases so far because
we're really focusing on wfp interventions
interventions
um that are again high value and need to
be tested we've actually so far used the
qualitative work to unpack different
um well anomalies or subgroups that we
see in the inline data so for a Salvador
it was looking at participation rates
household characteristics different
things that would were not obviously
answerable using the survey data that we
had and then sampling from those to
understand those differences in more
detail using the qualitative work but
we're very flexible I mean we're
constantly learning while we do and
thinking about how to integrate those
two pieces and then again the intention
is to always feed back into the
thank you uh the next questions are from zlata
zlata um
um
so how many of the UNICEF 36 impact
evaluations were demanded by donors and
also if you could elaborate on our
strategic pillar second with some
um it is difficult to say uh for sure
the uh transfer project which
constitutes the bulk of impact
violations in there are social protection
protection um
um
a window so to say
um was funded by donors it was a
partnership and still is between a
University of North Carolina Chapel Hill
UNICEF uh FAU art and my understanding
it was our donor funded by defeat and
other donors
um see the uh including
um others it's very difficult to see
sometimes it is part of the
um bigger uh
program proposal to
um to the EU or to other country and
it's included already in the Dona funded project
project
so um I would say probably
um a substantial amount of impact
valuations are part of the donor or
funded by donors
um I think they're curious you know is
there any practical examples for the
um mixed methods uh I think it's a
general approach for us for all the
value impact valuations now to combine
the quantitative and qualitative work
um I think it would be very difficult
not to find uh to find impact evaluation
commissioned by UNICEF without any
substantive qualitative components so
it's it's more or less a common practice
um Mozambique example that I showed at
the beginning is uh was done in parallel
with the process evaluation uh but was
pretty much integrated and trying
information was triangulated uh between
results of both evaluations uh on a
non-experimental methods of the methods
guide that we are preparing will have
our specifically searched and we found a
number of specific examples and they
will be included into the methods guide
thank you thank you zlata and thank you
in terms of time I think uh we are close
to wrapping up we just have a few
um well I want to thank everyone the the
questions I'm I'm scrolling are now the
uh questions and answers we will try to
answer them uh after this session are
both Jonas and sand me so or bear with
us and uh really thank you for
um for joining us today and
um thank you very much and do not
hesitate to reach out and ask any
follow-up questions and some of them I
will probably
um need to be uh directly followed up
thank you very much everyone Jonas and
Stefan particularly and Malika and
Celesta thank you for for the support
thank you have everyone have a great day
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.