0:49 good morning good afternoon good um good
0:52 evening everybody Welcome to the webinar
0:54 on impact evaluation strategic
0:56 directions challenges and Innovations
0:59 we're just waiting a couple of minutes
1:01 for other participants to join in
1:04 currently there are 65 and coming thank
1:58 zlata I think we're good over to you
2:01 thank you thank you Celeste and uh hello
2:06 everyone and very warm welcome to this
2:09 um evaluation practice exchange
2:13 seminar of your net group
2:17 and I'm zlata brutoff senior variation specialist
2:19 specialist
2:22 evaluation office
2:26 um headquarters UNICEF and I will be
2:30 co-host with my counterpart my colleague
2:33 Jonas German
2:36 who leads the impact evaluation unit at
2:41 our office of evaluation are wfp
2:44 um we also will be joined later about in
2:49 a half an hour by our guest from GIS our senior
2:50 senior
2:55 social protection advisor step one BL
2:57 BL
2:59 um who we hope will give his um
3:01 perspective from the programmatic side
3:04 on this topic
3:07 um the reason we chose to propose this
3:10 topic for epe session is
3:13 um basically we see that uh there is a
3:16 lot of interest and demand from
3:19 externally and internally
3:22 um uh to do more of rigorous uh
3:26 evaluative evidence on impact and
3:29 outcomes from donors from our partners
3:34 from executive boards but generally this
3:36 is an area which is still underdeveloped
3:40 among our un agencies the field of
3:41 impact evaluation has grown
3:44 exponentially in the last decade and you
3:47 know all probably about the Nobel prizes
3:51 that were given to social experiments
3:53 and natural experiments in the recent
3:58 years but again this is a relatively um
3:59 um
4:03 uh underdeveloped area of evaluation for
4:06 uh for un agencies in at the same time
4:09 we see that uh there are some um
4:13 changing shifts in our environment and
4:17 challenges that will shape our work and
4:21 shape the trends basically Big Data
4:24 climate change acceleration and
4:27 increasing focus on humanitarian and
4:29 development Nexus and we all are trying
4:32 to grasp very far although these are
4:34 different contextual factors
4:38 so therefore we we do need to
4:41 help each other to
4:45 um to exchange our practices on this and
4:48 our kind of build uh build on the
4:50 learning of each other and build the
4:53 community of practice arm in this area
4:56 of work among the UN agencies that was
4:59 our reasoning and that was our idea
5:02 but uh now without further Ado I I would
5:04 like to
5:08 um to invite uh Jonas to um to share
5:12 with us what the wfp is doing are in
5:13 this area
5:15 over to you and us
5:17 okay great thank you zlata and thank you
5:21 so much everyone for joining us today
5:23 um over the next 15-20 minutes or so I'm
5:24 just going to give an overview of the
5:26 journey that we've started here at wfp
5:29 kind of what we're trying to achieve and
5:30 what we've learned and some of the
5:33 innovations that we're trying to try out
5:35 um going forward
5:37 I'm not sure do I have control over the
5:40 uh you have to say next slide please
5:43 okay next thanks I was clicking a button
5:45 that didn't do anything um
5:46 um
5:49 Celeste will oh sorry
5:56 change in plans let me do this huh
6:03 no no I think oh
6:06 sorry okay can you see the next slide
6:08 yeah I see it now you takeaways yes
6:10 thank you okay so before I go into the
6:12 content I just want to actually start
6:13 off with a few messages that I think
6:15 resonate particularly with the
6:17 evaluation offices and evaluation communities
6:18 communities
6:20 um for wfp we we see impact duration
6:22 very much as a tool that's designed to
6:24 support learning and it does that by
6:25 testing out program theories and
6:28 understanding what works it's definitely
6:30 something that we see as having high
6:32 internal validity and limited external
6:34 so we see it uh evidence coming from
6:36 Impact evaluations as something that
6:38 needs to be built up over time and
6:41 across context and different programs
6:44 um we also think that the important part
6:46 of imagination is that it's giving us a
6:48 real Glimpse at what is in terms of
6:50 programming operationally
6:53 um working and and effective so to have
6:55 that you really need a strong level uh
6:57 no strong partnership built on trust uh
6:59 with the program teams but also with the
7:01 donors that you're working with it
7:03 cannot be imposed
7:06 um and then we also see that to make
7:10 impactivation I guess happen in any in
7:12 any context m e is still a crucial part
7:14 so program m e is a critical kind of
7:16 basic requirement to do good
7:19 impactivation work and then finally
7:21 impact evaluations are definitely for
7:23 wfp something that complement other
7:25 types of evaluations we don't see them
7:28 as replacing or in any form of hierarchy
7:30 against other types of evidence but
7:32 they're one form that complements a
7:34 wider evidence base about wfp programs next
7:36 next foreign
7:42 so in the next few minutes I'm going to
7:44 go briefly over the impactivation
7:46 strategy at wfp and how that has then
7:48 informed the later policy I'm going to
7:50 describe a bit about how we develop
7:53 multi-country impactivation windows in
7:55 key priority areas and then I'm going to
7:56 reflect a bit on some of the lessons
7:58 learned over the last five years and and
7:59 some of the innovations that we're
8:03 trying now next
8:04 so before
8:07 um next please
8:10 so before wfp actually embedded
8:12 impactuation into its uh evaluation
8:15 policy we had an impactation strategy
8:17 that we launched in 2019 the strategy
8:19 came after about a year of consultation
8:22 with wfp's donors and program teams and
8:25 other external experts and it really was
8:28 focusing on two main areas one is that
8:32 wfp sees evacuation as a tool for really
8:33 optimizing interventions so program
8:36 learning and two it should do so in a
8:38 way that the evidence generated provides
8:40 thought leadership globally to the areas
8:44 where wfp is operating so it's not
8:45 something that we see as only meeting
8:47 internal needs but something that we
8:49 always reflect on is this credible to an
8:52 external academic and and donor audience next
8:59 after two years of piloting that impact
9:01 question strategy and in that period
9:03 developing this portfolio that I'm going
9:05 to talk about impact evaluation was then
9:07 finally embedded into wfp's corporate
9:10 evaluation policy last year this was
9:13 based on recognition of the demand and
9:16 the amount of attention and support that
9:18 was going into impact tuition and also
9:20 the kind of unique characteristics which
9:21 made it very difficult for Country
9:23 offices to manage on their own which is
9:25 why we see decentralized evaluations and
9:28 it's also required significant
9:31 specialist skills that were not already
9:32 available in the office of evaluation
9:35 which is essentialized evaluations so
9:39 to put in place this as a key or a third
9:42 type of evaluation we defined it quite
9:45 narrowly we described it or defined it
9:47 as the evaluations that measured changes
9:49 in development outcomes of interest that
9:51 can be attributed to a specific program
9:53 or policy through a credible kind of
9:55 factual so those are very loaded words
9:57 for evaluation audiences obviously we're
9:59 looking at changes in development
10:01 outcomes so immediately we're not using
10:03 a kind of results framework definition
10:05 of the word impact or oecd definition
10:06 the word impact it's really about
10:09 Effectiveness we're also looking for
10:12 attribution so it's very much thinking
10:14 about a causal design and for us that
10:16 causal design needs to have a credible
10:19 counterfactual and so that's pointing us
10:22 towards a limited set of methods but
10:24 we're complementing it as we go forward
10:26 the other reason for kind of narrowing
10:28 it down a bit was it it was not seen as
10:33 something that needed to be repl needed
10:35 to replace already existing work so for
10:36 decentralized evaluation operations in
10:39 wfp they do a lot of different mixed
10:41 methods and qualitative evaluation work
10:43 for centralized evaluations there's a
10:45 very standardized kind of a Performance
10:48 Based and process evaluation and those
10:51 two things are are continuing and then
10:53 why do we do impact missions is because
10:54 it does something additional or
10:56 different that was not already there
10:58 the other thing is that impact
11:00 evaluations in this way are seen as
11:01 taking place very much during program
11:03 implementation this helps us to make
11:05 sure that we have the quality data that
11:07 we need before programs start
11:09 implementing so it's really putting in
11:11 place baselines in the tracking progress
11:13 over time and they're not they're not
11:15 officially or required to be presented
11:17 to our executive board and this is
11:19 distinguishing them from centralized
11:21 evaluations which do have to be reported
11:23 to the board the reason for that again
11:25 is because it's seen as a learning tool
11:27 and we see it as something that needs to
11:29 build up over time and
11:31 um that we haven't decided to have no
11:32 coverage Norm so there's never a
11:34 requirement for any wfp office to do an
11:41 so in that context the strategy which is
11:43 trying to like Focus On Demand lead
11:45 impact variation and has four
11:47 overarching objectives one is that again
11:49 contributing to Global goals delivered
11:51 operational and but to make sure it does
11:53 meet demand we had to maximize the
11:55 responsiveness so with the unit that we
11:58 have in um in HQ really is just there to
11:59 make sure that we're responding to
12:01 Country office contacts and needs as we
12:04 identify them and then seeing if they
12:06 can fit into this definition of wfp's
12:08 impactivation and it's also focused on
12:10 to the extent possible harnessing new
12:18 early on when we started off on this
12:19 work we did not have an impactuation
12:22 unit that came again last year after it
12:25 was made as part of the wfp policy and
12:27 so we really relied on technical
12:30 Partnerships uh for wfp the first strong
12:32 like big technical partnership that we
12:34 had was with the world bank's dime
12:37 Department their department now
12:39 um and that was kind of chosen for two
12:41 reasons one dime had a long track record
12:43 of doing impact evaluations in the
12:46 development space and two wfp in the
12:48 world bank had a global strategy that
12:50 they signed in 2018 which is the year
12:51 before we launched our strategy for
12:53 impact tuition it really set out a
12:55 vision for working on in this Nexus
12:57 space and tackling both humanitarian and
12:59 development challenges together and so
13:01 dime became a kind of an obvious choice
13:03 for a partner at the beginning of this
13:10 um in addition all of the work that we
13:12 do at wfp is supported with I mean
13:14 amazing and great support from our donor
13:17 community so at the moment that includes
13:21 um bmz and kfw Quaker usaid and we're
13:23 just discussing now with NORAD about um
13:25 working on homegrown School feeding
13:27 together in Malawi but they really make
13:29 this possible the cost of impact
13:31 evaluations varies greatly but it does
13:33 tend to be higher than other types of
13:35 evaluation largely due to the time they
13:36 run over three or four years and the
13:39 amount of primary data collected we also
13:40 work closely with other un Partners so
13:42 in Sudan and South Sudan where we're
13:44 working on impact tuitions that are
13:46 funded by bmz
13:48 um there we're actually doing it of
13:50 joint programs and so those joint
13:52 programs are also
13:54 um developed or the impactivations also
13:56 developed jointly with the program teams
13:57 from both sides to make sure that the
13:59 questions we ask and the tools we use
14:03 answer UNICEF and wfp priority questions
14:05 and then we work with a large range of
14:07 external networks um and academics thanks
14:17 no
14:20 it doesn't involve for some reason okay
14:27 well the next ones would be the windows
14:29 I think yeah okay so go for one further then
14:38 yeah okay
14:39 okay
14:42 so in this context where we're trying to support
14:43 support
14:46 um demand-led impact evaluations and
14:48 there's no coverage Norm we had to think
14:50 a lot about why and when to use impact
14:53 evaluation and so we've focused on
14:55 organizing our activations around
14:58 corporate priority areas for wfp and to
14:59 do that we've developed what we call
15:01 impact version Windows it's kind of a
15:03 normal term in the impactivation space
15:06 Also for other research but we have a
15:08 couple of kind of twists on How We Do It
15:09 um one is that these windows are
15:12 developed very much in partnership with
15:14 program division so they are involved in
15:16 identifying what they see as the
15:18 Strategic priorities for evidence so
15:20 looking forward where do we see wfp
15:22 spending more money and focusing more
15:26 attention we also then have no need in
15:28 terms of Shifting the windows to
15:31 actually close them what you see instead
15:32 is that these windows would evolve over
15:36 time by updating the priorities as in
15:38 when wfp updates its policies or
15:39 strategies we would only think of
15:43 closing a window if for example wfp were
15:45 to stop doing cash transfers or stop
15:46 focusing on one of the key priority
15:48 areas and these priority areas were
15:51 selected again they align with wfp's
15:53 program divisions across the windows we
15:55 also have a humanitarian workstream
15:57 which is really about out the context we
15:59 operate in so the humanitarian context
16:02 still focus on the same outcomes like we
16:04 still want to achieve climate adaptation
16:07 and resilience objectives but there's a
16:08 different way of working and a different
16:10 need for thinking about data sources and
16:12 tools so that work stream is across all
16:23 so when we set out to develop a window
16:26 after we do or in parallel actually with
16:27 the consultation process
16:30 um which is with wfp HQ and also country
16:32 offices in Regional bureaus we also
16:34 conduct a literature review the
16:36 literature review is very much focused
16:39 in on the same areas where they see a
16:41 parity in terms of spending and it looks
16:43 at the last in most cases there's been
16:45 the last 10 years of rigorous impact
16:47 evaluation evidence for the type of
16:49 interventions that are being supported
16:52 and that helps us then choose from a
16:53 longer list of questions which
16:57 intervention types or questions are not
16:59 well supported with um with rigorous
17:01 impact Evolution evidence so that helps
17:02 us then narrow down where we should
17:04 focus our impact evaluations we then put
17:06 out a call for expressions of interest
17:08 to all country offices
17:10 um basically informing them of the
17:12 priority area and the opportunity to
17:15 work with us on impact evaluations and
17:17 they're being asked to volunteer their
17:18 future programming so they're being
17:20 asked to think about in the next six
17:22 months or the next year will you be
17:23 delivering new programs that would fit
17:25 into these windows and can we work with
17:27 you on designing a rigorous impact
17:30 evaluation of that program we then
17:31 conduct feasibility assessments of the
17:33 programs that volunteer and then finally
17:35 the country's selection is again
17:37 proposed back to a steering committee
17:39 here in HQ that also includes program
17:41 colleagues and that's just to confirm
17:43 that the the country selection is also
17:45 again relevant to the operational
17:48 questions and priorities here at wfp next
17:53 an example of one of these impactuations
17:56 that we have ongoing right now and this
17:58 is a design that's actually being rolled
18:00 out in three countries so
18:02 um El Salvador Rwanda and Kenya and
18:04 early on I mean the first window that we
18:06 opened was on cash and gender and we did
18:07 the literature review and literature you
18:10 showed that cash does have positive or
18:12 potential positive impacts on women
18:14 through different uh mixes but it
18:17 doesn't tend to give them directly more
18:19 Authority or more decision-making power
18:21 we also saw studies and and also
18:23 qualitative work done by wfp that said
18:26 that women have more control when the
18:28 income that they have is seen as earned
18:30 income or income assigned because they
18:33 were contributing some amount of time to
18:36 for that income and so for wfp food
18:38 assistance for assets is a very common
18:42 programming approach it's a way of
18:44 basically helping communities develop
18:45 assets that will have longer term
18:48 benefits but during the process it
18:49 actually requires communities to assign
18:52 people to work on those assets and those
18:54 can be both Community level and actually
18:56 household level and so in a lot of cases
19:00 there's no clear gender designation but
19:02 men tend to be the ones who are assigned
19:04 to work so we've actually created the
19:06 opportunity by just saying in in a
19:07 handful of communities and in each
19:10 country we will offer an option which is
19:13 a women's only focused food assistance
19:16 for assets programming and then
19:18 households can choose to participate by
19:21 volunteering women or women can choose
19:23 to participate by volunteering their own
19:25 time to work in the on those assets and
19:27 so then what we're trying to understand
19:30 is compared to groups that are doing
19:32 food assistance as business as usual
19:36 does offering women the explicit option
19:38 and giving a women a safe space to work
19:40 does that change their the control over
19:42 women's income but also their
19:43 decision-making power and the
19:46 perceptions of women's work in addition
19:47 to all the normal outcomes that we're
19:48 looking at in terms of consumption and
20:01 um that was just an example as you can
20:03 see across the three windows that we
20:04 already have open the fourth one on
20:06 nutrition is going to be open in 2023
20:10 and we have a well we have 15 kind of
20:11 confirmed in about 18 ongoing
20:15 impactivations they all have some form
20:17 of experimental design like the one I
20:19 just described some are rcts that
20:22 include a kind of factual that is a
20:25 control group and most of them are rcts
20:29 that conclude multiple treatment arms next
20:31 next
20:34 okay so now I'm going to briefly reflect
20:36 on kind of some lessons learned over the
20:38 last five years since we started this work
20:39 work
20:41 um the first set of lessons here is
20:43 coming out of an independent review at
20:44 the end of the pilot phase so between
20:47 2019 and 2021 those two years were
20:49 considered a pilot phase for the wfp
20:51 impactivation strategy that was really
20:53 for two reasons the first was because we
20:54 were not sure what kind of demand or
20:57 whether it be feasible in wfp so it was
20:59 kind of a way to learn what works but it
21:01 also aligned then with the new policy
21:02 which we knew was going to be published
21:05 in 2022. so overall the review found
21:07 that there was positive feedback most of
21:08 the country offices involved with
21:11 impactuation saw benefit there was
21:12 struggles for sure there was issues
21:15 early on but most of them were able to
21:17 deliver an impactivation design that
21:20 they were happy or saw value in and were
21:22 overall happy with the idea of doing
21:24 more impact oceans and actually one of
21:25 the big complaints was there was more
21:27 demand from country offices than what we
21:30 were currently needing um as wfp's
21:31 office of evaluation next
21:34 next
21:35 uh the other thing that it recommended
21:37 is we start to think a bit more about
21:40 how we deliver in populations so
21:41 um it recommended having more capacity
21:43 in the office of evaluation to support
21:45 impact relations again we didn't have a
21:47 unit at this point it was really just a
21:48 very very small team of a couple people
21:50 working in Rome and so they are
21:52 recommended that there should be a more
21:55 investment in in-house capacity to make
21:57 sure that the process of engaging impact
22:00 violations is as smooth as possible they
22:02 also suggested having
22:03 um focusing more on capacity building
22:05 and linking more with country offices
22:08 and the global academic communities and
22:10 countries where we work they also
22:11 pointed us towards broadening the
22:13 methods we were using again we're doing
22:16 mostly our CTS but we were also weren't
22:18 at that point doing much more on top of
22:20 the RCT work in terms of qualitative or
22:23 other evidence and they also said we
22:25 should do more like events like we're
22:26 doing now to improve awareness of the
22:27 strategy and make sure that people are
22:29 aware of the opportunities and then
22:31 finally there was a recommendation to
22:32 think about how do we institutionalize
22:35 impact duration in wfp so that it just
22:36 becomes business as usual when appropriate
22:38 appropriate next
22:43 so we immediately responded to that we
22:45 just finalized um impact Creation in El
22:47 Salvador which is going to have its
22:48 endline Workshop here in another month
22:51 or so and following the inline data
22:53 collection last year we worked with a
22:57 qualitative pi to do a qualitative study
22:59 that looked at differences that were
23:01 already visible in the inline data from
23:03 the quantitative work and so what you
23:05 see is yes we see improvements which is
23:08 a positive impact in terms of
23:10 quantitative measures on health hustle
23:12 consumption on the left hand side but on
23:14 the right hand side you see one of the
23:15 quotes that came from the qualitative
23:16 work which again had a qualitative
23:18 sampling strategy that tried to unpack
23:21 different experiences and there it says
23:22 that at the time my children got sick I
23:24 had to buy medicine and they had the
23:26 money that they needed to actually do
23:28 that so it's a really nice way to make a
23:30 much richer picture there over would be
23:32 normally just a little graph on the left next
23:36 the other thing we're doing is in
23:38 countries where we have high frequency
23:39 data which is our resilience window
23:42 we've also developed a high frequency
23:44 data dashboards and these allow country
23:46 offices to see every time that we have
23:48 new data from those high frequency
23:50 servers which one every two months
23:52 um they can actually within a short
23:54 period see the changes over time in the
23:57 different outcomes but also in things
23:58 like coping mechanisms other things
24:00 where we collect data next
24:05 and then finally as mentioned the
24:07 humanitarian workstream is is moving
24:09 away from I guess traditional RCT
24:11 methods and doing what we call a b
24:14 testing and that's really based on both
24:16 um the recognition that in an emergency
24:18 setting people need support it's not
24:20 really whether they need support it's
24:21 about when or how you get the support
24:23 and so here's an example of forecast
24:25 based financing where we have a little
24:27 bit of money up front where we can
24:28 actually support households before
24:31 shocks happen and then two versions of
24:33 responses now you could say oh it's
24:34 always great to do it early but there
24:36 are good arguments to question whether
24:38 that's true particularly in terms of
24:40 like flood responses where if the
24:43 markets are completely destroyed or
24:45 there's a change in expect in terms of
24:46 price or purchasing power from the
24:48 transfer within a short period of time
24:50 there's good reasons to think about
24:52 whether or not the timing is right for
24:54 different households and similarly not
24:56 every household will be affected the
24:57 same so there's also gains and
24:59 understanding again who's most
25:00 vulnerable or who needs the most support
25:03 in reconstruction next
25:05 next
25:07 okay so that's me I mean that's
25:09 everything I was going to say today this
25:12 QR code if you get a chance you can scan
25:13 that I don't know maybe we could pull it
25:15 up later again and that'll allow you to
25:17 join our mailing list and then we can
25:18 make sure that you get our future
25:21 newsletters okay over to you later thank
25:24 you very much uh I will uh at the end of
25:26 my presentation I'll return to this
25:29 slide and uh leave it for for people to
25:33 scan uh so we'll move on to um
25:34 um
25:36 so unisa
25:39 strategy and what we are doing and so
25:43 our WP example is very inspiring at
25:46 least uh to us and I often say that we
25:49 are three years behind on wfp in terms
25:52 of developing this area but uh we
25:57 definitely uh are making our first steps
26:01 um first step was uh also developing a
26:04 strategy we call it evaluation of impact strategy
26:05 strategy
26:07 um reflecting the broader definition or
26:11 the broader perspective on this area of
26:13 work uh we also have gone through a very
26:16 long extensive consultations through
26:18 evaluation of function of Unicef but
26:21 also externally and now hopefully we are
26:23 in the design and layout stage and
26:25 hopefully there the document will be
26:29 available on very soon
26:32 but before I start and uh share with you
26:35 some key um pillars of this strategy I I
26:38 just would like to Echo are the key
26:42 messages the key takeaways that were
26:43 presented at the very beginning to which
26:45 I fully
26:48 um agree with but also say that when
26:51 designed and planned well impact
26:54 evaluation or high quality rigorous
26:57 impact evidence is a very powerful tool
26:58 to make a difference
27:00 and it is not an assumption it's
27:02 actually a fact
27:06 um and uh just to give you a very few
27:09 um examples most recent examples there
27:10 are definitely more
27:13 um of Unicef evaluations that uh were
27:16 commissioned very recently and already
27:20 had some very positive influence on uh
27:22 government decisions to scale up but
27:24 also programmatic changes and
27:27 programmatic learning so Mozambique
27:30 social protection the rigorous impact
27:32 evaluation which was conducted alongside
27:35 the process evaluation
27:38 um demonstrated very positive results on
27:40 The Cash Plus model cash transfer plus
27:44 case management and social behavioral uh
27:47 communication package and this allowed
27:50 uh or enables the government to scale up
27:54 the pilot phase which reached around 15
27:59 000 children to our decision to to reach 250
28:00 250
28:03 000 families with children between
28:05 um within the next two years India
28:08 sanitation National Sanitation project
28:10 the evaluation of Economic and financial
28:14 impacts of the swash Barat Mission
28:17 um led to the sanitation cabinet
28:19 decision to of additional Government
28:23 funding of 18 billion to increase our
28:26 the programming within the next four years
28:31 um Nigeria I chose this example because
28:33 of its actual
28:36 um influence on the programmatic changes
28:39 and programmatic adjustments
28:42 alongside the um impact of original the
28:45 volunteer community mobilizers Network
28:48 on Polar eradication help UNICEF country
28:52 office to advocate for retention of 17
28:56 000 volunteer community mobilizers from
28:59 polio eradication campaign
29:03 um but also helped the country office to
29:06 um make adjustments
29:08 um and changes in the health strategy
29:10 and social behavior communication programming
29:16 so there are of course uh institutional
29:18 rationale for Unicef to strengthen the
29:21 work on outcomes and impact levels
29:26 um first of all the new strategic plan um
29:27 um
29:30 2020-2025 has explicit for the first
29:34 time explicit focus on outcomes and we
29:36 have a mandate to
29:39 um to support that as an as a change strategy
29:41 strategy
29:43 um this relates to accountability and
29:46 another aspect of accountability is the
29:48 pressure increased pressure from donors
29:50 and executive board to demonstrate
29:54 effectiveness of Unicef Investments
29:58 learning is critical and so as uh
30:00 previously was sad learning probably is
30:03 a is a very important and
30:07 um the core element of of this work
30:09 um programs UNICEF programs become more
30:12 Innovative more integrated and we have
30:15 to test interventions that are being
30:19 implemented rolled out uh simply to know
30:22 what works and what doesn't uh not to
30:26 waste money and efforts and secondly our
30:28 humanitarian programming is rapidly
30:31 expanding our the investment in
30:34 humanitarian programming also Rising so
30:36 we need to find new innovative ways to
30:42 uh to show the results are in that area
30:44 so our vision um
30:45 um
30:47 kind of outward looking and inward
30:50 looking so outward looking we think that
30:54 our beta beta evident impact evidence
30:57 will help and support National systems
31:00 and policies are by facilitating UNICEF
31:03 advocacy and supporting National
31:05 Partners in their decisions to scale up
31:09 a child focus uh policies and programs
31:11 and inward looking it simply are
31:13 contributes to improved our
31:17 organizational Effectiveness uh by
31:20 um through allocating uh limited public
31:28 our um I have to say that our approach
31:32 is slightly uh different from wfp in
31:37 that we are we recognize that our UNICEF
31:40 are areas of work are very Broad and
31:43 UNICEF Works Upstream as well as
31:47 Downstream are the work on advocacy
31:49 governance Public Finance for children
31:53 for instance are as important to UNICEF
31:57 as our intervention type of programming
32:00 so therefore we are we see and again
32:03 recognizing there are internal and external
32:04 external
32:07 um discourse in this area we see that uh
32:10 programmatic parameters these are Define
32:14 the purpose evaluative purpose and we
32:16 need to look uh at what is the nature of
32:18 intervention what is the nature of
32:20 outcome we are looking at what is the
32:22 nature of the program what kind of
32:24 questions we ask are they causal
32:26 questions or not
32:29 um and then we see which track we take
32:35 if causal attribution to outcomes
32:38 um through credible counterfactual is
32:42 possible and feasible then we are go for
32:46 impact evaluations with our in which a
32:49 specific quantitative methods are best
32:52 positioned to to do to fit this arm requirement
32:54 requirement
32:59 if the program or interventions are more
33:02 suited for causal contribution then we
33:04 need to apply theory-based
33:07 non-experimental methods
33:10 um that are also available and also very
33:12 um credible for
33:14 um specific research and validation questions
33:20 so we also looked at the our did some
33:23 Diagnostics and where we are at the
33:26 beginning of this are journey and UNICEF
33:28 has done quite
33:30 um a few evaluations and some of them
33:34 are very well known as specifically
33:37 thanks to their transfer project and
33:40 social protection but we identified
33:44 overall uh that 30 UNICEF Pro commission
33:47 to conducted 36 impact evaluation over
33:50 their five-year period uh this is about
33:53 six percent of the total number of
33:56 evaluations evaluative products produced
33:59 we see a substantial or thematic and
34:03 Geographic disparity and this is one of
34:05 the challenges of social protection is
34:08 where very well covered uh but other
34:12 areas like ecd nutrition are adolescent
34:15 programming and even child protection I
34:17 have very very few
34:20 um credible are rigorous evidence available
34:22 available
34:24 in terms of the methods our UNICEF
34:28 evaluations to date are have been are
34:31 both are including rcts and experimental
34:35 designs but also a lot of our 21 were
34:37 done using class experimental approaches
34:39 approaches
34:42 so we identified a number of challenges
34:45 that are also through online survey of
34:48 our staff I will not go I'm not going to
34:52 to read or uh uh elaborate on those
34:54 you're all very familiar with them
34:57 including high cost and resources and
35:00 low awareness and capacity on the ground
35:04 to our of staff and partners to or to
35:05 understand the feasibility and
35:07 parameters requirements of impact evaluations
35:10 evaluations
35:12 So based on that we developed 3p
35:14 strategic pillars
35:16 and I will just say a few words about
35:19 each of them what we are what we are
35:20 currently doing
35:23 uh the first pillar is to increase
35:26 initiation coverage and requirements for
35:29 impact evaluations across UNICEF again
35:32 as I said UNICEF has been doing impact
35:34 evaluations for a long time but what is
35:37 different now is that we are trying to
35:40 develop similar to wfp a more strategic more
35:41 more
35:44 holistic approach to generating this
35:47 evidence identifying the areas of high priority
35:48 priority
35:52 uh within this current strategic plan
35:56 the areas with the least coverage and
35:59 trying to stimulate our demand
36:02 initiation in those uh thematic and
36:04 Geographic areas
36:08 um so one way to do it is to explore we
36:10 don't call them thematic windows but are
36:12 basically it's it is some uh priority areas
36:14 areas um
36:15 um
36:18 through we would like to use the impact
36:20 evaluation Catalyst fund
36:23 um that will stimulate basically provide
36:26 matching contributions to
36:30 um country offices to initiate our
36:32 impact evaluation rigorous impact
36:34 pollution evidence
36:36 the first step is again slightly
36:39 different approach from W he uh we start
36:42 uh from a multi-country impact
36:45 visibility assessment and we have
36:48 completed uh one on child marriage just
36:52 now uh uh jointly with uh unfpa and we
36:55 have started uh the one on mental health
36:57 and psychosocial support and I will say
37:00 very briefly on what it entails are in
37:02 terms of methodology
37:06 second we will uh start very soon a very
37:08 comprehensive impact evaluation package
37:11 for adaptive social protection it but in
37:15 partnership with bmz which includes for
37:18 impact evaluations in fragile context
37:22 here the interesting model is that it's
37:25 it's a comprehensive evidence uh project
37:28 so it's not only impact evaluations but
37:30 it is operational research and data innovation
37:31 innovation
37:35 and finally we work on our supply side
37:38 and so we are trying to establish our
37:40 Partnerships with academic institutions
37:44 and other actors but also uh worked on
37:46 long-term agreements with who impact
37:48 evaluations uh which will be um
37:55 um just to give you a sense on
37:57 multi-country impact visibility
37:59 assessment which we consider as assist
38:04 as a systematic uh expert driven and
38:07 strategic approach to understand what is
38:09 feasible what are the opportunities
38:12 limitations to conduct conduct our
38:16 impact evaluation portfolios are it
38:18 basically consists of four steps are
38:19 starting from
38:21 um stock taking on the literature
38:25 mapping intervention uh interventions or
38:28 identify Global gaps against UNICEF
38:30 interventions then programmatic Deep
38:34 dive into our selected list of countries
38:36 and then final recommendations um
38:37 um
38:41 just the methodology includes developing
38:44 a very specific and clear criteria for the
38:45 the
38:49 um for selecting country cases including
38:52 um political interests or demand for
38:54 rigorous evidence at the country level
38:57 at the end of the day we want to scale
39:00 up we want this evidence to be used our
39:03 operational facilities meaning the
39:07 existence or availability of strong are
39:09 data collection companies and National
39:12 Partners they could support US policies
39:14 and so on
39:18 Gila 2 focuses on diversification on
39:20 method of methods and Innovation and
39:23 this is this work uh very similar in a
39:28 way that our Jonas are presented we also
39:29 will look at
39:32 um we're already looking at there are
39:34 utilize better utilization of secondary
39:39 data sources administrative data
39:42 household survey are to utilize the
39:46 quite experimental designs right now we
39:49 are focusing on humanitarian portfolio
39:51 because this is where the evidence is
39:54 really uh lacking not only for Unicef but
39:56 but
39:59 um globally but for that work we really
40:03 focus on outcomes are we see we
40:05 recognize there are short and
40:08 intermediate are terms that usually
40:11 applied for humanitarian setting um
40:12 um
40:15 this is with a credit to the our
40:19 regional icar office uh who initiated
40:23 and launched the so-called so-called uh
40:26 digital rcts low-cost
40:30 um rcts of Digital Services uh or
40:33 digital applications basically using big data
40:35 data
40:38 um the results will be available
40:40 um at the end of June and we hope to
40:43 replicate these models for other regions
40:47 and generally in in terms of methods we are
40:49 are
40:53 promote mix mixed approaches but are
40:55 very close integration of process
40:58 related evaluation questions with impact
41:01 evaluation uh questions
41:03 um using so-called nested approaches
41:06 that would help the program to receive
41:08 short-term learning respond to the
41:10 short-term learning needs of the program
41:13 implementers but also contribute to
41:17 their Global Learning and longer term
41:23 and the third pillar is basically
41:25 capacity building and learning again
41:27 addressing some of the challenges of low
41:31 awareness a low understanding of among
41:34 program staff and evaluation stuff in
41:37 the field what are the requirements for
41:40 doing rigorous and credible impact uh
41:43 evaluation uh evaluations
41:46 um but at the same time we as I said
41:50 before we do promote and we do try to
41:51 support our
41:55 regions and Country offices with other
41:57 methods are non-experimental these are
42:01 methods our contribution analysis
42:04 um qualitative impact protocol are
42:06 process tracing
42:08 um we are developing a methods Guide
42:11 Series that will uh provide a
42:13 user-friendly versions of those
42:17 methodologies to support our summative
42:20 evaluations that would want to look at
42:23 the I would want to ask causal questions
42:26 and look at the contribution
42:28 we are interested in developing National
42:31 capacity impact evaluation and some of
42:34 the our ideas is to develop a network of
42:37 academic institutions of the South
42:40 uh and young evaluative fellowships that
42:43 would uh support and give opportunities
42:44 for young evaluators and young
42:46 researchers to work on impact evaluations
42:52 so our immediate priorities are is
42:54 institutionalization of impact
42:56 evaluations in the new evaluation policy
43:01 revision of which is ongoing right now
43:03 um test the more cost efficient methods
43:06 and I saw the question in Q a about
43:10 costing was probably probably the first
43:12 question are and very
43:15 um Valiant for for this area of work so
43:16 so
43:19 um Aid and B testing that sianas uh
43:22 mentioned this is what we are going to
43:24 try to do as well
43:27 answer we are planning this year we are
43:29 planning to launch this multi-country
43:31 initiatives in uh specifically in
43:32 adaptive social protection and child
43:35 protection and nutrition
43:52 um Celeste I don't see Stefan joint
43:55 the panel Stefan
43:58 um no I did not see him let me so we
44:00 waited for it yes because I said at the
44:04 beginning uh from gizet uh to see a few
44:06 words but we can start um
44:07 um
44:11 addressing some of the questions and maybe
44:12 maybe um
44:12 um
44:15 Jonas if you want to answer the question
44:18 about costing
44:22 I mean it's uh I don't know
44:26 it it ranges very a very large amount in
44:28 WFT we we the cost of our impact
44:30 evaluations are down to the country
44:33 context the number of survey rounds and
44:36 the the timeline before the impact
44:39 commission so at the low end we see
44:41 impact evaluations that are I guess what
44:43 we call more lean so between two or
44:45 three hundred thousand dollars and again
44:48 those are often ones that harness the
44:50 monitoring data are more about the a b
44:52 testing designs and are not going for
44:54 many years at the high end it can go way
44:57 above that if we're running surveys over
45:00 three to five years and
45:02 um yeah and providing constant support
45:05 it can be multiple times that so just
45:07 with a context that's actually not very
45:09 high compared to a lot of wfp's
45:11 centralized evaluations or decentralized
45:13 divisions so you can do impact
45:15 evaluations for about the same cost of
45:18 any evaluation but you can also spend a
45:20 lot more if you have a reason to and
45:21 there's I guess important reasons to
45:24 collect data over
45:28 thank you are so I see Stefan Jones just to
45:29 to
45:32 um thank you hi Stefan
45:35 um I I introduced you at the beginning
45:38 but I will say it again Stefan Pierre um
45:39 um
45:42 social protection advisor at GI that our
45:47 partner and we have been uh working with
45:50 Stefan um in the last year quite
45:53 intensely developing the our the
45:56 portfolio on impact evaluation for
45:59 adaptive social protection and so we
46:01 hope Stefan will give some his thoughts
46:04 on the from programmatic side on the uh
46:08 on their potential for this area work
46:09 anybody anymore
46:16 thanks zlata and hi everyone
46:18 um sorry I was connected but in
46:19 spectator mode so that's why you
46:22 probably couldn't see me
46:24 um so yeah
46:27 um Let Me Maybe say a few things about
46:30 my my current position and then I can
46:32 situate that in the way
46:35 the the kind of evidence that that
46:37 UNICEF is generating
46:40 um comes in very handy there and how
46:43 useful that kind of work is and some
46:45 other Reflections related to that I hope
46:46 that sort of goes into the intended
46:48 Direction otherwise
46:50 feel free to steam in a different direction
46:51 direction
46:54 yeah so currently does the sector initiative
46:56 initiative
46:58 um or sector program means that I'm
47:01 primarily tasked with advising um
47:02 um
47:04 BM sets sector unit on social protection
47:08 you know which means we very often have to
47:09 to
47:12 very rapidly react to all sorts of
47:15 different uh search protection related
47:18 questions then can sometimes be quite
47:20 political but also sometimes be very
47:22 Integrity operational questions when it
47:25 comes to like just you know quickly
47:26 responding to something that we hear
47:27 from some some
47:30 operational programs that we're
47:32 implementing or whatever yeah so therefore
47:33 therefore
47:35 in that work
47:37 it is
47:41 absolutely vital that something like for
47:43 example the transfer project has done
47:45 over many years has been done because
47:50 without that my job would be so much
47:53 harder you know so it's
47:56 it's really invaluable and I think very
47:58 often for for the policy makers they
47:59 don't they don't see that side so much
48:02 because in a way we're the mediators between
48:03 between
48:07 researchers between evaluators on the
48:10 one hand and then the policy makers on
48:13 the other hand yeah but for us in this
48:16 intermediate position that kind of
48:19 um like rigorous evidence that is also
48:21 but also combined with operational
48:22 insights all the different kinds of
48:25 evaluation approaches that the human
48:27 setup was there zlata also highlighted
48:29 bringing that together and clearly
48:31 teasing out the the gist and showing how
48:33 solid it is and what we know for which
48:35 context and all that like having done that
48:35 that
48:38 in such a comprehensive way for social
48:40 protection and that
48:43 example of the transfer project
48:45 is absolutely invaluable so I'm very
48:48 happy and then leading now to what
48:49 Salata also has indicated that now
48:53 through the bmz UNICEF partnership it
48:56 very much looks like there's gonna be an
48:58 asp so adaptive social protection
49:02 related evaluation partnership which my
49:04 hope is very much that what has been
49:06 achieved before for
49:08 one area will now be done for another
49:11 area so that going forward again my work
49:13 which is currently focused a lot on the
49:15 depth of search protection will also
49:17 become much much easier because we can
49:19 stand on that firmly established
49:21 knowledge and we don't always have to go
49:23 back to the drawing board what we can
49:26 basically stand on those established
49:30 factors so this is something that
49:33 I personally feel is maybe under
49:35 appreciated sometimes so I just wanted
49:39 to to flag that to everyone who is
49:41 working in evaluation and so on and I've
49:42 worked in that area beforehand just to
49:44 flag that so I know that it sometimes
49:46 can actually be a bit frustrating and
49:47 maybe sometimes one doesn't quite know
49:50 how it then filters into something but I
49:52 just want to say it actually is
49:53 incredibly valuable even though you
49:56 might sometimes not not see it and
49:59 therefore I'm very happy that
50:00 um to the extent that my current
50:02 position allows it I can still
50:05 facilitate between those two worlds and
50:07 sort of have at least one foot
50:10 still in in that area
50:12 so I'll leave it at that but feel free
50:14 if you had any discussions earlier where
50:16 I wasn't there yet that I should get
50:17 into it
50:19 um thank you
50:24 I I just will reflect uh saying that uh
50:27 it's a very common argument
50:31 um that impact evaluation is very uh
50:32 evidence from Impact evaluation it's
50:36 very contextualized it's only uh
50:39 applicable to one specific context it
50:43 cannot be uh replicated or generalized
50:44 this is true
50:48 unless the impact evaluation evidence is
50:50 generated at scale
50:54 and the transfer project showed us an
50:57 example that the scaled up of evidence
51:00 and then it becomes we reach the
51:03 saturation level we basically know that
51:07 okay this works in context a in context
51:11 B in context C then it means that it's
51:15 likely to work in context F right so
51:18 that's why the whole point of doing the
51:20 Strategic approach of planning and
51:22 developing impact evaluation portfolios
51:25 in specific thematic Windows
51:29 priority thematic areas is to build this
51:31 evidence base that we are talking about
51:34 that allows us to
51:38 um to replicate the US most successful
51:40 and most uh transformative interventions
51:44 but um let's move thank you so much um
51:45 um
51:48 for your intervention and let's move to
51:51 the questions are
51:54 uh Jonas do you see there these questions
51:55 questions
51:59 hi sorry uh this is Malika from UNICEF I
52:01 think I can migrate some I can moderate
52:03 some of the questions considering we
52:05 just have five minutes left
52:07 um yeah so the first question is for
52:10 Jonas uh the audience members wanted to
52:12 know how you integrate the different
52:14 components so let's review qualitative
52:15 and quantitative to answer different
52:18 puzzling results and just related to
52:19 that if you could speak a little bit
52:22 about the feasibility assessment
52:24 um to decide on the impact evaluations
52:27 in wxp yeah sure I mean very much
52:30 similar to what uh was presented by
52:33 zlata we do see the liter view as a key
52:34 starting point I mean when we do the
52:36 literature review we're looking at not
52:38 only which interventions were effective
52:39 but also what were the outcome measures
52:42 they used so we do also look at the
52:44 actual modules that were used in the
52:46 data that's available
52:47 um for the cash and gender that include
52:49 doing a meta-analysis actually of some
52:52 30 impact evaluations done prior to our
52:54 window to understand again what is the
52:57 average impact across many many studies
52:59 of a transfer and does that vary by
53:01 transfer size or by recipient so it's
53:04 very much informs the pre-analysis plan
53:06 um for window now the windows keep going
53:07 but we do have multi-country
53:09 pre-analysis plans and so this design
53:12 where we were varying the targeting
53:14 towards women was informed by that
53:16 literature and then that design was
53:18 registered so we do register all our
53:20 designs and that was the American
53:23 economic Association registration
53:25 um in terms of the qualitative this is
53:26 something that we've been playing around
53:29 with we I haven't in past lives as both
53:31 academic and and bureaucrat use the
53:34 qualitative work to inform design so
53:36 I've personally been engaged in Impact
53:38 questions where ethnographic work was
53:40 used to select interventions did they
53:42 test it in the cases so far because
53:43 we're really focusing on wfp interventions
53:45 interventions
53:48 um that are again high value and need to
53:50 be tested we've actually so far used the
53:53 qualitative work to unpack different
53:55 um well anomalies or subgroups that we
53:57 see in the inline data so for a Salvador
53:59 it was looking at participation rates
54:01 household characteristics different
54:03 things that would were not obviously
54:06 answerable using the survey data that we
54:08 had and then sampling from those to
54:10 understand those differences in more
54:11 detail using the qualitative work but
54:13 we're very flexible I mean we're
54:15 constantly learning while we do and
54:16 thinking about how to integrate those
54:19 two pieces and then again the intention
54:20 is to always feed back into the
54:28 thank you uh the next questions are from zlata
54:30 zlata um
54:31 um
54:34 so how many of the UNICEF 36 impact
54:36 evaluations were demanded by donors and
54:38 also if you could elaborate on our
54:41 strategic pillar second with some
54:48 um it is difficult to say uh for sure
54:52 the uh transfer project which
54:54 constitutes the bulk of impact
54:56 violations in there are social protection
54:58 protection um
55:00 um
55:02 a window so to say
55:05 um was funded by donors it was a
55:08 partnership and still is between a
55:10 University of North Carolina Chapel Hill
55:16 UNICEF uh FAU art and my understanding
55:19 it was our donor funded by defeat and
55:21 other donors
55:24 um see the uh including
55:26 um others it's very difficult to see
55:29 sometimes it is part of the
55:31 um bigger uh
55:34 program proposal to
55:39 um to the EU or to other country and
55:41 it's included already in the Dona funded project
55:42 project
55:45 so um I would say probably
55:47 um a substantial amount of impact
55:50 valuations are part of the donor or
55:52 funded by donors
56:04 um I think they're curious you know is
56:06 there any practical examples for the
56:13 um mixed methods uh I think it's a
56:15 general approach for us for all the
56:18 value impact valuations now to combine
56:21 the quantitative and qualitative work
56:23 um I think it would be very difficult
56:26 not to find uh to find impact evaluation
56:29 commissioned by UNICEF without any
56:32 substantive qualitative components so
56:36 it's it's more or less a common practice
56:38 um Mozambique example that I showed at
56:41 the beginning is uh was done in parallel
56:45 with the process evaluation uh but was
56:47 pretty much integrated and trying
56:50 information was triangulated uh between
56:54 results of both evaluations uh on a
56:57 non-experimental methods of the methods
56:59 guide that we are preparing will have
57:02 our specifically searched and we found a
57:05 number of specific examples and they
57:07 will be included into the methods guide
57:14 thank you thank you zlata and thank you
57:17 in terms of time I think uh we are close
57:20 to wrapping up we just have a few
57:27 um well I want to thank everyone the the
57:31 questions I'm I'm scrolling are now the
57:33 uh questions and answers we will try to
57:37 answer them uh after this session are
57:41 both Jonas and sand me so or bear with
57:45 us and uh really thank you for
57:48 um for joining us today and
57:51 um thank you very much and do not
57:53 hesitate to reach out and ask any
57:55 follow-up questions and some of them I
57:56 will probably
57:59 um need to be uh directly followed up
58:03 thank you very much everyone Jonas and
58:06 Stefan particularly and Malika and
58:08 Celesta thank you for for the support
58:11 thank you have everyone have a great day