0:02 welcome everyone to another Deep dive
0:04 into the world of project management and
0:07 uh today we're tackling something called meal
0:08 meal
0:13 monitoring evaluation accountability and
0:15 learning I like it and by the end of
0:17 this deep dive you'll not only
0:20 understand what makes projects tick but
0:21 also be able to ask the right questions
0:23 about any project you come across like a
0:26 detective almost for projects so let's
0:28 uh let's break it down monitoring and
0:31 evaluation I feel like those often get
0:33 used interchangeably yeah they do
0:35 they're distinct absolutely monitoring
0:38 is like taking the pulse of a project
0:40 okay making sure it's on track you're
0:41 looking at are we reaching the intended
0:43 number of people are we staying within
0:45 budget evaluation on the other hand goes
0:48 a bit deeper it asks did we actually
0:49 make a difference was it worth the
0:51 resources The Source material had a
0:53 great table comparing the two it really
0:56 showed how monitoring is this continuous
0:58 process and then evaluation is more
1:00 periodic right periodic and involves
1:02 external experts yeah and that makes
1:04 sense because monitoring is providing
1:07 the raw material if you will for
1:09 evaluation kind of like a doctor using
1:12 regular checkups to gather information
1:15 that informs a deeper diagnosis so they
1:17 work together then absolutely hand inand
1:19 monitoring feeds into evaluation and
1:21 then evaluation can then lead to
1:23 adjustments in how we monitor exactly
1:25 okay so I've got monitoring we've got
1:27 evaluation where to accountability in
1:30 learning fit into all this well account
1:31 ility it's about demonstrating
1:35 responsibility okay using data to show
1:37 that the project is using resources
1:40 effectively and achieving those results
1:42 then you move into learning which is all
1:44 about taking those valuable lessons from
1:46 what worked what didn't work and using
1:49 them to improve future projects so it's
1:51 not just a one-off not at all we're
1:53 constantly learning constantly adapting
1:56 okay now the source material outlines
1:59 five phases of meal yes and it's
2:01 interesting because it's Loop it is a
2:03 cycle it's not linear it ensures that
2:06 you have that ongoing learning and
2:08 Improvement you're always checking in
2:12 assessing yeah adapting okay that makes
2:16 a lot of sense all right logic
2:19 models ah yes the backbone yeah the road
2:22 map of a project they visually map out
2:24 how it should
2:27 work connecting activities to intended
2:29 outcomes now there are different types
2:31 okay but we will focus on theory of change
2:32 change
2:37 Toc and the results framework or RF so
2:38 theory of change that's like the big
2:40 picture document right exactly Grand
2:43 Vision it defines those long-term goals
2:45 the preconditions needed to reach them
2:46 and this is really important the
2:48 assumptions assumptions yeah you have to
2:50 consider those factors outside the
2:52 Project's Direct Control that need to be
2:54 true for the project to succeed so even
2:57 if the project team is willing doing
2:58 everything perfectly there are these
3:00 external factors external
3:03 factors influence success absolutely and
3:05 then results framework seems a little
3:08 bit more more focused it zooms in on the
3:11 specific things that the project team is
3:12 managing okay like in the Delta River
3:14 IDP project the
3:17 to identified multiple areas needed to
3:19 improve the lives of internally
3:23 displaced persons but the RF narrowed
3:26 the scope to what unite us the
3:28 organization implementing the project
3:30 was directly resp responsible for like
3:33 improving access to clean water and
3:35 hygiene so the TOC is kind of like the
3:38 big why the why and then the RF gets
3:41 into the how assumptions so those are
3:42 interesting to me yes very important
3:44 because it's like
3:46 acknowledging that there could be these
3:48 risks these road blocks that can really
3:50 make or break a project The Source
3:52 material calls the really critical ones
3:54 youer assumptions killer assumptions
3:57 yeah and a good example is in the
3:58 delelta River Project a killer
4:00 assumption was that the
4:03 government would provide latrines and
4:05 Water Systems if that didn't happen the
4:07 whole project could have been severely
4:09 impacted wow so you really can't just
4:11 assume that everything's going to go as
4:13 planned so we've got to identify those
4:16 potential roadblocks be ready to adap
4:19 yeah okay let's get into measuring
4:23 success so how do we track progress well
4:25 that's where indicators come in they're
4:26 the measurable factors that tell us if
4:28 we're on track and to make sure we're
4:30 using good indicators we use the a smart
4:33 acronym okay specific measurable
4:37 achievable relevant and time bound so a
4:39 good example from the Delta River
4:42 Project was by year three of the project
4:45 80% of idps demonstrate knowledge that
4:49 hands need to be washed with soap after
4:51 critical events there you go it's
4:53 specific measurable relevant I like it a
4:55 smart indicator in action so we've got
4:57 our indicators Now by to actually gather
5:00 the data we need to choose the right
5:01 measurement methods right and that
5:03 brings us to the classic Duo
5:06 quantitative and qualitative data the
5:08 numbers versus the stories so
5:10 quantitative gives us those hard facts
5:13 and figures while qualitative helps us
5:16 understand perceptions experiences the
5:18 why behind the numbers and the source
5:20 material was saying that which one you
5:21 choose really depends on what you're
5:23 trying to measure absolutely your budget
5:25 the level of detail you need like to
5:27 measure handwashing behavior in the
5:29 Delta River Project they could could
5:32 have used direct observation which is
5:34 more accurate but expensive or they
5:36 could have relied on questionnaires
5:37 which are
5:40 cheaper but maybe not as reliable it's
5:42 all about finding that right balance
5:43 okay and
5:47 then no matter which method you choose
5:49 data management yes super important it's
5:51 like keeping your kitchen tidy while
5:52 you're cooking I like that if everything
5:55 is organized and clean easy to find the
5:58 whole process runs smoother so we've got
6:01 data entry cleaning . storage. security
6:03 all those little details matter they all
6:04 matter this has been great so far we've
6:06 unpacked what meal is we've looked at
6:09 monitoring evaluation accountability
6:12 learning delt into logic models
6:15 assumptions smart indicators choosing
6:17 the right ey data measurement methods
6:19 and even touched on data management and
6:21 we're just getting started we are oh
6:23 yeah in part two we're going to go even
6:25 further exploring how to analyze and
6:27 interpret that data and ultimately how
6:29 to use it to make a real difference well
6:31 so stay tuned everybody lots more to
6:39 meal welcome back deep divers in part
6:44 one we unpacked the basics of meal and
6:45 we saw how it's kind of like a
6:47 detective's toolkit for projects but
6:49 yeah just like a detective it's not
6:51 enough to just have the right tools you
6:54 need to know how to use them effectively
6:56 exactly and ethically we have to think
6:58 about those ethical considerations when
7:01 working with project data so today we're
7:06 diving into Data analysis interpretation
7:08 and most importantly data use the
7:10 exciting part let's start with analysis
7:11 we've gathered all this data we got our
7:14 spreadsheets our notes now what data
7:16 analysis is where we make sense of it
7:18 all think of it as sifting for gold okay
7:19 you've got a pan full of dirt and
7:21 Pebbles but you're looking for those
7:24 nuggets we use different techniques to
7:27 identify Trends patterns and answer
7:28 those key questions so it's more than
7:31 just describing absolutely we want to
7:32 figure out why it happened and this is
7:35 where quantitative and qualitative
7:37 really start to work together right
7:39 exactly they complement each other
7:41 numbers give us one piece of the puzzle
7:44 the stories the experiences help us
7:45 understand the
7:48 nuances why imagine we're evaluating a
7:50 project that's trying to improve
7:52 maternal Health quantitative data might
7:55 show a decrease in maternal mortality
7:57 rates but then
8:00 then the qualitative data
8:02 from interviews might reveal that while
8:04 the project led to better Access to
8:07 Health Care Transportation is still a
8:09 barrier that's a great example for a lot
8:11 of people so it adds that depth adds
8:13 context yeah but analysis is just the
8:16 first step then we need to interpret the
8:17 data that's right data interpretation is
8:19 crucial and this is where we connect the
8:22 dots draw conclusions and figure out
8:23 those implications this is where
8:24 critical thinking it is so important
8:26 this is so important we have to be
8:28 careful not to jump to conclusions or
8:30 let our biases
8:31 influence our interpretation and the
8:33 source material
8:36 emphasizes that no data set is perfect
8:38 right there's always uncertainty we need
8:41 to acknowledge that absolutely and not
8:43 overstate our findings we need to be
8:46 transparent yeah about our methods
8:48 assumptions and potential sources of
8:50 error I think it's also important to
8:52 remember that data
8:54 interpretation shouldn't happen in a
8:56 vacuum you're right it should be
8:58 collaborative yeah having the project
9:01 team stakehold ERS and the community
9:03 because then you get those diverse
9:05 perspectives exactly and it helps us
9:07 challenge our assumptions so analysis
9:09 interpretation those are great but
9:11 they're only valuable if they lead to
9:12 action I know where data use comes in
9:14 data use where the rubber meets the road
9:16 let's put those insights into action so
9:20 imagine you're leading a project to
9:22 improve agricultural practices you
9:25 collect some data and you find out that
9:28 farmers are struggling to access quality
9:32 seeds data use would be using that
9:34 insight to maybe Implement a program to
9:36 make seeds more available there you go
9:38 closing the loop closing the loop
9:41 between learning and action and this is
9:42 where adaptive management is so
9:45 important okay adaptive management being
9:48 flexible responsive willing to change
9:50 course based on the data based on what
9:52 the data is telling us and the source
9:54 material highlighted some key principles
9:56 like creating space for feedback being
9:58 data driven fostering a culture of
10:01 innovation that's all about learning and
10:04 adapting and data use isn't just about
10:07 internal improvements it's about
10:09 accountability transparency we need to
10:12 show our stakeholders our funders that
10:13 the Project's making a difference this
10:15 is where progress reporting comes in
10:17 exactly and those reports should tell a
10:19 compelling story not just charts and
10:22 graphs all right use visuals narratives
10:24 yeah storytelling so imagine you're
10:27 presenting to donors about the Delta
10:28 River Project instead of just saying
10:31 hand washing practices improved you show
10:32 them a heat map oh I like that
10:34 demonstrates how that increase
10:38 correlates with a decrease in waterborne
10:40 illnesses powerful stuff so far we've
10:43 covered how to analyze interpret data
10:46 use those insights to make decisions
10:48 adapt our strategies and communicate our
10:50 findings we've talked about critical thinking
10:51 thinking
10:53 collaboration transparency but there's
10:56 one more piece to
10:59 explore evaluating project impact the
11:02 big and that's what we'll be diving into
11:05 in part three of our meal Deep dive I
11:07 can't wait stay
11:10 tuned welcome back deep divers we've
11:12 journeyed through the world of meal
11:15 learned how to monitor evaluate analyze
11:19 and use data to improve projects but
11:22 there's one final Peak to conquer impact
11:25 the ultimate goal yeah did our project
11:26 really make a difference a lasting
11:28 difference it can feel like navigating a
11:30 maze though oh it can be tricky
11:33 isolating the effects of a project no
11:35 from Real World all those other factors
11:37 it really is like solving a mystery you
11:38 have to piece together those Clues just
11:41 like a detective needs a good framework
11:43 we need a strong theory of change the to
11:46 to guide our impact evaluation because
11:49 it helps us map out how we expect our
11:51 project to create those changes exactly
11:52 without it it's like wandering around in
11:55 the dark you're lost you don't know if
11:57 you've arrived or how to get there the
11:59 TOC helps you identify those Leverage
12:01 points okay where your project can make
12:03 the biggest difference so when it comes
12:06 to actually evaluating impact what are
12:07 some of the different approaches well
12:09 there are a few different types okay
12:11 you've got formative evaluations those
12:14 happen early on they help fine-tune the
12:15 approach then there's process
12:18 evaluations yes those focus on how well
12:19 the project is being implemented are
12:21 things running smoothly are we hitting
12:24 any roadblocks and then finally we've
12:27 got impact evaluations the big one
12:29 assessing the overall effectiveness like
12:31 to think of it like different lenses
12:32 well that's a good way to put it you
12:34 know formative evaluations are like
12:37 using a magnifying glass okay to see
12:40 those fine details early on process
12:43 evaluations are like checking the engine
12:45 of a car making sure everything's
12:47 running smoothly and impact evaluations
12:49 are like taking that car for a test
12:51 drive see how it performs on the open
12:53 road but no matter what type of
12:55 evaluation we're doing there are some
12:57 guiding principles real absolutely the
13:00 source material highlighted
13:03 the oecd D dayc criteria which are
13:05 pretty widely used for evaluating
13:07 development projects so we've got relevance
13:08 relevance
13:12 efficiency Effectiveness impact and
13:14 sustainability so starting with
13:16 relevance relevance making sure the
13:19 project addresses the real needs of the
13:21 community are we solving the right
13:23 problem then we've got efficiency using
13:25 resources wisely getting the most bang
13:27 for your buck then there's Effectiveness
13:30 are we achieving those outcomes the
13:33 Milestones impact that long-term change
13:35 are those positive changes going to
13:37 stick around even after the Project
13:39 funding ends that's a really important
13:40 one it is and I think it's also
13:43 important to remember that evaluation
13:46 it's not about passing judgment yeah
13:48 it's about learning learning and
13:51 improving using those insights to do
13:52 better next time and being honest with
13:55 ourselves about what worked what didn't
13:57 it can't be just us oh absolutely
13:59 involving stakeholders so important
14:01 especially the community yeah the source
14:04 material mentions empowerment evaluation
14:06 where the community actually evaluates
14:09 the project shifting that power Dynamic
14:10 yeah and
14:14 recognizing that the people closest
14:17 often have the most valuable insights
14:19 absolutely okay so let's get practical
14:21 what are some questions we should ask
14:24 when evaluating impact well good place
14:26 to start yeah did the project achieve
14:28 its outcomes okay were those outcomes achieved
14:29 achieved
14:31 efficiently did the project have any
14:34 unintended consequences good or bad good
14:36 or bad what factors contributed to
14:38 success or challenges those are great
14:40 questions and remember this isn't a
14:42 onetime thing no it's an ongoing process
14:45 let's revisit the Delta River IDP
14:46 project they used a bunch of different
14:49 methods they did focus groups household
14:51 surveys water quality testing and they
14:54 found that their efforts were making a
14:55 difference they really were they saw a
14:59 reduction in waterborne illnesses mhm
15:00 and from those focus groups they learned
15:03 that people felt empowered healthier
15:05 more they had the numbers yeah and the
15:07 stories the quantitative qualitative
15:09 evidence and they didn't stop there they
15:12 used those findings to advocate for
15:14 continued funding and support that's
15:16 what it's all about so as we wrap up
15:20 this deep dive any key takeaways well
15:23 meal it's a powerful framework for
15:26 Designing implementing and evaluating
15:29 projects it's about ensuring that our
15:31 efforts lead to Real Results it's about
15:34 being accountable transparent embracing
15:36 learning and recognizing we don't have
15:38 all the answers but we can achieve
15:40 amazing things so next time you hear
15:43 about a project ask yourself how are
15:44 they measuring success are they
15:46 involving the community are they using
15:48 what they learn because those are the
15:50 questions that separate the truly
15:52 effective initiatives from the ones that
15:53 just go through the motions keep
15:56 exploring everyone keep learning and
15:57 keep striving to make a difference we'll