0:02 then why didn't you test that podcast
0:04 want to improve your software join you
0:07 rich and their guests they share
0:08 experiences and insights on the many
0:11 sides of better software delivery
0:15 hello again we're back it is uh Hugh and
0:18 Rich once more and we are joined by the
0:21 legend that is Paul Jared I've known him
0:22 for many many years you may well have
0:24 listened to him speak at various
0:27 conferences uh we normally
0:30 somewhere sitting in a bar in Copenhagen
0:33 or somewhere like that discussing
0:35 discussing what's right and what's wrong
0:37 with the industry
0:41 um yeah so you know I think I think Paul
0:43 is a creative thinker he's always
0:46 prodding and poking and challenging and
0:49 I think that was probably uh one of the
0:51 key messages from his keynote at
0:54 Eurostar last year last year
0:56 um so I think just to kind of give it
0:57 sort of fill you in a bit Paul so we've
1:00 I think rich and I and our various
1:02 guests have kind of explored quite a lot
1:06 what's what's wrong uh with with the
1:07 industry uh you know I mean the reality
1:11 is that most systems are late and have
1:13 relatively poor quality
1:15 um with some massive you know
1:17 outliers where they're doing a really
1:20 really good job so I think maybe what
1:23 we'll try and do today is brainstorm a
1:25 little bit between us about you know
1:27 what people can actually do what what
1:31 sort of small medium long-term uh maybe
1:32 just starting with a few small things
1:34 moving on to some medium things that
1:36 have actually made
1:39 um a significant difference to
1:41 the speed and the quality and that's
1:44 kind of what it's about maintainability
1:46 testability all of those types of things
1:47 that kind of drop out of what we are
1:50 really all about at the moment
1:53 um you know and and I think
1:56 most managers would probably agree that
1:58 there are there are some issues
2:01 um and then I think you know if they can
2:03 just pick up maybe one or two things
2:04 here from us that might lead them down a
2:06 path or might just actually give them a
2:08 few quick wins to improve a few things
2:10 so I'm just going to hand over to Rich
2:12 now just to kind of so you can add a bit
2:15 of color and then we'll let unleash Paul
2:16 for a little bit and then maybe turn it
2:19 more into a conversation
2:21 so I think you know based on my
2:23 experience in in some of this might be
2:25 quite boring right and then kind of very
2:27 much feeds into the Gopher go slower to
2:30 go faster kind of Mantra in terms of I
2:31 think a lot of
2:34 companies teams
2:36 um are very technology Centric and
2:37 they're looking for the next big
2:39 technology solution
2:41 um to solve a problem that isn't
2:44 necessarily a technology problem
2:47 um and you know my history of kind of
2:48 working within test teams and trying to
2:50 get automation working um
2:51 um
2:53 kind of failed to do those things a lot
2:55 a lot a lot right and
2:58 um I had the fortunate kind of I didn't
2:59 see it at the time but the fortunate
3:01 gift if you'd like of taking me over a
3:03 test data team at the time right and
3:05 trying to to understand what what does
3:07 it mean to make a test data team work
3:10 and we we built out a technology
3:12 capability uh coincidentally it was with
3:14 with with with a past company of
3:17 yourself queue and um we had a solution that
3:18 that
3:20 was brilliant but didn't necessarily
3:21 solve a problem okay because we didn't
3:23 really look at what the problem was
3:25 underneath the covers and a lot of the
3:26 time it was around that fact that people
3:27 didn't really understand what they were
3:29 testing people didn't really understand
3:30 what how the systems were supposed to
3:33 work and back to my point go slower to
3:35 go faster what we found is actually
3:38 taking the time to do more analysis up
3:39 front and I don't mean analysis
3:42 Paralysis on this I mean analysis to do
3:44 enough to get consensus to move forward
3:46 to the next iteration
3:48 um to build out
3:51 and you know I I I I I I I don't know
3:52 whether Paul regrets coming up with this
3:54 at the time but the the kind of the new
3:56 model for kind of for testing to build
3:59 out a model of what we're trying to
4:01 achieve to get consensus to get
4:04 transparency that although perceived
4:07 possibly to be over analysis
4:11 I can't kind of uh applaud teamed enough
4:13 for doing that to get ultimately the the
4:15 the the the the the industrialization
4:17 fundamentals in place to actually
4:20 accelerate far quicker and I think there
4:22 is so much to getting those fundamental
4:24 pieces in place that allowed people to
4:27 be agile with a little a
4:29 far more than they ever would looking at
4:31 that are kind of the next kind of
4:33 technology that the the proposes to do
4:34 those things
4:37 um I don't know I'll open it up there to
4:39 to yourself Paul in terms of first
4:40 question do you regret that model
4:42 because I'm sure lots of people come
4:44 back to it [Laughter]
4:47 [Laughter]
4:51 no uh well I could say certainly not no
4:54 um what I would say is I mean touching
4:55 on a couple of themes you mentioned
4:57 there about sort of fun I mean what you
4:58 called analysis and just getting closer
5:00 to the problem let's say rather than
5:03 getting to and and battled and then
5:06 bittered with the supposed solution um
5:07 um
5:09 I mean the first thing I would say is is
5:10 you can't get close enough to your
5:11 stakeholders now one of one of the
5:13 things well I call stakeholders the
5:15 people who are the customers of what you do
5:15 do
5:17 uh you know so when you test and you
5:19 produce some knowledge some information
5:21 a report whatever it is in whatever
5:24 format that is going to be taken by
5:27 someone else to usually to make some
5:30 kind of decision either to fix a bug to
5:33 uh really system to transition to
5:35 another state in the another stage in a project
5:37 project
5:39 to abandon the project completely to
5:41 talk have another meeting about
5:43 requirements whatever okay so everything
5:46 you do as a tester I think has value
5:49 one of the problems we have is we the
5:50 value is in the eye of those stakeholders
5:51 stakeholders
5:53 so the closer we can get to those
5:54 stakeholders know what they want from us
5:57 the better
5:58 sometimes it's difficult because big
6:00 organizations often have layers of you
6:03 know hierarchies and layers and buffers
6:05 between you and the real source of
6:07 knowledge and so on and so forth but a
6:08 question I was asked when
6:11 um I get down and dirty with the testers
6:12 and saying look you know what's you know
6:14 what could we do better in this in this
6:15 part of the world
6:16 I usually ask her well who's your
6:18 stakeholder and they usually look a
6:20 little bit what's up what do you mean a
6:23 stakeholder I don't talk to the chair of
6:25 the management board you know every day
6:27 obviously uh what do you mean by a
6:29 stakeholder and I think oh my God here
6:31 we go uh because like I said well okay
6:33 who reads your reports and and people
6:35 say well oh
6:37 I don't know really uh and then you say
6:39 well what do you put in those reports
6:42 and who values what you do and well the
6:44 problem I've got is I don't feel valued
6:45 at all you know and so on and like
6:47 you're starting right from from day one
6:49 on the back foot so
6:50 so
6:53 I first say look I have a think about
6:54 who are the people who will take what
6:57 you uh whatever you provide who will
6:59 achieve value in what you do but the
7:00 other thing is like well before you know
7:02 what the value is you've got to go and
7:03 meet them and say look what do you guys
7:05 want from me they are your customers
7:08 literally they are your internal
7:10 uh testing customers testing
7:13 stakeholders so uh identify who they are
7:15 uh what they're about what they're
7:17 trying to do and Achieve which should
7:19 bring common knowledge but sometimes
7:22 it's not as liquid uh and then trying to
7:24 understand what decisions they need to
7:26 make and because the testing is going to
7:28 help them make better and more confident decisions
7:29 decisions
7:31 so understanding their decision process
7:33 what they're trying to get out of uh the
7:35 project how they think they're going to
7:38 get there and their understanding of
7:39 what is really of value and really
7:42 important and what is really scary to
7:43 them and what is a real risk and a real
7:46 challenge for them to get over yeah this
7:49 is all gold dust to the testing uh kind
7:51 of community like in a project because
7:53 then you can start saying okay uh okay
7:55 so it sounds like we need to pressure
7:57 the developers a little bit more to
8:01 figure out uh or to get a better handle
8:02 on the calculations that have been done
8:06 or the user experiment is a bit more uh
8:08 important or just the end-to-end flow of
8:11 data and you know the whole you know
8:14 linked uh components you know delivering
8:16 some value at the end of a long chain of
8:18 events you know these are the kind of
8:19 things that you need to get to
8:21 understand because they give you a steer
8:23 about what your test strategy should be
8:27 yeah yeah I mean I think
8:28 yeah I mean I think
8:31 I think if you're confident of I mean
8:33 you're using the phrase testers and I'm
8:35 I'm sort of slightly wary of that phase
8:38 nowadays I mean I think it's
8:40 um let's just say I don't know whether
8:42 the word quality experts let's just sort
8:45 of critical thinkers let's just think of
8:46 the tester as someone who's a critical
8:49 thinker that's really the primary role I
8:51 mean a lot of the stuff that you do with
8:52 some of your things you know it's two
8:54 two boxes on the screen how do you test
8:55 it you've got to be you know you've got
8:57 to be a critical thinker now if you're
9:00 engaging with a stakeholder what you'll
9:03 I think you know I've done this a lot
9:04 um is that they don't know what they
9:08 don't know so you've got to do a little
9:11 engage in a in a sort of optimistic and
9:13 positive conversation with them and
9:15 basically say oh do you know that
9:17 actually probably we're spending 70 of
9:20 our time hunting hunting for bugs which
9:22 turn out not to be bugs
9:23 um and that's generally is because of
9:25 this and they'll go oh I didn't know
9:27 that why why why is that
9:29 um and then you're sort of engaging them
9:31 and say well actually it's because you
9:33 know maybe when you send some of these
9:35 things through we don't particularly
9:38 have access to some of the pieces of
9:39 information and they'll say oh yeah
9:40 we've got that that's just sitting over
9:42 here all right well fine can we
9:44 democratize that out and when do you
9:46 make the decisions on that oh yeah we
9:47 normally have a meeting once a month and
9:49 then we decide on the new meta rules you
9:50 know oh great
9:52 um okay well you know it would probably
9:53 help us out if you could probably you
9:55 know if you could just give us you know
9:56 maybe get one of us to come to that
9:58 meeting or something like that that
10:00 would probably save us maybe 20 of our
10:02 time whereas we're you know where
10:03 because we're sort of two cycles away
10:06 from you and maybe try and get them to
10:08 to learn a little bit about what it
10:10 takes to um
10:11 um
10:15 improve a system and what what what the
10:17 effects of what they're doing is upon
10:21 you now that not that's the best way you
10:22 know there are different ways that you
10:25 can remanufacture the whole process but
10:27 even just getting them just to kind of
10:29 understand a little bit more don't be
10:31 dogmatic about it don't be just say oh
10:33 this you know that's that's interesting
10:35 you know let me just explain a little
10:37 bit about about what we're spending our
10:39 time on and then they might go oh okay
10:41 and then you're just opening up that
10:42 dollar which I guess is what you're
10:43 trying to say without without just
10:45 engaging with the stakeholders yeah
10:48 it it it's easier done in small teams
10:50 there's no two ways about it uh if you
10:53 are the tester in a large program of 200
10:56 people uh it's obviously much harder to
10:58 get contact with the senior stakeholders
11:00 who in principle are leading
11:03 the effort and so should know what the
11:04 goals are and should know what their
11:06 biggest concerns are but too often
11:08 they're too far removed from the guide
11:10 on the ground however whoever you work
11:12 for and whoever he works or she works
11:14 for uh they're the people who need to
11:17 have contact and make make visible what
11:20 testers could do to say that we're here
11:22 to demonstrate things working but we're
11:24 also here to demonstrate things not
11:25 working so we can fix them and improve
11:28 the quality of whatever's delivered
11:29 uh it's a tough one
11:32 um a little anecdote though if you're in
11:35 a I was in involved in a relatively
11:36 small project there are only about a
11:38 dozen people involved or so
11:40 and it was a kind of a new business uh
11:42 being put together a startup if you like
11:44 being put together by a large Financial
11:45 Services business
11:48 and the software development was
11:51 outsourced the service itself the the
11:53 the uh
11:54 the people who are going to operate the
11:56 service they were kind of outsourced if
11:57 you like and there's really just a
12:02 project manager uh a CEO a deskaphone
12:03 and so on and so forth that was it
12:07 anyway so I I staged a kind of a a risk Workshop
12:08 Workshop
12:11 uh to try and get to the bottom of the
12:13 things that were really going to scare
12:15 uh you know and and deflect if you like
12:18 from the success of the whole program
12:20 and they're about sort of uh seven or
12:21 eight or nine people uh coming to this
12:23 meeting that I asked for and the first
12:25 place to arrive was the CEO
12:29 who wasn't invited uh and I said oh but
12:31 God met him I said okay uh oh thanks for coming
12:32 coming
12:34 um I you weren't on the invite yo why
12:36 are you here at kind of and he said well
12:39 the things on the agenda are critical to
12:42 my success as a CEO in this business so
12:44 I would be remiss if I didn't show up so
12:46 they were very deaf he'd very definitely
12:48 recognized that the conversations that
12:50 we're about to have about what we're
12:51 really trying to achieve and what
12:52 concerning the way what could go wrong
12:55 and so on and so forth uh he was as
12:56 interested in the outcome of that
12:58 meeting as anybody and contributed like
13:00 anybody else so
13:02 uh quite often people aren't still
13:04 remote we we don't talk the same
13:06 language as the guys at the board level
13:08 so we don't talk about their business
13:10 goals and the risks that as they
13:12 perceive them when in fact those are
13:15 absolutely the the things they uh live
13:18 or die by almost you know their jobs are
13:19 on the line if they don't meet the girls
13:22 or the things fails you know shot down
13:25 in flames kind of thing so once you talk
13:26 the same the right language and you make
13:29 contact with people that uh
13:31 recognize what you're talking about
13:33 you're using their language goals and
13:37 risks yeah you're in a much better place
13:39 um so I don't want to dwell Forever on
13:41 stakeholders but uh it's not always
13:43 possible obviously I think it's an
13:44 interesting one in terms of mapping out
13:47 uh what kinds of things you're trying to
13:49 prove to stakeholders in the company who
13:51 might have an interest or be accountable
13:52 for certain things right and because I
13:54 think when we're talking about testing a
13:55 lot of the time we jump straight to
13:57 functional testing we jump straight to
13:58 Performance testing and everything else
14:00 around that starts to fade away quite a
14:02 lot and I think you know absolutely
14:03 right in terms of
14:05 you know if you went to a lot of teams
14:07 and said what are you proving around
14:09 resiliency I don't think you get much of
14:11 a response a lot of the time but yet
14:13 like you say talking to the language you
14:15 know companies will live and live and
14:17 die by the resiliency of their systems right
14:18 right um
14:19 um
14:21 but I don't think that it kind of transcends
14:22 transcends
14:25 uh the kind of the low level development
14:28 and test approaches into you know what
14:30 are the resiliency that's being built
14:31 into a system and therefore how does
14:32 that then translate into what
14:35 stakeholders are and aren't
14:37 um accountable for aware of or want to
14:39 implement I think that that there's
14:42 challenges there when I pick up on um
14:43 just to kind of move slightly off the
14:44 stakeholder thing and pick up on
14:46 something you said there Paul which was
14:48 contributed to the meeting now this is a
14:50 this is a big one with me and uh you
14:51 know I mean I've been running software
14:54 companies for a for a long time
14:55 um and my wife I think I've mentioned
14:57 this earlier in the podcast said
14:59 creativity is not jealous
15:03 now if you're trying to run a meeting
15:06 where you're trying to come to a
15:07 consensus come to an understanding come
15:09 to a new idea
15:12 I think if we're looking for some advice
15:15 to give to people who are trying to make
15:17 changes in organizations then the
15:19 running of that meeting I think was
15:21 absolutely critical and maybe it's
15:23 something I mean I've written down here
15:25 sort of small skill more small small
15:28 groups of skills improvements you know
15:30 sort of personal development Etc but I
15:32 think one of the ones is actually kind
15:35 of watching how some people run meetings
15:37 and you know pick out the best of them
15:39 but the best thing you can do if you've
15:41 managed to get that CEO in and and he or she
15:42 she
15:44 um basically
15:47 what contributed in a sort of open open
15:49 way that is massive that makes such a
15:51 difference the number of meetings where
15:54 everyone's grinding out an agenda
15:56 um I think we talked about men being the
15:57 main problem here
15:59 um you know in that you know you're
16:02 basically trying to come to a general
16:06 solution and if you can not have get the
16:08 egos out of there and actually just try
16:10 and kick it around and if someone's
16:11 being quiet and say what's what do you
16:12 think then
16:14 and they might say nothing for a little
16:15 bit and then I think and then they might
16:17 just come out with one little gem and I
16:18 think the best meetings especially at
16:20 grid tools I remember this we used to
16:22 come out of the room and I say right we
16:23 came out with two good ideas can anyone
16:25 remember whose idea they were and they
16:27 went no
16:29 which is the it's not jealous in that we
16:30 managed to be able to come up with some
16:33 with some solutions and they came from
16:36 really quiet people actually I kind of
16:37 did did write them down quietly in terms
16:39 of where where the Genesis of those
16:41 ideas came from and if you can run those
16:43 meetings then I think a lot of this
16:46 communication is is is much easier
16:47 afterwards because you've sort of you've
16:48 worked on it together you've got that
16:50 collaborative experience so if this if
16:51 there's ways I don't know whether you
16:53 know of any kind of training courses
16:55 that people can go on to sort of
16:57 understand how you can manage these
16:59 meetings and not just repeat what the
17:00 last person said you know because
17:02 they're the boss you know which is the
17:04 big problem with the hierarchies you know
17:05 know
17:07 but one one is if you drop the year if
17:09 you've got the ear of the chief
17:13 executive uh which I did on a large sap
17:16 project many years ago but I learned so
17:17 much from it
17:19 um I remember going to a meeting uh of
17:23 uh I don't know about 12 12 or 14 people
17:26 and it was one of the uh engine offshore
17:27 companies who were basically talking about
17:29 about um
17:30 um
17:31 testing something really quite
17:33 peripheral to this very large system
17:36 it's a 200 million dollar sap project
17:39 and as in my first week I was told just
17:41 go to as many meetings as you can meet
17:42 everyone you can get to know them
17:44 whatever my role's an assurance manager
17:46 so I was there to oversee the testing
17:49 and to give advice earlier on and to
17:52 police it later on if you like and so I
17:54 was going to be helping people initially
17:56 but also policing what they did a bit
17:57 later on
17:59 so anyway I went to my first meeting and
18:03 one of the uh things the culture in that
18:05 oil company as it was
18:06 that if you're not if you don't know
18:08 someone in the room you either ask them
18:09 to introduce themselves or you introduce
18:11 yourself as the unknown person
18:13 so we went around the room and I I got
18:15 to meet all these guys and I said well I
18:16 work for let's say let's call him John
18:20 Smith uh no one recognized that name
18:22 until I saw at the other end of the
18:24 table someone going I can't do it but
18:27 nudging the next person saying he works
18:28 with John Smith
18:30 hey I could see it going around the room
18:32 it's like who was God
18:34 and so when I said I think this test
18:36 plan is a bit light I mean I'm I'm very
18:38 I'm an innocent bye something I'm just
18:41 looking whatever they took it extremely
18:43 seriously because they realized Craig he
18:45 reports to God you know and that is so
18:48 powerful so on the one hand it's great
18:49 to have the influence of the
18:51 stakeholders in the room if you can say
18:53 I've got the authority to say certain
18:55 things and and complain you know if I'm
18:58 talking to external phone but also to
19:01 say look uh cards on the table it's a
19:03 safe environment uh we can say anything
19:05 you know if you've got a problem speak
19:07 up let's hear it and let's if you've got
19:08 a solution even better let's hear that too
19:09 too
19:11 so to me it's all about being safe it's
19:14 about knowing that you're operating with
19:16 the authority of more senior people that
19:19 you can say what you uh need to say to
19:21 get things done to avoid problems and so on
19:22 on
19:24 um you need to be a very good listener
19:27 sometimes maybe over time I mean there's
19:28 all this is all kind of old heart you
19:31 know so meeting management is a is an
19:34 art I think some people uh have a very
19:36 what's the word uh
19:39 uh almost like a disciplinarian kind of
19:41 thing and just take control of meetings
19:43 and basically bark at people and just
19:44 get them to do things and then oh it's
19:49 yeah but I think when I'm I'm sort of
19:51 talking about their creative meetings
19:52 here now if you if you need to get some
19:54 decisions made fine you lay them out
19:57 fine you know someone's prepared for it
20:00 but I'm I'm talking about your sort of
20:03 you know what's the best way to improve
20:06 our process this system this feature
20:07 yeah you know that that is the kind of
20:10 thing where I think some training in
20:14 that space is would be really useful
20:16 because it pays massive dividends
20:19 well you mentioned critical thinking
20:21 yeah yeah yeah I mean you mentioned
20:24 critical thinking and and uh
20:26 if if the people out there don't know
20:28 what it is uh they should look it up and
20:30 just find out because it's where you you
20:33 essentially don't take anything at face
20:36 value so if if person a says makes a
20:39 statement uh you have to understand who
20:41 person a is and what perhaps their
20:43 agenda is uh what where they're coming
20:45 from and what angle of attack they're
20:47 coming from in terms of making progress
20:50 whether they are a supporter or a
20:52 resistor to the system because that
20:54 person might be managing a big team
20:55 that's going to be laid off because the
20:57 software is going to eliminate their
20:59 roles so they might not be a supporter
21:00 so you have to understand where they're
21:02 coming from you also have to understand
21:05 what their kind of background and
21:08 mentality is or attitude is to software
21:10 development they may love software and
21:12 want to help and contribute they might
21:14 be a Luddite and just refuse to get
21:16 involved not come to meetings always
21:17 negative you know so you have to
21:19 understand that
21:21 but the critical thinking aspect is
21:22 really about
21:25 can I take this person's
21:27 at that word you know what they say is
21:29 is what they believe it could be true
21:32 and they are truly trying to help and
21:33 that goes for everyone in the room so
21:35 the information that's shared in
21:38 meetings and shared on documents and
21:42 emails and slack and everywhere else
21:43 uh can you look a little bit further
21:45 than just the text you see because
21:48 you'll find schisms in projects between
21:50 individuals who may not want to work
21:52 together but have to because of the
21:54 project and people who are really
21:56 totally enthusiastic and massive
21:58 contributors and other people who really
22:00 don't you know and so critical thinking
22:02 is about saying do I take what I'm given
22:04 at face value or do I look a bit further
22:07 into it into its uh completeness its
22:09 accuracy its consistency and whether
22:11 it's true you know so when it comes to
22:14 requirements usually we have lots of
22:17 gaps in requirements that we cannot test
22:19 against and you say well if I can't test
22:22 it you can't damn well build it you know
22:24 to developers and yet they have a try
22:26 anyway you know they can always in the
22:28 absence of requirements they'll make it
22:31 up now that's a bit of a harsh criticism
22:32 of Developers
22:34 but when people are under pressure to
22:36 deliver if the requirements aren't good
22:38 to the developers have no choice but to
22:39 try and build the best do the best they
22:41 can and they will try and build software that
22:42 that
22:45 um they believe meets requirements that
22:47 may not have been explicitly stated
22:48 now the Tesla's got a different problem
22:51 in that it's very hard to create a test
22:53 without knowing what the actual outcome
22:55 of that test would be just because if
22:57 you don't have a definition of a
22:59 behavior say
23:01 any outcome will do you know because you
23:03 don't have anything to compare it with
23:06 so typically testers are great people to
23:07 have in the room when requirements are
23:11 being uh elaborated discuss formalize
23:13 whatever terminology you want to use
23:15 and so the critical thinking comes in is
23:17 that critical thinkers would challenge
23:19 your requirements in and by that I mean
23:22 they don't get annoying with people they
23:24 would say here's an example you know
23:27 here's an example of the behavior of the
23:30 software if I do a b and c I think d e
23:31 and f should happen
23:32 is that right
23:35 now you might make that challenge and
23:37 give that example knowing there's a gap
23:38 in the requirements and the answer
23:40 cannot be found in the requirements and
23:42 that's why you've asked the question
23:44 and so by asking by providing examples
23:46 it's a
23:48 that's what it's like non-challenging
23:49 challenge if that makes sense you know
23:52 it's not a difficult thing to answer and
23:55 what tends to happen is is the users
23:56 analysts stakeholders customers
23:58 whoever's in the room
24:00 will either say you've got that
24:02 perfectly right and I'll say well okay
24:05 but I got that from my imagination not
24:06 from the requirements so is there a
24:08 graph in the requirements
24:10 but they might say well how do you get
24:12 that idea and I said well there's no
24:14 requirement there so we need a
24:16 definition of how this software behaves
24:17 in this circumstance
24:19 and occasionally getting meetings where
24:22 projects are under threat of uh
24:24 cancellation I mean I've had one project
24:26 canceled because of an awkward meeting
24:28 where it just hadn't been thought
24:32 through uh and it wasn't a late stage
24:34 one of uh challenge it was quite early
24:36 but even so
24:38 um the tested role I think and the value
24:41 of what tested to is they they
24:43 don't take things for granted they don't
24:44 say okay the requirements are good
24:45 enough for me to write some code that
24:47 may or may not work that's not good
24:49 enough for a tester the tester has to
24:51 have certainty about what the software
24:54 is about to do and what it's about what
24:56 should do and how they're about to test
24:59 it so they have to have a much firmer
25:01 understanding a firmer grasp the the
25:04 interpersonal skills required are asking
25:06 good questions being a good listener an
25:09 active listener uh being diplomatic when
25:10 it comes to asking some awkward
25:12 questions and saying I think we have a
25:14 problem here because here's an example
25:17 of uh behaviors that I think none of us
25:18 understand because we haven't thought
25:20 it's through and so on so there's a
25:22 whole raft of interpersonal skills
25:24 coming to play to try and get to the
25:26 bottom of certain problems which very
25:28 often at the start and the early phases
25:31 of a project are about all about requirements
25:32 requirements
25:35 and also saying you know has anyone
25:36 thought about performance has anyone
25:38 thought about security you know I've had
25:39 that yeah we've all had those meetings
25:42 where you say I don't see on anyone's
25:44 agenda on any documentation anything about
25:46 about
25:48 reliability and availability and stuff
25:50 like that I think it's important yes or no
25:51 no
25:53 uh and a tester quite often is a person
25:55 who asked because like am I responsible
25:59 for demonstrating that this system is it
26:01 fails over correctly uh the availability
26:03 is up to your performance and security
26:05 and so on and so forth so
26:07 so
26:10 getting engaged with the people as many
26:12 people as you can as early as you can in
26:14 the projects as you can
26:16 um by and large pays off I mean I think
26:18 there's a massive contribution to make
26:20 there Rich anything to add on that I I I
26:22 I I'll just go back I I think there's a
26:25 there's a there's a real uh
26:27 uh strong point around psychological
26:31 safety right and I think the seniors in
26:34 organizations and seniors in teams set
26:36 the agenda around
26:38 um the expectation around challenging
26:40 and candid Challenge and things like
26:42 that and some organizations have got it
26:44 right some organizations say they've got
26:46 it right but haven't some organizations
26:47 really don't care
26:49 um and I think you know that that's
26:52 that's a a large factor in the ability
26:55 to I think ask a lot of critical
26:57 questions and kind of go
27:01 uh resolve a lot of systemic problems
27:03 that kind of reside in organizations for
27:05 longer long long time and you know are
27:07 the years of so many kind of failed
27:10 attempts at lots of things we will do in
27:12 tests around kind of delivery and around
27:14 trying to get things automated lots and
27:15 lots and lots of things
27:19 um to your also to your point again I I
27:21 whether those things are strong or not I
27:23 think there's always an opportunity to
27:25 pause Point very early on when
27:29 projects are forming and storming in
27:30 terms of trying to figure out what kind
27:32 of their blueprints are to go and ask a
27:35 lot of questions around you know to the
27:37 point what is it what is important what
27:38 isn't important to set out the
27:41 expectation that these questions can
27:42 either be answered now or they need
27:45 answering right because if we get too
27:47 far down the line in a project and that
27:49 question's never been answered Chances
27:50 Are You Know by the time kind of
27:53 delivery is the only thing in the
27:55 crosshairs that's gonna code way down
27:57 the list when it gets to kind of should
27:59 we resolve this thing too difficult
28:01 let's pay lip service to it and fire it
28:04 out the window and you know yeah
28:07 I like the one where we you know it was like
28:08 like
28:11 652 critical bugs which just became 652
28:14 documentation they just re-categorized
28:16 them that was that was a classic
28:18 very large company never mind
28:20 um okay good I'm just going to bring in
28:22 one more thing we haven't really talked
28:23 about tech here today maybe we should do
28:25 another session on Tech and just talk
28:26 through that I think we've you've had
28:31 another session on uh kind of AI it um
28:32 Etc now there's one thing that I think
28:34 as well with my because I I've had my
28:36 65th birthday the other day I've been
28:38 doing this I think even longer than you thought
28:39 thought
28:43 and uh you know yeah look at looking at
28:45 the uh the you know the good and the bad
28:47 teams that I've worked with over the years
28:47 years
28:49 um I think there's a thing that also
28:50 people sort of forget about is the
28:53 balance of the team you know in terms of
28:54 the different types of people that are
28:57 in that team you know um you know do you
29:00 have a snowy for example which is uh you
29:02 know from from Nationwide who every team
29:04 should have a snowy
29:06 um so you know it's just Dean parent
29:08 when you
29:11 so you know it's like I I don't want to
29:12 bring in I don't want to get you on
29:14 football or whatever but I mean you know
29:16 you don't need everyone to be to do
29:19 everything you but you do need someone
29:22 who's sort of very very anal or sort of
29:24 autistic about details and you inject
29:26 them into a certain point you do need
29:28 probably two creative people now they
29:31 might be creative in a sort of UI type
29:33 of way but they might not be
29:36 um you know creative in terms of putting
29:38 together complex chains of systems but
29:41 you might have an old an old hand who's
29:43 seen everything that's gone wrong in the
29:46 past you know that literally if we don't
29:48 get the validation right on our on our
29:50 payments messages which is you know I
29:51 don't know how many times I've seen this
29:53 one then all that's going to happen is
29:55 the bugs are gonna data's gonna creep
29:57 through it's going to make it through
29:59 one help two hop three hops four hops
30:00 the fifth hop is going to pretty much
30:03 destroy the system now if you can stop
30:04 that stuff at the beginning now what you
30:06 need is kind of an old lag in there
30:07 who's sort of seen it all and said look
30:09 we really do need to focus on this and
30:11 maybe we should focus on a much more
30:13 rigorous set of
30:17 um rules to reject Data before it gets
30:18 in and you know that's a contribution
30:20 they would make you might have then
30:21 someone who's saying oh yeah yeah we're
30:22 going to use all this latest cool
30:23 technology we can do this that and the
30:26 other thing and you say cool that's
30:28 that's really good idea but let's try
30:29 and set that up as we're going to do it
30:30 we're going to go into prototype mode
30:33 for that let's go and experiment the
30:35 difference between you know Kafka and mq
30:37 or mq light or whatever it is let's just
30:40 you know because it what was what we're
30:41 hearing from the analysts who I don't
30:43 trust at all is this is
30:45 um you know fantastic
30:47 um but then you go eh let's apply a bit
30:49 of critical thinking on this and do some
30:50 experimentation and you put that person
30:53 on that and I think having that blend
30:56 allows you to move along much more
30:58 um much more effectively I mean in
31:00 Holland they use colors I think everyone
31:01 is allocated a color based on their
31:03 personality type and they try and create
31:05 teams which which have all the different
31:08 orange Blues Reds Etc so and you can't
31:10 have everyone in the same the same type
31:11 and I think it's almost like a transfer
31:14 system in in in in I've been in America
31:15 soccer football
31:18 um so where you almost say look we've
31:19 got three Strikers here you know and
31:21 it's just not really working so maybe we
31:23 can move them around between some of the
31:25 teams so what are any thoughts on kind
31:27 of how well that's always rich on this
31:29 one in terms of what you think a good
31:31 team isn't
31:33 so so I I think um
31:35 um
31:37 there's two answers to that right in
31:38 that you've got a team that faces into
31:40 the problem today and you've got a team
31:43 that faces in and starts to
31:45 start to resolve any team that faces
31:47 into the problems of of tomorrow right
31:50 so so by that what I mean is um
31:56 a lot of teams
32:00 when their first forming I think will
32:03 have you know if you if you look at lots
32:04 of organizations lots of test teams
32:06 they'll have lots of people with a lot
32:07 of head knowledge
32:10 and separately there'll be lots of
32:11 people that have technical testing
32:13 capabilities I.E the ability to code
32:14 some Frameworks or something like that
32:17 right and those smes probably have a lot
32:22 of analytical understanding that's great
32:24 I have a silo of a test team knows a lot
32:25 of things okay what are we trying to
32:28 achieve right and so by this I'm you're
32:29 probably by saying snowy you probably
32:31 Focus me on this a bit too much around
32:33 certain certain team in a certain
32:36 situation all right so that's great for
32:38 a time being we're able to get to a
32:40 certain place but we know that's not
32:42 sustainable we know that's not what are
32:44 our objectives right so what we want to
32:45 do is we want to codify our head
32:46 knowledge we don't want it stuck in
32:49 either testers or that 25-year Unisys
32:51 person that's been in the organization
32:53 is just about to retire we don't want
32:54 that stock in their head and actually
32:56 what they know is probably Limited in
32:58 terms of its structure anyway so let's
33:01 build back to Paul's model let's build a
33:02 model of common understanding of how
33:04 systems should work or what we're trying
33:07 to prove here right and how do we then
33:10 codify that and automate that because we
33:11 want to Chuck out something as quickly
33:13 as possible but we all understand that
33:16 you know that shared knowledge that
33:17 understanding that needs to percolate
33:20 around the testing actually we you know
33:21 we've already talked about collaboration
33:24 when we need to build out relationships
33:26 with Architects with designers to make
33:28 sure that you know we're not building
33:30 these things far too late in the
33:31 development life cycle they've got skin
33:33 in the game in terms of what we're
33:34 trying to achieve as well and they're
33:37 not feeling challenged from your thing
33:38 doesn't work right actually we want to
33:41 build this thing together up front and
33:43 you know what in time if we're building
33:45 a model who cares who builds it right we
33:47 know that we want to get the different
33:49 perspectives in a room to build this
33:50 thing because we all know that there's
33:53 benefit in getting different lenses
33:55 different contexts on this thing because
33:57 we all see it in a different light right
33:59 and therefore if we understand those
34:01 things we can build out automation be
34:03 that design build and test off the back
34:06 of that we we create a lean pipeline
34:08 right of things that if it goes wrong
34:12 there is collective buy-in around what
34:14 we did and why it went wrong so it kind
34:16 of eliminates the blame game as well and
34:18 I think you know to my point
34:21 we started with a silo a very good
34:23 individual Jewels but what we built out
34:26 is a collective capability that has
34:30 um a collective accountability about it
34:32 going right or going wrong I think is
34:33 kind of
34:36 a good Evolution Evolution if you like
34:38 of a of a skill set that I think we
34:41 should be building as a kind of a team
34:42 delivering test
34:46 and wider well thoughts I I I'm reminded
34:49 of um uh
34:51 something called Duncan uh the team
34:52 roles I think if you've heard of that
34:53 it's a bit longer than two so it's been
34:56 around a long long time uh but I only I
34:58 mention it uh because it's kind of
35:00 interesting I think I'd hope be an
35:02 interesting aspect of this is that
35:06 people tend to be good at and prefer
35:07 certain roles
35:09 so uh from memory I haven't done a bell
35:13 bin test for oh decades I'd say but like
35:15 I think I was a plant or I think it was
35:17 called a controller list I was either a
35:20 team lead kind of role or the ideas guy
35:22 you know they they kind of the problem
35:24 solver you know oh he's the smart guy in
35:26 the corner give it to him he'll stole it
35:28 and that kind of game I was definitely
35:30 not a complete a finisher who basically
35:32 do all the legwork and get the job done
35:34 and get it through the door and there's
35:35 a facilitator and from memory there's
35:37 someone called a facilitator I think or
35:38 they may have a different name but
35:40 they're kind of the glue that hold the
35:42 team together they provide like the
35:44 social context if you like of the
35:45 conversations meetings all that kind of
35:47 stuff and so on
35:49 um now one of the things about being a
35:51 consultant and I
35:54 uh mentioned this only because I don't
35:56 think people are too should be too fixed
35:58 in these specific roles as a consultant
36:00 when you come into a project and you
36:01 look around you think there's no
36:02 leadership here
36:04 maybe that's the problem or the or
36:06 there's a leader but there's no uh
36:09 thinker or as there's a leader a thinker
36:10 but there are no completer finishes
36:13 everybody's facilitating but no one's
36:14 actually doing their damn job you know
36:17 so you've you find you get involved in
36:18 these projects in that way and say like
36:20 somebody needs to go away and do some
36:22 leg work for three days to come up with
36:25 a spec for the work or a plan or whatever
36:26 whatever
36:29 to get this uh project moving again
36:32 um and so what I found is uh not that I
36:34 can do every role by any means but I
36:36 found myself very often dropping into
36:38 roles I wasn't comfortable with but just
36:39 to get
36:40 the you know you get called in because
36:42 something's slipping behind Okay well
36:44 what can do to do that okay we need to
36:47 in a hurry get the automation sorted or
36:49 we need to in a hurry clarify the
36:51 requirements get some specs for the
36:54 testing to be done so we can yeah I mean
36:55 I think that's cool it's actually a
36:57 really good experience
36:59 taking over those jobs because you
37:01 actually do understand the pain I mean I
37:03 remember we did uh yeah when we were
37:06 we were doing CA uh we were bought by CA
37:08 and I said well why don't we try and get
37:10 the the Tester the sales people to come
37:13 and be a tester for a day
37:15 raged outraged
37:17 um you know but anyway they trooped in
37:20 to enchant and we sat them down and we
37:23 did uh we basically got them to uh we we
37:26 found one screen and just said okay off
37:28 you go you you test it so we got one
37:30 there and they just they just randomly
37:32 clicked around and looked grumpy and
37:33 then we switched them around and said
37:34 okay what I'm going to do now is you're
37:35 going to test it and you're gonna you're
37:37 gonna look at it and you're gonna write
37:39 down what the other person is going to
37:40 do to test it
37:42 um so then they had to kind of do that
37:43 and then thirdly we said well why don't
37:45 we just model it and we'll just generate
37:47 all the automation code and test it and
37:49 by the end of it they went oh my God
37:51 that's what people do all day long and
37:53 they basically spoke to QA managers for
37:56 their life you know and it was like oh
37:59 you know so you know I I wasn't
38:00 particularly the point of where I
38:01 started with this but your comment about
38:03 you know having to put many hats on to
38:05 solve problems actually it's really good
38:07 experience for everyone so maybe maybe
38:08 that's that's something we should incorporate
38:10 incorporate
38:11 the whole thinking hatch thing I think
38:15 it's a debono idea where uh the hats you
38:17 know you get these multi-colored hats
38:18 you know like a red blue green black
38:20 sort of stuff and I can't remember I've
38:22 I've made a little experience with this
38:24 but uh literally you could be in a
38:25 meeting and everyone uh is wearing a
38:27 certain hat and by wearing the Hat you
38:30 say I'm gonna be the uh ideas guy
38:32 um oh I'm gonna be the black hat I'm
38:34 going to be and I'm going to get this
38:35 wrong but in terms of these roles but
38:38 I'm my thinking and my attitude to the
38:42 meeting is to be much more pragmatic uh
38:45 practical get things done another one
38:46 would be I'm going to facilitate the
38:48 conversation and so on so forth so it's
38:50 a bit like Belvin but whatever but every
38:52 now and then uh the leader of the
38:54 meeting whoever it happens to be at the
38:56 time so that's right that's also perhaps
38:58 and then you change your perspective so
39:00 I'm going to be the customer now and I'm
39:02 going to be the supplier take the
39:05 suppliers uh position you know uh now
39:07 you know so and that gives people
39:09 different perspectives which is exactly
39:11 like uh sending your uh sales team into
39:13 the test team
39:18 like a recipe for disaster but um uh I'm sure
39:19 sure
39:21 I think many years ago we did a
39:23 performance test but completely manual
39:26 we it was a a help desk uh application
39:29 being built in a in a call center
39:31 and it was a brand new building brand
39:33 new teams everybody was new but the
39:34 people were there were 100 people
39:35 already you know women who have been
39:37 trained but also were there to help us
39:39 with this performance test so we got
39:41 them all in the call center we had an
39:42 earlier early version of the software
39:44 and we brought some sales people in and
39:46 they said uh and they all got stuck in
39:48 they had kind of scripts but they had a
39:50 free hand they had to do quite a lot of
39:52 what they what they were doing and they
39:53 said they'd ever had more fun in in
39:56 business yeah yeah it was like I really
39:57 enjoyed it right
39:59 um we're running out of time so uh as
40:00 ever Paul
40:02 um it usually goes on for quite a long
40:11 anyway it's been a pleasure uh seeing
40:13 your happy face again
40:15 um thanks so much for your insights I I
40:16 do feel like maybe we should do another
40:18 one in a few months and maybe just focus
40:22 on Tech and you know stuff like that
40:23 yeah yeah
40:27 anyway thanks Rich yeah uh thanks we'll
40:29 leave it there we'll leave it there so
40:30 we're just gonna leave thanks so much
40:32 bye-bye thanks for listening to this
40:34 episode of the why didn't you test that
40:36 podcast to continue the conversation
40:39 head over to
40:41 curiositysoftware.ie and don't forget to
40:45 subscribe for more episodes [Music]