0:02 is there a principle reason why I should
0:04 delete my social media and if so what is
0:05 it mmm
0:09 there are two one of them is for your
0:11 own good and the other is for society's good
0:12 good
0:16 for your own good it's because you're
0:18 being subtly manipulated by algorithms
0:20 that are watching everything you do
0:24 constantly and then sending you changes
0:27 in your media feed in your diet that are
0:30 calculated to adjust you slightly to the
0:33 liking of some unseen advertiser and so
0:34 if you get off that you can have a
0:37 chance to experience a clear view of
0:40 yourself in your life but then the
0:42 reason for society might be even more
0:46 important Society has been gradually
0:48 darkened by this scheme in which
0:50 everyone is under surveillance all the
0:53 time and everyone is under this mild
0:56 version of behavior modification all the
1:00 time it's made people jittery and cranky
1:03 it's made teens especially depressed
1:06 which can be quite severe but it's made
1:09 our politics kind of unreal and strange
1:11 where we're not sure if elections are
1:14 real anymore we're not sure how much the
1:17 Russians affected brexit we do know that
1:19 it was a crankier affair that it might
1:21 have been otherwise you say it's not for
1:22 me as an individual is it bad for me
1:24 because I'm addicted why become
1:29 chemically hooked you have the founders
1:32 of the great Silicon Valley spying
1:34 empires like Facebook have publicly
1:36 declared that they intentionally
1:40 included addictive schemes in in their
1:43 designs now we have to say this is what
1:46 I would call almost a stealthy addiction
1:49 it's it's a statistical addiction what
1:53 it says is we will get the broad
1:55 population to use the services a lot
1:58 will get them hooked through a scheme of
2:02 rewards and punishment and the rewards
2:03 or when you're retweeted the punishment
2:05 is when you're treated badly by others
2:08 online and then within that will very
2:12 gradually start to leverage that to
2:13 change them so it
2:15 it's this very kind of stealthy
2:18 manipulation of the population so it's
2:20 not as dramatic as a heroin addict or a
2:22 gambling addict but it is the same
2:24 principle but who's doing the
2:26 manipulating I mean it there isn't some
2:28 most sort of Wizard of all sitting
2:30 behind a screen well this is the
2:32 peculiarity of the situation the people
2:34 who run the tech companies like Google
2:36 and Facebook are not doing the
2:37 manipulating they're doing the addicting
2:40 but the manipulating which rides on the
2:42 back of the addicting is the paying
2:46 customer of such a company so and and
2:48 many of those customers are not at all
2:50 bad influences they might simply be
2:52 trying to promote their cars or their
2:56 perfumes or whatever and indeed I have
2:57 sympathy for them because they're
2:59 concerned that if they don't put money
3:01 into the system nobody will know about
3:02 them and you know is it different to
3:04 just television advertising or billboard
3:06 advertising or anything else the
3:08 difference is the constant feedback loop
3:10 so when you watch the television the
3:12 television isn't watching you when you
3:14 see the billboard the billboard isn't
3:16 seeing you and vast numbers of people
3:18 see the same thing on television and see
3:20 the same billboard when you use these
3:24 new designs social media search YouTube
3:26 when you see these things you're being
3:29 observed constantly and algorithms are
3:30 taking that information and changing
3:33 what you see next and they're searching
3:34 and searching and searching and they're
3:36 just blind robots there's no evil genius
3:39 here until they find those patterns
3:41 those those little tricks that get you
3:43 and make you change your behavior in
3:45 terms of society I mean you you you
3:47 through in this you know it's making
3:49 people depressed but is there any actual
3:52 evidence for that yeah unfortunately
3:55 there's a vast amount of evidence there
3:56 have been dozens of studies at this
3:59 point including studies released by
4:02 Facebook scientists so this is this is
4:04 something we can call a consensus and
4:06 and when Facebook releases such things
4:07 they say oh but we do all these good
4:09 things to that balance it but there's
4:11 there's a general acknowledgement that
4:15 depression correlates the scariest
4:18 example is a correlation between Rises
4:20 and teen suicide and there and the rise
4:22 in use of social media and so yes
4:24 unfortunately this is real are you sure
4:27 you can blame it on social media
4:28 is it not just those two things may have
4:29 happened at the same time for other
4:32 reasons well here's a distinction we
4:33 have to make it's very similar to the
4:36 problem of global climate change we can
4:37 say statistically over the whole
4:39 population yes the correlation is real
4:42 and any particular person of course we
4:44 can't just as we can't blame any
4:46 particular storm on global warming
4:49 it's causality isn't it yeah I mean it
4:52 is causality and it's this is something
4:55 that's very well demonstrated so when
4:57 the company's own scientists are
4:59 publishing on this top it can come to
5:01 the same agreement I think it's time to
5:06 say this is real why have you sort of
5:08 turned on your own kind
5:11 I love Silicon Valley and I do not at
5:13 all feel that I've turned on my own kind
5:14 and just to be clear I'm very much a
5:16 part of this I've sold a company to
5:18 Google I'm not in any sense an outsider
5:21 I believe that what we're doing is not
5:25 in our own self-interest business
5:27 interests are a part of society if they
5:29 destroy society they destroy themselves
5:32 I believe it's very clear that we could
5:34 offer all of the good things and there
5:36 are many many good things in these
5:38 services and social media in particular
5:40 I'm convinced we can offer them without
5:42 this manipulation engine in the
5:44 background there's a world of other
5:46 business plans and I think they'd be
5:48 better for us so I don't think we're
5:52 being evil so much as we're being stupid
5:55 when it comes to Facebook has Facebook
5:59 made itself safe yet in terms of dates
6:03 of harvesting and scraping normal well
6:06 Facebook's fundamental design is one
6:11 that is it's the business model is too
6:13 addictive and then offer a channel to
6:15 you to third parties to take advantage
6:17 of that to change you in some way
6:19 without you realizing it's happening I
6:21 mean that's that's what it does so I
6:23 don't think any amount of tweaking can
6:27 fully heal it I think it needs a
6:29 different business plan I mean it's very
6:32 hard to throw a barrage of rules that's
6:34 somebody who's following certain
6:35 incentives and then expect them to
6:36 really make a difference so when my
6:39 lucky book says he's taking action and
6:40 you know he regrets what's happened and
6:42 all the rest of it you're saying he
6:45 can't make his own product a safe or
6:48 desirable product I believe that as long
6:51 as his business incentives are contrary
6:53 to the interests of the people who use
6:55 it who are different from the customers
6:58 then no matter how some serious and I
7:01 believe he's he's sincere and no matter
7:04 how clever he is he can't undo that
7:07 problem he has to go back to the basics
7:08 and change the nature of the business
7:10 planet and if he if he doesn't agree
7:12 with that and says we're just gonna
7:16 carry on how much of a important is
7:19 security of that data and the inability
7:22 to repeat what has happened with
7:25 Cambridge analyst care and all that kind
7:27 of sort of data harvesting that went on
7:29 I don't believe that this is I don't
7:30 believe that what happened with
7:33 Cambridge analytic is the worst of it
7:35 the whole system is designed for this
7:37 like let's suppose that Facebook reforms
7:39 itself so that the next Cambridge
7:41 analytic accantus access to that data
7:43 they can still get access to the same
7:47 results because the service Facebook
7:51 offers is exactly what sells yeah I mean
7:55 this is you know there are bad actors
7:59 are are are able to use Facebook in ways
8:01 that Facebook can't understand because
8:03 the way the service is designed is
8:05 fundamentally to be manipulative so I
8:09 think the data protection idea is a
8:11 sincere and good idea but it's certainly
8:13 not adequate it doesn't address the core
8:15 problem which is the manipulation engine
8:18 and as long as that is there a bad actor
8:21 can find a way to utilize it so to me
8:23 this this concern about data protection
8:27 while laudable doesn't address the core
8:28 problem do you think they're all as bad
8:31 as each other I mean you know what why
8:33 is something like YouTube which is
8:35 basically just a way of watching video
8:39 bad for you YouTube it's not necessarily
8:40 bad for you
8:42 remember this is a statistical
8:44 distribution so for some percentage of
8:46 people it'll have an effect of making
8:49 them crankier around election time and
8:51 feeling media around the time they
8:53 might be making a purchase and so forth
8:55 and the way it works is that all the
8:57 data Google can get on you much of which
9:00 comes from just your email or whatever
9:03 else it might be is fed into an engine
9:05 that compares you with other people who
9:07 share some similar traits and YouTube's
9:09 ordering of videos that are presented to
9:13 you is designed to on the one hand
9:15 maximize your engagement so you won't
9:17 stop watching but that's achieved not
9:19 just by observing you but by a multitude
9:21 of people who are similar to you and
9:23 then when you do get an ad it's
9:25 contextualized in a way that has been
9:27 shown to be effective not only for you
9:29 but for this whole population so it's
9:32 this giant statistical thing and it's
9:34 bad for you because it leeches your free
9:37 will it makes you cranky it makes the
9:38 world a little darker because you're not
9:40 perceiving reality clearly anymore
9:43 you're B it's being manipulated it's
9:49 being tricked in a way and it the people
9:53 who are paying or maybe not paying just
9:56 using the system to in a clever way to
9:59 get at you are not necessarily pleasant
10:00 people they're they're sort of the worst
10:03 actors in some cases but then some users
10:06 think look I can handle advertising you
10:07 know I know what I'm doing here I'm
10:10 getting a free service and you know they
10:11 think they're manipulating me but I know
10:15 what I'm doing the problem is that
10:18 behaviors techniques are often invisible
10:20 to the person who's being manipulated
10:22 and and this has a long history this has
10:25 been done for a long time it used to be
10:27 that the only way to be subjected to
10:29 continuous observation and modification
10:32 was to either be in an experiment you
10:34 could be in the basement of a psychology
10:36 building and have students tweaking you
10:38 for their projects or you could join a
10:40 cult or you could be in an abusive
10:42 relationship I mean this has been done
10:45 before and often the people who are in
10:46 these situations do not realize that's
10:47 happening to them in fact the whole
10:50 point is that it's it's sneaky it's it's
10:53 a it's a mechanical approach to
10:55 manipulating people and because it's
10:57 it's so algorithmic it doesn't involve
10:59 direct communication and people don't
11:00 get the cues to understand what's
11:03 happening with them why do you think
11:05 social media has had the effect
11:07 what's that it has you know is it
11:09 because of the way people respond to
11:14 things on social media well I'd like to
11:16 give you a slightly detailed answer as
11:19 quickly as I can and that is that in
11:21 traditional behaviorism you would give
11:23 an animal or a person a little treat
11:25 like candy or maybe an electric shock
11:27 and you'd go back and forth between
11:30 positive and negative feedback and when
11:31 researchers try to determine whether
11:33 positivity or negativity is more
11:35 powerful they're roughly at parity
11:38 they're both important but the
11:40 difference with social media is that the
11:43 algorithms that are are following you
11:45 respond very quickly they're looking for
11:47 the quick responses and the negative
11:49 responses like getting startled or
11:52 scared or irritated or angry tend to
11:56 rise faster than the the positive
11:59 responses like building trust or feeling
12:01 good those things rise more slowly so
12:03 the algorithms naturally catch the
12:05 negativity and amplify it and introduce
12:08 negative people to each other and all of
12:10 this and so what this does is it means
12:12 that the algorithms discovered there's
12:15 more engagement possible say by
12:17 promoting Isis and promoting the Arab
12:19 Spring and so Isis gets more mileage or
12:22 promoting the Ku Klux Klan than black
12:24 lives matter now in the big picture it's
12:27 not true that negativity is more
12:28 powerful but if you're doing this very
12:31 rapid measurement of human impulses
12:33 instead of accumulated human behavior
12:36 then it's the negativity that gets
12:38 amplified so you tend to have elections
12:41 that are more driven by rancor and abuse
12:43 and you tend to have outcomes that are
12:46 kind of crazy so the effects on the
12:49 media we consume the news as well as
12:51 also alarming because then it will be
12:54 the news that makes people angry this is
12:56 the news that gets seen in the future or
13:00 now rather than you know a more balanced
13:02 diet of what's really going on in the
13:04 world well I think what goes on on a
13:07 show like this is that you have a bit of
13:09 a longer time horizon in which by which
13:12 you measure success so you have to
13:14 impress your viewership enough to tune
13:17 in but this is over a process of days
13:18 and weeks and months and year
13:20 and you build up a sense of rapport with
13:24 your your viewership right if you're an
13:25 algorithm that's just looking at instant
13:27 responses you don't get that it's just
13:29 like how did I engage this person and
13:32 it'll be you'll you'll find that
13:34 engagement more often by irritating
13:37 people than by educating them and so is
13:41 that how you create Trump well did you
13:44 say or you know any of the other list
13:45 leaders who were doing very well at the
13:48 moment partly from the internet I I have
13:50 never known Trump but I haven't met him
13:53 a few times over a fairly long period
13:54 over thirty years actually through
13:57 different circumstances and I will say
13:59 that while I never would have voted for
14:01 him as president and I always thought he
14:05 was um somewhat untrustworthy and a bit
14:08 of a showman and a bit of a scammer he
14:10 never lost himself and became so
14:19 irritable until he had his own addiction
14:21 in this case to Twitter and it's it's
14:23 really damaged him I mean I I view Trump
14:27 in a way as a victim oh yeah absolutely
14:29 his character has been really damaged by
14:31 his Twitter addiction because of the
14:33 reaction he gets from each tweet yeah so
14:36 you know what happens in addiction is
14:38 the addict becomes hooked not just on
14:39 the good part of the addiction
14:42 experience but on the whole cycle so a
14:44 gambler is not just addicted to winning
14:45 but to this whole process where they
14:48 mostly lose and in the same way the
14:50 Twitter addict or the social media
14:52 addict becomes addicted to this
14:54 engagement which is often unpleasant
14:57 where they're engaged in these you know
14:59 really abusive exchanges with other
15:01 human beings and only once in a while is
15:04 that you know you can watch the Trump
15:05 like every once in a while there will be
15:07 this tweet where somebody likes him and
15:09 that's when he gets his little uh we
15:11 call it in the trade the dopamine hit
15:13 that's what it's called in Facebook for
15:15 instance he gets his little dopamine hit
15:18 and then he dives in for more negativity
15:19 and things then he gets it again and you
15:22 can see the addiction playing out do you
15:24 think it's possible to create a
15:28 do-gooding social networks yes I'm
15:30 absolutely positive and the way to do it
15:31 is to have a different business model wherein
15:32 wherein
15:34 so right now we've created this bizarre
15:36 society that's unprecedented where if
15:38 any two people wish to communicate over
15:40 the Internet the only way that can
15:42 happen the only way it's financed this
15:44 you're a third party who believes that
15:46 those two can be manipulated in a sneaky
15:48 way it's it's a it's an insane way to
15:51 structure civilization so we can keep
15:53 all the good stuff and there is good
15:55 stuff on social media of course we can
15:57 keep all that and just throw away the
15:59 manipulation business model and
16:01 substitute in a different business model
16:03 and and there are many alternatives that
16:05 would be better they just have to be
16:07 honest it could be a paid service like a
16:09 Netflix where you're paying for it
16:11 you're the genuine customer it has to
16:13 keep your interest it could be like a
16:16 public library it could become a public
16:18 thing that is that isn't commercial at
16:21 all that's an option but what we did in
16:23 Silicon Valley is we wanted it both ways
16:25 we wanted everything open and free but
16:27 we wanted hero entrepreneurs and hackers
16:29 and so the only way to get that was this
16:32 advertising thing that that gradually
16:33 turned into the manipulation engine as
16:36 the computers got faster and this this
16:39 weird business planet once you can see
16:40 that there are alternatives you realize
16:42 how strange it is and how unsustainable
16:44 it is this is the thing we must get rid
16:46 of we don't have to get rid of the
16:47 smartphone we don't have to get rid of
16:49 the idea of social media we just have to
16:51 get rid of the manipulation machine
16:53 that's in the background just one last
16:55 thing as well that is also obsessing
17:00 parents screen time itself do you think
17:02 that is a bad thing or is it just what's
17:06 on the screen to be frank with you I
17:08 struggle with this question because I
17:11 have an 11-year old and so I I tend to
17:14 think that manipulation time when the
17:16 kids are being observed by algorithms
17:19 and tweaked by them is vastly worse than
17:23 just screen time by itself so I'll
17:25 include video games and in the social
17:26 media you know the things that are
17:28 manipulating them because they all
17:31 similarly addictive they're addictive
17:33 but not manipulative typically another
17:36 here I'm not sure how evil we've become
17:39 lately because there might be some video
17:40 games that are using behavior mod
17:43 techniques for pay that's conceivable I
17:45 can see how that could happen
17:46 you're thinking about it out there don't
17:49 do it okay find something better to do
17:51 but the the mainstream video games are
17:54 not doing that they are addictive so
17:55 there are plenty of things that are
17:57 addictive that aren't leveraging that
17:59 for manipulation see these are two
18:00 different stages what do you think of
18:03 fault lines I have not played it done
18:05 played it because fortnight's is exactly
18:07 that it's getting people to pay for
18:10 things within their game no but see the
18:12 thing is getting them to pay is still
18:14 not manipulating them for a third party
18:16 that's getting them to buy stuff I mean
18:18 Amazon does that to get you to buy stuff
18:22 all kinds of people do that that that
18:24 might be annoying you might object to it
18:26 especially if you feel your kids are
18:28 wasting money you might object to it you
18:31 might feel it's not an ideal example of
18:33 human behavior and character and maybe
18:34 there could be a better business
18:38 whatever but it's not directly
18:40 manipulating you say to influence an
18:42 election it's not trying to change your
18:44 behavior out in the larger world and and
18:47 that's the thing that's really tragic
18:49 about designs like Facebook and Google
18:51 they are succeeding at doing that but
18:54 your advice tonight to everyone watching
18:58 this is delete all your accounts I would
19:01 like to make two very quick pitches on
19:04 that account one if you're a young
19:06 person and you've only lived with social
19:09 media your first duty is to yourself you
19:10 have to know yourself you should
19:12 experience travel you should experience
19:14 challenge to yourself you need to know
19:16 yourself and you can't know yourself
19:19 without perspective so at least give it
19:21 six months without social media and
19:24 really quit him don't like quit Facebook
19:26 would keep another Facebook thing like
19:27 whatsapp because then it'll still be
19:29 spying and manipulating get rid of the
19:31 whole thing for six months and know
19:33 yourself and then you can decide I can't
19:35 tell you what's right you have to decide
19:37 but you can't until you know yourself
19:39 and then for the rest of society I'd say
19:42 as long as we can have some small
19:44 percentage of people who are off it then
19:46 the society can have voices to give
19:48 perspective if everybody's universally
19:50 part of this thing we cannot have
19:52 perspective we cannot have a real
19:54 conversation and it's too lonely right
19:57 now you know we need more people who are
19:59 just outside of that
20:00 who are thinking without the
20:02 manipulation and I think we'll find it
20:05 extraordinarily valuable to have them
20:09 are you just New Age hippie I mean have
20:11 you just been through the mill and kind
20:13 of worked out I want to check out of all
20:16 this and let's just let's just stop
20:19 do I seem new-age to you I don't know I
20:22 mean you know I mean I hear here's what
20:24 I'll tell you the bind you put me in is
20:27 that I'd be happy to trash the new age
20:29 and and demonstrate that I'm not part of
20:31 that manner of thinking I'm certainly
20:33 not I think I hope I've come across as a
20:36 non utopian but the problem is many of
20:38 my friends in California are quite new