1
00:00:16,630 --> 00:00:17,930
>> NARRATOR: Tonight--

2
00:00:17,930 --> 00:00:20,470
>> The race to become an A.I.
superpower is on...

3
00:00:20,470 --> 00:00:22,670
>> NARRATOR: The politics of
artificial intelligence...

4
00:00:22,670 --> 00:00:24,530
>> There will be
a Chinese tech sector

5
00:00:24,530 --> 00:00:26,200
and there will be
a American tech sector.

6
00:00:26,200 --> 00:00:27,730
>> NARRATOR: The new tech war.

7
00:00:27,730 --> 00:00:30,200
>> The more data,
the better the A.I. works.

8
00:00:30,200 --> 00:00:33,930
So in the age of A.I.,
where data is the new oil,

9
00:00:33,930 --> 00:00:36,170
China is the new Saudi Arabia.

10
00:00:36,170 --> 00:00:37,930
>> NARRATOR:
The future of work...

11
00:00:37,930 --> 00:00:40,100
>> When I increase productivity
through automation,

12
00:00:40,100 --> 00:00:42,070
jobs go away.

13
00:00:42,070 --> 00:00:46,100
>> I believe about 50% of jobs
will be somewhat

14
00:00:46,100 --> 00:00:50,000
or extremely threatened by A.I.
in the next 15 years or so.

15
00:00:50,000 --> 00:00:52,630
>> NARRATOR: A.I. and corporate
surveillance...

16
00:00:52,630 --> 00:00:55,500
>> We thought that we were
searching Google.

17
00:00:55,500 --> 00:00:57,930
We had no idea that Google
was searching us.

18
00:00:57,930 --> 00:01:00,170
>> NARRATOR: And the threat
to democracy.

19
00:01:00,170 --> 00:01:02,600
>> China is on its way
to building

20
00:01:02,600 --> 00:01:04,070
a total surveillance state.

21
00:01:04,070 --> 00:01:06,130
>> NARRATOR: Tonight on
"Frontline"...

22
00:01:06,130 --> 00:01:09,300
>> It has pervaded so many
elements of everyday life.

23
00:01:09,300 --> 00:01:11,930
How do we make it transparent
and accountable?

24
00:01:11,930 --> 00:01:13,770
>> NARRATOR:
..."In the Age of A.I."

25
00:01:16,530 --> 00:01:21,200
♪ ♪

26
00:01:34,900 --> 00:01:37,970
♪ ♪

27
00:01:42,700 --> 00:01:46,330
>> NARRATOR: This is the world's
most complex board game.

28
00:01:48,070 --> 00:01:51,530
There are more possible moves
in the game of Go

29
00:01:51,530 --> 00:01:55,800
than there are atoms
in the universe.

30
00:01:55,800 --> 00:02:01,700
Legend has it that in 2300 BCE,
Emperor Yao devised it

31
00:02:01,700 --> 00:02:08,000
to teach his son discipline,
concentration, and balance.

32
00:02:08,000 --> 00:02:12,570
And, over 4,000 years later,
this ancient Chinese game

33
00:02:12,570 --> 00:02:17,370
would signal the start
of a new industrial age.

34
00:02:17,370 --> 00:02:18,430
♪ ♪

35
00:02:25,930 --> 00:02:31,400
It was 2016, in Seoul,
South Korea.

36
00:02:31,400 --> 00:02:35,170
>> Can machines overtake
human intelligence?

37
00:02:35,170 --> 00:02:37,730
A breakthrough moment when the
world champion

38
00:02:37,730 --> 00:02:40,630
of the Asian board game Go
takes on an A.I. program

39
00:02:40,630 --> 00:02:42,530
developed by Google.

40
00:02:42,530 --> 00:02:49,430
>> (speaking Korean):

41
00:02:55,300 --> 00:02:56,770
>> In countries where
it's very popular,

42
00:02:56,770 --> 00:03:00,570
like China and Japan and,
and South Korea, to them,

43
00:03:00,570 --> 00:03:02,100
Go is not just a game, right?

44
00:03:02,100 --> 00:03:04,000
It's, like, how you learn
strategy.

45
00:03:04,000 --> 00:03:07,500
It has an almost spiritual
component.

46
00:03:07,500 --> 00:03:09,400
You know, if you talk
to South Koreans, right,

47
00:03:09,400 --> 00:03:11,500
and Lee Sedol is the world's
greatest Go player,

48
00:03:11,500 --> 00:03:13,730
he's a national hero
in South Korea.

49
00:03:13,730 --> 00:03:18,630
They were sure that Lee Sedol
would beat AlphaGo hands down.

50
00:03:18,630 --> 00:03:23,030
♪ ♪

51
00:03:23,030 --> 00:03:26,130
>> NARRATOR: Google's AlphaGo
was a computer program that,

52
00:03:26,130 --> 00:03:28,870
starting with the rules of Go

53
00:03:28,870 --> 00:03:31,330
and a database
of historical games,

54
00:03:31,330 --> 00:03:34,630
had been designed
to teach itself.

55
00:03:34,630 --> 00:03:38,700
>> I was one of the commentators
at the Lee Sedol games.

56
00:03:38,700 --> 00:03:42,700
And yes, it was watched by tens
of millions of people.

57
00:03:42,700 --> 00:03:44,300
(man speaking Korean)

58
00:03:44,300 --> 00:03:46,630
>> NARRATOR: Throughout
Southeast Asia,

59
00:03:46,630 --> 00:03:48,400
this was seen as
a sports spectacle

60
00:03:48,400 --> 00:03:49,800
with national pride at stake.

61
00:03:49,800 --> 00:03:51,030
>> Wow, that was a player guess.

62
00:03:51,030 --> 00:03:53,530
>> NARRATOR: But much more
was in play.

63
00:03:53,530 --> 00:03:55,730
This was the public unveiling

64
00:03:55,730 --> 00:03:57,830
of a form of artificial
intelligence

65
00:03:57,830 --> 00:04:00,400
called deep learning,

66
00:04:00,400 --> 00:04:03,400
that mimics the neural networks
of the human brain.

67
00:04:03,400 --> 00:04:05,430
>> So what happens with machine
learning,

68
00:04:05,430 --> 00:04:08,300
or artificial intelligence--
initially with AlphaGo--

69
00:04:08,300 --> 00:04:12,130
is that the machine is fed
all kinds of Go games,

70
00:04:12,130 --> 00:04:15,530
and then it studies them,
learns from them,

71
00:04:15,530 --> 00:04:17,830
and figures out its own moves.

72
00:04:17,830 --> 00:04:19,700
And because it's an A.I.
system--

73
00:04:19,700 --> 00:04:21,570
it's not just following
instructions,

74
00:04:21,570 --> 00:04:23,930
it's figuring out its own
instructions--

75
00:04:23,930 --> 00:04:26,930
it comes up with moves that
humans hadn't thought of before.

76
00:04:26,930 --> 00:04:31,130
So, it studies games that humans
have played, it knows the rules,

77
00:04:31,130 --> 00:04:36,130
and then it comes up
with creative moves.

78
00:04:36,130 --> 00:04:38,030
(woman speaking Korean)

79
00:04:39,600 --> 00:04:42,370
(speaking Korean):

80
00:04:42,370 --> 00:04:44,800
>> That's a very...
that's a very surprising move.

81
00:04:44,800 --> 00:04:47,670
>> I thought it was a mistake.

82
00:04:47,670 --> 00:04:51,470
>> NARRATOR: Game two, move 37.

83
00:04:51,470 --> 00:04:54,200
>> That move 37 was a move that
humans could not fathom,

84
00:04:54,200 --> 00:04:57,070
but yet it ended up being
brilliant

85
00:04:57,070 --> 00:05:00,470
and woke people up to say,

86
00:05:00,470 --> 00:05:03,100
"Wow, after thousands
of years of playing,

87
00:05:03,100 --> 00:05:06,330
we never thought about making
a move like that."

88
00:05:06,330 --> 00:05:08,370
>> Oh, he resigned.

89
00:05:08,370 --> 00:05:12,300
It looks like... Lee Sedol has
just resigned, actually.

90
00:05:12,300 --> 00:05:13,830
>> Yeah!
>> Yes.

91
00:05:13,830 --> 00:05:15,530
>> NARRATOR: In the end, the
scientists watched

92
00:05:15,530 --> 00:05:18,200
their algorithms win four
of the games.

93
00:05:18,200 --> 00:05:20,470
Lee Sedol took one.

94
00:05:20,470 --> 00:05:22,330
>> What happened with Go,
first and foremost,

95
00:05:22,330 --> 00:05:25,830
was a huge victory for deep mind
and for A.I., right?

96
00:05:25,830 --> 00:05:28,170
It wasn't that the computers
beat the humans,

97
00:05:28,170 --> 00:05:31,970
it was that, you know, one type
of intelligence beat another.

98
00:05:31,970 --> 00:05:34,230
>> NARRATOR: Artificial
intelligence had proven

99
00:05:34,230 --> 00:05:36,770
it could marshal a vast amount
of data,

100
00:05:36,770 --> 00:05:40,300
beyond anything any human
could handle,

101
00:05:40,300 --> 00:05:44,400
and use it to teach itself how
to predict an outcome.

102
00:05:44,400 --> 00:05:48,400
The commercial implications
were enormous.

103
00:05:48,400 --> 00:05:51,670
>> While AlphaGo is a,
is a toy game,

104
00:05:51,670 --> 00:05:57,530
but its success and its waking
everyone up, I think,

105
00:05:57,530 --> 00:06:03,770
is, is going to be remembered
as the pivotal moment

106
00:06:03,770 --> 00:06:07,230
where A.I. became mature

107
00:06:07,230 --> 00:06:09,100
and everybody jumped
on the bandwagon.

108
00:06:09,100 --> 00:06:10,570
♪ ♪

109
00:06:10,570 --> 00:06:14,270
>> NARRATOR: This is about the
consequences of that defeat.

110
00:06:14,270 --> 00:06:16,270
(man speaking local language)

111
00:06:16,270 --> 00:06:19,870
How the A.I. algorithms are
ushering in a new age

112
00:06:19,870 --> 00:06:24,200
of great potential and
prosperity,

113
00:06:24,200 --> 00:06:29,170
but an age that will also deepen
inequality, challenge democracy,

114
00:06:29,170 --> 00:06:35,200
and divide the world
into two A.I. superpowers.

115
00:06:35,200 --> 00:06:39,130
Tonight, five stories about how
artificial intelligence

116
00:06:39,130 --> 00:06:40,930
is changing our world.

117
00:06:40,930 --> 00:06:43,930
♪ ♪

118
00:06:51,800 --> 00:06:56,330
China has decided to chase
the A.I. future.

119
00:06:56,330 --> 00:06:58,770
>> The difference between
the internet mindset

120
00:06:58,770 --> 00:07:00,830
and the A.I. mindset...

121
00:07:00,830 --> 00:07:04,700
>> NARRATOR: A future made and
embraced by a new generation.

122
00:07:07,000 --> 00:07:10,770
>> Well, it's hard not to feel
the kind of immense energy,

123
00:07:10,770 --> 00:07:15,570
and also the obvious fact
of the demographics.

124
00:07:15,570 --> 00:07:18,770
They're mostly very younger
people,

125
00:07:18,770 --> 00:07:22,830
so that this clearly is
technology which is being

126
00:07:22,830 --> 00:07:26,030
generated by a whole new
generation.

127
00:07:26,030 --> 00:07:27,800
>> NARRATOR: Orville Schell
is one of

128
00:07:27,800 --> 00:07:30,100
America's foremost
China scholars.

129
00:07:30,100 --> 00:07:31,730
>> (speaking Mandarin)

130
00:07:31,730 --> 00:07:34,830
>> NARRATOR: He first came here
45 years ago.

131
00:07:34,830 --> 00:07:38,270
>> When I, when I first came
here, in 1975,

132
00:07:38,270 --> 00:07:40,770
Chairman Mao was still alive,

133
00:07:40,770 --> 00:07:43,300
the Cultural Revolution
was coming on,

134
00:07:43,300 --> 00:07:47,830
and there wasn't a single whiff
of anything

135
00:07:47,830 --> 00:07:49,170
of what you see here.

136
00:07:49,170 --> 00:07:50,770
It was unimaginable.

137
00:07:50,770 --> 00:07:54,500
In fact, in those years,
one very much thought,

138
00:07:54,500 --> 00:08:00,530
"This is the way China is, this
is the way it's going to be."

139
00:08:00,530 --> 00:08:02,570
And the fact that it has gone
through

140
00:08:02,570 --> 00:08:06,330
so many different changes since
is quite extraordinary.

141
00:08:06,330 --> 00:08:08,270
(man giving instructions)

142
00:08:08,270 --> 00:08:11,770
>> NARRATOR: This extraordinary
progress goes back

143
00:08:11,770 --> 00:08:14,370
to that game of Go.

144
00:08:14,370 --> 00:08:16,830
>> I think that the government
recognized

145
00:08:16,830 --> 00:08:18,300
that this was a sort of critical
thing for the future,

146
00:08:18,300 --> 00:08:20,270
and, "We need to catch up
in this," that, you know,

147
00:08:20,270 --> 00:08:22,900
"We cannot have a foreign
company showing us up

148
00:08:22,900 --> 00:08:24,300
at our own game.

149
00:08:24,300 --> 00:08:25,730
And this is going to be
something that is going to be

150
00:08:25,730 --> 00:08:27,100
critically important
in the future."

151
00:08:27,100 --> 00:08:29,230
So, you know, we called it the
Sputnik moment for,

152
00:08:29,230 --> 00:08:31,000
for the Chinese government--

153
00:08:31,000 --> 00:08:33,970
the Chinese government kind of
woke up.

154
00:08:33,970 --> 00:08:36,600
>> (translated): As we often say
in China,

155
00:08:36,600 --> 00:08:39,700
"The beginning is the most
difficult part."

156
00:08:39,700 --> 00:08:42,630
>> NARRATOR: In 2017, Xi Jinping
announced

157
00:08:42,630 --> 00:08:44,570
the government's bold new plans

158
00:08:44,570 --> 00:08:47,570
to an audience
of foreign diplomats.

159
00:08:47,570 --> 00:08:51,000
China would catch up with the
U.S. in artificial intelligence

160
00:08:51,000 --> 00:08:55,170
by 2025 and lead the world
by 2030.

161
00:08:55,170 --> 00:08:57,500
>> (translated): ...and
intensified cooperation

162
00:08:57,500 --> 00:09:00,270
in frontier areas such as
digital economy,

163
00:09:00,270 --> 00:09:02,930
artificial intelligence,
nanotechnology,

164
00:09:02,930 --> 00:09:05,230
and accounting computing.

165
00:09:05,230 --> 00:09:08,730
♪ ♪

166
00:09:11,530 --> 00:09:15,200
>> NARRATOR: Today, China leads
the world in e-commerce.

167
00:09:18,370 --> 00:09:22,070
Drones deliver to rural
villages.

168
00:09:22,070 --> 00:09:25,070
And a society that bypassed
credit cards

169
00:09:25,070 --> 00:09:28,030
now shops in stores
without cashiers,

170
00:09:28,030 --> 00:09:33,200
where the currency
is facial recognition.

171
00:09:33,200 --> 00:09:36,230
>> No country has ever moved
that fast.

172
00:09:36,230 --> 00:09:38,730
And in a short two-and-a-half
years,

173
00:09:38,730 --> 00:09:43,400
China's A.I. implementation
really went from minimal amount

174
00:09:43,400 --> 00:09:47,230
to probably about
17 or 18 unicorns,

175
00:09:47,230 --> 00:09:50,000
that is billion-dollar
companies, in A.I. today.

176
00:09:50,000 --> 00:09:55,130
And that, that progress is,
is hard to believe.

177
00:09:55,130 --> 00:09:57,830
>> NARRATOR: The progress was
powered by a new generation

178
00:09:57,830 --> 00:10:01,870
of ambitious young techs pouring
out of Chinese universities,

179
00:10:01,870 --> 00:10:05,570
competing with each other
for new ideas,

180
00:10:05,570 --> 00:10:11,630
and financed by a new cadre of
Chinese venture capitalists.

181
00:10:11,630 --> 00:10:13,600
This is Sinovation,

182
00:10:13,600 --> 00:10:17,100
created by U.S.-educated A.I.
scientist and businessman

183
00:10:17,100 --> 00:10:19,000
Kai-Fu Lee.

184
00:10:19,000 --> 00:10:24,170
>> These unicorns-- we've got
one, two, three, four, five,

185
00:10:24,170 --> 00:10:27,300
six, in the general A.I. area.

186
00:10:27,300 --> 00:10:29,630
And unicorn means a
billion-dollar company,

187
00:10:29,630 --> 00:10:33,870
a company whose valuation
or market capitalization

188
00:10:33,870 --> 00:10:36,870
is at $1 billion or higher.

189
00:10:36,870 --> 00:10:42,830
I think we put two unicorns
to show $5 billion or higher.

190
00:10:42,830 --> 00:10:45,300
>> NARRATOR: Kai-Fu Lee was born
in Taiwan.

191
00:10:45,300 --> 00:10:48,530
His parents sent him
to high school in Tennessee.

192
00:10:48,530 --> 00:10:51,270
His PhD thesis
at Carnegie Mellon

193
00:10:51,270 --> 00:10:53,900
was on computer speech
recognition,

194
00:10:53,900 --> 00:10:55,570
which took him to Apple.

195
00:10:55,570 --> 00:10:57,900
>> Well, reality is a step
closer to science fiction,

196
00:10:57,900 --> 00:11:00,770
with Apple Computers'
new developed program...

197
00:11:00,770 --> 00:11:03,830
>> NARRATOR: And at 31,
an early measure of fame.

198
00:11:03,830 --> 00:11:06,100
>> Kai-Fu Lee,
the inventor of Apple's

199
00:11:06,100 --> 00:11:07,530
speech-recognition technology.

200
00:11:07,530 --> 00:11:10,400
>> Casper, copy this
to Make Write 2.

201
00:11:10,400 --> 00:11:12,730
Casper, paste.

202
00:11:12,730 --> 00:11:15,600
Casper, 72-point italic outline.

203
00:11:15,600 --> 00:11:18,970
>> NARRATOR: He would move on to
Microsoft research in Asia

204
00:11:18,970 --> 00:11:21,430
and became the head
of Google China.

205
00:11:21,430 --> 00:11:26,530
Ten years ago, he started
Sinovation in Beijing,

206
00:11:26,530 --> 00:11:30,700
and began looking for promising
startups and A.I. talent.

207
00:11:30,700 --> 00:11:33,500
>> So, the Chinese
entrepreneurial companies

208
00:11:33,500 --> 00:11:35,500
started as copycats.

209
00:11:35,500 --> 00:11:39,570
But over the last 15 years,
China has developed its own form

210
00:11:39,570 --> 00:11:45,100
of entrepreneurship, and that
entrepreneurship is described

211
00:11:45,100 --> 00:11:50,130
as tenacious, very fast,
winner-take-all,

212
00:11:50,130 --> 00:11:52,930
and incredible work ethic.

213
00:11:52,930 --> 00:11:57,430
I would say these few thousand
Chinese top entrepreneurs,

214
00:11:57,430 --> 00:11:59,230
they could take on any
entrepreneur

215
00:11:59,230 --> 00:12:01,400
anywhere in the world.

216
00:12:01,400 --> 00:12:04,170
>> NARRATOR: Entrepreneurs like
Cao Xudong,

217
00:12:04,170 --> 00:12:10,100
the 33-year-old C.E.O. of
a new startup called Momenta.

218
00:12:10,100 --> 00:12:12,700
This is a ring road around
Beijing.

219
00:12:12,700 --> 00:12:15,470
The car is driving itself.

220
00:12:15,470 --> 00:12:18,730
♪ ♪

221
00:12:21,230 --> 00:12:24,470
>> You see, another cutting,
another cutting-in.

222
00:12:24,470 --> 00:12:26,670
>> Another cut-in, yeah, yeah.

223
00:12:26,670 --> 00:12:29,470
>> NARRATOR: Cao has no doubt
about the inevitability

224
00:12:29,470 --> 00:12:33,430
of autonomous vehicles.

225
00:12:33,430 --> 00:12:39,130
>> Just like AlphaGo can beat
the human player in, in Go,

226
00:12:39,130 --> 00:12:43,570
I think the machine will
definitely surpass

227
00:12:43,570 --> 00:12:47,370
the human driver, in the end.

228
00:12:47,370 --> 00:12:48,730
>> NARRATOR: Recently, there
have been cautions

229
00:12:48,730 --> 00:12:53,530
about how soon autonomous
vehicles will be deployed,

230
00:12:53,530 --> 00:12:55,700
but Cao and his team are
confident

231
00:12:55,700 --> 00:12:58,730
they're in for the long haul.

232
00:12:58,730 --> 00:13:01,030
>> U.S. will be the first
to deploy,

233
00:13:01,030 --> 00:13:03,830
but China may be the first
to popularize.

234
00:13:03,830 --> 00:13:05,270
It is 50-50 right now.

235
00:13:05,270 --> 00:13:07,000
U.S. is ahead in technology.

236
00:13:07,000 --> 00:13:10,030
China has a larger market,
and the Chinese government

237
00:13:10,030 --> 00:13:12,870
is helping with infrastructure
efforts--

238
00:13:12,870 --> 00:13:16,100
for example, building a new city
the size of Chicago

239
00:13:16,100 --> 00:13:18,670
with autonomous driving enabled,

240
00:13:18,670 --> 00:13:21,700
and also a new highway that has
sensors built in

241
00:13:21,700 --> 00:13:24,230
to help autonomous vehicle
be safer.

242
00:13:24,230 --> 00:13:27,470
>> NARRATOR: Their early
investors included

243
00:13:27,470 --> 00:13:29,470
Mercedes-Benz.

244
00:13:29,470 --> 00:13:33,430
>> I feel very lucky and very
inspiring

245
00:13:33,430 --> 00:13:38,000
and very exciting that we're
living in this era.

246
00:13:38,000 --> 00:13:40,800
♪ ♪

247
00:13:40,800 --> 00:13:42,630
>> NARRATOR: Life in China is
largely conducted

248
00:13:42,630 --> 00:13:45,000
on smartphones.

249
00:13:45,000 --> 00:13:48,270
A billion people use WeChat,
the equivalent of Facebook,

250
00:13:48,270 --> 00:13:51,030
Messenger, and PayPal,
and much more,

251
00:13:51,030 --> 00:13:54,200
combined into just one
super-app.

252
00:13:54,200 --> 00:13:55,870
And there are many more.

253
00:13:55,870 --> 00:14:00,070
>> China is the best place
for A.I. implementation today,

254
00:14:00,070 --> 00:14:04,230
because the vast amount of data
that's available in China.

255
00:14:04,230 --> 00:14:07,670
China has a lot more users than
any other country,

256
00:14:07,670 --> 00:14:10,700
three to four times more than
the U.S.

257
00:14:10,700 --> 00:14:14,900
There are 50 times more mobile
payments than the U.S.

258
00:14:14,900 --> 00:14:17,300
There are ten times more food
deliveries,

259
00:14:17,300 --> 00:14:21,030
which serve as data to learn
more about user behavior

260
00:14:21,030 --> 00:14:22,870
than the U.S.

261
00:14:22,870 --> 00:14:26,570
300 times more shared bicycle
rides,

262
00:14:26,570 --> 00:14:30,400
and each shared bicycle ride
has all kinds of sensors

263
00:14:30,400 --> 00:14:32,730
submitting data up to the cloud.

264
00:14:32,730 --> 00:14:36,230
We're talking about maybe ten
times more data than the U.S.,

265
00:14:36,230 --> 00:14:41,230
and A.I. is basically run on
data and fueled by data.

266
00:14:41,230 --> 00:14:44,400
The more data, the better
the A.I. works,

267
00:14:44,400 --> 00:14:47,500
more importantly than how
brilliant the researcher is

268
00:14:47,500 --> 00:14:49,000
working on the problem.

269
00:14:49,000 --> 00:14:54,100
So, in the age of A.I.,
where data is the new oil,

270
00:14:54,100 --> 00:14:57,370
China is the new Saudi Arabia.

271
00:14:57,370 --> 00:14:59,570
>> NARRATOR: And access to all
that data

272
00:14:59,570 --> 00:15:02,870
means that the deep-learning
algorithm can quickly predict

273
00:15:02,870 --> 00:15:05,370
behavior, like the
creditworthiness of someone

274
00:15:05,370 --> 00:15:06,870
wanting a short-term loan.

275
00:15:06,870 --> 00:15:09,270
>> Here is our application.

276
00:15:09,270 --> 00:15:13,900
And customer can choose how many
money they want to borrow

277
00:15:13,900 --> 00:15:16,830
and how long they want
to borrow,

278
00:15:16,830 --> 00:15:21,130
and they can input
their datas here.

279
00:15:21,130 --> 00:15:27,530
And after, after that, you can
just borrow very quickly.

280
00:15:27,530 --> 00:15:31,030
>> NARRATOR: The C.E.O. shows us
how quickly you can get a loan.

281
00:15:31,030 --> 00:15:33,100
>> It is, it has done.

282
00:15:33,100 --> 00:15:35,430
>> NARRATOR: It takes an average
of eight seconds.

283
00:15:35,430 --> 00:15:38,400
>> It has passed to banks.
>> Wow.

284
00:15:38,400 --> 00:15:40,170
>> NARRATOR:
In the eight seconds,

285
00:15:40,170 --> 00:15:42,930
the algorithm has assessed
5,000 personal features

286
00:15:42,930 --> 00:15:44,630
from all your data.

287
00:15:44,630 --> 00:15:50,500
>> 5,000 features that is
related with the delinquency,

288
00:15:50,500 --> 00:15:57,800
when maybe the banks only use
few, maybe, maybe ten features

289
00:15:57,800 --> 00:16:02,170
when they are doing
their risk amendment.

290
00:16:02,170 --> 00:16:03,630
>> NARRATOR: Processing millions
of transactions,

291
00:16:03,630 --> 00:16:06,930
it'll dig up features that would
never be apparent

292
00:16:06,930 --> 00:16:11,870
to a human loan officer,
like how confidently you type

293
00:16:11,870 --> 00:16:15,600
your loan application,
or, surprisingly,

294
00:16:15,600 --> 00:16:18,670
if you keep your cell phone
battery charged.

295
00:16:18,670 --> 00:16:21,300
>> It's very interesting, the
battery of the phone

296
00:16:21,300 --> 00:16:24,170
is related with their
delinquency rate.

297
00:16:24,170 --> 00:16:26,530
Someone who has much more
lower battery,

298
00:16:26,530 --> 00:16:31,400
they get much more dangerous
than others.

299
00:16:31,400 --> 00:16:34,370
>> It's probably unfathomable
to an American

300
00:16:34,370 --> 00:16:39,600
how a country can dramatically
evolve itself

301
00:16:39,600 --> 00:16:43,470
from a copycat laggard to,
all of a sudden,

302
00:16:43,470 --> 00:16:48,030
to nearly as good as the U.S. in
technology.

303
00:16:48,030 --> 00:16:50,300
>> NARRATOR: Like this
facial-recognition startup

304
00:16:50,300 --> 00:16:51,800
he invested in.

305
00:16:51,800 --> 00:16:56,470
Megvii was started by three
young graduates in 2011.

306
00:16:56,470 --> 00:17:00,700
It's now a world leader in using
A.I. to identify people.

307
00:17:03,530 --> 00:17:05,000
>> It's pretty fast.

308
00:17:05,000 --> 00:17:07,530
For example,
on the mobile device,

309
00:17:07,530 --> 00:17:10,670
we have timed the
facial-recognition speed.

310
00:17:10,670 --> 00:17:13,830
It's actually less
than 100 milliseconds.

311
00:17:13,830 --> 00:17:15,830
So, that's very, very fast.

312
00:17:15,830 --> 00:17:19,970
So 0.1 second that we can, we
will be able to recognize you,

313
00:17:19,970 --> 00:17:24,200
even on a mobile device.

314
00:17:24,200 --> 00:17:26,300
>> NARRATOR: The company claims
the system is better

315
00:17:26,300 --> 00:17:30,170
than any human at identifying
people in its database.

316
00:17:30,170 --> 00:17:33,770
And for those who aren't,
it can describe them.

317
00:17:33,770 --> 00:17:36,530
Like our director--
what he's wearing,

318
00:17:36,530 --> 00:17:42,230
and a good guess at his age,
missing it by only a few months.

319
00:17:42,230 --> 00:17:46,970
>> We are the first one to
really take facial recognition

320
00:17:46,970 --> 00:17:50,570
to commercial quality.

321
00:17:50,570 --> 00:17:52,070
>> NARRATOR: That's why in
Beijing today,

322
00:17:52,070 --> 00:17:57,630
you can pay for your KFC
with a smile.

323
00:17:57,630 --> 00:17:59,070
>> You know, it's not so
surprising,

324
00:17:59,070 --> 00:18:01,230
we've seen Chinese companies
catching up to the U.S.

325
00:18:01,230 --> 00:18:02,630
in technology for a long time.

326
00:18:02,630 --> 00:18:05,200
And so, if particular effort
and attention is paid

327
00:18:05,200 --> 00:18:07,700
in a specific sector,
it's not so surprising

328
00:18:07,700 --> 00:18:09,600
that they would surpass
the rest of the world.

329
00:18:09,600 --> 00:18:12,000
And facial recognition is one of
the, really the first places

330
00:18:12,000 --> 00:18:15,300
we've seen that start to happen.

331
00:18:15,300 --> 00:18:18,230
>> NARRATOR: It's a technology
prized by the government,

332
00:18:18,230 --> 00:18:23,370
like this program in Shenzhen
to discourage jaywalking.

333
00:18:23,370 --> 00:18:27,400
Offenders are shamed in public--
and with facial recognition,

334
00:18:27,400 --> 00:18:31,330
can be instantly fined.

335
00:18:31,330 --> 00:18:34,570
Critics warn that the government
and some private companies

336
00:18:34,570 --> 00:18:37,170
have been building a national
database

337
00:18:37,170 --> 00:18:41,330
from dozens of experimental
social-credit programs.

338
00:18:41,330 --> 00:18:43,500
>> The government wants to
integrate

339
00:18:43,500 --> 00:18:48,670
all these individual behaviors,
or corporations' records,

340
00:18:48,670 --> 00:18:55,670
into some kind of metrics and
compute out a single number

341
00:18:55,670 --> 00:18:59,030
or set of number associated
with a individual,

342
00:18:59,030 --> 00:19:04,670
a citizen, and using that,
to implement a incentive

343
00:19:04,670 --> 00:19:06,130
or punishment system.

344
00:19:06,130 --> 00:19:07,400
>> NARRATOR: A high
social-credit number

345
00:19:07,400 --> 00:19:11,100
can be rewarded with discounts
on bus fares.

346
00:19:11,100 --> 00:19:15,800
A low number can lead
to a travel ban.

347
00:19:15,800 --> 00:19:18,430
Some say it's very popular
with a Chinese public

348
00:19:18,430 --> 00:19:21,500
that wants to punish
bad behavior.

349
00:19:21,500 --> 00:19:25,070
Others see a future that rewards
party loyalty

350
00:19:25,070 --> 00:19:28,570
and silences criticism.

351
00:19:28,570 --> 00:19:32,970
>> Right now, there is no final
system being implemented.

352
00:19:32,970 --> 00:19:41,070
And from those experiments, we
already see that the possibility

353
00:19:41,070 --> 00:19:44,400
of what this social-credit
system can do to individual.

354
00:19:44,400 --> 00:19:48,400
It's very powerful--
Orwellian-like--

355
00:19:48,400 --> 00:19:56,170
and it's extremely troublesome
in terms of civil liberty.

356
00:19:56,170 --> 00:19:58,270
>> NARRATOR: Every evening
in Shanghai,

357
00:19:58,270 --> 00:20:01,270
ever-present cameras record the
crowds

358
00:20:01,270 --> 00:20:03,430
as they surge down to the Bund,

359
00:20:03,430 --> 00:20:07,530
the promenade along the banks
of the Huangpu River.

360
00:20:07,530 --> 00:20:10,830
Once the great trading houses of
Europe came here to do business

361
00:20:10,830 --> 00:20:12,570
with the Middle Kingdom.

362
00:20:12,570 --> 00:20:15,600
In the last century,
they were all shut down

363
00:20:15,600 --> 00:20:18,200
by Mao's revolution.

364
00:20:18,200 --> 00:20:20,530
But now, in the age of A.I.,

365
00:20:20,530 --> 00:20:22,670
people come here to take
in a spectacle

366
00:20:22,670 --> 00:20:26,100
that reflects China's
remarkable progress.

367
00:20:26,100 --> 00:20:28,630
(spectators gasp)

368
00:20:28,630 --> 00:20:32,070
And illuminates the great
political paradox of capitalism

369
00:20:32,070 --> 00:20:37,200
taken root
in the communist state.

370
00:20:37,200 --> 00:20:40,800
>> People have called it
market Leninism,

371
00:20:40,800 --> 00:20:43,570
authoritarian capitalism.

372
00:20:43,570 --> 00:20:46,970
We are watching a kind
of a Petri dish

373
00:20:46,970 --> 00:20:54,270
in which an experiment of, you
know, extraordinary importance

374
00:20:54,270 --> 00:20:55,970
to the world is
being carried out.

375
00:20:55,970 --> 00:20:59,030
Whether you can combine these
things

376
00:20:59,030 --> 00:21:02,170
and get something
that's more powerful,

377
00:21:02,170 --> 00:21:04,870
that's coherent,
that's durable in the world.

378
00:21:04,870 --> 00:21:07,600
Whether you can bring together
a one-party state

379
00:21:07,600 --> 00:21:12,370
with an innovative sector,
both economically

380
00:21:12,370 --> 00:21:14,400
and technologically innovative,

381
00:21:14,400 --> 00:21:20,700
and that's something we thought
could not coexist.

382
00:21:20,700 --> 00:21:23,070
>> NARRATOR:
As China reinvents itself,

383
00:21:23,070 --> 00:21:25,170
it has set its sights
on leading the world

384
00:21:25,170 --> 00:21:29,170
in artificial intelligence
by 2030.

385
00:21:29,170 --> 00:21:32,230
But that means taking on the
world's most innovative

386
00:21:32,230 --> 00:21:34,100
A.I. culture.

387
00:21:34,100 --> 00:21:37,600
♪ ♪

388
00:21:46,900 --> 00:21:49,930
On an interstate
in the U.S. Southwest,

389
00:21:49,930 --> 00:21:52,830
artificial intelligence is at
work solving the problem

390
00:21:52,830 --> 00:21:56,030
that's become emblematic
of the new age,

391
00:21:56,030 --> 00:21:58,800
replacing a human driver.

392
00:21:58,800 --> 00:22:04,370
♪ ♪

393
00:22:04,370 --> 00:22:08,730
This is the company's C.E.O.,
24-year-old Alex Rodrigues.

394
00:22:11,630 --> 00:22:13,800
>> The more things we build
successfully,

395
00:22:13,800 --> 00:22:15,970
the less people ask questions

396
00:22:15,970 --> 00:22:18,870
about how old you are when you
have working trucks.

397
00:22:18,870 --> 00:22:21,800
>> NARRATOR: And this is what
he's built.

398
00:22:21,800 --> 00:22:24,670
Commercial goods are being
driven from California

399
00:22:24,670 --> 00:22:29,170
to Arizona on Interstate 10.

400
00:22:29,170 --> 00:22:34,270
There is a driver in the cab,
but he's not driving.

401
00:22:34,270 --> 00:22:40,630
It's a path set by a C.E.O.
with an unusual CV.

402
00:22:40,630 --> 00:22:42,930
>> Are we ready, Henry?

403
00:22:42,930 --> 00:22:47,730
The aim is to score these pucks
into the scoring area.

404
00:22:47,730 --> 00:22:51,400
So I, I did competitive robotics
starting when I was 11,

405
00:22:51,400 --> 00:22:53,130
and I took it very, very
seriously.

406
00:22:53,130 --> 00:22:55,900
To, to give you a sense, I won
the Robotics World Championships

407
00:22:55,900 --> 00:22:57,830
for the first time
when I was 13.

408
00:22:57,830 --> 00:22:59,430
I've been to worlds seven times

409
00:22:59,430 --> 00:23:02,200
between the ages of 13
and 20-ish.

410
00:23:02,200 --> 00:23:04,330
I eventually founded a team,

411
00:23:04,330 --> 00:23:07,000
did a lot of work at a
very high competitive level.

412
00:23:07,000 --> 00:23:08,470
Things looking pretty good.

413
00:23:08,470 --> 00:23:10,930
>> NARRATOR: This was a
prototype of sorts,

414
00:23:10,930 --> 00:23:15,130
from which he has built his
multi-million-dollar company.

415
00:23:15,130 --> 00:23:18,100
>> I hadn't built a robot in a
while, wanted to get back to it,

416
00:23:18,100 --> 00:23:21,030
and felt that this was by far
the most exciting piece

417
00:23:21,030 --> 00:23:22,930
of robotics technology that was
up and coming.

418
00:23:22,930 --> 00:23:25,170
A lot of people told us we
wouldn't be able to build it.

419
00:23:25,170 --> 00:23:28,570
But knew roughly the techniques
that you would use.

420
00:23:28,570 --> 00:23:30,370
And I was pretty confident that
if you put them together,

421
00:23:30,370 --> 00:23:32,330
you would get something
that worked.

422
00:23:32,330 --> 00:23:35,900
Took the summer off, built in my
parents' garage a golf cart

423
00:23:35,900 --> 00:23:40,470
that could drive itself.

424
00:23:40,470 --> 00:23:42,430
>> NARRATOR: That golf cart
got the attention

425
00:23:42,430 --> 00:23:45,400
of Silicon Valley,
and the first of several rounds

426
00:23:45,400 --> 00:23:47,570
of venture capital.

427
00:23:47,570 --> 00:23:50,670
He formed a team and then
decided the business opportunity

428
00:23:50,670 --> 00:23:53,700
was in self-driving trucks.

429
00:23:53,700 --> 00:23:56,470
He says there's also
a human benefit.

430
00:23:56,470 --> 00:23:58,630
>> If we can build a truck
that's ten times safer

431
00:23:58,630 --> 00:24:02,770
than a human driver, then not
much else actually matters.

432
00:24:02,770 --> 00:24:05,770
When we talk to regulators,
especially,

433
00:24:05,770 --> 00:24:08,930
everyone agrees that the only
way that we're going to get

434
00:24:08,930 --> 00:24:11,770
to zero highway deaths,
which is everyone's objective,

435
00:24:11,770 --> 00:24:13,800
is to use self-driving.

436
00:24:13,800 --> 00:24:17,030
And so, I'm sure you've heard
the statistic,

437
00:24:17,030 --> 00:24:19,230
more than 90% of all crashes

438
00:24:19,230 --> 00:24:20,870
have a human driver
as the cause.

439
00:24:20,870 --> 00:24:24,230
So if you want to solve
traffic fatalities,

440
00:24:24,230 --> 00:24:28,170
which, in my opinion, are the
single biggest tragedy

441
00:24:28,170 --> 00:24:30,970
that happens year after year
in the United States,

442
00:24:30,970 --> 00:24:33,800
this is the only solution.

443
00:24:33,800 --> 00:24:36,230
>> NARRATOR:
It's an ambitious goal,

444
00:24:36,230 --> 00:24:38,430
but only possible because
of the recent breakthroughs

445
00:24:38,430 --> 00:24:40,170
in deep learning.

446
00:24:40,170 --> 00:24:42,300
>> Artificial intelligence is
one of those key pieces

447
00:24:42,300 --> 00:24:46,530
that has made it possible now
to do driverless vehicles

448
00:24:46,530 --> 00:24:49,070
where it wasn't possible
ten years ago,

449
00:24:49,070 --> 00:24:53,870
particularly in the ability
to see and understand scenes.

450
00:24:53,870 --> 00:24:57,130
A lot of people don't know this,
but it's remarkably hard

451
00:24:57,130 --> 00:24:58,870
for computers,
until very, very recently,

452
00:24:58,870 --> 00:25:02,800
to do even the most basic
visual tasks,

453
00:25:02,800 --> 00:25:04,530
like seeing a picture
of a person

454
00:25:04,530 --> 00:25:06,070
and knowing that it's a person.

455
00:25:06,070 --> 00:25:09,270
And we've made gigantic strides
with artificial intelligence

456
00:25:09,270 --> 00:25:11,530
in being able to see and
understanding tasks,

457
00:25:11,530 --> 00:25:14,000
and that's obviously fundamental
to being able to understand

458
00:25:14,000 --> 00:25:15,770
the world around you
with the sensors that,

459
00:25:15,770 --> 00:25:19,870
that you have available.

460
00:25:19,870 --> 00:25:21,270
>> NARRATOR: That's now possible

461
00:25:21,270 --> 00:25:23,970
because of the algorithms
written by Yoshua Bengio

462
00:25:23,970 --> 00:25:28,070
and a small group of scientists.

463
00:25:28,070 --> 00:25:30,630
>> There are many aspects
of the world

464
00:25:30,630 --> 00:25:34,200
which we can't explain
with words.

465
00:25:34,200 --> 00:25:36,400
And that part of our knowledge
is actually

466
00:25:36,400 --> 00:25:39,170
probably the majority of it.

467
00:25:39,170 --> 00:25:41,400
So, like, the stuff we can
communicate verbally

468
00:25:41,400 --> 00:25:43,370
is the tip of the iceberg.

469
00:25:43,370 --> 00:25:48,600
And so to get at the bottom of
the iceberg, the solution was,

470
00:25:48,600 --> 00:25:53,000
the computers have to acquire
that knowledge by themselves

471
00:25:53,000 --> 00:25:54,500
from data, from examples.

472
00:25:54,500 --> 00:25:58,400
Just like children learn,
most not from their teachers,

473
00:25:58,400 --> 00:26:01,370
but from interacting
with the world,

474
00:26:01,370 --> 00:26:03,500
and playing around, and, and
trying things

475
00:26:03,500 --> 00:26:05,570
and seeing what works
and what doesn't work.

476
00:26:05,570 --> 00:26:07,870
>> NARRATOR: This is an early
demonstration.

477
00:26:07,870 --> 00:26:12,470
In 2013, deep-mind scientists
set a machine-learning program

478
00:26:12,470 --> 00:26:16,070
on the Atari video game
Breakout.

479
00:26:16,070 --> 00:26:19,870
The computer was only told
the goal-- to win the game.

480
00:26:19,870 --> 00:26:24,230
After 100 games, it learned to
use the bat at the bottom

481
00:26:24,230 --> 00:26:27,770
to hit the ball and break
the bricks at the top.

482
00:26:27,770 --> 00:26:33,030
After 300, it could do that
better than a human player.

483
00:26:33,030 --> 00:26:37,970
After 500 games, it came up with
a creative way to win the game--

484
00:26:37,970 --> 00:26:40,730
by digging a tunnel on the side

485
00:26:40,730 --> 00:26:42,270
and sending the ball
around the top

486
00:26:42,270 --> 00:26:44,730
to break many bricks
with one hit.

487
00:26:44,730 --> 00:26:48,170
That was deep learning.

488
00:26:48,170 --> 00:26:50,630
>> That's the A.I. program based
on learning,

489
00:26:50,630 --> 00:26:52,430
really, that has been
so successful

490
00:26:52,430 --> 00:26:54,870
in the last few years and has...

491
00:26:54,870 --> 00:26:57,430
It wasn't clear ten years ago
that it would work,

492
00:26:57,430 --> 00:27:00,600
but it has completely changed
the map

493
00:27:00,600 --> 00:27:06,570
and is now used in almost
every sector of society.

494
00:27:06,570 --> 00:27:08,970
>> Even the best and brightest
among us,

495
00:27:08,970 --> 00:27:11,000
we just don't have enough
compute power

496
00:27:11,000 --> 00:27:13,530
inside of our heads.

497
00:27:13,530 --> 00:27:16,000
>> NARRATOR: Amy Webb is a
professor at N.Y.U.

498
00:27:16,000 --> 00:27:19,970
and founder of the Future Today
Institute.

499
00:27:19,970 --> 00:27:26,270
>> As A.I. progresses, the great
promise is that they...

500
00:27:26,270 --> 00:27:30,700
they, these, these machines,
alongside of us,

501
00:27:30,700 --> 00:27:34,330
are able to think and imagine
and see things

502
00:27:34,330 --> 00:27:36,730
in ways that we never have
before,

503
00:27:36,730 --> 00:27:40,370
which means that maybe we have
some kind of new,

504
00:27:40,370 --> 00:27:45,330
weird, seemingly implausible
solution to climate change.

505
00:27:45,330 --> 00:27:49,530
Maybe we have some radically
different approach

506
00:27:49,530 --> 00:27:52,930
to dealing with
incurable cancers.

507
00:27:52,930 --> 00:27:58,330
The real practical and wonderful
promise is that machines help us

508
00:27:58,330 --> 00:28:02,170
be more creative, and,
using that creativity,

509
00:28:02,170 --> 00:28:06,430
we get to terrific solutions.

510
00:28:06,430 --> 00:28:09,500
>> NARRATOR: Solutions that
could come unexpectedly

511
00:28:09,500 --> 00:28:11,770
to urgent problems.

512
00:28:11,770 --> 00:28:13,700
>> It's going to change
the face of breast cancer.

513
00:28:13,700 --> 00:28:16,870
Right now, 40,000 women
in the U.S. alone

514
00:28:16,870 --> 00:28:19,670
die from breast cancer
every single year.

515
00:28:19,670 --> 00:28:21,870
>> NARRATOR: Dr. Connie Lehman
is head

516
00:28:21,870 --> 00:28:23,230
of the breast imaging center

517
00:28:23,230 --> 00:28:26,470
at Massachusetts General
Hospital in Boston.

518
00:28:26,470 --> 00:28:28,930
>> We've become so complacent
about it,

519
00:28:28,930 --> 00:28:31,070
we almost don't think it can
really be changed.

520
00:28:31,070 --> 00:28:33,470
We, we somehow think we should
put all of our energy

521
00:28:33,470 --> 00:28:36,670
into chemotherapies
to save women

522
00:28:36,670 --> 00:28:38,370
with metastatic breast cancer,

523
00:28:38,370 --> 00:28:41,430
and yet, you know, when we find
it early, we cure it,

524
00:28:41,430 --> 00:28:44,730
and we cure it without having
the ravages to the body

525
00:28:44,730 --> 00:28:46,830
when we diagnose it late.

526
00:28:46,830 --> 00:28:51,700
This shows the progression of a
small, small spot from one year

527
00:28:51,700 --> 00:28:54,530
to the next,
and then to the diagnosis

528
00:28:54,530 --> 00:28:57,730
of the small cancer here.

529
00:28:57,730 --> 00:28:59,830
>> NARRATOR: This is what
happened when a woman

530
00:28:59,830 --> 00:29:02,100
who had been diagnosed
with breast cancer

531
00:29:02,100 --> 00:29:04,170
started to ask questions

532
00:29:04,170 --> 00:29:07,370
about why it couldn't have been
diagnosed earlier.

533
00:29:07,370 --> 00:29:10,170
>> It really brings a lot of
anxiety,

534
00:29:10,170 --> 00:29:12,270
and you're asking the questions,
you know,

535
00:29:12,270 --> 00:29:13,530
"Am I going to survive?

536
00:29:13,530 --> 00:29:15,200
What's going to happen
to my son?"

537
00:29:15,200 --> 00:29:19,230
And I start asking
other questions.

538
00:29:19,230 --> 00:29:21,700
>> NARRATOR: She was used to
asking questions.

539
00:29:21,700 --> 00:29:24,970
At M.I.T.'s
artificial-intelligence lab,

540
00:29:24,970 --> 00:29:27,970
Professor Regina Barzilay uses
deep learning

541
00:29:27,970 --> 00:29:31,100
to teach the computer to
understand language,

542
00:29:31,100 --> 00:29:34,270
as well as read text and data.

543
00:29:34,270 --> 00:29:37,600
>> I was really surprised
that the very basic question

544
00:29:37,600 --> 00:29:39,970
that I ask my physicians,

545
00:29:39,970 --> 00:29:43,330
which were really excellent
physicians here at MGH,

546
00:29:43,330 --> 00:29:47,030
they couldn't give me answers
that I was looking for.

547
00:29:47,030 --> 00:29:50,770
>> NARRATOR: She was convinced
that if you analyze enough data,

548
00:29:50,770 --> 00:29:53,530
from mammograms
to diagnostic notes,

549
00:29:53,530 --> 00:29:56,800
the computer could predict
early-stage conditions.

550
00:29:56,800 --> 00:30:02,830
>> If we fast-forward from 2012
to '13 to 2014,

551
00:30:02,830 --> 00:30:05,900
we then see when Regina
was diagnosed,

552
00:30:05,900 --> 00:30:10,170
because of this spot on her
mammogram.

553
00:30:10,170 --> 00:30:14,830
Is it possible, with more
elegant computer applications,

554
00:30:14,830 --> 00:30:19,100
that we might have identified
this spot the year before,

555
00:30:19,100 --> 00:30:21,200
or even back here?

556
00:30:21,200 --> 00:30:22,830
>> So, those are standard
prediction problems

557
00:30:22,830 --> 00:30:26,870
in machine learning-- there is
nothing special about them.

558
00:30:26,870 --> 00:30:29,900
And to my big surprise,
none of the technologies

559
00:30:29,900 --> 00:30:33,130
that we are developing
at M.I.T.,

560
00:30:33,130 --> 00:30:38,730
even in the most simple form,
doesn't penetrate the hospital.

561
00:30:38,730 --> 00:30:41,670
>> NARRATOR: Regina and Connie
began the slow process

562
00:30:41,670 --> 00:30:45,070
of getting access to thousands
of mammograms and records

563
00:30:45,070 --> 00:30:46,770
from MGH's breast-imaging
program.

564
00:30:49,470 --> 00:30:53,130
>> So, our first foray was just
to take all of the patients

565
00:30:53,130 --> 00:30:56,130
we had at MGH during
a period of time,

566
00:30:56,130 --> 00:30:58,530
who had had breast surgery
for a certain type

567
00:30:58,530 --> 00:31:00,430
of high-risk lesion.

568
00:31:00,430 --> 00:31:03,700
And we found that most of them
didn't really need the surgery.

569
00:31:03,700 --> 00:31:05,070
They didn't have cancer.

570
00:31:05,070 --> 00:31:07,670
But about ten percent
did have cancer.

571
00:31:07,670 --> 00:31:10,570
With Regina's techniques
in deep learning

572
00:31:10,570 --> 00:31:13,370
and machine learning, we were
able to predict the women

573
00:31:13,370 --> 00:31:15,600
that truly needed the surgery
and separate out

574
00:31:15,600 --> 00:31:19,470
those that really could avoid
the unnecessary surgery.

575
00:31:19,470 --> 00:31:23,030
>> What machine can do, it can
take hundreds of thousands

576
00:31:23,030 --> 00:31:25,730
of images where the outcome
is known

577
00:31:25,730 --> 00:31:30,700
and learn, based on how, you
know, pixels are distributed,

578
00:31:30,700 --> 00:31:35,170
what are the very unique
patterns that correlate highly

579
00:31:35,170 --> 00:31:38,370
with future occurrence
of the disease.

580
00:31:38,370 --> 00:31:40,900
So, instead of using human
capacity

581
00:31:40,900 --> 00:31:44,770
to kind of recognize pattern,
formalize pattern--

582
00:31:44,770 --> 00:31:48,700
which is inherently limited
by our cognitive capacity

583
00:31:48,700 --> 00:31:50,800
and how much we can see
and remember--

584
00:31:50,800 --> 00:31:53,700
we're providing machine with a
lot of data

585
00:31:53,700 --> 00:31:57,630
and make it learn
this prediction.

586
00:31:57,630 --> 00:32:02,370
>> So, we are using technology
not only to be better

587
00:32:02,370 --> 00:32:04,770
at assessing the breast density,

588
00:32:04,770 --> 00:32:07,200
but to get more to the point of
what we're trying to predict.

589
00:32:07,200 --> 00:32:10,930
"Does this woman have
a cancer now,

590
00:32:10,930 --> 00:32:13,170
and will she develop a cancer
in five years? "

591
00:32:13,170 --> 00:32:16,770
And that's, again, where
the artificial intelligence,

592
00:32:16,770 --> 00:32:18,700
machine and deep learning can
really help us

593
00:32:18,700 --> 00:32:20,770
and our patients.

594
00:32:20,770 --> 00:32:22,830
>> NARRATOR: In the age of A.I.,

595
00:32:22,830 --> 00:32:26,330
the algorithms are transporting
us into a universe

596
00:32:26,330 --> 00:32:29,970
of vast potential and
transforming almost every aspect

597
00:32:29,970 --> 00:32:34,200
of human endeavor and
experience.

598
00:32:34,200 --> 00:32:38,000
Andrew McAfee is a research
scientist at M.I.T.

599
00:32:38,000 --> 00:32:42,000
who co-authored
"The Second Machine Age."

600
00:32:42,000 --> 00:32:45,070
>> The great compliment that a
songwriter gives another one is,

601
00:32:45,070 --> 00:32:46,600
"Gosh, I wish I had written
that one."

602
00:32:46,600 --> 00:32:49,100
The great compliment a geek
gives another one is,

603
00:32:49,100 --> 00:32:50,900
"Wow, I wish I had drawn
that graph."

604
00:32:50,900 --> 00:32:53,630
So, I wish I had drawn
this graph.

605
00:32:53,630 --> 00:32:55,500
>> NARRATOR:
The graph uses a formula

606
00:32:55,500 --> 00:32:59,400
to show human development and
growth since 2000 BCE.

607
00:32:59,400 --> 00:33:01,570
>> The state of human
civilization

608
00:33:01,570 --> 00:33:04,970
is not very advanced, and it's
not getting better

609
00:33:04,970 --> 00:33:07,130
very quickly at all,
and this is true for thousands

610
00:33:07,130 --> 00:33:08,970
and thousands of years.

611
00:33:08,970 --> 00:33:12,470
When we, when we formed empires
and empires got overturned,

612
00:33:12,470 --> 00:33:16,530
when we tried democracy,
when we invented zero

613
00:33:16,530 --> 00:33:19,630
and mathematics and fundamental
discoveries about the universe,

614
00:33:19,630 --> 00:33:21,400
big deal.

615
00:33:21,400 --> 00:33:23,300
It just, the numbers don't
change very much.

616
00:33:23,300 --> 00:33:26,900
What's weird is that the numbers
change essentially in the blink

617
00:33:26,900 --> 00:33:28,370
of an eye at one point in time.

618
00:33:28,370 --> 00:33:32,030
And it goes from really
horizontal, unchanging,

619
00:33:32,030 --> 00:33:36,600
uninteresting, to, holy Toledo,
crazy vertical.

620
00:33:36,600 --> 00:33:39,200
And then the question is,
what on Earth happened

621
00:33:39,200 --> 00:33:40,570
to cause that change?

622
00:33:40,570 --> 00:33:42,770
And the answer
is the Industrial Revolution.

623
00:33:42,770 --> 00:33:44,800
There were other things that
happened,

624
00:33:44,800 --> 00:33:46,830
but really what fundamentally
happened is

625
00:33:46,830 --> 00:33:49,530
we overcame the limitations
of our muscle power.

626
00:33:49,530 --> 00:33:52,400
Something equally interesting is
happening right now.

627
00:33:52,400 --> 00:33:55,330
We are overcoming the
limitations of our minds.

628
00:33:55,330 --> 00:33:56,930
We're not getting rid of them,

629
00:33:56,930 --> 00:33:58,970
we're not making them
unnecessary,

630
00:33:58,970 --> 00:34:02,500
but, holy cow, can we leverage
them and amplify them now.

631
00:34:02,500 --> 00:34:04,170
You have to be a huge pessimist

632
00:34:04,170 --> 00:34:06,730
not to find that profoundly
good news.

633
00:34:06,730 --> 00:34:09,370
>> I really do think the world
has entered a new era.

634
00:34:09,370 --> 00:34:12,830
Artificial intelligence holds so
much promise,

635
00:34:12,830 --> 00:34:15,730
but it's going to reshape every
aspect of the economy,

636
00:34:15,730 --> 00:34:17,370
so many aspects of our lives.

637
00:34:17,370 --> 00:34:20,770
Because A.I. is a little bit
like electricity.

638
00:34:20,770 --> 00:34:22,670
Everybody's going to use it.

639
00:34:22,670 --> 00:34:26,400
Every company is going to be
incorporating A.I.,

640
00:34:26,400 --> 00:34:28,300
integrating it into
what they do,

641
00:34:28,300 --> 00:34:29,630
governments are going to be
using it,

642
00:34:29,630 --> 00:34:33,600
nonprofit organizations are
going to be using it.

643
00:34:33,600 --> 00:34:37,200
It's going to create all kinds
of benefits

644
00:34:37,200 --> 00:34:41,070
in ways large and small,
and challenges for us, as well.

645
00:34:41,070 --> 00:34:44,730
>> NARRATOR: The challenges,
the benefits--

646
00:34:44,730 --> 00:34:47,000
the autonomous truck
represents both

647
00:34:47,000 --> 00:34:50,070
as it maneuvers
into the marketplace.

648
00:34:50,070 --> 00:34:53,070
The engineers are confident
that, in spite of questions

649
00:34:53,070 --> 00:34:55,370
about when this will happen,

650
00:34:55,370 --> 00:34:57,330
they can get it working safely
sooner

651
00:34:57,330 --> 00:34:58,770
than most people realize.

652
00:34:58,770 --> 00:35:02,130
>> I think that you will see the
first vehicles operating

653
00:35:02,130 --> 00:35:05,570
with no one inside them moving
freight in the next few years,

654
00:35:05,570 --> 00:35:07,700
and then you're going to see
that expanding to more freight,

655
00:35:07,700 --> 00:35:11,030
more geographies,
more weather over time as,

656
00:35:11,030 --> 00:35:12,530
as that capability builds up.

657
00:35:12,530 --> 00:35:16,600
We're talking, like,
less than half a decade.

658
00:35:16,600 --> 00:35:19,370
>> NARRATOR: He already has a
Fortune 500 company

659
00:35:19,370 --> 00:35:23,830
as a client, shipping appliances
across the Southwest.

660
00:35:23,830 --> 00:35:27,330
He says the sales pitch
is straightforward.

661
00:35:27,330 --> 00:35:30,070
>> They spend hundreds of
millions of dollars a year

662
00:35:30,070 --> 00:35:31,670
shipping parts around
the country.

663
00:35:31,670 --> 00:35:34,100
We can bring that cost in half.

664
00:35:34,100 --> 00:35:36,930
And they're really excited to be
able to start working with us,

665
00:35:36,930 --> 00:35:39,800
both because of the potential,

666
00:35:39,800 --> 00:35:42,100
the potential savings from
deploying self-driving,

667
00:35:42,100 --> 00:35:44,470
and also because of all the
operational efficiencies

668
00:35:44,470 --> 00:35:47,830
that they see, the biggest one
being able to operate

669
00:35:47,830 --> 00:35:49,800
24 hours a day.

670
00:35:49,800 --> 00:35:51,970
So, right now, human drivers are
limited to 11 hours

671
00:35:51,970 --> 00:35:55,470
by federal law,
and a driverless truck

672
00:35:55,470 --> 00:35:57,000
obviously wouldn't have
that limitation.

673
00:35:57,000 --> 00:36:02,530
♪ ♪

674
00:36:02,530 --> 00:36:05,330
>> NARRATOR: The idea of a
driverless truck comes up often

675
00:36:05,330 --> 00:36:11,430
in discussions about artificial
intelligence.

676
00:36:11,430 --> 00:36:14,800
Steve Viscelli is a sociologist
who drove a truck

677
00:36:14,800 --> 00:36:20,330
while researching his book "The
Big Rig" about the industry.

678
00:36:20,330 --> 00:36:23,000
>> This is one of the most
remarkable stories

679
00:36:23,000 --> 00:36:25,830
in, in U.S. labor history,
I think,

680
00:36:25,830 --> 00:36:30,400
is, you know, the decline of,
of unionized trucking.

681
00:36:30,400 --> 00:36:33,600
The industry was deregulated
in 1980,

682
00:36:33,600 --> 00:36:37,400
and at that time, you know,
truck drivers were earning

683
00:36:37,400 --> 00:36:41,400
the equivalent of over
$100,000 in today's dollars.

684
00:36:41,400 --> 00:36:45,500
And today the typical truck
driver will earn

685
00:36:45,500 --> 00:36:50,370
a little over $40,000 a year.

686
00:36:50,370 --> 00:36:52,630
And I think it's
an important part

687
00:36:52,630 --> 00:36:54,230
of the automation story, right?

688
00:36:54,230 --> 00:36:56,900
Why are they so afraid of
automation?

689
00:36:56,900 --> 00:37:00,670
Because we've had four decades
of rising inequality in wages.

690
00:37:00,670 --> 00:37:03,330
And if anybody is going to take
it on the chin

691
00:37:03,330 --> 00:37:05,330
from automation
in the trucking industry,

692
00:37:05,330 --> 00:37:07,630
the, the first in line is going
to be the driver,

693
00:37:07,630 --> 00:37:12,300
without a doubt.

694
00:37:12,300 --> 00:37:14,730
>> NARRATOR: For his research,
Viscelli tracked down truckers

695
00:37:14,730 --> 00:37:17,600
and their families,
like Shawn and Hope Cumbee

696
00:37:17,600 --> 00:37:19,530
of Beaverton, Michigan.
>> Hi.

697
00:37:19,530 --> 00:37:20,870
>> Hey, Hope,
I'm Steve Viscelli.

698
00:37:20,870 --> 00:37:21,870
>> Hi, Steve, nice to meet you.
Come on in.

699
00:37:21,870 --> 00:37:24,800
>> Great to meet you, too,
thanks.

700
00:37:24,800 --> 00:37:26,430
>> NARRATOR: And their son
Charlie.

701
00:37:26,430 --> 00:37:31,730
>> This is Daddy, me,
Daddy, and Mommy.

702
00:37:31,730 --> 00:37:34,230
>> NARRATOR: But Daddy's not
here.

703
00:37:34,230 --> 00:37:38,900
Shawn Cumbee's truck has broken
down in Tennessee.

704
00:37:38,900 --> 00:37:43,470
Hope, who drove a truck herself,
knows the business well.

705
00:37:43,470 --> 00:37:46,870
>> We made $150,000, right,
in a year.

706
00:37:46,870 --> 00:37:48,070
That sounds great, right?

707
00:37:48,070 --> 00:37:50,400
That's, like, good money.

708
00:37:50,400 --> 00:37:53,870
We paid $100,000 in fuel, okay?

709
00:37:53,870 --> 00:37:57,030
So, right there,
now I made $50,000.

710
00:37:57,030 --> 00:37:59,030
But I didn't really, because,
you know,

711
00:37:59,030 --> 00:38:00,600
you get an oil change every
month,

712
00:38:00,600 --> 00:38:02,200
so that's $300 a month.

713
00:38:02,200 --> 00:38:04,170
You still have to do
all the maintenance.

714
00:38:04,170 --> 00:38:06,500
We had a motor blow out, right?

715
00:38:06,500 --> 00:38:09,170
$13,000. Right?

716
00:38:09,170 --> 00:38:11,800
I know, I mean, I choke up a
little just thinking about it,

717
00:38:11,800 --> 00:38:13,770
because it was...

718
00:38:13,770 --> 00:38:17,470
And it was 13,000, and we were
off work for two weeks.

719
00:38:17,470 --> 00:38:19,670
So, by the end of the year,
with that $150,000,

720
00:38:19,670 --> 00:38:22,670
by the end of the year,
we'd made about 20...

721
00:38:22,670 --> 00:38:26,030
About $22,000.

722
00:38:26,030 --> 00:38:28,400
>> NARRATOR: In a truck stop
in Tennessee,

723
00:38:28,400 --> 00:38:31,500
Shawn has been sidelined
waiting for a new part.

724
00:38:31,500 --> 00:38:35,300
The garage owner is letting him
stay in the truck to save money.

725
00:38:37,870 --> 00:38:39,770
>> Hi, baby.

726
00:38:39,770 --> 00:38:41,330
>> (on phone): Hey, how's it
going?

727
00:38:41,330 --> 00:38:42,730
>> It's going.
Chunky-butt!

728
00:38:42,730 --> 00:38:44,600
>> Hi, Daddy!
>> Hi, Chunky-butt.

729
00:38:44,600 --> 00:38:47,300
What're you doing?
>> (talking inaudibly)

730
00:38:47,300 --> 00:38:49,600
>> Believe it or not,
I do it because I love it.

731
00:38:49,600 --> 00:38:51,330
I mean, you know,
it's in the blood.

732
00:38:51,330 --> 00:38:52,900
Third-generation driver.

733
00:38:52,900 --> 00:38:55,230
And my granddaddy told me a long
time ago,

734
00:38:55,230 --> 00:38:58,630
when I was probably
11, 12 years old, probably,

735
00:38:58,630 --> 00:39:01,500
he said, "The world meets nobody
halfway.

736
00:39:01,500 --> 00:39:02,930
Nobody."

737
00:39:02,930 --> 00:39:07,030
He said, "If you want it,
you have to earn it."

738
00:39:07,030 --> 00:39:09,870
And that's what I do every day.

739
00:39:09,870 --> 00:39:11,330
I live by that creed.

740
00:39:11,330 --> 00:39:16,100
And I've lived by that
since it was told to me.

741
00:39:16,100 --> 00:39:18,300
>> So, if you're down for a week
in a truck,

742
00:39:18,300 --> 00:39:19,870
you still have to pay your
bills.

743
00:39:19,870 --> 00:39:22,100
I have enough money in my
checking account at all times

744
00:39:22,100 --> 00:39:23,470
to pay a month's worth of bills.

745
00:39:23,470 --> 00:39:25,070
That does not include my food.

746
00:39:25,070 --> 00:39:27,630
That doesn't include field trips
for my son's school.

747
00:39:27,630 --> 00:39:31,700
My son and I just went to our
yearly doctor appointment.

748
00:39:31,700 --> 00:39:36,270
I took, I took money out of my
son's piggy bank to pay for it,

749
00:39:36,270 --> 00:39:40,600
because it's not...
it's not scheduled in.

750
00:39:40,600 --> 00:39:43,430
It's, it's not something that
you can, you know, afford.

751
00:39:43,430 --> 00:39:45,500
I mean, like, when...

752
00:39:45,500 --> 00:39:46,900
(sighs): Sorry.

753
00:39:46,900 --> 00:39:48,970
>> It's okay.

754
00:39:48,970 --> 00:39:52,600
♪ ♪

755
00:39:57,230 --> 00:39:59,170
Have you guys ever talked about
self-driving trucks?

756
00:39:59,170 --> 00:40:00,500
Is he...

757
00:40:00,500 --> 00:40:03,130
>> (laughing): So, kind of.

758
00:40:03,130 --> 00:40:05,830
Um, I asked him once, you know.

759
00:40:05,830 --> 00:40:07,230
And he laughed so hard.

760
00:40:07,230 --> 00:40:10,330
He said, "No way will they
ever have a truck

761
00:40:10,330 --> 00:40:12,970
that can drive itself."

762
00:40:12,970 --> 00:40:15,230
>> It's kind of interesting when
you think about it, you know,

763
00:40:15,230 --> 00:40:17,730
they're putting all this new
technology into things,

764
00:40:17,730 --> 00:40:19,570
but, you know,
it's still man-made.

765
00:40:19,570 --> 00:40:22,970
And man, you know,
does make mistakes.

766
00:40:22,970 --> 00:40:26,170
I really don't see it being
a problem with the industry,

767
00:40:26,170 --> 00:40:28,770
'cause, one, you still got to
have a driver in it,

768
00:40:28,770 --> 00:40:30,330
because I don't see it
doing city.

769
00:40:30,330 --> 00:40:32,600
I don't see it doing,
you know, main things.

770
00:40:32,600 --> 00:40:34,700
I don't see it backing into
a dock.

771
00:40:34,700 --> 00:40:37,870
I don't see the automation part,
you know, doing...

772
00:40:37,870 --> 00:40:39,900
maybe the box-trailer side,
you know, I can see that,

773
00:40:39,900 --> 00:40:41,400
but not stuff like I do.

774
00:40:41,400 --> 00:40:44,830
So, I ain't really worried about
the automation of trucks.

775
00:40:44,830 --> 00:40:46,230
>> How near of a future is it?

776
00:40:46,230 --> 00:40:49,300
>> Yeah, self-driving, um...

777
00:40:49,300 --> 00:40:52,600
So, some, you know, some
companies are already operating.

778
00:40:52,600 --> 00:40:56,170
Embark, for instance, is one
that has been doing

779
00:40:56,170 --> 00:40:59,030
driverless trucks
on the interstate.

780
00:40:59,030 --> 00:41:01,930
And what's called exit-to-exit
self-driving.

781
00:41:01,930 --> 00:41:04,830
And they're currently running
real freight.

782
00:41:04,830 --> 00:41:07,530
>> Really?
>> Yeah, on I-10.

783
00:41:07,530 --> 00:41:10,530
♪ ♪

784
00:41:10,530 --> 00:41:15,170
>> (on P.A.): Shower guest 100,
your shower is now ready.

785
00:41:15,170 --> 00:41:18,430
>> NARRATOR: Over time, it has
become harder and harder

786
00:41:18,430 --> 00:41:21,230
for veteran independent drivers
like the Cumbees

787
00:41:21,230 --> 00:41:23,070
to make a living.

788
00:41:23,070 --> 00:41:25,070
They've been replaced by
younger,

789
00:41:25,070 --> 00:41:28,200
less experienced drivers.

790
00:41:28,200 --> 00:41:32,630
>> So, the, the trucking
industry's $740 billion a year,

791
00:41:32,630 --> 00:41:34,770
and, again, in, in many
of these operations,

792
00:41:34,770 --> 00:41:37,470
labor's a third of that cost.

793
00:41:37,470 --> 00:41:40,500
By my estimate, I, you know,
I think we're in the range

794
00:41:40,500 --> 00:41:42,970
of 300,000 or so jobs
in the foreseeable future

795
00:41:42,970 --> 00:41:47,930
that could be automated to some
significant extent.

796
00:41:47,930 --> 00:41:50,630
♪ ♪

797
00:41:50,630 --> 00:41:53,530
>> (groans)

798
00:41:53,530 --> 00:41:57,070
♪ ♪

799
00:42:03,000 --> 00:42:06,130
>> NARRATOR: The A.I. future
was built with great optimism

800
00:42:06,130 --> 00:42:09,100
out here in the West.

801
00:42:09,100 --> 00:42:12,630
In 2018, many of the people
who invented it

802
00:42:12,630 --> 00:42:16,170
gathered in San Francisco to
celebrate the 25th anniversary

803
00:42:16,170 --> 00:42:18,700
of the industry magazine.

804
00:42:18,700 --> 00:42:22,300
>> Howdy, welcome to WIRED25.

805
00:42:22,300 --> 00:42:24,200
>> NARRATOR: It is a
celebration, for sure,

806
00:42:24,200 --> 00:42:27,070
but there's also a growing sense
of caution

807
00:42:27,070 --> 00:42:28,670
and even skepticism.

808
00:42:31,130 --> 00:42:33,330
>> We're having a really good
weekend here.

809
00:42:33,330 --> 00:42:37,030
>> NARRATOR: Nick Thompson is
editor-in-chief of "Wired."

810
00:42:37,030 --> 00:42:40,030
>> When it started,
it was very much a magazine

811
00:42:40,030 --> 00:42:44,100
about what's coming and why you
should be excited about it.

812
00:42:44,100 --> 00:42:47,730
Optimism was the defining
feature of "Wired"

813
00:42:47,730 --> 00:42:49,400
for many, many years.

814
00:42:49,400 --> 00:42:53,130
Or, as our slogan used to be,
"Change Is Good."

815
00:42:53,130 --> 00:42:55,070
And over time,
it shifted a little bit.

816
00:42:55,070 --> 00:42:59,170
And now it's more,
"We love technology,

817
00:42:59,170 --> 00:43:00,630
but let's look at some
of the big issues,

818
00:43:00,630 --> 00:43:03,400
and let's look at some of them
critically,

819
00:43:03,400 --> 00:43:05,730
and let's look at the way
algorithms are changing

820
00:43:05,730 --> 00:43:07,930
the way we behave,
for good and for ill."

821
00:43:07,930 --> 00:43:12,030
So, the whole nature of "Wired"
has gone from a champion

822
00:43:12,030 --> 00:43:14,830
of technological change to more
of a observer

823
00:43:14,830 --> 00:43:16,700
of technological change.

824
00:43:16,700 --> 00:43:18,570
>> So, um, before we start...

825
00:43:18,570 --> 00:43:20,530
>> NARRATOR: There
are 25 speakers,

826
00:43:20,530 --> 00:43:23,700
all named as icons
of the last 25 years

827
00:43:23,700 --> 00:43:25,500
of technological progress.

828
00:43:25,500 --> 00:43:27,770
>> So, why is Apple so
secretive?

829
00:43:27,770 --> 00:43:29,470
>> (chuckling)

830
00:43:29,470 --> 00:43:31,630
>> NARRATOR: Jony Ive, who
designed Apple's iPhone.

831
00:43:31,630 --> 00:43:34,300
>> It would be bizarre
not to be.

832
00:43:34,300 --> 00:43:36,670
>> There's this question of,
like,

833
00:43:36,670 --> 00:43:39,000
what are we doing here in this
life, in this reality?

834
00:43:39,000 --> 00:43:43,170
>> NARRATOR: Jaron Lanier, who
pioneered virtual reality.

835
00:43:43,170 --> 00:43:46,500
And Jeff Bezos,
the founder of Amazon.

836
00:43:46,500 --> 00:43:47,870
>> Amazon was a garage startup.

837
00:43:47,870 --> 00:43:49,370
Now it's a very large company.

838
00:43:49,370 --> 00:43:50,570
Two kids in a dorm...

839
00:43:50,570 --> 00:43:52,070
>> NARRATOR: His message is,

840
00:43:52,070 --> 00:43:54,730
"All will be well
in the new world."

841
00:43:54,730 --> 00:43:58,470
>> I guess, first of all, I
remain incredibly optimistic

842
00:43:58,470 --> 00:43:59,630
about technology,

843
00:43:59,630 --> 00:44:01,830
and technologies always
are two-sided.

844
00:44:01,830 --> 00:44:03,230
But that's not new.

845
00:44:03,230 --> 00:44:05,400
That's always been the case.

846
00:44:05,400 --> 00:44:07,830
And, and we will figure it out.

847
00:44:07,830 --> 00:44:10,570
The last thing we would ever
want to do is stop the progress

848
00:44:10,570 --> 00:44:16,630
of new technologies,
even when they are dual-use.

849
00:44:16,630 --> 00:44:19,800
>> NARRATOR: But, says Thompson,
beneath the surface,

850
00:44:19,800 --> 00:44:22,530
there's a worry most of them
don't like to talk about.

851
00:44:22,530 --> 00:44:26,630
>> There are some people in
Silicon Valley who believe that,

852
00:44:26,630 --> 00:44:29,900
"You just have to trust
the technology.

853
00:44:29,900 --> 00:44:32,870
Throughout history, there's been
a complicated relationship

854
00:44:32,870 --> 00:44:34,470
between humans and machines,

855
00:44:34,470 --> 00:44:36,770
we've always worried about
machines,

856
00:44:36,770 --> 00:44:38,130
and it's always been fine.

857
00:44:38,130 --> 00:44:41,000
And we don't know how A.I. will
change the labor force,

858
00:44:41,000 --> 00:44:42,300
but it will be okay."

859
00:44:42,300 --> 00:44:44,070
So, that argument exists.

860
00:44:44,070 --> 00:44:45,700
There's another argument,

861
00:44:45,700 --> 00:44:48,170
which is what I think most of
them believe deep down,

862
00:44:48,170 --> 00:44:51,100
which is, "This is different.

863
00:44:51,100 --> 00:44:52,930
We're going to have labor-force
disruption

864
00:44:52,930 --> 00:44:55,030
like we've never seen before.

865
00:44:55,030 --> 00:44:59,370
And if that happens,
will they blame us?"

866
00:44:59,370 --> 00:45:02,600
>> NARRATOR: There is, however,
one of the WIRED25 icons

867
00:45:02,600 --> 00:45:05,800
willing to take on the issue.

868
00:45:05,800 --> 00:45:09,470
Onstage, Kai-Fu Lee dispenses
with one common fear.

869
00:45:09,470 --> 00:45:11,670
>> Well, I think there are so
many myths out there.

870
00:45:11,670 --> 00:45:14,530
I think one, one myth is that

871
00:45:14,530 --> 00:45:17,570
because A.I. is so good at a
single task,

872
00:45:17,570 --> 00:45:21,600
that one day we'll wake up, and
we'll all be enslaved

873
00:45:21,600 --> 00:45:24,100
or forced to plug our brains
to the A.I.

874
00:45:24,100 --> 00:45:28,800
But it is nowhere close
to displacing humans.

875
00:45:28,800 --> 00:45:32,130
>> NARRATOR: But in interviews
around the event and beyond,

876
00:45:32,130 --> 00:45:37,430
he takes a decidedly contrarian
position on A.I. and job loss.

877
00:45:37,430 --> 00:45:41,270
>> The A.I. giants want to paint
the rosier picture

878
00:45:41,270 --> 00:45:43,500
because they're happily
making money.

879
00:45:43,500 --> 00:45:47,330
So, I think they prefer not to
talk about the negative side.

880
00:45:47,330 --> 00:45:53,070
I believe about 50% of jobs
will be

881
00:45:53,070 --> 00:45:56,900
somewhat or extremely
threatened by A.I.

882
00:45:56,900 --> 00:46:00,500
in the next 15 years or so.

883
00:46:00,500 --> 00:46:02,570
>> NARRATOR: Kai-Fu Lee also
makes a great deal

884
00:46:02,570 --> 00:46:04,900
of money from A.I.

885
00:46:04,900 --> 00:46:06,800
What separates him from most of
his colleagues

886
00:46:06,800 --> 00:46:09,930
is that he's frank
about its downside.

887
00:46:09,930 --> 00:46:13,900
>> Yes, yes, we, we've made
about 40 investments in A.I.

888
00:46:13,900 --> 00:46:16,930
I think, based on these 40
investments,

889
00:46:16,930 --> 00:46:20,000
most of them are not impacting
human jobs.

890
00:46:20,000 --> 00:46:21,970
They're creating value,
making high margins,

891
00:46:21,970 --> 00:46:24,300
inventing a new model.

892
00:46:24,300 --> 00:46:27,730
But I could list seven or eight

893
00:46:27,730 --> 00:46:32,670
that would lead to a very clear
displacement of human jobs.

894
00:46:32,670 --> 00:46:34,370
>> NARRATOR: He says that A.I.
is coming,

895
00:46:34,370 --> 00:46:36,470
whether we like it or not.

896
00:46:36,470 --> 00:46:38,300
And he wants to warn society

897
00:46:38,300 --> 00:46:41,030
about what he sees as
inevitable.

898
00:46:41,030 --> 00:46:43,600
>> You have a view which I think
is different than many others,

899
00:46:43,600 --> 00:46:48,670
which is that A.I. is not going
to take blue-collar jobs

900
00:46:48,670 --> 00:46:51,230
so quickly, but is actually
going to take white-collar jobs.

901
00:46:51,230 --> 00:46:53,770
>> Yeah.
Well, both will happen.

902
00:46:53,770 --> 00:46:57,000
A.I. will be, at the same time,
a replacement for blue-collar,

903
00:46:57,000 --> 00:47:00,630
white-collar jobs, and be
a great symbiotic tool

904
00:47:00,630 --> 00:47:03,630
for doctors, lawyers, and you,
for example.

905
00:47:03,630 --> 00:47:05,700
But the white-collar jobs are
easier to take,

906
00:47:05,700 --> 00:47:10,030
because they're a pure
quantitative analytical process.

907
00:47:10,030 --> 00:47:15,370
Let's say reporters, traders,
telemarketing,

908
00:47:15,370 --> 00:47:17,270
telesales, customer service...

909
00:47:17,270 --> 00:47:18,730
>> Analysts?

910
00:47:18,730 --> 00:47:23,170
>> Analysts, yes, these can all
be replaced just by a software.

911
00:47:23,170 --> 00:47:26,330
To do blue-collar, some of the
work requires, you know,

912
00:47:26,330 --> 00:47:30,030
hand-eye coordination, things
that machines are not yet

913
00:47:30,030 --> 00:47:32,300
good enough to do.

914
00:47:32,300 --> 00:47:36,400
>> Today, there are many people
who are ringing the alarm,

915
00:47:36,400 --> 00:47:37,600
"Oh, my God, what are we going
to do?

916
00:47:37,600 --> 00:47:39,830
Half the jobs are going away."

917
00:47:39,830 --> 00:47:43,430
I believe that's true, but
here's the missing fact.

918
00:47:43,430 --> 00:47:46,400
I've done the research on this,
and if you go back 20, 30,

919
00:47:46,400 --> 00:47:50,930
or 40 years ago, you will find
that 50% of the jobs

920
00:47:50,930 --> 00:47:54,400
that people performed back then
are gone today.

921
00:47:54,400 --> 00:47:56,900
You know, where are all the
telephone operators,

922
00:47:56,900 --> 00:48:00,600
bowling-pin setters,
elevator operators?

923
00:48:00,600 --> 00:48:04,270
You used to have seas of
secretaries in corporations

924
00:48:04,270 --> 00:48:06,070
that have now been eliminated--
travel agents.

925
00:48:06,070 --> 00:48:08,770
You can just go through field
after field after field.

926
00:48:08,770 --> 00:48:12,100
That same pattern has recurred
many times throughout history,

927
00:48:12,100 --> 00:48:14,230
with each new wave
of automation.

928
00:48:14,230 --> 00:48:20,270
>> But I would argue that
history is only trustable

929
00:48:20,270 --> 00:48:24,670
if it is multiple repetitions
of similar events,

930
00:48:24,670 --> 00:48:28,670
not once-in-a-blue-moon
occurrence.

931
00:48:28,670 --> 00:48:33,070
So, over the history of many
tech inventions,

932
00:48:33,070 --> 00:48:34,770
most are small things.

933
00:48:34,770 --> 00:48:41,330
Only maybe three are at the
magnitude of A.I. revolution--

934
00:48:41,330 --> 00:48:44,730
the steam, steam engine,
electricity,

935
00:48:44,730 --> 00:48:46,570
and the computer revolution.

936
00:48:46,570 --> 00:48:48,970
I'd say everything else
is too small.

937
00:48:48,970 --> 00:48:52,670
And the reason I think it might
be something brand-new

938
00:48:52,670 --> 00:48:58,930
is that A.I. is fundamentally
replacing our cognitive process

939
00:48:58,930 --> 00:49:03,670
in doing a job in its
significant entirety,

940
00:49:03,670 --> 00:49:06,400
and it can do it dramatically
better.

941
00:49:06,400 --> 00:49:08,570
>> NARRATOR: This argument
about job loss

942
00:49:08,570 --> 00:49:11,470
in the age of A.I. was ignited
six years ago

943
00:49:11,470 --> 00:49:15,830
amid the gargoyles and spires
of Oxford University.

944
00:49:15,830 --> 00:49:19,970
Two researchers had been poring
through U.S. labor statistics,

945
00:49:19,970 --> 00:49:25,270
identifying jobs that could be
vulnerable to A.I. automation.

946
00:49:25,270 --> 00:49:27,300
>> Well, vulnerable to
automation,

947
00:49:27,300 --> 00:49:30,730
in the context that we discussed
five years ago now,

948
00:49:30,730 --> 00:49:34,430
essentially meant that those
jobs are potentially automatable

949
00:49:34,430 --> 00:49:36,900
over an unspecified number of
years.

950
00:49:36,900 --> 00:49:41,530
And the figure we came up with
was 47%.

951
00:49:41,530 --> 00:49:43,330
>> NARRATOR: 47%.

952
00:49:43,330 --> 00:49:46,470
That number quickly traveled
the world in headlines

953
00:49:46,470 --> 00:49:47,830
and news bulletins.

954
00:49:47,830 --> 00:49:51,030
But authors Carl Frey
and Michael Osborne

955
00:49:51,030 --> 00:49:52,770
offered a caution.

956
00:49:52,770 --> 00:49:57,670
They can't predict how many jobs
will be lost, or how quickly.

957
00:49:57,670 --> 00:50:02,430
But Frey believes that there are
lessons in history.

958
00:50:02,430 --> 00:50:04,830
>> And what worries me the most
is that there is actually

959
00:50:04,830 --> 00:50:08,830
one episode that looks quite
familiar to today,

960
00:50:08,830 --> 00:50:12,270
which is the British
Industrial Revolution,

961
00:50:12,270 --> 00:50:16,400
where wages didn't grow
for nine decades,

962
00:50:16,400 --> 00:50:20,530
and a lot of people actually
saw living standards decline

963
00:50:20,530 --> 00:50:23,870
as technology progressed.

964
00:50:23,870 --> 00:50:25,630
♪ ♪

965
00:50:25,630 --> 00:50:28,370
>> NARRATOR: Saginaw, Michigan,
knows about decline

966
00:50:28,370 --> 00:50:31,170
in living standards.

967
00:50:31,170 --> 00:50:34,900
Harry Cripps, an auto worker
and a local union president,

968
00:50:34,900 --> 00:50:40,730
has witnessed what 40 years of
automation can do to a town.

969
00:50:40,730 --> 00:50:43,470
>> You know, we're one of the
cities in the country that,

970
00:50:43,470 --> 00:50:47,170
I think we were left behind in
this recovery.

971
00:50:47,170 --> 00:50:51,670
And I just... I don't know how
we get on the bandwagon now.

972
00:50:54,770 --> 00:50:57,030
>> NARRATOR: Once, this was the
U.A.W. hall

973
00:50:57,030 --> 00:50:59,230
for one local union.

974
00:50:59,230 --> 00:51:03,670
Now, with falling membership,
it's shared by five locals.

975
00:51:03,670 --> 00:51:05,730
>> Rudy didn't get his shift.

976
00:51:05,730 --> 00:51:07,330
>> NARRATOR: This day,
it's the center

977
00:51:07,330 --> 00:51:09,570
for a Christmas food drive.

978
00:51:09,570 --> 00:51:12,030
Even in a growth economy,

979
00:51:12,030 --> 00:51:14,830
unemployment here is near
six percent.

980
00:51:14,830 --> 00:51:18,930
Poverty in Saginaw is over 30%.

981
00:51:21,830 --> 00:51:25,130
>> Our factory has about
1.9 million square feet.

982
00:51:25,130 --> 00:51:29,100
Back in the '70s, that 1.9
million square feet

983
00:51:29,100 --> 00:51:32,330
had about 7,500 U.A.W.
automotive workers

984
00:51:32,330 --> 00:51:34,300
making middle-class wage with
decent benefits

985
00:51:34,300 --> 00:51:36,770
and able to send their kids to
college and do all the things

986
00:51:36,770 --> 00:51:39,000
that the middle-class family
should be able to do.

987
00:51:39,000 --> 00:51:42,270
Our factory today, with
automation,

988
00:51:42,270 --> 00:51:46,300
would probably be about
700 United Auto Workers.

989
00:51:46,300 --> 00:51:50,130
That's a dramatic change.

990
00:51:50,130 --> 00:51:52,230
Lot of union brothers used
to work there, buddy.

991
00:51:52,230 --> 00:51:55,130
>> The TRW plant, that was
unfortunate.

992
00:51:55,130 --> 00:51:57,830
>> Delphi... looks like they're
starting to tear it down now.

993
00:51:57,830 --> 00:51:59,300
Wow.

994
00:51:59,300 --> 00:52:02,770
Automations is, is definitely
taking away a lot of jobs.

995
00:52:02,770 --> 00:52:05,530
Robots, I don't know how they
buy cars,

996
00:52:05,530 --> 00:52:07,300
I don't know how
they buy sandwiches,

997
00:52:07,300 --> 00:52:09,100
I don't know how they go to the
grocery store.

998
00:52:09,100 --> 00:52:11,430
They definitely don't pay taxes,
which serves the infrastructure.

999
00:52:11,430 --> 00:52:15,300
So, you don't have the sheriffs
and the police and the firemen,

1000
00:52:15,300 --> 00:52:18,830
and anybody else that supports
the city is gone,

1001
00:52:18,830 --> 00:52:19,900
'cause there's no tax base.

1002
00:52:19,900 --> 00:52:23,770
Robots don't pay taxes.

1003
00:52:23,770 --> 00:52:25,900
>> NARRATOR: The average
personal income in Saginaw

1004
00:52:25,900 --> 00:52:29,570
is $16,000 a year.

1005
00:52:29,570 --> 00:52:32,600
>> A lot of the families that I
work with here in the community,

1006
00:52:32,600 --> 00:52:33,830
both parents are working.

1007
00:52:33,830 --> 00:52:35,470
They're working two jobs.

1008
00:52:35,470 --> 00:52:38,370
Mainly, it's the wages,
you know,

1009
00:52:38,370 --> 00:52:43,270
people not making a decent wage
to be able to support a family.

1010
00:52:43,270 --> 00:52:46,930
Like, back in the day, my dad
even worked at the plant.

1011
00:52:46,930 --> 00:52:49,300
My mom stayed home,
raised the children.

1012
00:52:49,300 --> 00:52:52,000
And that give us the opportunity
to put food on the table,

1013
00:52:52,000 --> 00:52:53,370
and things of that nature.

1014
00:52:53,370 --> 00:52:56,000
And, and them times are gone.

1015
00:52:56,000 --> 00:52:57,930
>> If you look at this graph of
what's been happening

1016
00:52:57,930 --> 00:52:59,670
to America since the end
of World War II,

1017
00:52:59,670 --> 00:53:03,000
you see a line for our
productivity,

1018
00:53:03,000 --> 00:53:05,730
and our productivity
gets better over time.

1019
00:53:05,730 --> 00:53:08,830
It used to be the case
that our pay, our income,

1020
00:53:08,830 --> 00:53:12,700
would increase in lockstep with
those productivity increases.

1021
00:53:12,700 --> 00:53:17,570
The weird part about this graph
is how the income has decoupled,

1022
00:53:17,570 --> 00:53:21,900
is not going up the same way
that productivity is anymore.

1023
00:53:21,900 --> 00:53:24,170
>> NARRATOR: As automation has
taken over,

1024
00:53:24,170 --> 00:53:27,770
workers are either laid off or
left with less-skilled jobs

1025
00:53:27,770 --> 00:53:31,400
for less pay,
while productivity goes up.

1026
00:53:31,400 --> 00:53:33,100
>> There are still plenty
of factories in America.

1027
00:53:33,100 --> 00:53:35,430
We are a manufacturing
powerhouse,

1028
00:53:35,430 --> 00:53:37,670
but if you go walk around
an American factory,

1029
00:53:37,670 --> 00:53:40,070
you do not see long lines
of people

1030
00:53:40,070 --> 00:53:42,470
doing repetitive manual labor.

1031
00:53:42,470 --> 00:53:44,600
You see a whole lot
of automation.

1032
00:53:44,600 --> 00:53:46,230
If you go upstairs in that
factory

1033
00:53:46,230 --> 00:53:47,830
and look at the payroll
department,

1034
00:53:47,830 --> 00:53:51,130
you see one or two people
looking into a screen all day.

1035
00:53:51,130 --> 00:53:53,800
So, the activity is still there,

1036
00:53:53,800 --> 00:53:56,000
but the number of jobs
is very, very low,

1037
00:53:56,000 --> 00:53:58,330
because of automation
and tech progress.

1038
00:53:58,330 --> 00:54:01,130
Now, dealing with
that challenge,

1039
00:54:01,130 --> 00:54:02,900
and figuring out what
the next generation

1040
00:54:02,900 --> 00:54:05,700
of the American middle class
should be doing,

1041
00:54:05,700 --> 00:54:07,700
is a really important challenge,

1042
00:54:07,700 --> 00:54:10,530
because I am pretty confident
that we are never again

1043
00:54:10,530 --> 00:54:13,330
going to have this large,
stable, prosperous

1044
00:54:13,330 --> 00:54:15,730
middle class doing routine work.

1045
00:54:15,730 --> 00:54:19,430
♪ ♪

1046
00:54:19,430 --> 00:54:21,970
>> NARRATOR: Evidence of how
A.I. is likely to bring

1047
00:54:21,970 --> 00:54:25,530
accelerated change to the U.S.
workforce can be found

1048
00:54:25,530 --> 00:54:27,970
not far from Saginaw.

1049
00:54:27,970 --> 00:54:29,600
This is the U.S. headquarters

1050
00:54:29,600 --> 00:54:34,070
for one of the world's largest
builders of industrial robots,

1051
00:54:34,070 --> 00:54:38,030
a Japanese-owned company called
Fanuc Robotics.

1052
00:54:38,030 --> 00:54:41,230
>> We've been producing robots
for well over 35 years.

1053
00:54:41,230 --> 00:54:42,770
And you can imagine,
over the years,

1054
00:54:42,770 --> 00:54:45,330
they've changed quite a bit.

1055
00:54:45,330 --> 00:54:48,230
We're utilizing the artificial
intelligence

1056
00:54:48,230 --> 00:54:49,800
to really make the robots
easier to use

1057
00:54:49,800 --> 00:54:54,400
and be able to handle a broader
spectrum of opportunities.

1058
00:54:54,400 --> 00:54:57,770
We see a huge growth potential
in robotics.

1059
00:54:57,770 --> 00:55:00,330
And we see that growth potential
as being, really,

1060
00:55:00,330 --> 00:55:03,230
there's 90% of the market left.

1061
00:55:03,230 --> 00:55:05,230
>> NARRATOR: The industry says
optimistically

1062
00:55:05,230 --> 00:55:09,270
that with that growth,
they can create more jobs.

1063
00:55:09,270 --> 00:55:11,630
>> Even if there were five
people on a job,

1064
00:55:11,630 --> 00:55:12,870
and we reduced that down to two
people,

1065
00:55:12,870 --> 00:55:15,800
because we automated
some level of it,

1066
00:55:15,800 --> 00:55:18,570
we might produce two times more
parts than we did before,

1067
00:55:18,570 --> 00:55:20,170
because we automated it.

1068
00:55:20,170 --> 00:55:26,430
So now, there might be the need
for two more fork-truck drivers,

1069
00:55:26,430 --> 00:55:29,900
or two more quality-inspection
personnel.

1070
00:55:29,900 --> 00:55:31,870
So, although we reduce
some of the people,

1071
00:55:31,870 --> 00:55:36,100
we grow in other areas as we
produce more things.

1072
00:55:36,100 --> 00:55:41,070
>> When I increase productivity
through automation, I lose jobs.

1073
00:55:41,070 --> 00:55:42,370
Jobs go away.

1074
00:55:42,370 --> 00:55:45,170
And I don't care what the robot
manufacturers say,

1075
00:55:45,170 --> 00:55:47,830
you aren't replacing those ten
production people

1076
00:55:47,830 --> 00:55:51,570
that that robot is now doing
that job, with ten people.

1077
00:55:51,570 --> 00:55:54,830
You can increase productivity to
a level to stay competitive

1078
00:55:54,830 --> 00:55:58,970
with the global market-- that's
what they're trying to do.

1079
00:55:58,970 --> 00:56:00,530
♪ ♪

1080
00:56:00,530 --> 00:56:02,900
>> NARRATOR:
In the popular telling,

1081
00:56:02,900 --> 00:56:06,800
blame for widespread job loss
has been aimed overseas,

1082
00:56:06,800 --> 00:56:08,900
at what's called offshoring.

1083
00:56:08,900 --> 00:56:11,200
>> We want to keep
our factories here,

1084
00:56:11,200 --> 00:56:13,100
we want to keep
our manufacturing here.

1085
00:56:13,100 --> 00:56:17,470
We don't want them moving
to China, to Mexico, to Japan,

1086
00:56:17,470 --> 00:56:21,630
to India, to Vietnam.

1087
00:56:21,630 --> 00:56:23,770
>> NARRATOR: But it turns out
most of the job loss

1088
00:56:23,770 --> 00:56:26,370
isn't because of offshoring.

1089
00:56:26,370 --> 00:56:27,700
>> There's been offshoring.

1090
00:56:27,700 --> 00:56:32,300
And I think offshoring is
responsible for maybe 20%

1091
00:56:32,300 --> 00:56:34,000
of the jobs that have been lost.

1092
00:56:34,000 --> 00:56:36,270
I would say most of the jobs
that have been lost,

1093
00:56:36,270 --> 00:56:38,830
despite what most Americans
thinks, was due to automation

1094
00:56:38,830 --> 00:56:41,830
or productivity growth.

1095
00:56:41,830 --> 00:56:43,570
>> NARRATOR:
Mike Hicks is an economist

1096
00:56:43,570 --> 00:56:46,600
at Ball State University
in Muncie, Indiana.

1097
00:56:46,600 --> 00:56:50,300
He and sociologist Emily Wornell
have been documenting

1098
00:56:50,300 --> 00:56:52,670
employment trends
in Middle America.

1099
00:56:52,670 --> 00:56:57,130
Hicks says that automation has
been a mostly silent job killer,

1100
00:56:57,130 --> 00:56:59,200
lowering the standard of living.

1101
00:56:59,200 --> 00:57:02,400
>> So, in the last 15 years, the
standard of living has dropped

1102
00:57:02,400 --> 00:57:04,600
by 15, ten to 15 percent.

1103
00:57:04,600 --> 00:57:07,100
So, that's unusual
in a developed world.

1104
00:57:07,100 --> 00:57:08,600
A one-year decline
is a recession.

1105
00:57:08,600 --> 00:57:12,470
A 15-year decline gives
an entirely different sense

1106
00:57:12,470 --> 00:57:14,830
about the prospects
of a community.

1107
00:57:14,830 --> 00:57:18,500
And so that is common
from the Canadian border

1108
00:57:18,500 --> 00:57:20,970
to the Gulf of Mexico

1109
00:57:20,970 --> 00:57:23,300
in the middle swath
of the United States.

1110
00:57:23,300 --> 00:57:26,130
>> This is something we're gonna
do for you guys.

1111
00:57:26,130 --> 00:57:30,730
These were left over from our
suggestion drive that we did,

1112
00:57:30,730 --> 00:57:32,200
and we're going to give them
each two.

1113
00:57:32,200 --> 00:57:33,300
>> That is awesome.
>> I mean,

1114
00:57:33,300 --> 00:57:35,070
that is going to go a long ways,
right?

1115
00:57:35,070 --> 00:57:37,070
I mean, that'll really help that
family out during the holidays.

1116
00:57:37,070 --> 00:57:39,800
>> Yes, well, with the kids home
from school,

1117
00:57:39,800 --> 00:57:41,430
the families have three meals
a day that they got

1118
00:57:41,430 --> 00:57:43,170
to put on the table.

1119
00:57:43,170 --> 00:57:45,130
So, it's going to make a big
difference.

1120
00:57:45,130 --> 00:57:47,130
So, thank you, guys.
>> You're welcome.

1121
00:57:47,130 --> 00:57:48,830
>> This is wonderful.
>> Let them know Merry Christmas

1122
00:57:48,830 --> 00:57:50,370
on behalf of us here
at the local, okay?

1123
00:57:50,370 --> 00:57:52,930
>> Absolutely, you guys are
just, just amazing, thank you.

1124
00:57:52,930 --> 00:57:56,270
And please, tell, tell all the
workers how grateful

1125
00:57:56,270 --> 00:57:57,900
these families will be.
>> We will.

1126
00:57:57,900 --> 00:58:00,870
>> I mean, this is not a small
problem.

1127
00:58:00,870 --> 00:58:02,700
The need is so great.

1128
00:58:02,700 --> 00:58:05,830
And I can tell you
that it's all races,

1129
00:58:05,830 --> 00:58:08,070
it's all income classes

1130
00:58:08,070 --> 00:58:09,700
that you might think someone
might be from.

1131
00:58:09,700 --> 00:58:11,900
But I can tell you that when you
see it,

1132
00:58:11,900 --> 00:58:15,000
and you deliver this type
of gift to somebody

1133
00:58:15,000 --> 00:58:18,600
who is in need, just the
gratitude that they show you

1134
00:58:18,600 --> 00:58:22,470
is incredible.

1135
00:58:22,470 --> 00:58:26,470
>> We actually know that people
are at greater risk of mortality

1136
00:58:26,470 --> 00:58:30,130
for over 20 years after they
lose their job due to,

1137
00:58:30,130 --> 00:58:32,670
due to no fault of their own, so
something like automation

1138
00:58:32,670 --> 00:58:34,770
or offshoring.

1139
00:58:34,770 --> 00:58:36,970
They're at higher risk
for cardiovascular disease,

1140
00:58:36,970 --> 00:58:42,500
they're at higher risk
for depression and suicide.

1141
00:58:42,500 --> 00:58:44,630
But then with the
intergenerational impacts,

1142
00:58:44,630 --> 00:58:48,230
we also see their children
are more likely--

1143
00:58:48,230 --> 00:58:50,300
children of parents who have
lost their job

1144
00:58:50,300 --> 00:58:53,670
due to automation-- are more
likely to repeat a grade,

1145
00:58:53,670 --> 00:58:55,570
they're more likely to drop out
of school,

1146
00:58:55,570 --> 00:58:57,700
they're more likely to be
suspended from school,

1147
00:58:57,700 --> 00:58:59,470
and they have lower educational
attainment

1148
00:58:59,470 --> 00:59:03,200
over their entire lifetimes.

1149
00:59:03,200 --> 00:59:06,200
>> It's the future of this,
not the past, that scares me.

1150
00:59:06,200 --> 00:59:08,700
Because I think we're in the
early decades

1151
00:59:08,700 --> 00:59:11,170
of what is a multi-decade
adjustment period.

1152
00:59:11,170 --> 00:59:14,000
♪ ♪

1153
00:59:14,000 --> 00:59:18,170
>> NARRATOR: The world is being
re-imagined.

1154
00:59:18,170 --> 00:59:20,370
This is a supermarket.

1155
00:59:20,370 --> 00:59:24,800
Robots, guided by A.I., pack
everything from soap powder

1156
00:59:24,800 --> 00:59:29,530
to cantaloupes for online
consumers.

1157
00:59:29,530 --> 00:59:31,600
Machines that pick groceries,

1158
00:59:31,600 --> 00:59:35,170
machines that can also read
reports, learn routines,

1159
00:59:35,170 --> 00:59:38,730
and comprehend are reaching deep
into factories,

1160
00:59:38,730 --> 00:59:41,870
stores, and offices.

1161
00:59:41,870 --> 00:59:43,800
At a college in Goshen, Indiana,

1162
00:59:43,800 --> 00:59:47,030
a group of local business and
political leaders come together

1163
00:59:47,030 --> 00:59:52,830
to try to understand the impact
of A.I. and the new machines.

1164
00:59:52,830 --> 00:59:54,870
Molly Kinder studies
the future of work

1165
00:59:54,870 --> 00:59:56,470
at a Washington think tank.

1166
00:59:56,470 --> 00:59:58,970
>> How many people have gone
into a fast-food restaurant

1167
00:59:58,970 --> 01:00:01,370
and done a self-ordering?

1168
01:00:01,370 --> 01:00:02,530
Anyone, yes?

1169
01:00:02,530 --> 01:00:04,400
Panera, for instance,
is doing this.

1170
01:00:04,400 --> 01:00:08,270
Cashier was my first job,
and in, in, where I live,

1171
01:00:08,270 --> 01:00:10,830
in Washington, DC, it's actually
the number-one occupation

1172
01:00:10,830 --> 01:00:12,300
for the greater DC region.

1173
01:00:12,300 --> 01:00:14,670
There are millions of people who
work in cashier positions.

1174
01:00:14,670 --> 01:00:17,000
This is not a futuristic
challenge,

1175
01:00:17,000 --> 01:00:19,800
this is something that's
happening sooner than we think.

1176
01:00:19,800 --> 01:00:24,770
In the popular discussions about
robots and automation and work,

1177
01:00:24,770 --> 01:00:28,600
almost every image is of a man
on a factory floor

1178
01:00:28,600 --> 01:00:29,770
or a truck driver.

1179
01:00:29,770 --> 01:00:32,900
And yet, in our data, when we
looked,

1180
01:00:32,900 --> 01:00:35,900
women disproportionately hold
the jobs that today

1181
01:00:35,900 --> 01:00:37,900
are at highest risk
of automation.

1182
01:00:37,900 --> 01:00:40,800
And that's not really being
talked about,

1183
01:00:40,800 --> 01:00:43,700
and that's in part because women
are over-represented

1184
01:00:43,700 --> 01:00:45,570
in some of these marginalized
occupations,

1185
01:00:45,570 --> 01:00:48,230
like a cashier
or a fast-food worker.

1186
01:00:48,230 --> 01:00:53,670
And also in a large numbers
in clerical jobs in offices--

1187
01:00:53,670 --> 01:00:57,400
HR departments,
payroll, finance,

1188
01:00:57,400 --> 01:01:00,900
a lot of that is more routine
processing information,

1189
01:01:00,900 --> 01:01:03,530
processing paper,
transferring data.

1190
01:01:03,530 --> 01:01:08,000
That has huge potential for
automation.

1191
01:01:08,000 --> 01:01:11,000
A.I. is going to do
some of that, software,

1192
01:01:11,000 --> 01:01:12,900
robots are going to do
some of that.

1193
01:01:12,900 --> 01:01:14,830
So how many people are still
working

1194
01:01:14,830 --> 01:01:16,300
as switchboard operators?

1195
01:01:16,300 --> 01:01:18,170
Probably none in this country.

1196
01:01:18,170 --> 01:01:20,470
>> NARRATOR: The workplace of
the future will demand

1197
01:01:20,470 --> 01:01:24,230
different skills, and gaining
them, says Molly Kinder,

1198
01:01:24,230 --> 01:01:26,300
will depend on who
can afford them.

1199
01:01:26,300 --> 01:01:28,570
>> I mean it's not a good
situation in the United States.

1200
01:01:28,570 --> 01:01:30,330
There's been some excellent
research that says

1201
01:01:30,330 --> 01:01:32,800
that half of Americans
couldn't afford

1202
01:01:32,800 --> 01:01:35,300
a $400 unexpected expense.

1203
01:01:35,300 --> 01:01:38,630
And if you want to get to a
$1,000, there's even less.

1204
01:01:38,630 --> 01:01:41,270
So imagine you're going to go
out without a month's pay,

1205
01:01:41,270 --> 01:01:43,330
two months' pay, a year.

1206
01:01:43,330 --> 01:01:47,030
Imagine you want to put savings
toward a course

1207
01:01:47,030 --> 01:01:49,670
to, to redevelop your career.

1208
01:01:49,670 --> 01:01:52,330
People can't afford to take time
off of work.

1209
01:01:52,330 --> 01:01:56,600
They don't have a cushion, so
this lack of economic stability,

1210
01:01:56,600 --> 01:01:59,500
married with the disruptions in
people's careers,

1211
01:01:59,500 --> 01:02:01,230
is a really toxic mix.

1212
01:02:01,230 --> 01:02:03,630
>> (blowing whistle)

1213
01:02:03,630 --> 01:02:05,530
>> NARRATOR: The new machines
will penetrate every sector

1214
01:02:05,530 --> 01:02:08,600
of the economy:
from insurance companies

1215
01:02:08,600 --> 01:02:11,130
to human resource departments;

1216
01:02:11,130 --> 01:02:14,030
from law firms to the trading
floors of Wall Street.

1217
01:02:14,030 --> 01:02:15,470
>> Wall Street's
going through it,

1218
01:02:15,470 --> 01:02:16,970
but every industry is going
through it.

1219
01:02:16,970 --> 01:02:19,630
Every company is looking at all
of the disruptive technologies,

1220
01:02:19,630 --> 01:02:23,630
could be robotics or drones
or blockchain.

1221
01:02:23,630 --> 01:02:27,130
And whatever it is, every
company's using everything

1222
01:02:27,130 --> 01:02:29,570
that's developed, everything
that's disruptive,

1223
01:02:29,570 --> 01:02:32,370
in thinking about, "How do
I apply that to my business

1224
01:02:32,370 --> 01:02:35,000
to make myself more efficient?"

1225
01:02:35,000 --> 01:02:37,400
And what efficiency means is,
mostly,

1226
01:02:37,400 --> 01:02:40,670
"How do I do this
with fewer workers?"

1227
01:02:43,900 --> 01:02:47,330
And I do think that when we look
at some of the studies

1228
01:02:47,330 --> 01:02:50,700
about opportunity
in this country,

1229
01:02:50,700 --> 01:02:53,030
and the inequality
of opportunity,

1230
01:02:53,030 --> 01:02:55,830
the likelihood that you won't be
able to advance

1231
01:02:55,830 --> 01:02:59,300
from where your parents were, I
think that's, that's,

1232
01:02:59,300 --> 01:03:02,000
is very serious and gets
to the heart of the way

1233
01:03:02,000 --> 01:03:06,430
we like to think of America as
the land of opportunity.

1234
01:03:06,430 --> 01:03:08,970
>> NARRATOR: Inequality has been
rising in America.

1235
01:03:08,970 --> 01:03:13,270
It used to be the top 1%
of earners-- here in red--

1236
01:03:13,270 --> 01:03:16,670
owned a relatively small portion
of the country's wealth.

1237
01:03:16,670 --> 01:03:20,100
Middle and lower earners--
in blue-- had the largest share.

1238
01:03:20,100 --> 01:03:24,970
Then, 15 years ago,
the lines crossed.

1239
01:03:24,970 --> 01:03:29,500
And inequality has been
increasing ever since.

1240
01:03:29,500 --> 01:03:31,830
>> There's many factors that are
driving inequality today,

1241
01:03:31,830 --> 01:03:33,330
and unfortunately,
artificial intelligence--

1242
01:03:33,330 --> 01:03:38,270
without being thoughtful
about it--

1243
01:03:38,270 --> 01:03:41,430
is a driver for increased
inequality

1244
01:03:41,430 --> 01:03:43,900
because it's a form of
automation,

1245
01:03:43,900 --> 01:03:47,000
and automation is the
substitution of capital

1246
01:03:47,000 --> 01:03:49,070
for labor.

1247
01:03:49,070 --> 01:03:52,800
And when you do that,
the people with the capital win.

1248
01:03:52,800 --> 01:03:55,800
So Karl Marx was right,

1249
01:03:55,800 --> 01:03:58,100
it's a struggle between capital
and labor,

1250
01:03:58,100 --> 01:03:59,600
and with artificial
intelligence,

1251
01:03:59,600 --> 01:04:02,830
we're putting our finger on the
scale on the side of capital,

1252
01:04:02,830 --> 01:04:05,770
and how we wish to distribute
the benefits,

1253
01:04:05,770 --> 01:04:07,230
the economic benefits,

1254
01:04:07,230 --> 01:04:09,130
that that will create is going
to be a major

1255
01:04:09,130 --> 01:04:13,430
moral consideration for society
over the next several decades.

1256
01:04:13,430 --> 01:04:19,370
>> This is really an outgrowth
of the increasing gaps

1257
01:04:19,370 --> 01:04:23,600
of haves and have-nots--
the wealthy getting wealthier,

1258
01:04:23,600 --> 01:04:24,870
the poor getting poorer.

1259
01:04:24,870 --> 01:04:28,400
It may not be specifically
related to A.I.,

1260
01:04:28,400 --> 01:04:30,770
but as... but A.I. will
exacerbate that.

1261
01:04:30,770 --> 01:04:36,430
And that, I think, will tear
the society apart,

1262
01:04:36,430 --> 01:04:38,930
because the rich will have just
too much,

1263
01:04:38,930 --> 01:04:44,200
and those who are have-nots will
have perhaps very little way

1264
01:04:44,200 --> 01:04:46,700
of digging themselves
out of the hole.

1265
01:04:46,700 --> 01:04:50,800
And with A.I. making its impact,
it, it'll be worse, I think.

1266
01:04:50,800 --> 01:04:56,170
♪ ♪

1267
01:04:56,170 --> 01:05:01,870
(crowd cheering and applauding)

1268
01:05:01,870 --> 01:05:05,000
>> (speaking on P.A.)

1269
01:05:05,000 --> 01:05:08,830
I'm here today for one main
reason.

1270
01:05:08,830 --> 01:05:12,630
To say thank you to Ohio.

1271
01:05:12,630 --> 01:05:17,400
(crowd cheering and applauding)

1272
01:05:17,400 --> 01:05:20,300
>> I think the Trump vote
was a protest.

1273
01:05:20,300 --> 01:05:22,030
I mean that for whatever reason,

1274
01:05:22,030 --> 01:05:25,000
whatever the hot button was
that, you know,

1275
01:05:25,000 --> 01:05:28,800
that really hit home with these
Americans who voted for him

1276
01:05:28,800 --> 01:05:30,800
were, it was a protest vote.

1277
01:05:30,800 --> 01:05:34,530
They didn't like the direction
things were going.

1278
01:05:34,530 --> 01:05:38,270
(crowd booing and shouting)

1279
01:05:39,170 --> 01:05:40,700
I'm scared.

1280
01:05:40,700 --> 01:05:42,900
I'm gonna be quite honest with
you, I worry about the future

1281
01:05:42,900 --> 01:05:47,100
of not just this country,
but the, the entire globe.

1282
01:05:47,100 --> 01:05:51,100
If we continue to go in an
automated system,

1283
01:05:51,100 --> 01:05:52,730
what are we going to do?

1284
01:05:52,730 --> 01:05:54,870
Now I've got a group of people
at the top

1285
01:05:54,870 --> 01:05:57,170
that are making all the money
and I don't have anybody

1286
01:05:57,170 --> 01:06:00,070
in the middle
that can support a family.

1287
01:06:00,070 --> 01:06:05,330
So do we have to go to the point
where we crash to come back?

1288
01:06:05,330 --> 01:06:06,630
And in this case,

1289
01:06:06,630 --> 01:06:08,030
the automation's already gonna
be there,

1290
01:06:08,030 --> 01:06:09,630
so I don't know how
you come back.

1291
01:06:09,630 --> 01:06:11,730
I'm really worried
about where this,

1292
01:06:11,730 --> 01:06:13,700
where this leads us
in the future.

1293
01:06:13,700 --> 01:06:17,000
♪ ♪

1294
01:06:27,200 --> 01:06:28,730
>> NARRATOR: The future is
largely being shaped

1295
01:06:28,730 --> 01:06:32,370
by a few hugely successful
tech companies.

1296
01:06:32,370 --> 01:06:35,700
They're constantly buying up
successful smaller companies

1297
01:06:35,700 --> 01:06:37,900
and recruiting talent.

1298
01:06:37,900 --> 01:06:39,830
Between the U.S. and China,

1299
01:06:39,830 --> 01:06:42,800
they employ a great majority of
the leading A.I. researchers

1300
01:06:42,800 --> 01:06:46,070
and scientists.

1301
01:06:46,070 --> 01:06:48,370
In the course of amassing
such power,

1302
01:06:48,370 --> 01:06:51,930
they've also become among the
richest companies in the world.

1303
01:06:51,930 --> 01:06:58,130
>> A.I. really is the ultimate
tool of wealth creation.

1304
01:06:58,130 --> 01:07:03,730
Think about the massive data
that, you know, Facebook has

1305
01:07:03,730 --> 01:07:08,270
on user preferences, and how
it can very smartly target

1306
01:07:08,270 --> 01:07:10,400
an ad that you might buy
something

1307
01:07:10,400 --> 01:07:16,430
and get a much bigger cut that
a smaller company couldn't do.

1308
01:07:16,430 --> 01:07:18,970
Same with Google,
same with Amazon.

1309
01:07:18,970 --> 01:07:23,300
So it's... A.I. is a set of
tools

1310
01:07:23,300 --> 01:07:26,500
that helps you maximize an
objective function,

1311
01:07:26,500 --> 01:07:32,200
and that objective function
initially will simply be,

1312
01:07:32,200 --> 01:07:34,200
make more money.

1313
01:07:34,200 --> 01:07:36,400
>> NARRATOR: And it is how these
companies make that money,

1314
01:07:36,400 --> 01:07:41,230
and how their algorithms reach
deeper and deeper into our work,

1315
01:07:41,230 --> 01:07:42,870
our daily lives,
and our democracy,

1316
01:07:42,870 --> 01:07:47,870
that makes many people
increasingly uncomfortable.

1317
01:07:47,870 --> 01:07:52,330
Pedro Domingos wrote the book
"The Master Algorithm."

1318
01:07:52,330 --> 01:07:55,470
>> Everywhere you go,
you generate a cloud of data.

1319
01:07:55,470 --> 01:07:58,500
You're trailing data, everything
that you do is producing data.

1320
01:07:58,500 --> 01:07:59,900
And then there are computers
looking at that data

1321
01:07:59,900 --> 01:08:02,970
that are learning, and these
computers are essentially

1322
01:08:02,970 --> 01:08:05,100
trying to serve you better.

1323
01:08:05,100 --> 01:08:07,270
They're trying to personalize
things to you.

1324
01:08:07,270 --> 01:08:08,900
They're trying to adapt
the world to you.

1325
01:08:08,900 --> 01:08:10,800
So on the one hand,
this is great,

1326
01:08:10,800 --> 01:08:12,430
because the world will get
adapted to you

1327
01:08:12,430 --> 01:08:15,900
without you even having to
explicitly adapt it.

1328
01:08:15,900 --> 01:08:18,600
There's also a danger, because
the entities in the companies

1329
01:08:18,600 --> 01:08:20,100
that are in control of those
algorithms

1330
01:08:20,100 --> 01:08:22,000
don't necessarily have the same
goals as you,

1331
01:08:22,000 --> 01:08:24,800
and this is where I think people
need to be aware that,

1332
01:08:24,800 --> 01:08:30,000
what's going on, so they can
have more control over it.

1333
01:08:30,000 --> 01:08:31,630
>> You know, we came into this
new world thinking

1334
01:08:31,630 --> 01:08:35,530
that we were users
of social media.

1335
01:08:35,530 --> 01:08:37,800
It didn't occur to us
that social media

1336
01:08:37,800 --> 01:08:40,430
was actually using us.

1337
01:08:40,430 --> 01:08:43,830
We thought that we were
searching Google.

1338
01:08:43,830 --> 01:08:48,700
We had no idea that Google
was searching us.

1339
01:08:48,700 --> 01:08:50,800
>> NARRATOR: Shoshana Zuboff
is a Harvard Business School

1340
01:08:50,800 --> 01:08:52,870
professor emerita.

1341
01:08:52,870 --> 01:08:55,930
In 1988, she wrote a definitive
book called

1342
01:08:55,930 --> 01:08:58,370
"In the Age of
the Smart Machine."

1343
01:08:58,370 --> 01:09:01,970
For the last seven years,
she has worked on a new book,

1344
01:09:01,970 --> 01:09:04,730
making the case that we have now
entered a new phase

1345
01:09:04,730 --> 01:09:09,970
of the economy, which she calls
"surveillance capitalism."

1346
01:09:09,970 --> 01:09:16,330
>> So, famously, industrial
capitalism claimed nature.

1347
01:09:16,330 --> 01:09:20,300
Innocent rivers, and meadows,
and forests, and so forth,

1348
01:09:20,300 --> 01:09:25,130
for the market dynamic to be
reborn as real estate,

1349
01:09:25,130 --> 01:09:27,970
as land that could be sold
and purchased.

1350
01:09:27,970 --> 01:09:32,230
Industrial capitalism claimed
work for the market dynamic

1351
01:09:32,230 --> 01:09:35,170
to reborn, to be reborn as labor

1352
01:09:35,170 --> 01:09:38,700
that could be sold
and purchased.

1353
01:09:38,700 --> 01:09:40,830
Now, here comes surveillance
capitalism,

1354
01:09:40,830 --> 01:09:47,700
following this pattern, but with
a dark and startling twist.

1355
01:09:47,700 --> 01:09:51,700
What surveillance capitalism
claims is private,

1356
01:09:51,700 --> 01:09:53,930
human experience.

1357
01:09:53,930 --> 01:09:58,970
Private, human experience is
claimed as a free source

1358
01:09:58,970 --> 01:10:05,800
of raw material, fabricated into
predictions of human behavior.

1359
01:10:05,800 --> 01:10:09,370
And it turns out that there are
a lot of businesses

1360
01:10:09,370 --> 01:10:14,270
that really want to know what
we will do now, soon, and later.

1361
01:10:17,800 --> 01:10:19,430
>> NARRATOR: Like most people,

1362
01:10:19,430 --> 01:10:21,470
Alastair Mactaggart
had know idea

1363
01:10:21,470 --> 01:10:23,600
about this new surveillance
business,

1364
01:10:23,600 --> 01:10:27,370
until one evening in 2015.

1365
01:10:27,370 --> 01:10:30,230
>> I had a conversation with a
fellow who's an engineer,

1366
01:10:30,230 --> 01:10:33,870
and I was just talking to him
one night at a,

1367
01:10:33,870 --> 01:10:35,270
you know, a dinner,
at a cocktail party.

1368
01:10:35,270 --> 01:10:37,470
And I... there had been
something in the press that day

1369
01:10:37,470 --> 01:10:39,900
about privacy in the paper,
and I remember asking him--

1370
01:10:39,900 --> 01:10:41,870
he worked for Google-- "What's
the big deal about all,

1371
01:10:41,870 --> 01:10:44,270
why are people so worked up
about it?"

1372
01:10:44,270 --> 01:10:45,670
And I thought it was gonna be
one of those conversations,

1373
01:10:45,670 --> 01:10:49,330
like, with, you know, if you
ever ask an airline pilot,

1374
01:10:49,330 --> 01:10:50,570
"Should I be worried about
flying?"

1375
01:10:50,570 --> 01:10:52,070
and they say,
"Oh, the most dangerous part

1376
01:10:52,070 --> 01:10:55,030
is coming to the airport,
you know, in the car."

1377
01:10:55,030 --> 01:10:57,730
And he said, "Oh, you'd be
horrified

1378
01:10:57,730 --> 01:10:59,900
if you knew how much we knew
about you."

1379
01:10:59,900 --> 01:11:02,100
And I remember that kind of
stuck in my head,

1380
01:11:02,100 --> 01:11:04,530
because it was not
what I expected.

1381
01:11:04,530 --> 01:11:08,400
>> NARRATOR: That question
would change his life.

1382
01:11:08,400 --> 01:11:09,930
A successful California real
estate developer,

1383
01:11:09,930 --> 01:11:15,730
Mactaggart began researching
the new business model.

1384
01:11:15,730 --> 01:11:17,870
>> What I've learned since is
that their entire business

1385
01:11:17,870 --> 01:11:20,630
is learning as much about you
as they can.

1386
01:11:20,630 --> 01:11:21,970
Everything about your thoughts,
and your desires,

1387
01:11:21,970 --> 01:11:25,730
and your dreams,
and who your friends are,

1388
01:11:25,730 --> 01:11:27,700
and what you're thinking, what
your private thoughts are.

1389
01:11:27,700 --> 01:11:29,770
And with that,
that's true power.

1390
01:11:29,770 --> 01:11:33,430
And so, I think...
I didn't know that at the time.

1391
01:11:33,430 --> 01:11:35,470
That their entire business
is basically mining

1392
01:11:35,470 --> 01:11:37,200
the data of your life.

1393
01:11:37,200 --> 01:11:39,070
♪ ♪

1394
01:11:39,070 --> 01:11:43,200
>> NARRATOR: Shoshana Zuboff had
been doing her own research.

1395
01:11:43,200 --> 01:11:45,970
>> You know, I'd been reading
and reading and reading.

1396
01:11:45,970 --> 01:11:48,370
From patents, to transcripts
of earnings calls,

1397
01:11:48,370 --> 01:11:50,430
research reports.

1398
01:11:50,430 --> 01:11:52,130
And, you know,
just literally everything,

1399
01:11:52,130 --> 01:11:56,730
for years and years and years.

1400
01:11:56,730 --> 01:11:57,930
>> NARRATOR: Her studies
included the early days

1401
01:11:57,930 --> 01:12:00,400
of Google, started in 1998

1402
01:12:00,400 --> 01:12:02,570
by two young Stanford grad
students,

1403
01:12:02,570 --> 01:12:06,330
Sergey Brin and Larry Page.

1404
01:12:06,330 --> 01:12:10,000
In the beginning, they had no
clear business model.

1405
01:12:10,000 --> 01:12:13,900
Their unofficial motto was,
"Don't Be Evil."

1406
01:12:13,900 --> 01:12:16,270
>> Right from the start,
the founders,

1407
01:12:16,270 --> 01:12:20,170
Larry Page and Sergey Brin,
they had been very public

1408
01:12:20,170 --> 01:12:26,200
about their antipathy
toward advertising.

1409
01:12:26,200 --> 01:12:31,300
Advertising would distort
the internet

1410
01:12:31,300 --> 01:12:37,800
and it would distort and
disfigure the, the purity

1411
01:12:37,800 --> 01:12:41,700
of any search engine,
including their own.

1412
01:12:41,700 --> 01:12:43,070
>> Once in love with e-commerce,

1413
01:12:43,070 --> 01:12:46,300
Wall Street has turned its back
on the dotcoms.

1414
01:12:46,300 --> 01:12:49,400
>> NARRATOR: Then came the
dotcom crash of the early 2000s.

1415
01:12:49,400 --> 01:12:51,400
>> ...has left hundreds of
unprofitable internet companies

1416
01:12:51,400 --> 01:12:54,830
begging for love and money.

1417
01:12:54,830 --> 01:12:56,730
>> NARRATOR: While Google had
rapidly become the default

1418
01:12:56,730 --> 01:12:58,830
search engine for tens of
millions of users,

1419
01:12:58,830 --> 01:13:04,070
their investors were pressuring
them to make more money.

1420
01:13:04,070 --> 01:13:06,100
Without a new business model,

1421
01:13:06,100 --> 01:13:10,530
the founders knew that the young
company was in danger.

1422
01:13:10,530 --> 01:13:14,470
>> In this state of emergency,
the founders decided,

1423
01:13:14,470 --> 01:13:19,300
"We've simply got to find a way
to save this company."

1424
01:13:19,300 --> 01:13:25,630
And so, parallel to this were
another set of discoveries,

1425
01:13:25,630 --> 01:13:30,970
where it turns out that whenever
we search or whenever we browse,

1426
01:13:30,970 --> 01:13:35,030
we're leaving behind traces--
digital traces--

1427
01:13:35,030 --> 01:13:37,230
of our behavior.

1428
01:13:37,230 --> 01:13:39,330
And those traces,
back in these days,

1429
01:13:39,330 --> 01:13:43,500
were called digital exhaust.

1430
01:13:43,500 --> 01:13:45,400
>> NARRATOR: They realized how
valuable this data could be

1431
01:13:45,400 --> 01:13:47,700
by applying machine learning
algorithms

1432
01:13:47,700 --> 01:13:52,570
to predict users' interests.

1433
01:13:52,570 --> 01:13:54,670
>> What happened was,
they decided to turn

1434
01:13:54,670 --> 01:13:57,630
to those data logs
in a systematic way,

1435
01:13:57,630 --> 01:14:01,670
and to begin to use these
surplus data

1436
01:14:01,670 --> 01:14:06,970
as a way to come up with
fine-grained predictions

1437
01:14:06,970 --> 01:14:11,330
of what a user would click on,
what kind of ad

1438
01:14:11,330 --> 01:14:14,230
a user would click on.

1439
01:14:14,230 --> 01:14:18,700
And inside Google, they started
seeing these revenues

1440
01:14:18,700 --> 01:14:22,830
pile up at a startling rate.

1441
01:14:22,830 --> 01:14:26,000
They realized that they had to
keep it secret.

1442
01:14:26,000 --> 01:14:28,930
They didn't want anyone to know
how much money they were making,

1443
01:14:28,930 --> 01:14:31,500
or how they were making it.

1444
01:14:31,500 --> 01:14:35,700
Because users had no idea that
these extra-behavioral data

1445
01:14:35,700 --> 01:14:39,070
that told so much about them,
you know, was just out there,

1446
01:14:39,070 --> 01:14:43,700
and now it was being used
to predict their future.

1447
01:14:43,700 --> 01:14:46,100
>> NARRATOR: When Google's
I.P.O. took place

1448
01:14:46,100 --> 01:14:47,170
just a few years later,

1449
01:14:47,170 --> 01:14:49,700
the company had a market
capitalization

1450
01:14:49,700 --> 01:14:53,230
of around $23 billion.

1451
01:14:53,230 --> 01:14:56,000
Google's stock was now as
valuable as General Motors.

1452
01:14:56,000 --> 01:14:59,100
♪ ♪

1453
01:14:59,100 --> 01:15:02,330
>> And it was only when Google
went public in 2004

1454
01:15:02,330 --> 01:15:05,600
that the numbers were released.

1455
01:15:05,600 --> 01:15:10,600
And it's at that point that we
learn that between the year 2000

1456
01:15:10,600 --> 01:15:14,500
and the year 2004, Google's
revenue line increased

1457
01:15:14,500 --> 01:15:20,400
by 3,590%.

1458
01:15:20,400 --> 01:15:22,470
>> Let's talk a little about
information, and search,

1459
01:15:22,470 --> 01:15:24,630
and how people consume it.

1460
01:15:24,630 --> 01:15:27,230
>> NARRATOR: By 2010, the C.E.O.
of Google, Eric Schmidt,

1461
01:15:27,230 --> 01:15:29,570
would tell "The Atlantic"
magazine...

1462
01:15:29,570 --> 01:15:33,230
>> ...is, we don't need you to
type at all.

1463
01:15:33,230 --> 01:15:35,930
Because we know where you are,
with your permission,

1464
01:15:35,930 --> 01:15:39,630
we know where you've been,
with your permission.

1465
01:15:39,630 --> 01:15:41,700
We can more or less guess what
you're thinking about.

1466
01:15:41,700 --> 01:15:44,200
(audience laughing)
Now, is that over the line?

1467
01:15:44,200 --> 01:15:45,800
>> NARRATOR: Eric Schmidt
and Google declined

1468
01:15:45,800 --> 01:15:49,370
to be interviewed
for this program.

1469
01:15:49,370 --> 01:15:52,670
Google's new business model for
predicting users' profiles

1470
01:15:52,670 --> 01:15:58,100
had migrated to other companies,
particularly Facebook.

1471
01:15:58,100 --> 01:16:00,130
Roger McNamee was an early
investor

1472
01:16:00,130 --> 01:16:02,330
and adviser to Facebook.

1473
01:16:02,330 --> 01:16:05,900
He's now a critic, and wrote
a book about the company.

1474
01:16:05,900 --> 01:16:08,930
He says he's concerned about how
widely companies like Facebook

1475
01:16:08,930 --> 01:16:11,770
and Google have been casting
the net for data.

1476
01:16:11,770 --> 01:16:13,530
>> And then they realized,
"Wait a minute,

1477
01:16:13,530 --> 01:16:16,470
there's all this data in
the economy we don't have."

1478
01:16:16,470 --> 01:16:18,700
So they went to credit card
processors,

1479
01:16:18,700 --> 01:16:20,430
and credit rating services,

1480
01:16:20,430 --> 01:16:23,030
and said, "We want
to buy your data."

1481
01:16:23,030 --> 01:16:25,100
They go to health and wellness
apps and say,

1482
01:16:25,100 --> 01:16:26,600
"Hey, you got women's
menstrual cycles?

1483
01:16:26,600 --> 01:16:28,530
We want all that stuff."

1484
01:16:28,530 --> 01:16:30,830
Why are they doing that?

1485
01:16:30,830 --> 01:16:34,430
They're doing that because
behavioral prediction

1486
01:16:34,430 --> 01:16:38,270
is about taking uncertainty
out of life.

1487
01:16:38,270 --> 01:16:40,670
Advertising and marketing
are all about uncertainty--

1488
01:16:40,670 --> 01:16:43,530
you never really know who's
going to buy your product.

1489
01:16:43,530 --> 01:16:45,500
Until now.

1490
01:16:45,500 --> 01:16:49,870
We have to recognize that we
gave technology a place

1491
01:16:49,870 --> 01:16:55,700
in our lives
that it had not earned.

1492
01:16:55,700 --> 01:17:00,530
That essentially, because
technology always made things

1493
01:17:00,530 --> 01:17:03,530
better in the '50s, '60s, '70s,
'80s, and '90s,

1494
01:17:03,530 --> 01:17:07,030
we developed a sense of
inevitability

1495
01:17:07,030 --> 01:17:10,000
that it will always make things
better.

1496
01:17:10,000 --> 01:17:13,630
We developed a trust, and the
industry earned good will

1497
01:17:13,630 --> 01:17:20,330
that Facebook and Google have
cashed in.

1498
01:17:20,330 --> 01:17:23,500
>> NARRATOR: The model is simply
this: provide a free service--

1499
01:17:23,500 --> 01:17:26,630
like Facebook-- and in exchange,
you collect the data

1500
01:17:26,630 --> 01:17:28,870
of the millions who use it.

1501
01:17:28,870 --> 01:17:31,800
♪ ♪

1502
01:17:31,800 --> 01:17:37,470
And every sliver of information
is valuable.

1503
01:17:37,470 --> 01:17:41,370
>> It's not just what you post,
it's that you post.

1504
01:17:41,370 --> 01:17:44,800
It's not just that you make
plans to see your friends later.

1505
01:17:44,800 --> 01:17:47,470
It's whether you say,
"I'll see you later,"

1506
01:17:47,470 --> 01:17:51,000
or, "I'll see you at 6:45."

1507
01:17:51,000 --> 01:17:54,030
It's not just that you talk
about the things

1508
01:17:54,030 --> 01:17:56,330
that you have to do today.

1509
01:17:56,330 --> 01:17:59,230
It's whether you simply rattle
them on in a,

1510
01:17:59,230 --> 01:18:04,500
in a rambling paragraph,
or list them as bullet points.

1511
01:18:04,500 --> 01:18:09,030
All of these tiny signals are
the behavioral surplus

1512
01:18:09,030 --> 01:18:13,630
that turns out to have immense
predictive value.

1513
01:18:13,630 --> 01:18:16,300
>> NARRATOR: In 2010, Facebook
experimented

1514
01:18:16,300 --> 01:18:19,070
with A.I.'s predictive powers
in what they called

1515
01:18:19,070 --> 01:18:21,830
a "social contagion" experiment.

1516
01:18:21,830 --> 01:18:25,470
They wanted to see if, through
online messaging,

1517
01:18:25,470 --> 01:18:30,070
they could influence real-world
behavior.

1518
01:18:30,070 --> 01:18:32,670
The aim was to get more people
to the polls

1519
01:18:32,670 --> 01:18:34,530
in the 2010 midterm elections.

1520
01:18:34,530 --> 01:18:38,000
>> Cleveland, I need you to keep
on fighting.

1521
01:18:38,000 --> 01:18:41,030
I need you to keep on believing.

1522
01:18:41,030 --> 01:18:42,530
>> NARRATOR: They offered
61 million users

1523
01:18:42,530 --> 01:18:45,470
an "I voted" button together
with faces of friends

1524
01:18:45,470 --> 01:18:47,330
who had voted.

1525
01:18:47,330 --> 01:18:52,000
A subset of users received
just the button.

1526
01:18:52,000 --> 01:18:56,130
In the end, they claimed to have
nudged 340,000 people to vote.

1527
01:19:00,270 --> 01:19:03,170
They would conduct other
"massive contagion" experiments.

1528
01:19:03,170 --> 01:19:06,600
Among them, one showing that by
adjusting their feeds,

1529
01:19:06,600 --> 01:19:12,000
they could make users
happy or sad.

1530
01:19:12,000 --> 01:19:13,300
>> When they went to write up
these findings,

1531
01:19:13,300 --> 01:19:16,270
they boasted about two things.

1532
01:19:16,270 --> 01:19:19,770
One was, "Oh, my goodness.

1533
01:19:19,770 --> 01:19:24,770
Now we know that we can use cues
in the online environment

1534
01:19:24,770 --> 01:19:28,830
to change real-world behavior.

1535
01:19:28,830 --> 01:19:31,770
That's big news."

1536
01:19:31,770 --> 01:19:35,600
The second thing that they
understood, and they celebrated,

1537
01:19:35,600 --> 01:19:39,030
was that, "We can do this in a
way that bypasses

1538
01:19:39,030 --> 01:19:43,370
the users' awareness."

1539
01:19:43,370 --> 01:19:47,500
>> Private corporations have
built a corporate surveillance

1540
01:19:47,500 --> 01:19:52,370
state without our awareness
or permission.

1541
01:19:52,370 --> 01:19:55,230
And the systems necessary to
make it work

1542
01:19:55,230 --> 01:19:58,430
are getting a lot better,
specifically with what are known

1543
01:19:58,430 --> 01:20:01,500
as internet of things,
smart appliances, you know,

1544
01:20:01,500 --> 01:20:04,430
powered by the Alexa voice
recognition system,

1545
01:20:04,430 --> 01:20:06,870
or the Google Home system.

1546
01:20:06,870 --> 01:20:09,700
>> Okay, Google,
play the morning playlist.

1547
01:20:09,700 --> 01:20:12,200
>> Okay, playing morning
playlist.

1548
01:20:12,200 --> 01:20:14,300
♪ ♪

1549
01:20:14,300 --> 01:20:16,370
>> Okay, Google,
play music in all rooms.

1550
01:20:16,370 --> 01:20:18,030
♪ ♪

1551
01:20:18,030 --> 01:20:21,000
>> And those will put the
surveillance in places

1552
01:20:21,000 --> 01:20:22,270
we've never had it before--

1553
01:20:22,270 --> 01:20:24,800
living rooms, kitchens,
bedrooms.

1554
01:20:24,800 --> 01:20:27,300
And I find all of that
terrifying.

1555
01:20:27,300 --> 01:20:29,630
>> Okay, Google, I'm listening.

1556
01:20:29,630 --> 01:20:31,400
>> NARRATOR: The companies say
they're not using the data

1557
01:20:31,400 --> 01:20:36,770
to target ads, but helping A.I.
improve the user experience.

1558
01:20:36,770 --> 01:20:40,030
>> Alexa, turn on the fan.

1559
01:20:40,030 --> 01:20:41,500
(fan clicks on)

1560
01:20:41,500 --> 01:20:42,670
>> Okay.

1561
01:20:42,670 --> 01:20:43,830
>> NARRATOR: Meanwhile, they are
researching

1562
01:20:43,830 --> 01:20:45,930
and applying for patents

1563
01:20:45,930 --> 01:20:48,900
to expand their reach
into homes and lives.

1564
01:20:48,900 --> 01:20:51,230
>> Alexa, take a video.

1565
01:20:51,230 --> 01:20:52,670
(camera chirps)

1566
01:20:52,670 --> 01:20:54,570
>> The more and more that you
use spoken interfaces--

1567
01:20:54,570 --> 01:20:57,800
so smart speakers-- they're
being trained

1568
01:20:57,800 --> 01:21:00,770
not just to recognize
who you are,

1569
01:21:00,770 --> 01:21:03,970
but they're starting to take
baselines

1570
01:21:03,970 --> 01:21:09,770
and comparing changes over time.

1571
01:21:09,770 --> 01:21:12,970
So does your cadence increase
or decrease?

1572
01:21:12,970 --> 01:21:15,600
Are you sneezing
while you're talking?

1573
01:21:15,600 --> 01:21:18,730
Is your voice a little wobbly?

1574
01:21:18,730 --> 01:21:21,570
The purpose of doing this is
to understand

1575
01:21:21,570 --> 01:21:24,430
more about you in real time.

1576
01:21:24,430 --> 01:21:27,830
So that a system could make
inferences, perhaps,

1577
01:21:27,830 --> 01:21:30,600
like, do you have a cold?

1578
01:21:30,600 --> 01:21:33,370
Are you in a manic phase?

1579
01:21:33,370 --> 01:21:35,100
Are you feeling depressed?

1580
01:21:35,100 --> 01:21:38,700
So that is an extraordinary
amount of information

1581
01:21:38,700 --> 01:21:41,670
that can be gleaned by you
simply waking up

1582
01:21:41,670 --> 01:21:45,330
and asking your smart speaker,
"What's the weather today?"

1583
01:21:45,330 --> 01:21:47,430
>> Alexa, what's the weather
for tonight?

1584
01:21:47,430 --> 01:21:50,630
>> Currently, in Pasadena, it's
58 degrees with cloudy skies.

1585
01:21:50,630 --> 01:21:52,900
>> Inside it is, then.

1586
01:21:52,900 --> 01:21:54,700
Dinner!

1587
01:21:54,700 --> 01:21:57,630
>> The point is that this
is the same

1588
01:21:57,630 --> 01:22:01,800
micro-behavioral targeting that
is directed

1589
01:22:01,800 --> 01:22:08,670
toward individuals based on
intimate, detailed understanding

1590
01:22:08,670 --> 01:22:11,000
of personalities.

1591
01:22:11,000 --> 01:22:15,600
So this is precisely what
Cambridge Analytica did,

1592
01:22:15,600 --> 01:22:19,300
simply pivoting from
the advertisers

1593
01:22:19,300 --> 01:22:23,500
to the political outcomes.

1594
01:22:23,500 --> 01:22:26,170
>> NARRATOR: The Cambridge
Analytica scandal of 2018

1595
01:22:26,170 --> 01:22:29,970
engulfed Facebook, forcing
Mark Zuckerberg to appear

1596
01:22:29,970 --> 01:22:32,900
before Congress to explain how
the data

1597
01:22:32,900 --> 01:22:36,300
of up to 87 million Facebook
users had been harvested

1598
01:22:36,300 --> 01:22:42,870
by a political consulting
company based in the U.K.

1599
01:22:42,870 --> 01:22:45,500
The purpose was to target
and manipulate voters

1600
01:22:45,500 --> 01:22:48,170
in the 2016 presidential
campaign,

1601
01:22:48,170 --> 01:22:51,800
as well as the Brexit
referendum.

1602
01:22:51,800 --> 01:22:53,870
Cambridge Analytica had been
largely funded

1603
01:22:53,870 --> 01:22:58,830
by conservative hedge fund
billionaire Robert Mercer.

1604
01:22:58,830 --> 01:23:02,370
>> And now we know that any
billionaire with enough money,

1605
01:23:02,370 --> 01:23:04,200
who can buy the data,

1606
01:23:04,200 --> 01:23:07,230
buy the machine intelligence
capabilities,

1607
01:23:07,230 --> 01:23:10,700
buy the skilled data scientists,

1608
01:23:10,700 --> 01:23:16,330
you know, they too can
commandeer the public,

1609
01:23:16,330 --> 01:23:23,370
and infect and infiltrate and
upend our democracy

1610
01:23:23,370 --> 01:23:27,770
with the same methodologies that
surveillance capitalism

1611
01:23:27,770 --> 01:23:32,270
uses every single day.

1612
01:23:32,270 --> 01:23:35,070
>> We didn't take a broad enough
view of our responsibility,

1613
01:23:35,070 --> 01:23:37,230
and that was a big mistake.

1614
01:23:37,230 --> 01:23:40,830
And it was my mistake,
and I'm sorry.

1615
01:23:40,830 --> 01:23:41,770
>> NARRATOR:
Zuckerberg has apologized

1616
01:23:41,770 --> 01:23:44,370
for numerous violations of
privacy,

1617
01:23:44,370 --> 01:23:47,130
and his company was recently
fined $5 billion

1618
01:23:47,130 --> 01:23:50,300
by the Federal Trade Commission.

1619
01:23:50,300 --> 01:23:53,230
He has said Facebook will now
make data protection a priority,

1620
01:23:53,230 --> 01:23:56,800
and the company has suspended
tens of thousands

1621
01:23:56,800 --> 01:23:59,400
of third-party apps from its
platform

1622
01:23:59,400 --> 01:24:02,930
as a result of an internal
investigation.

1623
01:24:02,930 --> 01:24:06,870
>> You know, I wish I could say
that after Cambridge Analytica,

1624
01:24:06,870 --> 01:24:09,030
we've learned our lesson and
that everything will be much

1625
01:24:09,030 --> 01:24:12,930
better after that, but I'm
afraid the opposite is true.

1626
01:24:12,930 --> 01:24:14,900
In some ways, Cambridge
Analytica was using tools

1627
01:24:14,900 --> 01:24:16,830
that were ten years old.

1628
01:24:16,830 --> 01:24:18,570
It was really, in some ways,
old-school,

1629
01:24:18,570 --> 01:24:20,670
first-wave data science.

1630
01:24:20,670 --> 01:24:22,270
What we're looking at now,
with current tools

1631
01:24:22,270 --> 01:24:26,270
and machine learning, is that
the ability for manipulation,

1632
01:24:26,270 --> 01:24:28,930
both in terms of elections
and opinions,

1633
01:24:28,930 --> 01:24:31,700
but more broadly,
just how information travels,

1634
01:24:31,700 --> 01:24:34,630
That is a much bigger problem,

1635
01:24:34,630 --> 01:24:36,200
and certainly much more serious
than what we faced

1636
01:24:36,200 --> 01:24:40,070
with Cambridge Analytica.

1637
01:24:40,070 --> 01:24:43,670
>> NARRATOR: A.I. pioneer Yoshua
Bengio also has concerns

1638
01:24:43,670 --> 01:24:48,470
about how his algorithms
are being used.

1639
01:24:48,470 --> 01:24:51,600
>> So the A.I.s are tools.

1640
01:24:51,600 --> 01:24:56,330
And they will serve the people
who control those tools.

1641
01:24:56,330 --> 01:25:01,700
If those people's interests go
against the, the values

1642
01:25:01,700 --> 01:25:04,670
of democracy, then democracy is
in danger.

1643
01:25:04,670 --> 01:25:10,430
So I believe that scientists
who contribute to science,

1644
01:25:10,430 --> 01:25:14,670
when that science can or will
have an impact on society,

1645
01:25:14,670 --> 01:25:17,670
those scientists have a
responsibility.

1646
01:25:17,670 --> 01:25:19,700
It's a little bit like the
physicists of,

1647
01:25:19,700 --> 01:25:21,800
around the Second World War,

1648
01:25:21,800 --> 01:25:25,100
who rose up to tell
the governments,

1649
01:25:25,100 --> 01:25:29,130
"Wait, nuclear power
can be dangerous

1650
01:25:29,130 --> 01:25:31,930
and nuclear war can be really,
really destructive."

1651
01:25:31,930 --> 01:25:36,400
And today, the equivalent of a
physicist of the '40s and '50s

1652
01:25:36,400 --> 01:25:38,730
and '60s are,
are the computer scientists

1653
01:25:38,730 --> 01:25:41,430
who are doing machine learning
and A.I.

1654
01:25:41,430 --> 01:25:45,000
♪ ♪

1655
01:25:45,000 --> 01:25:46,300
>> NARRATOR: One person who
wanted to do something

1656
01:25:46,300 --> 01:25:49,330
about the dangers was not
a computer scientist,

1657
01:25:49,330 --> 01:25:53,330
but an ordinary citizen.

1658
01:25:53,330 --> 01:25:55,600
Alastair Mactaggart was alarmed.

1659
01:25:55,600 --> 01:25:58,800
>> Voting is, for me,
the most alarming one.

1660
01:25:58,800 --> 01:26:00,570
If less than 100,000 votes
separated

1661
01:26:00,570 --> 01:26:03,330
the last two candidates in the
last presidential election,

1662
01:26:03,330 --> 01:26:06,900
in three states...

1663
01:26:06,900 --> 01:26:10,430
>> NARRATOR: He began a solitary
campaign.

1664
01:26:10,430 --> 01:26:12,100
>> We're talking about
convincing a relatively tiny

1665
01:26:12,100 --> 01:26:14,900
fraction of the voters
in a very...

1666
01:26:14,900 --> 01:26:17,700
in a handful of states
to either come out and vote

1667
01:26:17,700 --> 01:26:18,970
or stay home.

1668
01:26:18,970 --> 01:26:21,200
And remember, these companies
know everybody intimately.

1669
01:26:21,200 --> 01:26:24,470
They know who's a racist,
who's a misogynist,

1670
01:26:24,470 --> 01:26:26,770
who's a homophobe,
who's a conspiracy theorist.

1671
01:26:26,770 --> 01:26:28,770
They know the lazy people and
the gullible people.

1672
01:26:28,770 --> 01:26:31,130
They have access to the greatest
trove of personal information

1673
01:26:31,130 --> 01:26:32,670
that's ever been assembled.

1674
01:26:32,670 --> 01:26:35,370
They have the world's best data
scientists.

1675
01:26:35,370 --> 01:26:37,430
And they have essentially
a frictionless way

1676
01:26:37,430 --> 01:26:39,670
of communicating with you.

1677
01:26:39,670 --> 01:26:43,070
This is power.

1678
01:26:43,070 --> 01:26:44,800
>> NARRATOR: Mactaggart started
a signature drive

1679
01:26:44,800 --> 01:26:47,000
for a California ballot
initiative,

1680
01:26:47,000 --> 01:26:51,230
for a law to give consumers
control of their digital data.

1681
01:26:51,230 --> 01:26:54,670
In all, he would spend
$4 million of his own money

1682
01:26:54,670 --> 01:26:58,600
in an effort to rein in the
goliaths of Silicon Valley.

1683
01:26:58,600 --> 01:27:02,570
Google, Facebook, AT&T,
and Comcast

1684
01:27:02,570 --> 01:27:06,170
all opposed his initiative.

1685
01:27:06,170 --> 01:27:09,200
>> I'll tell you, I was scared.
Fear.

1686
01:27:09,200 --> 01:27:12,000
Fear of looking like
a world-class idiot.

1687
01:27:12,000 --> 01:27:14,930
The market cap of all the firms
arrayed against me were,

1688
01:27:14,930 --> 01:27:19,800
was over $6 trillion.

1689
01:27:19,800 --> 01:27:21,600
>> NARRATOR: He needed 500,000
signatures

1690
01:27:21,600 --> 01:27:25,070
to get his initiative
on the ballot.

1691
01:27:25,070 --> 01:27:27,370
He got well over 600,000.

1692
01:27:27,370 --> 01:27:33,530
Polls showed 80% approval
for a privacy law.

1693
01:27:33,530 --> 01:27:37,670
That made the politicians in
Sacramento pay attention.

1694
01:27:37,670 --> 01:27:40,030
So Mactaggart decided that
because he was holding

1695
01:27:40,030 --> 01:27:44,100
a strong hand, it was worth
negotiating with them.

1696
01:27:44,100 --> 01:27:46,530
>> And if AB-375 passes
by tomorrow

1697
01:27:46,530 --> 01:27:48,230
and is signed into law
by the governor,

1698
01:27:48,230 --> 01:27:49,770
we will withdraw the initiative.

1699
01:27:49,770 --> 01:27:51,270
Our deadline to do so is
tomorrow at 5:00.

1700
01:27:51,270 --> 01:27:53,470
>> NARRATOR:
At the very last moment,

1701
01:27:53,470 --> 01:27:55,600
a new law was rushed to the
floor of the state house.

1702
01:27:55,600 --> 01:27:57,470
>> Everyone take their seats,
please.

1703
01:27:57,470 --> 01:28:01,800
Mr. Secretary,
please call the roll.

1704
01:28:01,800 --> 01:28:05,900
>> The voting starts.
>> Alan, aye.

1705
01:28:05,900 --> 01:28:07,570
>> And the first guy,
I think, was a Republican,

1706
01:28:07,570 --> 01:28:08,870
and he voted for it.

1707
01:28:08,870 --> 01:28:10,600
And everybody had said the
Republicans won't vote for it

1708
01:28:10,600 --> 01:28:11,670
because it has this private
right of action,

1709
01:28:11,670 --> 01:28:13,830
where consumers can sue.

1710
01:28:13,830 --> 01:28:15,530
And the guy in the Senate,
he calls the name.

1711
01:28:15,530 --> 01:28:16,770
>> Aye, Roth.

1712
01:28:16,770 --> 01:28:17,830
Aye, Skinner.

1713
01:28:17,830 --> 01:28:19,000
Aye, Stern.

1714
01:28:19,000 --> 01:28:20,800
Aye, Stone.

1715
01:28:20,800 --> 01:28:23,600
>> You can see down below,
and everyone went green,

1716
01:28:23,600 --> 01:28:26,270
and then it passed unanimously.

1717
01:28:26,270 --> 01:28:29,770
>> Ayes 36; No zero,
the measure passes.

1718
01:28:29,770 --> 01:28:32,200
Immediate transmittal to the...

1719
01:28:32,200 --> 01:28:34,530
>> So I was blown away.

1720
01:28:34,530 --> 01:28:36,500
It was, it was a day I will
never forget.

1721
01:28:41,770 --> 01:28:43,630
So in January, next year,
you as a California resident

1722
01:28:43,630 --> 01:28:45,900
will have the right to go to any
company and say,

1723
01:28:45,900 --> 01:28:47,270
"What have you collected on me
in the last 12 years...

1724
01:28:47,270 --> 01:28:48,700
12 months?

1725
01:28:48,700 --> 01:28:51,370
What of my personal information
do you have?"

1726
01:28:51,370 --> 01:28:52,300
So that's the first right.

1727
01:28:52,300 --> 01:28:54,200
It's right of... we call that
the right to know.

1728
01:28:54,200 --> 01:28:56,230
The second is the right
to say no.

1729
01:28:56,230 --> 01:28:59,030
And that's the right to go to
any company and click a button,

1730
01:28:59,030 --> 01:29:00,930
on any page where they're
collecting your information,

1731
01:29:00,930 --> 01:29:03,200
and say, "Do not sell
my information."

1732
01:29:03,200 --> 01:29:06,430
More importantly, we require
that they honor

1733
01:29:06,430 --> 01:29:09,370
what's called a third-party
opt-out.

1734
01:29:09,370 --> 01:29:11,130
You will click once
in your browser,

1735
01:29:11,130 --> 01:29:13,830
"Don't sell my information,"

1736
01:29:13,830 --> 01:29:18,230
and it will then send the signal
to every single website

1737
01:29:18,230 --> 01:29:21,400
that you visit: "Don't sell
this person's information."

1738
01:29:21,400 --> 01:29:22,970
And that's gonna have a huge
impact on the spread

1739
01:29:22,970 --> 01:29:25,530
of your information
across the internet.

1740
01:29:25,530 --> 01:29:27,900
>> NARRATOR: The tech companies
had been publicly cautious,

1741
01:29:27,900 --> 01:29:31,500
but privately alarmed
about regulation.

1742
01:29:31,500 --> 01:29:34,400
Then one tech giant came on
board in support

1743
01:29:34,400 --> 01:29:37,000
of Mactaggart's efforts.

1744
01:29:37,000 --> 01:29:39,670
>> I find the reaction among
other tech companies to,

1745
01:29:39,670 --> 01:29:42,730
at this point, be pretty much
all over the place.

1746
01:29:42,730 --> 01:29:45,970
Some people are saying,
"You're right to raise this.

1747
01:29:45,970 --> 01:29:47,430
These are good ideas."

1748
01:29:47,430 --> 01:29:49,100
Some people say, "We're not sure
these are good ideas,

1749
01:29:49,100 --> 01:29:50,870
but you're right to raise it,"

1750
01:29:50,870 --> 01:29:54,300
and some people are saying,
"We don't want regulation."

1751
01:29:54,300 --> 01:29:56,970
And so, you know, we have
conversations with people

1752
01:29:56,970 --> 01:30:00,030
where we point out that the auto
industry is better

1753
01:30:00,030 --> 01:30:03,130
because there are
safety standards.

1754
01:30:03,130 --> 01:30:05,430
Pharmaceuticals,
even food products,

1755
01:30:05,430 --> 01:30:08,200
all of these industries are
better because the public

1756
01:30:08,200 --> 01:30:11,000
has confidence in the products,

1757
01:30:11,000 --> 01:30:14,930
in part because of a mixture
of responsible companies

1758
01:30:14,930 --> 01:30:19,030
and responsible regulation.

1759
01:30:19,030 --> 01:30:21,400
>> NARRATOR: But the lobbyists
for big tech have been working

1760
01:30:21,400 --> 01:30:24,270
the corridors in Washington.

1761
01:30:24,270 --> 01:30:26,300
They're looking for
a more lenient

1762
01:30:26,300 --> 01:30:29,670
national privacy standard,
one that could perhaps override

1763
01:30:29,670 --> 01:30:33,300
the California law
and others like it.

1764
01:30:33,300 --> 01:30:34,570
But while hearings are held,

1765
01:30:34,570 --> 01:30:37,170
and anti-trust legislation
threatened,

1766
01:30:37,170 --> 01:30:40,530
the problem is that A.I.
has already spread so far

1767
01:30:40,530 --> 01:30:43,630
into our lives and work.

1768
01:30:43,630 --> 01:30:46,100
>> Well, it's in healthcare,
it's in education,

1769
01:30:46,100 --> 01:30:48,670
it's in criminal justice,
it's in the experience

1770
01:30:48,670 --> 01:30:51,230
of shopping as you walk down
the street.

1771
01:30:51,230 --> 01:30:54,700
It has pervaded so many elements
of everyday life,

1772
01:30:54,700 --> 01:30:57,370
and in a way that, in many
cases, is completely opaque

1773
01:30:57,370 --> 01:30:59,230
to people.

1774
01:30:59,230 --> 01:31:00,830
While we can see a phone and
look at it and we know that

1775
01:31:00,830 --> 01:31:02,970
there's some A.I. technology
behind it,

1776
01:31:02,970 --> 01:31:05,200
many of us don't know that when
we go for a job interview

1777
01:31:05,200 --> 01:31:07,030
and we sit down
and we have a conversation,

1778
01:31:07,030 --> 01:31:09,970
that we're being filmed, and
that our micro expressions

1779
01:31:09,970 --> 01:31:12,670
are being analyzed
by hiring companies.

1780
01:31:12,670 --> 01:31:14,700
Or that if you're in the
criminal justice system,

1781
01:31:14,700 --> 01:31:16,670
that there are risk assessment
algorithms

1782
01:31:16,670 --> 01:31:18,830
that are deciding
your "risk number,"

1783
01:31:18,830 --> 01:31:22,530
which could determine whether
or not you receive bail or not.

1784
01:31:22,530 --> 01:31:24,770
These are systems which, in many
cases, are hidden

1785
01:31:24,770 --> 01:31:28,400
in the back end of our sort
of social institutions.

1786
01:31:28,400 --> 01:31:29,400
And so, one of the big
challenges we have is,

1787
01:31:29,400 --> 01:31:31,970
how do we make that more
apparent?

1788
01:31:31,970 --> 01:31:32,830
How do we make it transparent?

1789
01:31:32,830 --> 01:31:36,700
And how do we make it
accountable?

1790
01:31:36,700 --> 01:31:39,830
>> For a very long time,
we have felt like as humans,

1791
01:31:39,830 --> 01:31:43,130
as Americans,
we have full agency

1792
01:31:43,130 --> 01:31:48,830
in determining our own futures--
what we read, what we see,

1793
01:31:48,830 --> 01:31:50,330
we're in charge.

1794
01:31:50,330 --> 01:31:53,070
What Cambridge Analytica taught
us,

1795
01:31:53,070 --> 01:31:55,770
and what Facebook continues
to teach us,

1796
01:31:55,770 --> 01:31:58,830
is that we don't have agency.

1797
01:31:58,830 --> 01:32:00,470
We're not in charge.

1798
01:32:00,470 --> 01:32:05,330
This is machines that are
automating some of our skills,

1799
01:32:05,330 --> 01:32:09,330
but have made decisions about
who...

1800
01:32:09,330 --> 01:32:12,630
Who we are.

1801
01:32:12,630 --> 01:32:16,300
And they're using that
information to tell others

1802
01:32:16,300 --> 01:32:19,570
the story of us.

1803
01:32:19,570 --> 01:32:22,130
♪ ♪

1804
01:32:32,970 --> 01:32:35,470
>> NARRATOR: In China,
in the age of A.I.,

1805
01:32:35,470 --> 01:32:38,130
there's no doubt
about who is in charge.

1806
01:32:38,130 --> 01:32:41,370
In an authoritarian state,
social stability

1807
01:32:41,370 --> 01:32:43,770
is the watchword
of the government.

1808
01:32:43,770 --> 01:32:47,930
(whistle blowing)

1809
01:32:47,930 --> 01:32:51,070
And artificial intelligence has
increased its ability to scan

1810
01:32:51,070 --> 01:32:54,430
the country for signs of unrest.

1811
01:32:54,430 --> 01:32:57,470
(whistle blowing)

1812
01:32:57,470 --> 01:33:00,430
It's been projected that over
600 million cameras

1813
01:33:00,430 --> 01:33:04,700
will be deployed by 2020.

1814
01:33:04,700 --> 01:33:07,730
Here, they may be used to
discourage jaywalking.

1815
01:33:07,730 --> 01:33:10,400
But they also serve to remind
people

1816
01:33:10,400 --> 01:33:14,830
that the state is watching.

1817
01:33:14,830 --> 01:33:18,030
>> And now, there is a project
called Sharp Eyes,

1818
01:33:18,030 --> 01:33:22,670
which is putting camera
on every major street

1819
01:33:22,670 --> 01:33:29,730
and the corner of every village
in China-- meaning everywhere.

1820
01:33:29,730 --> 01:33:33,530
Matching with the most advanced
artificial intelligence

1821
01:33:33,530 --> 01:33:36,830
algorithm, which they can
actually use this data,

1822
01:33:36,830 --> 01:33:39,470
real-time data, to pick up
a face or pick up a action.

1823
01:33:39,470 --> 01:33:42,330
♪ ♪

1824
01:33:42,330 --> 01:33:44,370
>> NARRATOR: Frequent security
expos feature companies

1825
01:33:44,370 --> 01:33:48,530
like Megvii and its facial-
recognition technology.

1826
01:33:48,530 --> 01:33:51,970
They show off cameras with A.I.
that can track cars,

1827
01:33:51,970 --> 01:33:54,800
and identify individuals
by face,

1828
01:33:54,800 --> 01:33:58,270
or just by the way they walk.

1829
01:33:58,270 --> 01:34:02,130
>> The place is just filled with
these screens where you can see

1830
01:34:02,130 --> 01:34:04,530
the computers are actually
reading people's faces

1831
01:34:04,530 --> 01:34:07,630
and trying to digest that data,
and basically track

1832
01:34:07,630 --> 01:34:09,700
and identify who each person is.

1833
01:34:09,700 --> 01:34:11,600
And it's incredible to see so
many,

1834
01:34:11,600 --> 01:34:12,870
because just two
or three years ago,

1835
01:34:12,870 --> 01:34:14,700
we hardly saw
that kind of thing.

1836
01:34:14,700 --> 01:34:16,700
So, a big part of it is
government spending.

1837
01:34:16,700 --> 01:34:18,370
And so the technology's really
taken off,

1838
01:34:18,370 --> 01:34:21,530
and a lot of companies have
started to sort of glom onto

1839
01:34:21,530 --> 01:34:25,700
this idea that this
is the future.

1840
01:34:25,700 --> 01:34:29,470
>> China is on its way
to building

1841
01:34:29,470 --> 01:34:32,330
a total surveillance state.

1842
01:34:32,330 --> 01:34:33,830
>> NARRATOR: And this is the
test lab

1843
01:34:33,830 --> 01:34:36,370
for the surveillance state.

1844
01:34:36,370 --> 01:34:40,170
Here, in the far northwest of
China,

1845
01:34:40,170 --> 01:34:41,830
is the autonomous region
of Xinjiang.

1846
01:34:41,830 --> 01:34:45,170
Of the 25 million people
who live here,

1847
01:34:45,170 --> 01:34:48,170
almost half are a Muslim Turkic
speaking people

1848
01:34:48,170 --> 01:34:52,400
called the Uighurs.

1849
01:34:52,400 --> 01:34:53,900
(people shouting)

1850
01:34:53,900 --> 01:34:57,670
In 2009, tensions with local
Han Chinese led to protests

1851
01:34:57,670 --> 01:35:01,300
and then riots in the capital,
Urumqi.

1852
01:35:01,300 --> 01:35:04,200
(people shouting, guns firing)

1853
01:35:04,200 --> 01:35:05,670
(people shouting)

1854
01:35:08,200 --> 01:35:11,200
As the conflict has grown,
the authorities have brought in

1855
01:35:11,200 --> 01:35:13,670
more police,
and deployed extensive

1856
01:35:13,670 --> 01:35:17,530
surveillance technology.

1857
01:35:17,530 --> 01:35:20,700
That data feeds an A.I. system
that the government claims

1858
01:35:20,700 --> 01:35:24,300
can predict individuals prone
to "terrorism"

1859
01:35:24,300 --> 01:35:27,570
and detect those in need of
"re-education"

1860
01:35:27,570 --> 01:35:30,730
in scores of recently
built camps.

1861
01:35:30,730 --> 01:35:35,700
It is a campaign that has
alarmed human rights groups.

1862
01:35:35,700 --> 01:35:39,100
>> Chinese authorities are,
without any legal basis,

1863
01:35:39,100 --> 01:35:42,770
arbitrarily detaining up
to a million Turkic Muslims

1864
01:35:42,770 --> 01:35:44,800
simply on the basis
of their identity.

1865
01:35:44,800 --> 01:35:49,230
But even outside the facilities
in which these people

1866
01:35:49,230 --> 01:35:51,300
are being held, most of the
population there

1867
01:35:51,300 --> 01:35:53,470
is being subjected to
extraordinary levels

1868
01:35:53,470 --> 01:35:58,470
of high-tech surveillance such
that almost no aspect of life

1869
01:35:58,470 --> 01:36:01,100
anymore, you know, takes place
outside

1870
01:36:01,100 --> 01:36:02,600
the state's line of sight.

1871
01:36:02,600 --> 01:36:06,230
And so the kinds of behavior
that's now being monitored--

1872
01:36:06,230 --> 01:36:07,770
you know, which language do you
speak at home,

1873
01:36:07,770 --> 01:36:09,530
whether you're talking to your
relatives

1874
01:36:09,530 --> 01:36:13,330
in other countries,
how often you pray--

1875
01:36:13,330 --> 01:36:16,230
that information is now being
hoovered up

1876
01:36:16,230 --> 01:36:19,230
and used to decide whether
people should be subjected

1877
01:36:19,230 --> 01:36:21,800
to political re-education
in these camps.

1878
01:36:21,800 --> 01:36:24,570
>> NARRATOR: There have been
reports of torture

1879
01:36:24,570 --> 01:36:27,000
and deaths in the camps.

1880
01:36:27,000 --> 01:36:28,800
And for Uighurs on the outside,

1881
01:36:28,800 --> 01:36:31,670
Xinjiang has already been
described

1882
01:36:31,670 --> 01:36:34,770
as an "open-air prison."

1883
01:36:34,770 --> 01:36:36,930
>> Trying to have a normal life
as a Uighur

1884
01:36:36,930 --> 01:36:40,430
is impossible both inside
and outside of China.

1885
01:36:40,430 --> 01:36:43,530
Just imagine, while you're on
your way to work,

1886
01:36:43,530 --> 01:36:47,600
police subject you to scan
your I.D.,

1887
01:36:47,600 --> 01:36:51,770
forcing you to lift your chin,
while machines take your photo

1888
01:36:51,770 --> 01:36:54,970
and wait... you wait until you
find out if you can go.

1889
01:36:54,970 --> 01:36:59,100
Imagine police take your phone
and run data scan,

1890
01:36:59,100 --> 01:37:02,500
and force you to install
compulsory software

1891
01:37:02,500 --> 01:37:07,630
allowing your phone calls and
messages to be monitored.

1892
01:37:07,630 --> 01:37:09,730
>> NARRATOR: Nury Turkel, a
lawyer and a prominent

1893
01:37:09,730 --> 01:37:14,470
Uighur activist, addresses a
demonstration in Washington, DC.

1894
01:37:14,470 --> 01:37:18,700
Many among the Uighur diaspora
have lost all contact

1895
01:37:18,700 --> 01:37:21,030
with their families back home.

1896
01:37:21,030 --> 01:37:26,100
Turkel warns that this dystopian
deployment of new technology

1897
01:37:26,100 --> 01:37:29,330
is a demonstration project
for authoritarian regimes

1898
01:37:29,330 --> 01:37:31,430
around the world.

1899
01:37:31,430 --> 01:37:35,430
>> They have a bar codes in
somebody's home doors

1900
01:37:35,430 --> 01:37:39,730
to identify what kind of citizen
that he is.

1901
01:37:39,730 --> 01:37:42,670
What we're talking about is a
collective punishment

1902
01:37:42,670 --> 01:37:45,100
of an ethnic group.

1903
01:37:45,100 --> 01:37:48,200
Not only that, the Chinese
government has been promoting

1904
01:37:48,200 --> 01:37:53,100
its methods, its technology,
it is...

1905
01:37:53,100 --> 01:37:58,630
to other countries, namely
Pakistan, Venezuela, Sudan,

1906
01:37:58,630 --> 01:38:04,400
and others to utilize, to
squelch political resentment

1907
01:38:04,400 --> 01:38:07,970
or prevent a political upheaval
in their various societies.

1908
01:38:07,970 --> 01:38:10,430
♪ ♪

1909
01:38:10,430 --> 01:38:13,500
>> NARRATOR: China has a grand
scheme to spread its technology

1910
01:38:13,500 --> 01:38:15,470
and influence around the world.

1911
01:38:15,470 --> 01:38:19,770
Launched in 2013, it started
along the old Silk Road

1912
01:38:19,770 --> 01:38:23,270
out of Xinjiang,
and now goes far beyond.

1913
01:38:23,270 --> 01:38:29,570
It's called "the Belt and Road
Initiative."

1914
01:38:29,570 --> 01:38:31,370
>> So effectively
what the Belt and Road

1915
01:38:31,370 --> 01:38:35,630
is is China's attempt to,
via spending and investment,

1916
01:38:35,630 --> 01:38:37,700
project its influence
all over the world.

1917
01:38:37,700 --> 01:38:39,830
And we've seen, you know,
massive infrastructure projects

1918
01:38:39,830 --> 01:38:43,300
going in in places like
Pakistan, in, in Venezuela,

1919
01:38:43,300 --> 01:38:45,330
in Ecuador, in Bolivia--

1920
01:38:45,330 --> 01:38:47,400
you know, all over the world,
Argentina,

1921
01:38:47,400 --> 01:38:49,630
in America's backyard,
in Africa.

1922
01:38:49,630 --> 01:38:51,530
Africa's been a huge place.

1923
01:38:51,530 --> 01:38:54,230
And what the Belt and Road
ultimately does is, it attempts

1924
01:38:54,230 --> 01:38:56,400
to kind of create a political
leverage

1925
01:38:56,400 --> 01:39:00,070
for the Chinese spending
campaign all over the globe.

1926
01:39:00,070 --> 01:39:03,600
>> NARRATOR: Like Xi Jinping's
2018 visit to Senegal,

1927
01:39:03,600 --> 01:39:06,700
where Chinese contractors had
just built a new stadium,

1928
01:39:06,700 --> 01:39:10,970
arranged loans for a new
infrastructure development,

1929
01:39:10,970 --> 01:39:13,270
and, said the Foreign Ministry,

1930
01:39:13,270 --> 01:39:16,370
there would be help
"maintaining social stability."

1931
01:39:16,370 --> 01:39:19,470
>> As China comes into these
countries and provides

1932
01:39:19,470 --> 01:39:21,770
these loans, what you end up
with is Chinese technology

1933
01:39:21,770 --> 01:39:24,430
being sold and built out by,
you know, by Chinese companies

1934
01:39:24,430 --> 01:39:26,370
in these countries.

1935
01:39:26,370 --> 01:39:27,500
We've started to see it already
in terms

1936
01:39:27,500 --> 01:39:29,170
of surveillance systems.

1937
01:39:29,170 --> 01:39:31,170
Not the kind of high-level A.I.
stuff yet, but, you know,

1938
01:39:31,170 --> 01:39:32,600
lower-level, camera-based,
you know,

1939
01:39:32,600 --> 01:39:36,670
manual sort of observation-type
things all over.

1940
01:39:36,670 --> 01:39:38,200
You know, you see it in
Cambodia, you see it in Ecuador,

1941
01:39:38,200 --> 01:39:39,770
you see it in Venezuela.

1942
01:39:39,770 --> 01:39:42,570
And what they do is, they sell
a dam, sell some other stuff,

1943
01:39:42,570 --> 01:39:44,100
and they say, "You know,
by the way, we can give you

1944
01:39:44,100 --> 01:39:46,730
these camera systems and,
for your emergency response.

1945
01:39:46,730 --> 01:39:49,000
And it'll cost you $300 million,

1946
01:39:49,000 --> 01:39:50,500
and we'll build a ton of
cameras,

1947
01:39:50,500 --> 01:39:52,800
and we'll build you a kind of,
you know, a main center

1948
01:39:52,800 --> 01:39:55,100
where you have police who can
watch these cameras."

1949
01:39:55,100 --> 01:39:57,870
And that's going in all over
the world already.

1950
01:39:57,870 --> 01:40:03,570
♪ ♪

1951
01:40:03,570 --> 01:40:06,600
>> There are 58 countries that
are starting to plug in

1952
01:40:06,600 --> 01:40:10,230
to China's vision of artificial
intelligence.

1953
01:40:10,230 --> 01:40:15,300
Which means effectively that
China is in the process

1954
01:40:15,300 --> 01:40:17,770
of raising a bamboo curtain.

1955
01:40:17,770 --> 01:40:20,770
One that does not need to...

1956
01:40:20,770 --> 01:40:24,130
One that is sort of
all-encompassing,

1957
01:40:24,130 --> 01:40:26,700
that has shared resources,

1958
01:40:26,700 --> 01:40:28,600
shared telecommunications
systems,

1959
01:40:28,600 --> 01:40:31,970
shared infrastructure,
shared digital systems--

1960
01:40:31,970 --> 01:40:35,400
even shared mobile-phone
technologies--

1961
01:40:35,400 --> 01:40:38,700
that is, that is quickly going
up all around the world

1962
01:40:38,700 --> 01:40:41,700
to the exclusion of us
in the West.

1963
01:40:41,700 --> 01:40:43,170
>> Well, one of the things
I worry about the most

1964
01:40:43,170 --> 01:40:45,130
is that the world
is gonna split in two,

1965
01:40:45,130 --> 01:40:47,230
and that there will be
a Chinese tech sector

1966
01:40:47,230 --> 01:40:48,970
and there will be an
American tech sector.

1967
01:40:48,970 --> 01:40:51,830
And countries will effectively
get to choose

1968
01:40:51,830 --> 01:40:53,170
which one they want.

1969
01:40:53,170 --> 01:40:55,770
It'll be kind of like the Cold
War, where you decide,

1970
01:40:55,770 --> 01:40:57,970
"Oh, are we gonna align
with the Soviet Union

1971
01:40:57,970 --> 01:40:59,570
or are we gonna align
with the United States?"

1972
01:40:59,570 --> 01:41:02,200
And the Third World gets to
choose this or that.

1973
01:41:02,200 --> 01:41:06,000
And that's not a world that's
good for anybody.

1974
01:41:06,000 --> 01:41:09,130
>> The markets in Asia and the
U.S. falling sharply

1975
01:41:09,130 --> 01:41:11,470
on news that a top Chinese
executive

1976
01:41:11,470 --> 01:41:13,100
has been arrested in Canada.

1977
01:41:13,100 --> 01:41:14,300
Her name is Sabrina Meng.

1978
01:41:14,300 --> 01:41:19,530
She is the CFO of the Chinese
telecom Huawei.

1979
01:41:19,530 --> 01:41:21,270
>> NARRATOR: News of the
dramatic arrest of an important

1980
01:41:21,270 --> 01:41:24,530
Huawei executive was ostensibly
about the company

1981
01:41:24,530 --> 01:41:26,430
doing business with Iran.

1982
01:41:26,430 --> 01:41:29,600
But it seemed to be more about
American distrust

1983
01:41:29,600 --> 01:41:32,600
of the company's technology.

1984
01:41:32,600 --> 01:41:33,900
From its headquarters
in southern China--

1985
01:41:33,900 --> 01:41:38,930
designed to look like fanciful
European capitals--

1986
01:41:38,930 --> 01:41:41,570
Huawei is the second-biggest
seller of smartphones,

1987
01:41:41,570 --> 01:41:45,630
and the world leader
in building 5G networks,

1988
01:41:45,630 --> 01:41:50,970
the high-speed backbone
for the age of A.I.

1989
01:41:50,970 --> 01:41:53,070
Huawei's C.E.O.,
a former officer

1990
01:41:53,070 --> 01:41:54,970
in the People's Liberation Army,

1991
01:41:54,970 --> 01:41:57,830
was defiant about
the American actions.

1992
01:41:57,830 --> 01:41:59,470
>> (speaking Mandarin)

1993
01:41:59,470 --> 01:42:02,600
(translated): There's no way
the U.S. can crush us.

1994
01:42:02,600 --> 01:42:08,500
The world needs Huawei because
we are more advanced.

1995
01:42:08,500 --> 01:42:12,900
If the lights go out in the
West, the East will still shine.

1996
01:42:12,900 --> 01:42:16,270
And if the North goes dark,
then there is still the South.

1997
01:42:16,270 --> 01:42:19,730
America doesn't represent
the world.

1998
01:42:19,730 --> 01:42:22,270
>> NARRATOR: The U.S. government
fears that as Huawei supplies

1999
01:42:22,270 --> 01:42:26,400
countries around the world
with 5G,

2000
01:42:26,400 --> 01:42:28,670
the Chinese government could
have back-door access

2001
01:42:28,670 --> 01:42:30,700
to their equipment.

2002
01:42:30,700 --> 01:42:34,400
Recently, the C.E.O. promised
complete transparency

2003
01:42:34,400 --> 01:42:36,700
into the company's software,

2004
01:42:36,700 --> 01:42:39,470
but U.S. authorities
are not convinced.

2005
01:42:39,470 --> 01:42:44,530
>> Nothing in China exists free
and clear of the party-state.

2006
01:42:44,530 --> 01:42:48,730
Those companies can only exist
and prosper

2007
01:42:48,730 --> 01:42:51,030
at the sufferance of the party.

2008
01:42:51,030 --> 01:42:55,030
And it's made very explicit that
when the party needs them,

2009
01:42:55,030 --> 01:42:58,900
they either have to respond
or they will be dethroned.

2010
01:42:58,900 --> 01:43:03,770
So this is the challenge with a
company like Huawei.

2011
01:43:03,770 --> 01:43:08,900
So Huawei, Ren Zhengfei, the
head of Huawei, he can say,

2012
01:43:08,900 --> 01:43:12,000
"Well, we... we're just a
private company and we just...

2013
01:43:12,000 --> 01:43:15,470
We don't take orders
from the Communist Party."

2014
01:43:15,470 --> 01:43:18,370
Well, maybe they haven't yet.

2015
01:43:18,370 --> 01:43:20,870
But what the Pentagon sees,

2016
01:43:20,870 --> 01:43:23,100
the National Intelligence
Council sees,

2017
01:43:23,100 --> 01:43:27,070
and what the FBI sees is,
"Well, maybe not yet."

2018
01:43:27,070 --> 01:43:30,200
But when the call comes,

2019
01:43:30,200 --> 01:43:35,430
everybody knows what the
company's response will be.

2020
01:43:35,430 --> 01:43:37,000
>> NARRATOR: The U.S. Commerce
Department

2021
01:43:37,000 --> 01:43:39,400
has recently blacklisted
eight companies

2022
01:43:39,400 --> 01:43:42,870
for doing business with
government agencies in Xinjiang,

2023
01:43:42,870 --> 01:43:45,370
claiming they are aiding
in the "repression"

2024
01:43:45,370 --> 01:43:49,300
of the Muslim minority.

2025
01:43:49,300 --> 01:43:52,270
Among the companies is Megvii.

2026
01:43:52,270 --> 01:43:55,170
They have strongly objected
to the blacklist,

2027
01:43:55,170 --> 01:43:57,630
saying that it's "a
misunderstanding of our company

2028
01:43:57,630 --> 01:44:01,500
and our technology."

2029
01:44:01,500 --> 01:44:04,430
♪ ♪

2030
01:44:04,430 --> 01:44:07,370
President Xi has increased his
authoritarian grip

2031
01:44:07,370 --> 01:44:11,070
on the country.

2032
01:44:11,070 --> 01:44:14,530
In 2018, he had the Chinese
constitution changed

2033
01:44:14,530 --> 01:44:20,070
so that he could be president
for life.

2034
01:44:20,070 --> 01:44:21,370
>> If you had asked me
20 years ago,

2035
01:44:21,370 --> 01:44:23,230
"What will happen to China?",
I would've said,

2036
01:44:23,230 --> 01:44:27,170
"Well, over time, the Great
Firewall will break down.

2037
01:44:27,170 --> 01:44:29,770
Of course, people will get
access to social media,

2038
01:44:29,770 --> 01:44:31,800
they'll get access to Google...

2039
01:44:31,800 --> 01:44:35,700
Eventually, it'll become a much
more democratic place,

2040
01:44:35,700 --> 01:44:38,370
with free expression
and lots of Western values."

2041
01:44:38,370 --> 01:44:41,870
And the last time I checked,
that has not happened.

2042
01:44:41,870 --> 01:44:46,600
In fact, technology's become
a tool of control.

2043
01:44:46,600 --> 01:44:48,570
And as China has gone through
this amazing period of growth

2044
01:44:48,570 --> 01:44:51,870
and wealth and openness in
certain ways,

2045
01:44:51,870 --> 01:44:53,430
there has not been the
democratic transformation

2046
01:44:53,430 --> 01:44:55,330
that I thought.

2047
01:44:55,330 --> 01:44:57,570
And it may turn out that,
in fact,

2048
01:44:57,570 --> 01:45:00,600
technology is a better tool for
authoritarian governments

2049
01:45:00,600 --> 01:45:02,570
than it is for democratic
governments.

2050
01:45:02,570 --> 01:45:04,900
>> NARRATOR: To dominate
the world in A.I.,

2051
01:45:04,900 --> 01:45:08,000
President Xi is depending on
Chinese tech

2052
01:45:08,000 --> 01:45:11,330
to lead the way.

2053
01:45:11,330 --> 01:45:13,030
While companies like
Baidu, Alibaba,

2054
01:45:13,030 --> 01:45:17,870
and Tencent are growing more
powerful and competitive,

2055
01:45:17,870 --> 01:45:20,500
they're also beginning to have
difficulty accessing

2056
01:45:20,500 --> 01:45:24,930
American technology, and are
racing to develop their own.

2057
01:45:27,600 --> 01:45:31,200
With a continuing trade war
and growing distrust,

2058
01:45:31,200 --> 01:45:33,630
the longtime argument for
engagement

2059
01:45:33,630 --> 01:45:38,230
between the two countries
has been losing ground.

2060
01:45:38,230 --> 01:45:42,100
>> I've seen more and more
of my colleagues move

2061
01:45:42,100 --> 01:45:44,270
from a position when they
thought,

2062
01:45:44,270 --> 01:45:47,330
"Well, if we just keep engaging
China,

2063
01:45:47,330 --> 01:45:50,800
the lines between
the two countries

2064
01:45:50,800 --> 01:45:52,800
will slowly converge."

2065
01:45:52,800 --> 01:45:56,570
You know, whether it's in
economics, technology, politics.

2066
01:45:56,570 --> 01:45:58,370
And the transformation,

2067
01:45:58,370 --> 01:46:01,230
where they now think
they're diverging.

2068
01:46:01,230 --> 01:46:05,000
So, in other words, the whole
idea of engagement

2069
01:46:05,000 --> 01:46:07,430
is coming under question.

2070
01:46:07,430 --> 01:46:15,600
And that's cast an entirely
different light on technology,

2071
01:46:15,600 --> 01:46:18,800
because if you're diverging and
you're heading into a world

2072
01:46:18,800 --> 01:46:23,900
of antagonism-- you know,
conflict, possibly,

2073
01:46:23,900 --> 01:46:25,930
then suddenly, technology is
something

2074
01:46:25,930 --> 01:46:27,930
that you don't want to share.

2075
01:46:27,930 --> 01:46:30,870
You want to sequester,

2076
01:46:30,870 --> 01:46:34,300
to protect your own national
interest.

2077
01:46:34,300 --> 01:46:38,130
And I think the tipping-point
moment we are at now,

2078
01:46:38,130 --> 01:46:41,130
which is what is casting
the whole question of things

2079
01:46:41,130 --> 01:46:45,130
like artificial intelligence
and technological innovation

2080
01:46:45,130 --> 01:46:47,470
into a completely different
framework,

2081
01:46:47,470 --> 01:46:51,700
is that if in fact China
and the U.S. are in some way

2082
01:46:51,700 --> 01:46:54,730
fundamentally antagonistic
to each other,

2083
01:46:54,730 --> 01:46:59,900
then we're in a completely
different world.

2084
01:46:59,900 --> 01:47:05,670
>> NARRATOR: In the age of A.I.,
a new reality is emerging.

2085
01:47:05,670 --> 01:47:07,600
That with so much accumulated
investment

2086
01:47:07,600 --> 01:47:11,800
and intellectual power, the
world is already dominated

2087
01:47:11,800 --> 01:47:16,200
by just two A.I. superpowers.

2088
01:47:16,200 --> 01:47:22,130
That's the premise of a new book
written by Kai-Fu Lee.

2089
01:47:22,130 --> 01:47:23,500
>> Hi, I'm Kai-Fu.

2090
01:47:23,500 --> 01:47:25,600
>> Hi, Dr. Lee, so
nice to meet you.

2091
01:47:25,600 --> 01:47:26,670
>> Really nice to meet you.

2092
01:47:26,670 --> 01:47:28,230
Look at all these dog ears.

2093
01:47:28,230 --> 01:47:30,000
I love, I love that.
>> You like that?

2094
01:47:30,000 --> 01:47:31,970
>> But I... but I don't like you
didn't buy the book,

2095
01:47:31,970 --> 01:47:33,470
you... you borrowed it.

2096
01:47:33,470 --> 01:47:35,570
>> I couldn't find it!
>> Oh, really?

2097
01:47:35,570 --> 01:47:36,900
>> Yeah!
>> And, and you...

2098
01:47:36,900 --> 01:47:39,130
you're coming to my talk?
>> Of course!

2099
01:47:39,130 --> 01:47:40,500
>> Oh, hi.
>> I did my homework,

2100
01:47:40,500 --> 01:47:41,600
I'm telling you.

2101
01:47:41,600 --> 01:47:42,600
>> Oh, my goodness, thank you.

2102
01:47:42,600 --> 01:47:44,670
Laurie, can you get this
gentleman a book?

2103
01:47:44,670 --> 01:47:46,430
(people talking in background)

2104
01:47:46,430 --> 01:47:47,730
>> NARRATOR: In his book
and in life,

2105
01:47:47,730 --> 01:47:50,830
the computer
scientist-cum-venture capitalist

2106
01:47:50,830 --> 01:47:52,230
walks a careful path.

2107
01:47:52,230 --> 01:47:56,370
Criticism of the Chinese
government is avoided,

2108
01:47:56,370 --> 01:47:58,700
while capitalist success
is celebrated.

2109
01:47:58,700 --> 01:48:00,800
>> I'm studying electrical
engineering.

2110
01:48:00,800 --> 01:48:03,230
>> Sure, send me a resume.
>> Okay, thanks.

2111
01:48:03,230 --> 01:48:06,230
>> NARRATOR: Now, with the rise
of the two superpowers,

2112
01:48:06,230 --> 01:48:09,400
he wants to warn the world
of what's coming.

2113
01:48:09,400 --> 01:48:11,600
>> Are you the new leaders?

2114
01:48:11,600 --> 01:48:14,230
>> If we're not the new leaders,
we're pretty close.

2115
01:48:14,230 --> 01:48:15,800
(laughs)

2116
01:48:15,800 --> 01:48:18,030
Thank you very much.
>> Thanks.

2117
01:48:18,030 --> 01:48:20,630
>> NARRATOR: "Never," he writes,
"has the potential

2118
01:48:20,630 --> 01:48:22,800
for human flourishing been
higher

2119
01:48:22,800 --> 01:48:26,230
or the stakes of failure
greater."

2120
01:48:26,230 --> 01:48:27,700
♪ ♪

2121
01:48:27,700 --> 01:48:32,070
>> So if one has to say who's
ahead, I would say today,

2122
01:48:32,070 --> 01:48:34,670
China is quickly catching up.

2123
01:48:34,670 --> 01:48:38,970
China actually began
its big push

2124
01:48:38,970 --> 01:48:42,370
in A.I. only two-and-a-half
years ago,

2125
01:48:42,370 --> 01:48:46,700
when the AlphaGo-Lee Sedol match
became the Sputnik moment.

2126
01:48:46,700 --> 01:48:49,870
>> NARRATOR: He says he believes
that the two A.I. superpowers

2127
01:48:49,870 --> 01:48:52,700
should lead the way and work
together

2128
01:48:52,700 --> 01:48:55,270
to make A.I. a force for good.

2129
01:48:55,270 --> 01:48:58,400
If we do, we may have a chance
of getting it right.

2130
01:48:58,400 --> 01:49:00,730
>> If we do a very good job
in the next 20 years,

2131
01:49:00,730 --> 01:49:04,100
A.I. will be viewed as an age of
enlightenment.

2132
01:49:04,100 --> 01:49:08,370
Our children and their children
will see A.I. as serendipity.

2133
01:49:08,370 --> 01:49:13,800
That A.I. is here to liberate us
from having to do routine jobs,

2134
01:49:13,800 --> 01:49:15,830
and push us to do what we love,

2135
01:49:15,830 --> 01:49:19,530
and push us to think what it
means to be human.

2136
01:49:19,530 --> 01:49:23,600
>> NARRATOR: But what if humans
mishandle this new power?

2137
01:49:23,600 --> 01:49:25,930
Kai-Fu Lee understands
the stakes.

2138
01:49:25,930 --> 01:49:28,270
After all, he invested early
in Megvii,

2139
01:49:28,270 --> 01:49:33,030
which is now on the U.S.
blacklist.

2140
01:49:33,030 --> 01:49:35,630
He says he's reduced his stake
and doesn't speak

2141
01:49:35,630 --> 01:49:38,070
for the company.

2142
01:49:38,070 --> 01:49:40,370
Asked about the government
using A.I.

2143
01:49:40,370 --> 01:49:44,570
for social control,
he chose his words carefully.

2144
01:49:44,570 --> 01:49:49,630
>> Um... A.I. is a technology
that can be used

2145
01:49:49,630 --> 01:49:52,230
for good and for evil.

2146
01:49:52,230 --> 01:50:00,700
So how... how do governments
limit themselves in,

2147
01:50:00,700 --> 01:50:04,670
on the one hand,
using this A.I. technology

2148
01:50:04,670 --> 01:50:07,970
and the database to maintain
a safe environment

2149
01:50:07,970 --> 01:50:11,400
for its citizens, but,
but not encroach

2150
01:50:11,400 --> 01:50:14,370
on a individual's rights
and privacies?

2151
01:50:14,370 --> 01:50:17,530
That, I think, is also a tricky
issue, I think,

2152
01:50:17,530 --> 01:50:19,200
for, for every country.

2153
01:50:19,200 --> 01:50:22,170
I think for... I think every
country will be tempted

2154
01:50:22,170 --> 01:50:26,030
to use A.I. probably
beyond the limits

2155
01:50:26,030 --> 01:50:29,930
to which that you and I would
like the government to use.

2156
01:50:35,370 --> 01:50:40,970
♪ ♪

2157
01:50:40,970 --> 01:50:43,030
>> NARRATOR: Emperor Yao devised
the game of Go

2158
01:50:43,030 --> 01:50:48,770
to teach his son discipline,
concentration, and balance.

2159
01:50:48,770 --> 01:50:52,630
Over 4,000 years later,
in the age of A.I.,

2160
01:50:52,630 --> 01:50:56,230
those words still resonate with
one of its architects.

2161
01:50:56,230 --> 01:50:58,330
♪ ♪

2162
01:50:58,330 --> 01:51:02,270
>> So A.I. can be used in many
ways that are very beneficial

2163
01:51:02,270 --> 01:51:03,700
for society.

2164
01:51:03,700 --> 01:51:08,230
But the current use of A.I.
isn't necessarily aligned

2165
01:51:08,230 --> 01:51:11,630
with the goals of building
a better society,

2166
01:51:11,630 --> 01:51:12,900
unfortunately.

2167
01:51:12,900 --> 01:51:16,570
But, but we could change that.

2168
01:51:16,570 --> 01:51:19,570
>> NARRATOR: In 2016, a game of
Go gave us a glimpse

2169
01:51:19,570 --> 01:51:24,970
of the future of artificial
intelligence.

2170
01:51:24,970 --> 01:51:27,500
Since then, it has become clear
that we will need

2171
01:51:27,500 --> 01:51:32,900
a careful strategy to harness
this new and awesome power.

2172
01:51:35,800 --> 01:51:38,600
>> I, I do think that democracy
is threatened by the progress

2173
01:51:38,600 --> 01:51:42,300
of these tools unless we improve
our social norms

2174
01:51:42,300 --> 01:51:46,430
and we increase
the collective wisdom

2175
01:51:46,430 --> 01:51:51,830
at the planet level to, to deal
with that increased power.

2176
01:51:51,830 --> 01:51:57,600
I'm hoping that my concerns are
not founded,

2177
01:51:57,600 --> 01:51:59,900
but the stakes are so high

2178
01:51:59,900 --> 01:52:06,600
that I don't think we should
take these concerns lightly.

2179
01:52:06,600 --> 01:52:11,930
I don't think we can play with
those possibilities and just...

2180
01:52:11,930 --> 01:52:17,030
race ahead without thinking
about the potential outcomes.

2181
01:52:17,030 --> 01:52:20,870
♪ ♪

2182
01:52:27,330 --> 01:52:31,100
>> Go to pbs.org/frontline for
more of the impact

2183
01:52:31,100 --> 01:52:32,870
of A.I. on jobs.

2184
01:52:32,870 --> 01:52:37,730
>> I believe about fifty percent
of jobs will be somewhat

2185
01:52:37,730 --> 01:52:41,230
or extremely threatened by A.I.
in the next 15 years or so.

2186
01:52:41,230 --> 01:52:43,530
>> And a look at the potential
for racial bias

2187
01:52:43,530 --> 01:52:45,200
in this technology.

2188
01:52:45,200 --> 01:52:47,000
>> We've had issues with bias,
with discrimination,

2189
01:52:47,000 --> 01:52:48,670
with poor system design,
with errors.

2190
01:52:48,670 --> 01:52:51,500
>> Connect to the "Frontline"
community on Facebook

2191
01:52:51,500 --> 01:52:54,570
and Twitter, and watch anytime
on the PBS Video app

2192
01:52:54,570 --> 01:52:56,500
or pbs.org/frontline.

2193
01:52:58,130 --> 01:53:02,070
♪ ♪

2194
01:53:25,000 --> 01:53:26,800
>> For more on this and
other "Frontline" programs,

2195
01:53:26,800 --> 01:53:30,100
visit our website
at pbs.org/frontline.

2196
01:53:34,930 --> 01:53:37,400
♪ ♪

2197
01:53:40,300 --> 01:53:43,530
To order "Frontline's"
"In the Age of A.I." on DVD,

2198
01:53:43,530 --> 01:53:48,830
visit ShopPBS or call
1-800-PLAY-PBS.

2199
01:53:48,830 --> 01:53:52,570
This program is also available
on Amazon Prime Video.

2200
01:53:57,870 --> 01:54:01,170
♪ ♪


