1
00:00:18,630 --> 00:00:19,930
>> NARRATOR: Tonight--

2
00:00:19,930 --> 00:00:22,470
>> The race to become an A.I.
superpower is on...

3
00:00:22,470 --> 00:00:24,670
>> NARRATOR: The politics of
artificial intelligence...

4
00:00:24,670 --> 00:00:26,530
>> There will be
a Chinese tech sector

5
00:00:26,530 --> 00:00:28,200
and there will be
a American tech sector.

6
00:00:28,200 --> 00:00:29,730
>> NARRATOR: The new tech war.

7
00:00:29,730 --> 00:00:32,200
>> The more data,
the better the A.I. works.

8
00:00:32,200 --> 00:00:35,930
So in the age of A.I.,
where data is the new oil,

9
00:00:35,930 --> 00:00:38,170
China is the new Saudi Arabia.

10
00:00:38,170 --> 00:00:39,930
>> NARRATOR:
The future of work...

11
00:00:39,930 --> 00:00:42,100
>> When I increase productivity
through automation,

12
00:00:42,100 --> 00:00:44,070
jobs go away.

13
00:00:44,070 --> 00:00:48,100
>> I believe about 50% of jobs
will be somewhat

14
00:00:48,100 --> 00:00:52,000
or extremely threatened by A.I.
in the next 15 years or so.

15
00:00:52,000 --> 00:00:54,630
>> NARRATOR: A.I. and corporate
surveillance...

16
00:00:54,630 --> 00:00:57,500
>> We thought that we were
searching Google.

17
00:00:57,500 --> 00:00:59,930
We had no idea that Google
was searching us.

18
00:00:59,930 --> 00:01:02,170
>> NARRATOR: And the threat
to democracy.

19
00:01:02,170 --> 00:01:04,600
>> China is on its way
to building

20
00:01:04,600 --> 00:01:06,070
a total surveillance state.

21
00:01:06,070 --> 00:01:08,130
>> NARRATOR: Tonight on
"Frontline"...

22
00:01:08,130 --> 00:01:11,300
>> It has pervaded so many
elements of everyday life.

23
00:01:11,300 --> 00:01:13,930
How do we make it transparent
and accountable?

24
00:01:13,930 --> 00:01:15,770
>> NARRATOR:
..."In the Age of A.I."

25
00:01:18,530 --> 00:01:23,200
♪ ♪

26
00:01:36,900 --> 00:01:39,970
♪ ♪

27
00:01:44,700 --> 00:01:48,330
>> NARRATOR: This is the world's
most complex board game.

28
00:01:50,070 --> 00:01:53,530
There are more possible moves
in the game of Go

29
00:01:53,530 --> 00:01:57,800
than there are atoms
in the universe.

30
00:01:57,800 --> 00:02:03,700
Legend has it that in 2300 BCE,
Emperor Yao devised it

31
00:02:03,700 --> 00:02:10,000
to teach his son discipline,
concentration, and balance.

32
00:02:10,000 --> 00:02:14,570
And, over 4,000 years later,
this ancient Chinese game

33
00:02:14,570 --> 00:02:19,370
would signal the start
of a new industrial age.

34
00:02:19,370 --> 00:02:20,430
♪ ♪

35
00:02:27,930 --> 00:02:33,400
It was 2016, in Seoul,
South Korea.

36
00:02:33,400 --> 00:02:37,170
>> Can machines overtake
human intelligence?

37
00:02:37,170 --> 00:02:39,730
A breakthrough moment when the
world champion

38
00:02:39,730 --> 00:02:42,630
of the Asian board game Go
takes on an A.I. program

39
00:02:42,630 --> 00:02:44,530
developed by Google.

40
00:02:44,530 --> 00:02:51,430
>> (speaking Korean):

41
00:02:57,300 --> 00:02:58,770
>> In countries where
it's very popular,

42
00:02:58,770 --> 00:03:02,570
like China and Japan and,
and South Korea, to them,

43
00:03:02,570 --> 00:03:04,100
Go is not just a game, right?

44
00:03:04,100 --> 00:03:06,000
It's, like, how you learn
strategy.

45
00:03:06,000 --> 00:03:09,500
It has an almost spiritual
component.

46
00:03:09,500 --> 00:03:11,400
You know, if you talk
to South Koreans, right,

47
00:03:11,400 --> 00:03:13,500
and Lee Sedol is the world's
greatest Go player,

48
00:03:13,500 --> 00:03:15,730
he's a national hero
in South Korea.

49
00:03:15,730 --> 00:03:20,630
They were sure that Lee Sedol
would beat AlphaGo hands down.

50
00:03:15,730 --> 00:03:20,630
They were sure that Lee Sedol
would beat AlphaGo hands down.

51
00:03:20,630 --> 00:03:25,030
♪ ♪

52
00:03:25,030 --> 00:03:28,130
>> NARRATOR: Google's AlphaGo
was a computer program that,

53
00:03:28,130 --> 00:03:30,870
starting with the rules of Go

54
00:03:30,870 --> 00:03:33,330
and a database
of historical games,

55
00:03:33,330 --> 00:03:36,630
had been designed
to teach itself.

56
00:03:36,630 --> 00:03:40,700
>> I was one of the commentators
at the Lee Sedol games.

57
00:03:40,700 --> 00:03:44,700
And yes, it was watched by tens
of millions of people.

58
00:03:44,700 --> 00:03:46,300
(man speaking Korean)

59
00:03:46,300 --> 00:03:48,630
>> NARRATOR: Throughout
Southeast Asia,

60
00:03:48,630 --> 00:03:50,400
this was seen as
a sports spectacle

61
00:03:50,400 --> 00:03:51,800
with national pride at stake.

62
00:03:51,800 --> 00:03:53,030
>> Wow, that was a player guess.

63
00:03:53,030 --> 00:03:55,530
>> NARRATOR: But much more
was in play.

64
00:03:55,530 --> 00:03:57,730
This was the public unveiling

65
00:03:57,730 --> 00:03:59,830
of a form of artificial
intelligence

66
00:03:59,830 --> 00:04:02,400
called deep learning,

67
00:04:02,400 --> 00:04:05,400
that mimics the neural networks
of the human brain.

68
00:04:05,400 --> 00:04:07,430
>> So what happens with machine
learning,

69
00:04:07,430 --> 00:04:10,300
or artificial intelligence--
initially with AlphaGo--

70
00:04:10,300 --> 00:04:14,130
is that the machine is fed
all kinds of Go games,

71
00:04:14,130 --> 00:04:17,530
and then it studies them,
learns from them,

72
00:04:17,530 --> 00:04:19,830
and figures out its own moves.

73
00:04:19,830 --> 00:04:21,700
And because it's an A.I.
system--

74
00:04:21,700 --> 00:04:23,570
it's not just following
instructions,

75
00:04:23,570 --> 00:04:25,930
it's figuring out its own
instructions--

76
00:04:25,930 --> 00:04:28,930
it comes up with moves that
humans hadn't thought of before.

77
00:04:28,930 --> 00:04:33,130
So, it studies games that humans
have played, it knows the rules,

78
00:04:33,130 --> 00:04:38,130
and then it comes up
with creative moves.

79
00:04:38,130 --> 00:04:40,030
(woman speaking Korean)

80
00:04:41,600 --> 00:04:44,370
(speaking Korean):

81
00:04:44,370 --> 00:04:46,800
>> That's a very...
that's a very surprising move.

82
00:04:46,800 --> 00:04:49,670
>> I thought it was a mistake.

83
00:04:49,670 --> 00:04:53,470
>> NARRATOR: Game two, move 37.

84
00:04:53,470 --> 00:04:56,200
>> That move 37 was a move that
humans could not fathom,

85
00:04:56,200 --> 00:04:59,070
but yet it ended up being
brilliant

86
00:04:59,070 --> 00:05:02,470
and woke people up to say,

87
00:05:02,470 --> 00:05:05,100
"Wow, after thousands
of years of playing,

88
00:05:05,100 --> 00:05:08,330
we never thought about making
a move like that."

89
00:05:08,330 --> 00:05:10,370
>> Oh, he resigned.

90
00:05:10,370 --> 00:05:14,300
It looks like... Lee Sedol has
just resigned, actually.

91
00:05:14,300 --> 00:05:15,830
>> Yeah!
>> Yes.

92
00:05:15,830 --> 00:05:17,530
>> NARRATOR: In the end, the
scientists watched

93
00:05:17,530 --> 00:05:20,200
their algorithms win four
of the games.

94
00:05:20,200 --> 00:05:22,470
Lee Sedol took one.

95
00:05:22,470 --> 00:05:24,330
>> What happened with Go,
first and foremost,

96
00:05:24,330 --> 00:05:27,830
was a huge victory for deep mind
and for A.I., right?

97
00:05:27,830 --> 00:05:30,170
It wasn't that the computers
beat the humans,

98
00:05:30,170 --> 00:05:33,970
it was that, you know, one type
of intelligence beat another.

99
00:05:33,970 --> 00:05:36,230
>> NARRATOR: Artificial
intelligence had proven

100
00:05:36,230 --> 00:05:38,770
it could marshal a vast amount
of data,

101
00:05:38,770 --> 00:05:42,300
beyond anything any human
could handle,

102
00:05:42,300 --> 00:05:46,400
and use it to teach itself how
to predict an outcome.

103
00:05:46,400 --> 00:05:50,400
The commercial implications
were enormous.

104
00:05:50,400 --> 00:05:53,670
>> While AlphaGo is a,
is a toy game,

105
00:05:53,670 --> 00:05:59,530
but its success and its waking
everyone up, I think,

106
00:05:59,530 --> 00:06:05,770
is, is going to be remembered
as the pivotal moment

107
00:06:05,770 --> 00:06:09,230
where A.I. became mature

108
00:06:09,230 --> 00:06:11,100
and everybody jumped
on the bandwagon.

109
00:06:11,100 --> 00:06:12,570
♪ ♪

110
00:06:12,570 --> 00:06:16,270
>> NARRATOR: This is about the
consequences of that defeat.

111
00:06:16,270 --> 00:06:18,270
(man speaking local language)

112
00:06:18,270 --> 00:06:21,870
How the A.I. algorithms are
ushering in a new age

113
00:06:18,270 --> 00:06:21,870
How the A.I. algorithms are
ushering in a new age

114
00:06:21,870 --> 00:06:26,200
of great potential and
prosperity,

115
00:06:26,200 --> 00:06:31,170
but an age that will also deepen
inequality, challenge democracy,

116
00:06:31,170 --> 00:06:37,200
and divide the world
into two A.I. superpowers.

117
00:06:37,200 --> 00:06:41,130
Tonight, five stories about how
artificial intelligence

118
00:06:41,130 --> 00:06:42,930
is changing our world.

119
00:06:42,930 --> 00:06:45,930
♪ ♪

120
00:06:53,800 --> 00:06:58,330
China has decided to chase
the A.I. future.

121
00:06:58,330 --> 00:07:00,770
>> The difference between
the internet mindset

122
00:07:00,770 --> 00:07:02,830
and the A.I. mindset...

123
00:07:02,830 --> 00:07:06,700
>> NARRATOR: A future made and
embraced by a new generation.

124
00:07:09,000 --> 00:07:12,770
>> Well, it's hard not to feel
the kind of immense energy,

125
00:07:12,770 --> 00:07:17,570
and also the obvious fact
of the demographics.

126
00:07:17,570 --> 00:07:20,770
They're mostly very younger
people,

127
00:07:20,770 --> 00:07:24,830
so that this clearly is
technology which is being

128
00:07:24,830 --> 00:07:28,030
generated by a whole new
generation.

129
00:07:28,030 --> 00:07:29,800
>> NARRATOR: Orville Schell
is one of

130
00:07:29,800 --> 00:07:32,100
America's foremost
China scholars.

131
00:07:32,100 --> 00:07:33,730
>> (speaking Mandarin)

132
00:07:33,730 --> 00:07:36,830
>> NARRATOR: He first came here
45 years ago.

133
00:07:36,830 --> 00:07:40,270
>> When I, when I first came
here, in 1975,

134
00:07:40,270 --> 00:07:42,770
Chairman Mao was still alive,

135
00:07:42,770 --> 00:07:45,300
the Cultural Revolution
was coming on,

136
00:07:45,300 --> 00:07:49,830
and there wasn't a single whiff
of anything

137
00:07:49,830 --> 00:07:51,170
of what you see here.

138
00:07:51,170 --> 00:07:52,770
It was unimaginable.

139
00:07:52,770 --> 00:07:56,500
In fact, in those years,
one very much thought,

140
00:07:56,500 --> 00:08:02,530
"This is the way China is, this
is the way it's going to be."

141
00:08:02,530 --> 00:08:04,570
And the fact that it has gone
through

142
00:08:04,570 --> 00:08:08,330
so many different changes since
is quite extraordinary.

143
00:08:08,330 --> 00:08:10,270
(man giving instructions)

144
00:08:10,270 --> 00:08:13,770
>> NARRATOR: This extraordinary
progress goes back

145
00:08:13,770 --> 00:08:16,370
to that game of Go.

146
00:08:16,370 --> 00:08:18,830
>> I think that the government
recognized

147
00:08:18,830 --> 00:08:20,300
that this was a sort of critical
thing for the future,

148
00:08:20,300 --> 00:08:22,270
and, "We need to catch up
in this," that, you know,

149
00:08:22,270 --> 00:08:24,900
"We cannot have a foreign
company showing us up

150
00:08:24,900 --> 00:08:26,300
at our own game.

151
00:08:26,300 --> 00:08:27,730
And this is going to be
something that is going to be

152
00:08:27,730 --> 00:08:29,100
critically important
in the future."

153
00:08:29,100 --> 00:08:31,230
So, you know, we called it the
Sputnik moment for,

154
00:08:31,230 --> 00:08:33,000
for the Chinese government--

155
00:08:33,000 --> 00:08:35,970
the Chinese government kind of
woke up.

156
00:08:35,970 --> 00:08:38,600
>> (translated): As we often say
in China,

157
00:08:38,600 --> 00:08:41,700
"The beginning is the most
difficult part."

158
00:08:41,700 --> 00:08:44,630
>> NARRATOR: In 2017, Xi Jinping
announced

159
00:08:44,630 --> 00:08:46,570
the government's bold new plans

160
00:08:46,570 --> 00:08:49,570
to an audience
of foreign diplomats.

161
00:08:49,570 --> 00:08:53,000
China would catch up with the
U.S. in artificial intelligence

162
00:08:53,000 --> 00:08:57,170
by 2025 and lead the world
by 2030.

163
00:08:57,170 --> 00:08:59,500
>> (translated): ...and
intensified cooperation

164
00:08:59,500 --> 00:09:02,270
in frontier areas such as
digital economy,

165
00:09:02,270 --> 00:09:04,930
artificial intelligence,
nanotechnology,

166
00:09:04,930 --> 00:09:07,230
and accounting computing.

167
00:09:07,230 --> 00:09:10,730
♪ ♪

168
00:09:13,530 --> 00:09:17,200
>> NARRATOR: Today, China leads
the world in e-commerce.

169
00:09:20,370 --> 00:09:24,070
Drones deliver to rural
villages.

170
00:09:24,070 --> 00:09:27,070
And a society that bypassed
credit cards

171
00:09:27,070 --> 00:09:30,030
now shops in stores
without cashiers,

172
00:09:30,030 --> 00:09:35,200
where the currency
is facial recognition.

173
00:09:35,200 --> 00:09:38,230
>> No country has ever moved
that fast.

174
00:09:38,230 --> 00:09:40,730
And in a short two-and-a-half
years,

175
00:09:40,730 --> 00:09:45,400
China's A.I. implementation
really went from minimal amount

176
00:09:45,400 --> 00:09:49,230
to probably about
17 or 18 unicorns,

177
00:09:49,230 --> 00:09:52,000
that is billion-dollar
companies, in A.I. today.

178
00:09:52,000 --> 00:09:57,130
And that, that progress is,
is hard to believe.

179
00:09:57,130 --> 00:09:59,830
>> NARRATOR: The progress was
powered by a new generation

180
00:09:59,830 --> 00:10:03,870
of ambitious young techs pouring
out of Chinese universities,

181
00:10:03,870 --> 00:10:07,570
competing with each other
for new ideas,

182
00:10:07,570 --> 00:10:13,630
and financed by a new cadre of
Chinese venture capitalists.

183
00:10:13,630 --> 00:10:15,600
This is Sinovation,

184
00:10:15,600 --> 00:10:19,100
created by U.S.-educated A.I.
scientist and businessman

185
00:10:19,100 --> 00:10:21,000
Kai-Fu Lee.

186
00:10:21,000 --> 00:10:26,170
>> These unicorns-- we've got
one, two, three, four, five,

187
00:10:26,170 --> 00:10:29,300
six, in the general A.I. area.

188
00:10:29,300 --> 00:10:31,630
And unicorn means a
billion-dollar company,

189
00:10:31,630 --> 00:10:35,870
a company whose valuation
or market capitalization

190
00:10:35,870 --> 00:10:38,870
is at $1 billion or higher.

191
00:10:38,870 --> 00:10:44,830
I think we put two unicorns
to show $5 billion or higher.

192
00:10:44,830 --> 00:10:47,300
>> NARRATOR: Kai-Fu Lee was born
in Taiwan.

193
00:10:47,300 --> 00:10:50,530
His parents sent him
to high school in Tennessee.

194
00:10:50,530 --> 00:10:53,270
His PhD thesis
at Carnegie Mellon

195
00:10:53,270 --> 00:10:55,900
was on computer speech
recognition,

196
00:10:55,900 --> 00:10:57,570
which took him to Apple.

197
00:10:57,570 --> 00:10:59,900
>> Well, reality is a step
closer to science fiction,

198
00:10:59,900 --> 00:11:02,770
with Apple Computers'
new developed program...

199
00:11:02,770 --> 00:11:05,830
>> NARRATOR: And at 31,
an early measure of fame.

200
00:11:05,830 --> 00:11:08,100
>> Kai-Fu Lee,
the inventor of Apple's

201
00:11:08,100 --> 00:11:09,530
speech-recognition technology.

202
00:11:09,530 --> 00:11:12,400
>> Casper, copy this
to Make Write 2.

203
00:11:12,400 --> 00:11:14,730
Casper, paste.

204
00:11:14,730 --> 00:11:17,600
Casper, 72-point italic outline.

205
00:11:17,600 --> 00:11:20,970
>> NARRATOR: He would move on to
Microsoft research in Asia

206
00:11:20,970 --> 00:11:23,430
and became the head
of Google China.

207
00:11:23,430 --> 00:11:28,530
Ten years ago, he started
Sinovation in Beijing,

208
00:11:28,530 --> 00:11:32,700
and began looking for promising
startups and A.I. talent.

209
00:11:32,700 --> 00:11:35,500
>> So, the Chinese
entrepreneurial companies

210
00:11:35,500 --> 00:11:37,500
started as copycats.

211
00:11:37,500 --> 00:11:41,570
But over the last 15 years,
China has developed its own form

212
00:11:41,570 --> 00:11:47,100
of entrepreneurship, and that
entrepreneurship is described

213
00:11:47,100 --> 00:11:52,130
as tenacious, very fast,
winner-take-all,

214
00:11:52,130 --> 00:11:54,930
and incredible work ethic.

215
00:11:54,930 --> 00:11:59,430
I would say these few thousand
Chinese top entrepreneurs,

216
00:11:59,430 --> 00:12:01,230
they could take on any
entrepreneur

217
00:12:01,230 --> 00:12:03,400
anywhere in the world.

218
00:12:03,400 --> 00:12:06,170
>> NARRATOR: Entrepreneurs like
Cao Xudong,

219
00:12:06,170 --> 00:12:12,100
the 33-year-old C.E.O. of
a new startup called Momenta.

220
00:12:12,100 --> 00:12:14,700
This is a ring road around
Beijing.

221
00:12:14,700 --> 00:12:17,470
The car is driving itself.

222
00:12:17,470 --> 00:12:20,730
♪ ♪

223
00:12:17,470 --> 00:12:20,730
♪ ♪

224
00:12:23,230 --> 00:12:26,470
>> You see, another cutting,
another cutting-in.

225
00:12:26,470 --> 00:12:28,670
>> Another cut-in, yeah, yeah.

226
00:12:28,670 --> 00:12:31,470
>> NARRATOR: Cao has no doubt
about the inevitability

227
00:12:31,470 --> 00:12:35,430
of autonomous vehicles.

228
00:12:35,430 --> 00:12:41,130
>> Just like AlphaGo can beat
the human player in, in Go,

229
00:12:41,130 --> 00:12:45,570
I think the machine will
definitely surpass

230
00:12:45,570 --> 00:12:49,370
the human driver, in the end.

231
00:12:49,370 --> 00:12:50,730
>> NARRATOR: Recently, there
have been cautions

232
00:12:50,730 --> 00:12:55,530
about how soon autonomous
vehicles will be deployed,

233
00:12:55,530 --> 00:12:57,700
but Cao and his team are
confident

234
00:12:57,700 --> 00:13:00,730
they're in for the long haul.

235
00:13:00,730 --> 00:13:03,030
>> U.S. will be the first
to deploy,

236
00:13:03,030 --> 00:13:05,830
but China may be the first
to popularize.

237
00:13:05,830 --> 00:13:07,270
It is 50-50 right now.

238
00:13:07,270 --> 00:13:09,000
U.S. is ahead in technology.

239
00:13:09,000 --> 00:13:12,030
China has a larger market,
and the Chinese government

240
00:13:12,030 --> 00:13:14,870
is helping with infrastructure
efforts--

241
00:13:14,870 --> 00:13:18,100
for example, building a new city
the size of Chicago

242
00:13:18,100 --> 00:13:20,670
with autonomous driving enabled,

243
00:13:20,670 --> 00:13:23,700
and also a new highway that has
sensors built in

244
00:13:23,700 --> 00:13:26,230
to help autonomous vehicle
be safer.

245
00:13:26,230 --> 00:13:29,470
>> NARRATOR: Their early
investors included

246
00:13:29,470 --> 00:13:31,470
Mercedes-Benz.

247
00:13:31,470 --> 00:13:35,430
>> I feel very lucky and very
inspiring

248
00:13:35,430 --> 00:13:40,000
and very exciting that we're
living in this era.

249
00:13:40,000 --> 00:13:42,800
♪ ♪

250
00:13:42,800 --> 00:13:44,630
>> NARRATOR: Life in China is
largely conducted

251
00:13:44,630 --> 00:13:47,000
on smartphones.

252
00:13:47,000 --> 00:13:50,270
A billion people use WeChat,
the equivalent of Facebook,

253
00:13:50,270 --> 00:13:53,030
Messenger, and PayPal,
and much more,

254
00:13:53,030 --> 00:13:56,200
combined into just one
super-app.

255
00:13:56,200 --> 00:13:57,870
And there are many more.

256
00:13:57,870 --> 00:14:02,070
>> China is the best place
for A.I. implementation today,

257
00:14:02,070 --> 00:14:06,230
because the vast amount of data
that's available in China.

258
00:14:06,230 --> 00:14:09,670
China has a lot more users than
any other country,

259
00:14:09,670 --> 00:14:12,700
three to four times more than
the U.S.

260
00:14:12,700 --> 00:14:16,900
There are 50 times more mobile
payments than the U.S.

261
00:14:16,900 --> 00:14:19,300
There are ten times more food
deliveries,

262
00:14:19,300 --> 00:14:23,030
which serve as data to learn
more about user behavior

263
00:14:23,030 --> 00:14:24,870
than the U.S.

264
00:14:24,870 --> 00:14:28,570
300 times more shared bicycle
rides,

265
00:14:28,570 --> 00:14:32,400
and each shared bicycle ride
has all kinds of sensors

266
00:14:32,400 --> 00:14:34,730
submitting data up to the cloud.

267
00:14:34,730 --> 00:14:38,230
We're talking about maybe ten
times more data than the U.S.,

268
00:14:38,230 --> 00:14:43,230
and A.I. is basically run on
data and fueled by data.

269
00:14:43,230 --> 00:14:46,400
The more data, the better
the A.I. works,

270
00:14:46,400 --> 00:14:49,500
more importantly than how
brilliant the researcher is

271
00:14:49,500 --> 00:14:51,000
working on the problem.

272
00:14:51,000 --> 00:14:56,100
So, in the age of A.I.,
where data is the new oil,

273
00:14:56,100 --> 00:14:59,370
China is the new Saudi Arabia.

274
00:14:59,370 --> 00:15:01,570
>> NARRATOR: And access to all
that data

275
00:15:01,570 --> 00:15:04,870
means that the deep-learning
algorithm can quickly predict

276
00:15:04,870 --> 00:15:07,370
behavior, like the
creditworthiness of someone

277
00:15:07,370 --> 00:15:08,870
wanting a short-term loan.

278
00:15:08,870 --> 00:15:11,270
>> Here is our application.

279
00:15:11,270 --> 00:15:15,900
And customer can choose how many
money they want to borrow

280
00:15:15,900 --> 00:15:18,830
and how long they want
to borrow,

281
00:15:15,900 --> 00:15:18,830
and how long they want
to borrow,

282
00:15:18,830 --> 00:15:23,130
and they can input
their datas here.

283
00:15:23,130 --> 00:15:29,530
And after, after that, you can
just borrow very quickly.

284
00:15:29,530 --> 00:15:33,030
>> NARRATOR: The C.E.O. shows us
how quickly you can get a loan.

285
00:15:33,030 --> 00:15:35,100
>> It is, it has done.

286
00:15:35,100 --> 00:15:37,430
>> NARRATOR: It takes an average
of eight seconds.

287
00:15:37,430 --> 00:15:40,400
>> It has passed to banks.
>> Wow.

288
00:15:40,400 --> 00:15:42,170
>> NARRATOR:
In the eight seconds,

289
00:15:42,170 --> 00:15:44,930
the algorithm has assessed
5,000 personal features

290
00:15:44,930 --> 00:15:46,630
from all your data.

291
00:15:46,630 --> 00:15:52,500
>> 5,000 features that is
related with the delinquency,

292
00:15:52,500 --> 00:15:59,800
when maybe the banks only use
few, maybe, maybe ten features

293
00:15:59,800 --> 00:16:04,170
when they are doing
their risk amendment.

294
00:16:04,170 --> 00:16:05,630
>> NARRATOR: Processing millions
of transactions,

295
00:16:05,630 --> 00:16:08,930
it'll dig up features that would
never be apparent

296
00:16:08,930 --> 00:16:13,870
to a human loan officer,
like how confidently you type

297
00:16:13,870 --> 00:16:17,600
your loan application,
or, surprisingly,

298
00:16:17,600 --> 00:16:20,670
if you keep your cell phone
battery charged.

299
00:16:20,670 --> 00:16:23,300
>> It's very interesting, the
battery of the phone

300
00:16:23,300 --> 00:16:26,170
is related with their
delinquency rate.

301
00:16:26,170 --> 00:16:28,530
Someone who has much more
lower battery,

302
00:16:28,530 --> 00:16:33,400
they get much more dangerous
than others.

303
00:16:33,400 --> 00:16:36,370
>> It's probably unfathomable
to an American

304
00:16:36,370 --> 00:16:41,600
how a country can dramatically
evolve itself

305
00:16:41,600 --> 00:16:45,470
from a copycat laggard to,
all of a sudden,

306
00:16:45,470 --> 00:16:50,030
to nearly as good as the U.S. in
technology.

307
00:16:50,030 --> 00:16:52,300
>> NARRATOR: Like this
facial-recognition startup

308
00:16:52,300 --> 00:16:53,800
he invested in.

309
00:16:53,800 --> 00:16:58,470
Megvii was started by three
young graduates in 2011.

310
00:16:58,470 --> 00:17:02,700
It's now a world leader in using
A.I. to identify people.

311
00:17:05,530 --> 00:17:07,000
>> It's pretty fast.

312
00:17:07,000 --> 00:17:09,530
For example,
on the mobile device,

313
00:17:09,530 --> 00:17:12,670
we have timed the
facial-recognition speed.

314
00:17:12,670 --> 00:17:15,830
It's actually less
than 100 milliseconds.

315
00:17:15,830 --> 00:17:17,830
So, that's very, very fast.

316
00:17:17,830 --> 00:17:21,970
So 0.1 second that we can, we
will be able to recognize you,

317
00:17:21,970 --> 00:17:26,200
even on a mobile device.

318
00:17:26,200 --> 00:17:28,300
>> NARRATOR: The company claims
the system is better

319
00:17:28,300 --> 00:17:32,170
than any human at identifying
people in its database.

320
00:17:32,170 --> 00:17:35,770
And for those who aren't,
it can describe them.

321
00:17:35,770 --> 00:17:38,530
Like our director--
what he's wearing,

322
00:17:38,530 --> 00:17:44,230
and a good guess at his age,
missing it by only a few months.

323
00:17:44,230 --> 00:17:48,970
>> We are the first one to
really take facial recognition

324
00:17:48,970 --> 00:17:52,570
to commercial quality.

325
00:17:52,570 --> 00:17:54,070
>> NARRATOR: That's why in
Beijing today,

326
00:17:54,070 --> 00:17:59,630
you can pay for your KFC
with a smile.

327
00:17:59,630 --> 00:18:01,070
>> You know, it's not so
surprising,

328
00:18:01,070 --> 00:18:03,230
we've seen Chinese companies
catching up to the U.S.

329
00:18:03,230 --> 00:18:04,630
in technology for a long time.

330
00:18:04,630 --> 00:18:07,200
And so, if particular effort
and attention is paid

331
00:18:07,200 --> 00:18:09,700
in a specific sector,
it's not so surprising

332
00:18:09,700 --> 00:18:11,600
that they would surpass
the rest of the world.

333
00:18:11,600 --> 00:18:14,000
And facial recognition is one of
the, really the first places

334
00:18:14,000 --> 00:18:17,300
we've seen that start to happen.

335
00:18:17,300 --> 00:18:20,230
>> NARRATOR: It's a technology
prized by the government,

336
00:18:17,300 --> 00:18:20,230
>> NARRATOR: It's a technology
prized by the government,

337
00:18:20,230 --> 00:18:25,370
like this program in Shenzhen
to discourage jaywalking.

338
00:18:25,370 --> 00:18:29,400
Offenders are shamed in public--
and with facial recognition,

339
00:18:29,400 --> 00:18:33,330
can be instantly fined.

340
00:18:33,330 --> 00:18:36,570
Critics warn that the government
and some private companies

341
00:18:36,570 --> 00:18:39,170
have been building a national
database

342
00:18:39,170 --> 00:18:43,330
from dozens of experimental
social-credit programs.

343
00:18:43,330 --> 00:18:45,500
>> The government wants to
integrate

344
00:18:45,500 --> 00:18:50,670
all these individual behaviors,
or corporations' records,

345
00:18:50,670 --> 00:18:57,670
into some kind of metrics and
compute out a single number

346
00:18:57,670 --> 00:19:01,030
or set of number associated
with a individual,

347
00:19:01,030 --> 00:19:06,670
a citizen, and using that,
to implement a incentive

348
00:19:06,670 --> 00:19:08,130
or punishment system.

349
00:19:08,130 --> 00:19:09,400
>> NARRATOR: A high
social-credit number

350
00:19:09,400 --> 00:19:13,100
can be rewarded with discounts
on bus fares.

351
00:19:13,100 --> 00:19:17,800
A low number can lead
to a travel ban.

352
00:19:17,800 --> 00:19:20,430
Some say it's very popular
with a Chinese public

353
00:19:20,430 --> 00:19:23,500
that wants to punish
bad behavior.

354
00:19:23,500 --> 00:19:27,070
Others see a future that rewards
party loyalty

355
00:19:27,070 --> 00:19:30,570
and silences criticism.

356
00:19:30,570 --> 00:19:34,970
>> Right now, there is no final
system being implemented.

357
00:19:34,970 --> 00:19:43,070
And from those experiments, we
already see that the possibility

358
00:19:43,070 --> 00:19:46,400
of what this social-credit
system can do to individual.

359
00:19:46,400 --> 00:19:50,400
It's very powerful--
Orwellian-like--

360
00:19:50,400 --> 00:19:58,170
and it's extremely troublesome
in terms of civil liberty.

361
00:19:58,170 --> 00:20:00,270
>> NARRATOR: Every evening
in Shanghai,

362
00:20:00,270 --> 00:20:03,270
ever-present cameras record the
crowds

363
00:20:03,270 --> 00:20:05,430
as they surge down to the Bund,

364
00:20:05,430 --> 00:20:09,530
the promenade along the banks
of the Huangpu River.

365
00:20:09,530 --> 00:20:12,830
Once the great trading houses of
Europe came here to do business

366
00:20:12,830 --> 00:20:14,570
with the Middle Kingdom.

367
00:20:14,570 --> 00:20:17,600
In the last century,
they were all shut down

368
00:20:17,600 --> 00:20:20,200
by Mao's revolution.

369
00:20:20,200 --> 00:20:22,530
But now, in the age of A.I.,

370
00:20:22,530 --> 00:20:24,670
people come here to take
in a spectacle

371
00:20:24,670 --> 00:20:28,100
that reflects China's
remarkable progress.

372
00:20:28,100 --> 00:20:30,630
(spectators gasp)

373
00:20:30,630 --> 00:20:34,070
And illuminates the great
political paradox of capitalism

374
00:20:34,070 --> 00:20:39,200
taken root
in the communist state.

375
00:20:39,200 --> 00:20:42,800
>> People have called it
market Leninism,

376
00:20:42,800 --> 00:20:45,570
authoritarian capitalism.

377
00:20:45,570 --> 00:20:48,970
We are watching a kind
of a Petri dish

378
00:20:48,970 --> 00:20:56,270
in which an experiment of, you
know, extraordinary importance

379
00:20:56,270 --> 00:20:57,970
to the world is
being carried out.

380
00:20:57,970 --> 00:21:01,030
Whether you can combine these
things

381
00:21:01,030 --> 00:21:04,170
and get something
that's more powerful,

382
00:21:04,170 --> 00:21:06,870
that's coherent,
that's durable in the world.

383
00:21:06,870 --> 00:21:09,600
Whether you can bring together
a one-party state

384
00:21:09,600 --> 00:21:14,370
with an innovative sector,
both economically

385
00:21:14,370 --> 00:21:16,400
and technologically innovative,

386
00:21:16,400 --> 00:21:22,700
and that's something we thought
could not coexist.

387
00:21:16,400 --> 00:21:22,700
and that's something we thought
could not coexist.

388
00:21:22,700 --> 00:21:25,070
>> NARRATOR:
As China reinvents itself,

389
00:21:25,070 --> 00:21:27,170
it has set its sights
on leading the world

390
00:21:27,170 --> 00:21:31,170
in artificial intelligence
by 2030.

391
00:21:31,170 --> 00:21:34,230
But that means taking on the
world's most innovative

392
00:21:34,230 --> 00:21:36,100
A.I. culture.

393
00:21:36,100 --> 00:21:39,600
♪ ♪

394
00:21:48,900 --> 00:21:51,930
On an interstate
in the U.S. Southwest,

395
00:21:51,930 --> 00:21:54,830
artificial intelligence is at
work solving the problem

396
00:21:54,830 --> 00:21:58,030
that's become emblematic
of the new age,

397
00:21:58,030 --> 00:22:00,800
replacing a human driver.

398
00:22:00,800 --> 00:22:06,370
♪ ♪

399
00:22:06,370 --> 00:22:10,730
This is the company's C.E.O.,
24-year-old Alex Rodrigues.

400
00:22:13,630 --> 00:22:15,800
>> The more things we build
successfully,

401
00:22:15,800 --> 00:22:17,970
the less people ask questions

402
00:22:17,970 --> 00:22:20,870
about how old you are when you
have working trucks.

403
00:22:20,870 --> 00:22:23,800
>> NARRATOR: And this is what
he's built.

404
00:22:23,800 --> 00:22:26,670
Commercial goods are being
driven from California

405
00:22:26,670 --> 00:22:31,170
to Arizona on Interstate 10.

406
00:22:31,170 --> 00:22:36,270
There is a driver in the cab,
but he's not driving.

407
00:22:36,270 --> 00:22:42,630
It's a path set by a C.E.O.
with an unusual CV.

408
00:22:42,630 --> 00:22:44,930
>> Are we ready, Henry?

409
00:22:44,930 --> 00:22:49,730
The aim is to score these pucks
into the scoring area.

410
00:22:49,730 --> 00:22:53,400
So I, I did competitive robotics
starting when I was 11,

411
00:22:53,400 --> 00:22:55,130
and I took it very, very
seriously.

412
00:22:55,130 --> 00:22:57,900
To, to give you a sense, I won
the Robotics World Championships

413
00:22:57,900 --> 00:22:59,830
for the first time
when I was 13.

414
00:22:59,830 --> 00:23:01,430
I've been to worlds seven times

415
00:23:01,430 --> 00:23:04,200
between the ages of 13
and 20-ish.

416
00:23:04,200 --> 00:23:06,330
I eventually founded a team,

417
00:23:06,330 --> 00:23:09,000
did a lot of work at a
very high competitive level.

418
00:23:09,000 --> 00:23:10,470
Things looking pretty good.

419
00:23:10,470 --> 00:23:12,930
>> NARRATOR: This was a
prototype of sorts,

420
00:23:12,930 --> 00:23:17,130
from which he has built his
multi-million-dollar company.

421
00:23:17,130 --> 00:23:20,100
>> I hadn't built a robot in a
while, wanted to get back to it,

422
00:23:20,100 --> 00:23:23,030
and felt that this was by far
the most exciting piece

423
00:23:23,030 --> 00:23:24,930
of robotics technology that was
up and coming.

424
00:23:24,930 --> 00:23:27,170
A lot of people told us we
wouldn't be able to build it.

425
00:23:27,170 --> 00:23:30,570
But knew roughly the techniques
that you would use.

426
00:23:30,570 --> 00:23:32,370
And I was pretty confident that
if you put them together,

427
00:23:32,370 --> 00:23:34,330
you would get something
that worked.

428
00:23:34,330 --> 00:23:37,900
Took the summer off, built in my
parents' garage a golf cart

429
00:23:37,900 --> 00:23:42,470
that could drive itself.

430
00:23:42,470 --> 00:23:44,430
>> NARRATOR: That golf cart
got the attention

431
00:23:44,430 --> 00:23:47,400
of Silicon Valley,
and the first of several rounds

432
00:23:47,400 --> 00:23:49,570
of venture capital.

433
00:23:49,570 --> 00:23:52,670
He formed a team and then
decided the business opportunity

434
00:23:52,670 --> 00:23:55,700
was in self-driving trucks.

435
00:23:55,700 --> 00:23:58,470
He says there's also
a human benefit.

436
00:23:58,470 --> 00:24:00,630
>> If we can build a truck
that's ten times safer

437
00:24:00,630 --> 00:24:04,770
than a human driver, then not
much else actually matters.

438
00:24:04,770 --> 00:24:07,770
When we talk to regulators,
especially,

439
00:24:07,770 --> 00:24:10,930
everyone agrees that the only
way that we're going to get

440
00:24:10,930 --> 00:24:13,770
to zero highway deaths,
which is everyone's objective,

441
00:24:13,770 --> 00:24:15,800
is to use self-driving.

442
00:24:15,800 --> 00:24:19,030
And so, I'm sure you've heard
the statistic,

443
00:24:15,800 --> 00:24:19,030
And so, I'm sure you've heard
the statistic,

444
00:24:19,030 --> 00:24:21,230
more than 90% of all crashes

445
00:24:21,230 --> 00:24:22,870
have a human driver
as the cause.

446
00:24:22,870 --> 00:24:26,230
So if you want to solve
traffic fatalities,

447
00:24:26,230 --> 00:24:30,170
which, in my opinion, are the
single biggest tragedy

448
00:24:30,170 --> 00:24:32,970
that happens year after year
in the United States,

449
00:24:32,970 --> 00:24:35,800
this is the only solution.

450
00:24:35,800 --> 00:24:38,230
>> NARRATOR:
It's an ambitious goal,

451
00:24:38,230 --> 00:24:40,430
but only possible because
of the recent breakthroughs

452
00:24:40,430 --> 00:24:42,170
in deep learning.

453
00:24:42,170 --> 00:24:44,300
>> Artificial intelligence is
one of those key pieces

454
00:24:44,300 --> 00:24:48,530
that has made it possible now
to do driverless vehicles

455
00:24:48,530 --> 00:24:51,070
where it wasn't possible
ten years ago,

456
00:24:51,070 --> 00:24:55,870
particularly in the ability
to see and understand scenes.

457
00:24:55,870 --> 00:24:59,130
A lot of people don't know this,
but it's remarkably hard

458
00:24:59,130 --> 00:25:00,870
for computers,
until very, very recently,

459
00:25:00,870 --> 00:25:04,800
to do even the most basic
visual tasks,

460
00:25:04,800 --> 00:25:06,530
like seeing a picture
of a person

461
00:25:06,530 --> 00:25:08,070
and knowing that it's a person.

462
00:25:08,070 --> 00:25:11,270
And we've made gigantic strides
with artificial intelligence

463
00:25:11,270 --> 00:25:13,530
in being able to see and
understanding tasks,

464
00:25:13,530 --> 00:25:16,000
and that's obviously fundamental
to being able to understand

465
00:25:16,000 --> 00:25:17,770
the world around you
with the sensors that,

466
00:25:17,770 --> 00:25:21,870
that you have available.

467
00:25:21,870 --> 00:25:23,270
>> NARRATOR: That's now possible

468
00:25:23,270 --> 00:25:25,970
because of the algorithms
written by Yoshua Bengio

469
00:25:25,970 --> 00:25:30,070
and a small group of scientists.

470
00:25:30,070 --> 00:25:32,630
>> There are many aspects
of the world

471
00:25:32,630 --> 00:25:36,200
which we can't explain
with words.

472
00:25:36,200 --> 00:25:38,400
And that part of our knowledge
is actually

473
00:25:38,400 --> 00:25:41,170
probably the majority of it.

474
00:25:41,170 --> 00:25:43,400
So, like, the stuff we can
communicate verbally

475
00:25:43,400 --> 00:25:45,370
is the tip of the iceberg.

476
00:25:45,370 --> 00:25:50,600
And so to get at the bottom of
the iceberg, the solution was,

477
00:25:50,600 --> 00:25:55,000
the computers have to acquire
that knowledge by themselves

478
00:25:55,000 --> 00:25:56,500
from data, from examples.

479
00:25:56,500 --> 00:26:00,400
Just like children learn,
most not from their teachers,

480
00:26:00,400 --> 00:26:03,370
but from interacting
with the world,

481
00:26:03,370 --> 00:26:05,500
and playing around, and, and
trying things

482
00:26:05,500 --> 00:26:07,570
and seeing what works
and what doesn't work.

483
00:26:07,570 --> 00:26:09,870
>> NARRATOR: This is an early
demonstration.

484
00:26:09,870 --> 00:26:14,470
In 2013, deep-mind scientists
set a machine-learning program

485
00:26:14,470 --> 00:26:18,070
on the Atari video game
Breakout.

486
00:26:18,070 --> 00:26:21,870
The computer was only told
the goal-- to win the game.

487
00:26:21,870 --> 00:26:26,230
After 100 games, it learned to
use the bat at the bottom

488
00:26:26,230 --> 00:26:29,770
to hit the ball and break
the bricks at the top.

489
00:26:29,770 --> 00:26:35,030
After 300, it could do that
better than a human player.

490
00:26:35,030 --> 00:26:39,970
After 500 games, it came up with
a creative way to win the game--

491
00:26:39,970 --> 00:26:42,730
by digging a tunnel on the side

492
00:26:42,730 --> 00:26:44,270
and sending the ball
around the top

493
00:26:44,270 --> 00:26:46,730
to break many bricks
with one hit.

494
00:26:46,730 --> 00:26:50,170
That was deep learning.

495
00:26:50,170 --> 00:26:52,630
>> That's the A.I. program based
on learning,

496
00:26:52,630 --> 00:26:54,430
really, that has been
so successful

497
00:26:54,430 --> 00:26:56,870
in the last few years and has...

498
00:26:56,870 --> 00:26:59,430
It wasn't clear ten years ago
that it would work,

499
00:26:59,430 --> 00:27:02,600
but it has completely changed
the map

500
00:27:02,600 --> 00:27:08,570
and is now used in almost
every sector of society.

501
00:27:08,570 --> 00:27:10,970
>> Even the best and brightest
among us,

502
00:27:10,970 --> 00:27:13,000
we just don't have enough
compute power

503
00:27:13,000 --> 00:27:15,530
inside of our heads.

504
00:27:15,530 --> 00:27:18,000
>> NARRATOR: Amy Webb is a
professor at N.Y.U.

505
00:27:18,000 --> 00:27:21,970
and founder of the Future Today
Institute.

506
00:27:18,000 --> 00:27:21,970
and founder of the Future Today
Institute.

507
00:27:21,970 --> 00:27:28,270
>> As A.I. progresses, the great
promise is that they...

508
00:27:28,270 --> 00:27:32,700
they, these, these machines,
alongside of us,

509
00:27:32,700 --> 00:27:36,330
are able to think and imagine
and see things

510
00:27:36,330 --> 00:27:38,730
in ways that we never have
before,

511
00:27:38,730 --> 00:27:42,370
which means that maybe we have
some kind of new,

512
00:27:42,370 --> 00:27:47,330
weird, seemingly implausible
solution to climate change.

513
00:27:47,330 --> 00:27:51,530
Maybe we have some radically
different approach

514
00:27:51,530 --> 00:27:54,930
to dealing with
incurable cancers.

515
00:27:54,930 --> 00:28:00,330
The real practical and wonderful
promise is that machines help us

516
00:28:00,330 --> 00:28:04,170
be more creative, and,
using that creativity,

517
00:28:04,170 --> 00:28:08,430
we get to terrific solutions.

518
00:28:08,430 --> 00:28:11,500
>> NARRATOR: Solutions that
could come unexpectedly

519
00:28:11,500 --> 00:28:13,770
to urgent problems.

520
00:28:13,770 --> 00:28:15,700
>> It's going to change
the face of breast cancer.

521
00:28:15,700 --> 00:28:18,870
Right now, 40,000 women
in the U.S. alone

522
00:28:18,870 --> 00:28:21,670
die from breast cancer
every single year.

523
00:28:21,670 --> 00:28:23,870
>> NARRATOR: Dr. Connie Lehman
is head

524
00:28:23,870 --> 00:28:25,230
of the breast imaging center

525
00:28:25,230 --> 00:28:28,470
at Massachusetts General
Hospital in Boston.

526
00:28:28,470 --> 00:28:30,930
>> We've become so complacent
about it,

527
00:28:30,930 --> 00:28:33,070
we almost don't think it can
really be changed.

528
00:28:33,070 --> 00:28:35,470
We, we somehow think we should
put all of our energy

529
00:28:35,470 --> 00:28:38,670
into chemotherapies
to save women

530
00:28:38,670 --> 00:28:40,370
with metastatic breast cancer,

531
00:28:40,370 --> 00:28:43,430
and yet, you know, when we find
it early, we cure it,

532
00:28:43,430 --> 00:28:46,730
and we cure it without having
the ravages to the body

533
00:28:46,730 --> 00:28:48,830
when we diagnose it late.

534
00:28:48,830 --> 00:28:53,700
This shows the progression of a
small, small spot from one year

535
00:28:53,700 --> 00:28:56,530
to the next,
and then to the diagnosis

536
00:28:56,530 --> 00:28:59,730
of the small cancer here.

537
00:28:59,730 --> 00:29:01,830
>> NARRATOR: This is what
happened when a woman

538
00:29:01,830 --> 00:29:04,100
who had been diagnosed
with breast cancer

539
00:29:04,100 --> 00:29:06,170
started to ask questions

540
00:29:06,170 --> 00:29:09,370
about why it couldn't have been
diagnosed earlier.

541
00:29:09,370 --> 00:29:12,170
>> It really brings a lot of
anxiety,

542
00:29:12,170 --> 00:29:14,270
and you're asking the questions,
you know,

543
00:29:14,270 --> 00:29:15,530
"Am I going to survive?

544
00:29:15,530 --> 00:29:17,200
What's going to happen
to my son?"

545
00:29:17,200 --> 00:29:21,230
And I start asking
other questions.

546
00:29:21,230 --> 00:29:23,700
>> NARRATOR: She was used to
asking questions.

547
00:29:23,700 --> 00:29:26,970
At M.I.T.'s
artificial-intelligence lab,

548
00:29:26,970 --> 00:29:29,970
Professor Regina Barzilay uses
deep learning

549
00:29:29,970 --> 00:29:33,100
to teach the computer to
understand language,

550
00:29:33,100 --> 00:29:36,270
as well as read text and data.

551
00:29:36,270 --> 00:29:39,600
>> I was really surprised
that the very basic question

552
00:29:39,600 --> 00:29:41,970
that I ask my physicians,

553
00:29:41,970 --> 00:29:45,330
which were really excellent
physicians here at MGH,

554
00:29:45,330 --> 00:29:49,030
they couldn't give me answers
that I was looking for.

555
00:29:49,030 --> 00:29:52,770
>> NARRATOR: She was convinced
that if you analyze enough data,

556
00:29:52,770 --> 00:29:55,530
from mammograms
to diagnostic notes,

557
00:29:55,530 --> 00:29:58,800
the computer could predict
early-stage conditions.

558
00:29:58,800 --> 00:30:04,830
>> If we fast-forward from 2012
to '13 to 2014,

559
00:30:04,830 --> 00:30:07,900
we then see when Regina
was diagnosed,

560
00:30:07,900 --> 00:30:12,170
because of this spot on her
mammogram.

561
00:30:12,170 --> 00:30:16,830
Is it possible, with more
elegant computer applications,

562
00:30:16,830 --> 00:30:21,100
that we might have identified
this spot the year before,

563
00:30:16,830 --> 00:30:21,100
that we might have identified
this spot the year before,

564
00:30:21,100 --> 00:30:23,200
or even back here?

565
00:30:23,200 --> 00:30:24,830
>> So, those are standard
prediction problems

566
00:30:24,830 --> 00:30:28,870
in machine learning-- there is
nothing special about them.

567
00:30:28,870 --> 00:30:31,900
And to my big surprise,
none of the technologies

568
00:30:31,900 --> 00:30:35,130
that we are developing
at M.I.T.,

569
00:30:35,130 --> 00:30:40,730
even in the most simple form,
doesn't penetrate the hospital.

570
00:30:40,730 --> 00:30:43,670
>> NARRATOR: Regina and Connie
began the slow process

571
00:30:43,670 --> 00:30:47,070
of getting access to thousands
of mammograms and records

572
00:30:47,070 --> 00:30:48,770
from MGH's breast-imaging
program.

573
00:30:51,470 --> 00:30:55,130
>> So, our first foray was just
to take all of the patients

574
00:30:55,130 --> 00:30:58,130
we had at MGH during
a period of time,

575
00:30:58,130 --> 00:31:00,530
who had had breast surgery
for a certain type

576
00:31:00,530 --> 00:31:02,430
of high-risk lesion.

577
00:31:02,430 --> 00:31:05,700
And we found that most of them
didn't really need the surgery.

578
00:31:05,700 --> 00:31:07,070
They didn't have cancer.

579
00:31:07,070 --> 00:31:09,670
But about ten percent
did have cancer.

580
00:31:09,670 --> 00:31:12,570
With Regina's techniques
in deep learning

581
00:31:12,570 --> 00:31:15,370
and machine learning, we were
able to predict the women

582
00:31:15,370 --> 00:31:17,600
that truly needed the surgery
and separate out

583
00:31:17,600 --> 00:31:21,470
those that really could avoid
the unnecessary surgery.

584
00:31:21,470 --> 00:31:25,030
>> What machine can do, it can
take hundreds of thousands

585
00:31:25,030 --> 00:31:27,730
of images where the outcome
is known

586
00:31:27,730 --> 00:31:32,700
and learn, based on how, you
know, pixels are distributed,

587
00:31:32,700 --> 00:31:37,170
what are the very unique
patterns that correlate highly

588
00:31:37,170 --> 00:31:40,370
with future occurrence
of the disease.

589
00:31:40,370 --> 00:31:42,900
So, instead of using human
capacity

590
00:31:42,900 --> 00:31:46,770
to kind of recognize pattern,
formalize pattern--

591
00:31:46,770 --> 00:31:50,700
which is inherently limited
by our cognitive capacity

592
00:31:50,700 --> 00:31:52,800
and how much we can see
and remember--

593
00:31:52,800 --> 00:31:55,700
we're providing machine with a
lot of data

594
00:31:55,700 --> 00:31:59,630
and make it learn
this prediction.

595
00:31:59,630 --> 00:32:04,370
>> So, we are using technology
not only to be better

596
00:32:04,370 --> 00:32:06,770
at assessing the breast density,

597
00:32:06,770 --> 00:32:09,200
but to get more to the point of
what we're trying to predict.

598
00:32:09,200 --> 00:32:12,930
"Does this woman have
a cancer now,

599
00:32:12,930 --> 00:32:15,170
and will she develop a cancer
in five years? "

600
00:32:15,170 --> 00:32:18,770
And that's, again, where
the artificial intelligence,

601
00:32:18,770 --> 00:32:20,700
machine and deep learning can
really help us

602
00:32:20,700 --> 00:32:22,770
and our patients.

603
00:32:22,770 --> 00:32:24,830
>> NARRATOR: In the age of A.I.,

604
00:32:24,830 --> 00:32:28,330
the algorithms are transporting
us into a universe

605
00:32:28,330 --> 00:32:31,970
of vast potential and
transforming almost every aspect

606
00:32:31,970 --> 00:32:36,200
of human endeavor and
experience.

607
00:32:36,200 --> 00:32:40,000
Andrew McAfee is a research
scientist at M.I.T.

608
00:32:40,000 --> 00:32:44,000
who co-authored
"The Second Machine Age."

609
00:32:44,000 --> 00:32:47,070
>> The great compliment that a
songwriter gives another one is,

610
00:32:47,070 --> 00:32:48,600
"Gosh, I wish I had written
that one."

611
00:32:48,600 --> 00:32:51,100
The great compliment a geek
gives another one is,

612
00:32:51,100 --> 00:32:52,900
"Wow, I wish I had drawn
that graph."

613
00:32:52,900 --> 00:32:55,630
So, I wish I had drawn
this graph.

614
00:32:55,630 --> 00:32:57,500
>> NARRATOR:
The graph uses a formula

615
00:32:57,500 --> 00:33:01,400
to show human development and
growth since 2000 BCE.

616
00:33:01,400 --> 00:33:03,570
>> The state of human
civilization

617
00:33:03,570 --> 00:33:06,970
is not very advanced, and it's
not getting better

618
00:33:06,970 --> 00:33:09,130
very quickly at all,
and this is true for thousands

619
00:33:09,130 --> 00:33:10,970
and thousands of years.

620
00:33:10,970 --> 00:33:14,470
When we, when we formed empires
and empires got overturned,

621
00:33:14,470 --> 00:33:18,530
when we tried democracy,
when we invented zero

622
00:33:18,530 --> 00:33:21,630
and mathematics and fundamental
discoveries about the universe,

623
00:33:18,530 --> 00:33:21,630
and mathematics and fundamental
discoveries about the universe,

624
00:33:21,630 --> 00:33:23,400
big deal.

625
00:33:23,400 --> 00:33:25,300
It just, the numbers don't
change very much.

626
00:33:25,300 --> 00:33:28,900
What's weird is that the numbers
change essentially in the blink

627
00:33:28,900 --> 00:33:30,370
of an eye at one point in time.

628
00:33:30,370 --> 00:33:34,030
And it goes from really
horizontal, unchanging,

629
00:33:34,030 --> 00:33:38,600
uninteresting, to, holy Toledo,
crazy vertical.

630
00:33:38,600 --> 00:33:41,200
And then the question is,
what on Earth happened

631
00:33:41,200 --> 00:33:42,570
to cause that change?

632
00:33:42,570 --> 00:33:44,770
And the answer
is the Industrial Revolution.

633
00:33:44,770 --> 00:33:46,800
There were other things that
happened,

634
00:33:46,800 --> 00:33:48,830
but really what fundamentally
happened is

635
00:33:48,830 --> 00:33:51,530
we overcame the limitations
of our muscle power.

636
00:33:51,530 --> 00:33:54,400
Something equally interesting is
happening right now.

637
00:33:54,400 --> 00:33:57,330
We are overcoming the
limitations of our minds.

638
00:33:57,330 --> 00:33:58,930
We're not getting rid of them,

639
00:33:58,930 --> 00:34:00,970
we're not making them
unnecessary,

640
00:34:00,970 --> 00:34:04,500
but, holy cow, can we leverage
them and amplify them now.

641
00:34:04,500 --> 00:34:06,170
You have to be a huge pessimist

642
00:34:06,170 --> 00:34:08,730
not to find that profoundly
good news.

643
00:34:08,730 --> 00:34:11,370
>> I really do think the world
has entered a new era.

644
00:34:11,370 --> 00:34:14,830
Artificial intelligence holds so
much promise,

645
00:34:14,830 --> 00:34:17,730
but it's going to reshape every
aspect of the economy,

646
00:34:17,730 --> 00:34:19,370
so many aspects of our lives.

647
00:34:19,370 --> 00:34:22,770
Because A.I. is a little bit
like electricity.

648
00:34:22,770 --> 00:34:24,670
Everybody's going to use it.

649
00:34:24,670 --> 00:34:28,400
Every company is going to be
incorporating A.I.,

650
00:34:28,400 --> 00:34:30,300
integrating it into
what they do,

651
00:34:30,300 --> 00:34:31,630
governments are going to be
using it,

652
00:34:31,630 --> 00:34:35,600
nonprofit organizations are
going to be using it.

653
00:34:35,600 --> 00:34:39,200
It's going to create all kinds
of benefits

654
00:34:39,200 --> 00:34:43,070
in ways large and small,
and challenges for us, as well.

655
00:34:43,070 --> 00:34:46,730
>> NARRATOR: The challenges,
the benefits--

656
00:34:46,730 --> 00:34:49,000
the autonomous truck
represents both

657
00:34:49,000 --> 00:34:52,070
as it maneuvers
into the marketplace.

658
00:34:52,070 --> 00:34:55,070
The engineers are confident
that, in spite of questions

659
00:34:55,070 --> 00:34:57,370
about when this will happen,

660
00:34:57,370 --> 00:34:59,330
they can get it working safely
sooner

661
00:34:59,330 --> 00:35:00,770
than most people realize.

662
00:35:00,770 --> 00:35:04,130
>> I think that you will see the
first vehicles operating

663
00:35:04,130 --> 00:35:07,570
with no one inside them moving
freight in the next few years,

664
00:35:07,570 --> 00:35:09,700
and then you're going to see
that expanding to more freight,

665
00:35:09,700 --> 00:35:13,030
more geographies,
more weather over time as,

666
00:35:13,030 --> 00:35:14,530
as that capability builds up.

667
00:35:14,530 --> 00:35:18,600
We're talking, like,
less than half a decade.

668
00:35:18,600 --> 00:35:21,370
>> NARRATOR: He already has a
Fortune 500 company

669
00:35:21,370 --> 00:35:25,830
as a client, shipping appliances
across the Southwest.

670
00:35:25,830 --> 00:35:29,330
He says the sales pitch
is straightforward.

671
00:35:29,330 --> 00:35:32,070
>> They spend hundreds of
millions of dollars a year

672
00:35:32,070 --> 00:35:33,670
shipping parts around
the country.

673
00:35:33,670 --> 00:35:36,100
We can bring that cost in half.

674
00:35:36,100 --> 00:35:38,930
And they're really excited to be
able to start working with us,

675
00:35:38,930 --> 00:35:41,800
both because of the potential,

676
00:35:41,800 --> 00:35:44,100
the potential savings from
deploying self-driving,

677
00:35:44,100 --> 00:35:46,470
and also because of all the
operational efficiencies

678
00:35:46,470 --> 00:35:49,830
that they see, the biggest one
being able to operate

679
00:35:49,830 --> 00:35:51,800
24 hours a day.

680
00:35:51,800 --> 00:35:53,970
So, right now, human drivers are
limited to 11 hours

681
00:35:53,970 --> 00:35:57,470
by federal law,
and a driverless truck

682
00:35:57,470 --> 00:35:59,000
obviously wouldn't have
that limitation.

683
00:35:59,000 --> 00:36:04,530
♪ ♪

684
00:36:04,530 --> 00:36:07,330
>> NARRATOR: The idea of a
driverless truck comes up often

685
00:36:07,330 --> 00:36:13,430
in discussions about artificial
intelligence.

686
00:36:13,430 --> 00:36:16,800
Steve Viscelli is a sociologist
who drove a truck

687
00:36:16,800 --> 00:36:22,330
while researching his book "The
Big Rig" about the industry.

688
00:36:16,800 --> 00:36:22,330
while researching his book "The
Big Rig" about the industry.

689
00:36:22,330 --> 00:36:25,000
>> This is one of the most
remarkable stories

690
00:36:25,000 --> 00:36:27,830
in, in U.S. labor history,
I think,

691
00:36:27,830 --> 00:36:32,400
is, you know, the decline of,
of unionized trucking.

692
00:36:32,400 --> 00:36:35,600
The industry was deregulated
in 1980,

693
00:36:35,600 --> 00:36:39,400
and at that time, you know,
truck drivers were earning

694
00:36:39,400 --> 00:36:43,400
the equivalent of over
$100,000 in today's dollars.

695
00:36:43,400 --> 00:36:47,500
And today the typical truck
driver will earn

696
00:36:47,500 --> 00:36:52,370
a little over $40,000 a year.

697
00:36:52,370 --> 00:36:54,630
And I think it's
an important part

698
00:36:54,630 --> 00:36:56,230
of the automation story, right?

699
00:36:56,230 --> 00:36:58,900
Why are they so afraid of
automation?

700
00:36:58,900 --> 00:37:02,670
Because we've had four decades
of rising inequality in wages.

701
00:37:02,670 --> 00:37:05,330
And if anybody is going to take
it on the chin

702
00:37:05,330 --> 00:37:07,330
from automation
in the trucking industry,

703
00:37:07,330 --> 00:37:09,630
the, the first in line is going
to be the driver,

704
00:37:09,630 --> 00:37:14,300
without a doubt.

705
00:37:14,300 --> 00:37:16,730
>> NARRATOR: For his research,
Viscelli tracked down truckers

706
00:37:16,730 --> 00:37:19,600
and their families,
like Shawn and Hope Cumbee

707
00:37:19,600 --> 00:37:21,530
of Beaverton, Michigan.
>> Hi.

708
00:37:21,530 --> 00:37:22,870
>> Hey, Hope,
I'm Steve Viscelli.

709
00:37:22,870 --> 00:37:23,870
>> Hi, Steve, nice to meet you.
Come on in.

710
00:37:23,870 --> 00:37:26,800
>> Great to meet you, too,
thanks.

711
00:37:26,800 --> 00:37:28,430
>> NARRATOR: And their son
Charlie.

712
00:37:28,430 --> 00:37:33,730
>> This is Daddy, me,
Daddy, and Mommy.

713
00:37:33,730 --> 00:37:36,230
>> NARRATOR: But Daddy's not
here.

714
00:37:36,230 --> 00:37:40,900
Shawn Cumbee's truck has broken
down in Tennessee.

715
00:37:40,900 --> 00:37:45,470
Hope, who drove a truck herself,
knows the business well.

716
00:37:45,470 --> 00:37:48,870
>> We made $150,000, right,
in a year.

717
00:37:48,870 --> 00:37:50,070
That sounds great, right?

718
00:37:50,070 --> 00:37:52,400
That's, like, good money.

719
00:37:52,400 --> 00:37:55,870
We paid $100,000 in fuel, okay?

720
00:37:55,870 --> 00:37:59,030
So, right there,
now I made $50,000.

721
00:37:59,030 --> 00:38:01,030
But I didn't really, because,
you know,

722
00:38:01,030 --> 00:38:02,600
you get an oil change every
month,

723
00:38:02,600 --> 00:38:04,200
so that's $300 a month.

724
00:38:04,200 --> 00:38:06,170
You still have to do
all the maintenance.

725
00:38:06,170 --> 00:38:08,500
We had a motor blow out, right?

726
00:38:08,500 --> 00:38:11,170
$13,000. Right?

727
00:38:11,170 --> 00:38:13,800
I know, I mean, I choke up a
little just thinking about it,

728
00:38:13,800 --> 00:38:15,770
because it was...

729
00:38:15,770 --> 00:38:19,470
And it was 13,000, and we were
off work for two weeks.

730
00:38:19,470 --> 00:38:21,670
So, by the end of the year,
with that $150,000,

731
00:38:21,670 --> 00:38:24,670
by the end of the year,
we'd made about 20...

732
00:38:24,670 --> 00:38:28,030
About $22,000.

733
00:38:28,030 --> 00:38:30,400
>> NARRATOR: In a truck stop
in Tennessee,

734
00:38:30,400 --> 00:38:33,500
Shawn has been sidelined
waiting for a new part.

735
00:38:33,500 --> 00:38:37,300
The garage owner is letting him
stay in the truck to save money.

736
00:38:39,870 --> 00:38:41,770
>> Hi, baby.

737
00:38:41,770 --> 00:38:43,330
>> (on phone): Hey, how's it
going?

738
00:38:43,330 --> 00:38:44,730
>> It's going.
Chunky-butt!

739
00:38:44,730 --> 00:38:46,600
>> Hi, Daddy!
>> Hi, Chunky-butt.

740
00:38:46,600 --> 00:38:49,300
What're you doing?
>> (talking inaudibly)

741
00:38:49,300 --> 00:38:51,600
>> Believe it or not,
I do it because I love it.

742
00:38:51,600 --> 00:38:53,330
I mean, you know,
it's in the blood.

743
00:38:53,330 --> 00:38:54,900
Third-generation driver.

744
00:38:54,900 --> 00:38:57,230
And my granddaddy told me a long
time ago,

745
00:38:57,230 --> 00:39:00,630
when I was probably
11, 12 years old, probably,

746
00:39:00,630 --> 00:39:03,500
he said, "The world meets nobody
halfway.

747
00:39:03,500 --> 00:39:04,930
Nobody."

748
00:39:04,930 --> 00:39:09,030
He said, "If you want it,
you have to earn it."

749
00:39:09,030 --> 00:39:11,870
And that's what I do every day.

750
00:39:11,870 --> 00:39:13,330
I live by that creed.

751
00:39:13,330 --> 00:39:18,100
And I've lived by that
since it was told to me.

752
00:39:18,100 --> 00:39:20,300
>> So, if you're down for a week
in a truck,

753
00:39:18,100 --> 00:39:20,300
>> So, if you're down for a week
in a truck,

754
00:39:20,300 --> 00:39:21,870
you still have to pay your
bills.

755
00:39:21,870 --> 00:39:24,100
I have enough money in my
checking account at all times

756
00:39:24,100 --> 00:39:25,470
to pay a month's worth of bills.

757
00:39:25,470 --> 00:39:27,070
That does not include my food.

758
00:39:27,070 --> 00:39:29,630
That doesn't include field trips
for my son's school.

759
00:39:29,630 --> 00:39:33,700
My son and I just went to our
yearly doctor appointment.

760
00:39:33,700 --> 00:39:38,270
I took, I took money out of my
son's piggy bank to pay for it,

761
00:39:38,270 --> 00:39:42,600
because it's not...
it's not scheduled in.

762
00:39:42,600 --> 00:39:45,430
It's, it's not something that
you can, you know, afford.

763
00:39:45,430 --> 00:39:47,500
I mean, like, when...

764
00:39:47,500 --> 00:39:48,900
(sighs): Sorry.

765
00:39:48,900 --> 00:39:50,970
>> It's okay.

766
00:39:50,970 --> 00:39:54,600
♪ ♪

767
00:39:59,230 --> 00:40:01,170
Have you guys ever talked about
self-driving trucks?

768
00:40:01,170 --> 00:40:02,500
Is he...

769
00:40:02,500 --> 00:40:05,130
>> (laughing): So, kind of.

770
00:40:05,130 --> 00:40:07,830
Um, I asked him once, you know.

771
00:40:07,830 --> 00:40:09,230
And he laughed so hard.

772
00:40:09,230 --> 00:40:12,330
He said, "No way will they
ever have a truck

773
00:40:12,330 --> 00:40:14,970
that can drive itself."

774
00:40:14,970 --> 00:40:17,230
>> It's kind of interesting when
you think about it, you know,

775
00:40:17,230 --> 00:40:19,730
they're putting all this new
technology into things,

776
00:40:19,730 --> 00:40:21,570
but, you know,
it's still man-made.

777
00:40:21,570 --> 00:40:24,970
And man, you know,
does make mistakes.

778
00:40:24,970 --> 00:40:28,170
I really don't see it being
a problem with the industry,

779
00:40:28,170 --> 00:40:30,770
'cause, one, you still got to
have a driver in it,

780
00:40:30,770 --> 00:40:32,330
because I don't see it
doing city.

781
00:40:32,330 --> 00:40:34,600
I don't see it doing,
you know, main things.

782
00:40:34,600 --> 00:40:36,700
I don't see it backing into
a dock.

783
00:40:36,700 --> 00:40:39,870
I don't see the automation part,
you know, doing...

784
00:40:39,870 --> 00:40:41,900
maybe the box-trailer side,
you know, I can see that,

785
00:40:41,900 --> 00:40:43,400
but not stuff like I do.

786
00:40:43,400 --> 00:40:46,830
So, I ain't really worried about
the automation of trucks.

787
00:40:46,830 --> 00:40:48,230
>> How near of a future is it?

788
00:40:48,230 --> 00:40:51,300
>> Yeah, self-driving, um...

789
00:40:51,300 --> 00:40:54,600
So, some, you know, some
companies are already operating.

790
00:40:54,600 --> 00:40:58,170
Embark, for instance, is one
that has been doing

791
00:40:58,170 --> 00:41:01,030
driverless trucks
on the interstate.

792
00:41:01,030 --> 00:41:03,930
And what's called exit-to-exit
self-driving.

793
00:41:03,930 --> 00:41:06,830
And they're currently running
real freight.

794
00:41:06,830 --> 00:41:09,530
>> Really?
>> Yeah, on I-10.

795
00:41:09,530 --> 00:41:12,530
♪ ♪

796
00:41:12,530 --> 00:41:17,170
>> (on P.A.): Shower guest 100,
your shower is now ready.

797
00:41:17,170 --> 00:41:20,430
>> NARRATOR: Over time, it has
become harder and harder

798
00:41:20,430 --> 00:41:23,230
for veteran independent drivers
like the Cumbees

799
00:41:23,230 --> 00:41:25,070
to make a living.

800
00:41:25,070 --> 00:41:27,070
They've been replaced by
younger,

801
00:41:27,070 --> 00:41:30,200
less experienced drivers.

802
00:41:30,200 --> 00:41:34,630
>> So, the, the trucking
industry's $740 billion a year,

803
00:41:34,630 --> 00:41:36,770
and, again, in, in many
of these operations,

804
00:41:36,770 --> 00:41:39,470
labor's a third of that cost.

805
00:41:39,470 --> 00:41:42,500
By my estimate, I, you know,
I think we're in the range

806
00:41:42,500 --> 00:41:44,970
of 300,000 or so jobs
in the foreseeable future

807
00:41:44,970 --> 00:41:49,930
that could be automated to some
significant extent.

808
00:41:49,930 --> 00:41:52,630
♪ ♪

809
00:41:52,630 --> 00:41:55,530
>> (groans)

810
00:41:55,530 --> 00:41:59,070
♪ ♪

811
00:42:05,000 --> 00:42:08,130
>> NARRATOR: The A.I. future
was built with great optimism

812
00:42:08,130 --> 00:42:11,100
out here in the West.

813
00:42:11,100 --> 00:42:14,630
In 2018, many of the people
who invented it

814
00:42:14,630 --> 00:42:18,170
gathered in San Francisco to
celebrate the 25th anniversary

815
00:42:18,170 --> 00:42:20,700
of the industry magazine.

816
00:42:18,170 --> 00:42:20,700
of the industry magazine.

817
00:42:20,700 --> 00:42:24,300
>> Howdy, welcome to WIRED25.

818
00:42:24,300 --> 00:42:26,200
>> NARRATOR: It is a
celebration, for sure,

819
00:42:26,200 --> 00:42:29,070
but there's also a growing sense
of caution

820
00:42:29,070 --> 00:42:30,670
and even skepticism.

821
00:42:33,130 --> 00:42:35,330
>> We're having a really good
weekend here.

822
00:42:35,330 --> 00:42:39,030
>> NARRATOR: Nick Thompson is
editor-in-chief of "Wired."

823
00:42:39,030 --> 00:42:42,030
>> When it started,
it was very much a magazine

824
00:42:42,030 --> 00:42:46,100
about what's coming and why you
should be excited about it.

825
00:42:46,100 --> 00:42:49,730
Optimism was the defining
feature of "Wired"

826
00:42:49,730 --> 00:42:51,400
for many, many years.

827
00:42:51,400 --> 00:42:55,130
Or, as our slogan used to be,
"Change Is Good."

828
00:42:55,130 --> 00:42:57,070
And over time,
it shifted a little bit.

829
00:42:57,070 --> 00:43:01,170
And now it's more,
"We love technology,

830
00:43:01,170 --> 00:43:02,630
but let's look at some
of the big issues,

831
00:43:02,630 --> 00:43:05,400
and let's look at some of them
critically,

832
00:43:05,400 --> 00:43:07,730
and let's look at the way
algorithms are changing

833
00:43:07,730 --> 00:43:09,930
the way we behave,
for good and for ill."

834
00:43:09,930 --> 00:43:14,030
So, the whole nature of "Wired"
has gone from a champion

835
00:43:14,030 --> 00:43:16,830
of technological change to more
of a observer

836
00:43:16,830 --> 00:43:18,700
of technological change.

837
00:43:18,700 --> 00:43:20,570
>> So, um, before we start...

838
00:43:20,570 --> 00:43:22,530
>> NARRATOR: There
are 25 speakers,

839
00:43:22,530 --> 00:43:25,700
all named as icons
of the last 25 years

840
00:43:25,700 --> 00:43:27,500
of technological progress.

841
00:43:27,500 --> 00:43:29,770
>> So, why is Apple so
secretive?

842
00:43:29,770 --> 00:43:31,470
>> (chuckling)

843
00:43:31,470 --> 00:43:33,630
>> NARRATOR: Jony Ive, who
designed Apple's iPhone.

844
00:43:33,630 --> 00:43:36,300
>> It would be bizarre
not to be.

845
00:43:36,300 --> 00:43:38,670
>> There's this question of,
like,

846
00:43:38,670 --> 00:43:41,000
what are we doing here in this
life, in this reality?

847
00:43:41,000 --> 00:43:45,170
>> NARRATOR: Jaron Lanier, who
pioneered virtual reality.

848
00:43:45,170 --> 00:43:48,500
And Jeff Bezos,
the founder of Amazon.

849
00:43:48,500 --> 00:43:49,870
>> Amazon was a garage startup.

850
00:43:49,870 --> 00:43:51,370
Now it's a very large company.

851
00:43:51,370 --> 00:43:52,570
Two kids in a dorm...

852
00:43:52,570 --> 00:43:54,070
>> NARRATOR: His message is,

853
00:43:54,070 --> 00:43:56,730
"All will be well
in the new world."

854
00:43:56,730 --> 00:44:00,470
>> I guess, first of all, I
remain incredibly optimistic

855
00:44:00,470 --> 00:44:01,630
about technology,

856
00:44:01,630 --> 00:44:03,830
and technologies always
are two-sided.

857
00:44:03,830 --> 00:44:05,230
But that's not new.

858
00:44:05,230 --> 00:44:07,400
That's always been the case.

859
00:44:07,400 --> 00:44:09,830
And, and we will figure it out.

860
00:44:09,830 --> 00:44:12,570
The last thing we would ever
want to do is stop the progress

861
00:44:12,570 --> 00:44:18,630
of new technologies,
even when they are dual-use.

862
00:44:18,630 --> 00:44:21,800
>> NARRATOR: But, says Thompson,
beneath the surface,

863
00:44:21,800 --> 00:44:24,530
there's a worry most of them
don't like to talk about.

864
00:44:24,530 --> 00:44:28,630
>> There are some people in
Silicon Valley who believe that,

865
00:44:28,630 --> 00:44:31,900
"You just have to trust
the technology.

866
00:44:31,900 --> 00:44:34,870
Throughout history, there's been
a complicated relationship

867
00:44:34,870 --> 00:44:36,470
between humans and machines,

868
00:44:36,470 --> 00:44:38,770
we've always worried about
machines,

869
00:44:38,770 --> 00:44:40,130
and it's always been fine.

870
00:44:40,130 --> 00:44:43,000
And we don't know how A.I. will
change the labor force,

871
00:44:43,000 --> 00:44:44,300
but it will be okay."

872
00:44:44,300 --> 00:44:46,070
So, that argument exists.

873
00:44:46,070 --> 00:44:47,700
There's another argument,

874
00:44:47,700 --> 00:44:50,170
which is what I think most of
them believe deep down,

875
00:44:50,170 --> 00:44:53,100
which is, "This is different.

876
00:44:53,100 --> 00:44:54,930
We're going to have labor-force
disruption

877
00:44:54,930 --> 00:44:57,030
like we've never seen before.

878
00:44:57,030 --> 00:45:01,370
And if that happens,
will they blame us?"

879
00:45:01,370 --> 00:45:04,600
>> NARRATOR: There is, however,
one of the WIRED25 icons

880
00:45:04,600 --> 00:45:07,800
willing to take on the issue.

881
00:45:07,800 --> 00:45:11,470
Onstage, Kai-Fu Lee dispenses
with one common fear.

882
00:45:11,470 --> 00:45:13,670
>> Well, I think there are so
many myths out there.

883
00:45:13,670 --> 00:45:16,530
I think one, one myth is that

884
00:45:16,530 --> 00:45:19,570
because A.I. is so good at a
single task,

885
00:45:16,530 --> 00:45:19,570
because A.I. is so good at a
single task,

886
00:45:19,570 --> 00:45:23,600
that one day we'll wake up, and
we'll all be enslaved

887
00:45:23,600 --> 00:45:26,100
or forced to plug our brains
to the A.I.

888
00:45:26,100 --> 00:45:30,800
But it is nowhere close
to displacing humans.

889
00:45:30,800 --> 00:45:34,130
>> NARRATOR: But in interviews
around the event and beyond,

890
00:45:34,130 --> 00:45:39,430
he takes a decidedly contrarian
position on A.I. and job loss.

891
00:45:39,430 --> 00:45:43,270
>> The A.I. giants want to paint
the rosier picture

892
00:45:43,270 --> 00:45:45,500
because they're happily
making money.

893
00:45:45,500 --> 00:45:49,330
So, I think they prefer not to
talk about the negative side.

894
00:45:49,330 --> 00:45:55,070
I believe about 50% of jobs
will be

895
00:45:55,070 --> 00:45:58,900
somewhat or extremely
threatened by A.I.

896
00:45:58,900 --> 00:46:02,500
in the next 15 years or so.

897
00:46:02,500 --> 00:46:04,570
>> NARRATOR: Kai-Fu Lee also
makes a great deal

898
00:46:04,570 --> 00:46:06,900
of money from A.I.

899
00:46:06,900 --> 00:46:08,800
What separates him from most of
his colleagues

900
00:46:08,800 --> 00:46:11,930
is that he's frank
about its downside.

901
00:46:11,930 --> 00:46:15,900
>> Yes, yes, we, we've made
about 40 investments in A.I.

902
00:46:15,900 --> 00:46:18,930
I think, based on these 40
investments,

903
00:46:18,930 --> 00:46:22,000
most of them are not impacting
human jobs.

904
00:46:22,000 --> 00:46:23,970
They're creating value,
making high margins,

905
00:46:23,970 --> 00:46:26,300
inventing a new model.

906
00:46:26,300 --> 00:46:29,730
But I could list seven or eight

907
00:46:29,730 --> 00:46:34,670
that would lead to a very clear
displacement of human jobs.

908
00:46:34,670 --> 00:46:36,370
>> NARRATOR: He says that A.I.
is coming,

909
00:46:36,370 --> 00:46:38,470
whether we like it or not.

910
00:46:38,470 --> 00:46:40,300
And he wants to warn society

911
00:46:40,300 --> 00:46:43,030
about what he sees as
inevitable.

912
00:46:43,030 --> 00:46:45,600
>> You have a view which I think
is different than many others,

913
00:46:45,600 --> 00:46:50,670
which is that A.I. is not going
to take blue-collar jobs

914
00:46:50,670 --> 00:46:53,230
so quickly, but is actually
going to take white-collar jobs.

915
00:46:53,230 --> 00:46:55,770
>> Yeah.
Well, both will happen.

916
00:46:55,770 --> 00:46:59,000
A.I. will be, at the same time,
a replacement for blue-collar,

917
00:46:59,000 --> 00:47:02,630
white-collar jobs, and be
a great symbiotic tool

918
00:47:02,630 --> 00:47:05,630
for doctors, lawyers, and you,
for example.

919
00:47:05,630 --> 00:47:07,700
But the white-collar jobs are
easier to take,

920
00:47:07,700 --> 00:47:12,030
because they're a pure
quantitative analytical process.

921
00:47:12,030 --> 00:47:17,370
Let's say reporters, traders,
telemarketing,

922
00:47:17,370 --> 00:47:19,270
telesales, customer service...

923
00:47:19,270 --> 00:47:20,730
>> Analysts?

924
00:47:20,730 --> 00:47:25,170
>> Analysts, yes, these can all
be replaced just by a software.

925
00:47:25,170 --> 00:47:28,330
To do blue-collar, some of the
work requires, you know,

926
00:47:28,330 --> 00:47:32,030
hand-eye coordination, things
that machines are not yet

927
00:47:32,030 --> 00:47:34,300
good enough to do.

928
00:47:34,300 --> 00:47:38,400
>> Today, there are many people
who are ringing the alarm,

929
00:47:38,400 --> 00:47:39,600
"Oh, my God, what are we going
to do?

930
00:47:39,600 --> 00:47:41,830
Half the jobs are going away."

931
00:47:41,830 --> 00:47:45,430
I believe that's true, but
here's the missing fact.

932
00:47:45,430 --> 00:47:48,400
I've done the research on this,
and if you go back 20, 30,

933
00:47:48,400 --> 00:47:52,930
or 40 years ago, you will find
that 50% of the jobs

934
00:47:52,930 --> 00:47:56,400
that people performed back then
are gone today.

935
00:47:56,400 --> 00:47:58,900
You know, where are all the
telephone operators,

936
00:47:58,900 --> 00:48:02,600
bowling-pin setters,
elevator operators?

937
00:48:02,600 --> 00:48:06,270
You used to have seas of
secretaries in corporations

938
00:48:06,270 --> 00:48:08,070
that have now been eliminated--
travel agents.

939
00:48:08,070 --> 00:48:10,770
You can just go through field
after field after field.

940
00:48:10,770 --> 00:48:14,100
That same pattern has recurred
many times throughout history,

941
00:48:14,100 --> 00:48:16,230
with each new wave
of automation.

942
00:48:16,230 --> 00:48:22,270
>> But I would argue that
history is only trustable

943
00:48:16,230 --> 00:48:22,270
>> But I would argue that
history is only trustable

944
00:48:22,270 --> 00:48:26,670
if it is multiple repetitions
of similar events,

945
00:48:26,670 --> 00:48:30,670
not once-in-a-blue-moon
occurrence.

946
00:48:30,670 --> 00:48:35,070
So, over the history of many
tech inventions,

947
00:48:35,070 --> 00:48:36,770
most are small things.

948
00:48:36,770 --> 00:48:43,330
Only maybe three are at the
magnitude of A.I. revolution--

949
00:48:43,330 --> 00:48:46,730
the steam, steam engine,
electricity,

950
00:48:46,730 --> 00:48:48,570
and the computer revolution.

951
00:48:48,570 --> 00:48:50,970
I'd say everything else
is too small.

952
00:48:50,970 --> 00:48:54,670
And the reason I think it might
be something brand-new

953
00:48:54,670 --> 00:49:00,930
is that A.I. is fundamentally
replacing our cognitive process

954
00:49:00,930 --> 00:49:05,670
in doing a job in its
significant entirety,

955
00:49:05,670 --> 00:49:08,400
and it can do it dramatically
better.

956
00:49:08,400 --> 00:49:10,570
>> NARRATOR: This argument
about job loss

957
00:49:10,570 --> 00:49:13,470
in the age of A.I. was ignited
six years ago

958
00:49:13,470 --> 00:49:17,830
amid the gargoyles and spires
of Oxford University.

959
00:49:17,830 --> 00:49:21,970
Two researchers had been poring
through U.S. labor statistics,

960
00:49:21,970 --> 00:49:27,270
identifying jobs that could be
vulnerable to A.I. automation.

961
00:49:27,270 --> 00:49:29,300
>> Well, vulnerable to
automation,

962
00:49:29,300 --> 00:49:32,730
in the context that we discussed
five years ago now,

963
00:49:32,730 --> 00:49:36,430
essentially meant that those
jobs are potentially automatable

964
00:49:36,430 --> 00:49:38,900
over an unspecified number of
years.

965
00:49:38,900 --> 00:49:43,530
And the figure we came up with
was 47%.

966
00:49:43,530 --> 00:49:45,330
>> NARRATOR: 47%.

967
00:49:45,330 --> 00:49:48,470
That number quickly traveled
the world in headlines

968
00:49:48,470 --> 00:49:49,830
and news bulletins.

969
00:49:49,830 --> 00:49:53,030
But authors Carl Frey
and Michael Osborne

970
00:49:53,030 --> 00:49:54,770
offered a caution.

971
00:49:54,770 --> 00:49:59,670
They can't predict how many jobs
will be lost, or how quickly.

972
00:49:59,670 --> 00:50:04,430
But Frey believes that there are
lessons in history.

973
00:50:04,430 --> 00:50:06,830
>> And what worries me the most
is that there is actually

974
00:50:06,830 --> 00:50:10,830
one episode that looks quite
familiar to today,

975
00:50:10,830 --> 00:50:14,270
which is the British
Industrial Revolution,

976
00:50:14,270 --> 00:50:18,400
where wages didn't grow
for nine decades,

977
00:50:18,400 --> 00:50:22,530
and a lot of people actually
saw living standards decline

978
00:50:22,530 --> 00:50:25,870
as technology progressed.

979
00:50:25,870 --> 00:50:27,630
♪ ♪

980
00:50:27,630 --> 00:50:30,370
>> NARRATOR: Saginaw, Michigan,
knows about decline

981
00:50:30,370 --> 00:50:33,170
in living standards.

982
00:50:33,170 --> 00:50:36,900
Harry Cripps, an auto worker
and a local union president,

983
00:50:36,900 --> 00:50:42,730
has witnessed what 40 years of
automation can do to a town.

984
00:50:42,730 --> 00:50:45,470
>> You know, we're one of the
cities in the country that,

985
00:50:45,470 --> 00:50:49,170
I think we were left behind in
this recovery.

986
00:50:49,170 --> 00:50:53,670
And I just... I don't know how
we get on the bandwagon now.

987
00:50:56,770 --> 00:50:59,030
>> NARRATOR: Once, this was the
U.A.W. hall

988
00:50:59,030 --> 00:51:01,230
for one local union.

989
00:51:01,230 --> 00:51:05,670
Now, with falling membership,
it's shared by five locals.

990
00:51:05,670 --> 00:51:07,730
>> Rudy didn't get his shift.

991
00:51:07,730 --> 00:51:09,330
>> NARRATOR: This day,
it's the center

992
00:51:09,330 --> 00:51:11,570
for a Christmas food drive.

993
00:51:11,570 --> 00:51:14,030
Even in a growth economy,

994
00:51:14,030 --> 00:51:16,830
unemployment here is near
six percent.

995
00:51:16,830 --> 00:51:20,930
Poverty in Saginaw is over 30%.

996
00:51:16,830 --> 00:51:20,930
Poverty in Saginaw is over 30%.

997
00:51:23,830 --> 00:51:27,130
>> Our factory has about
1.9 million square feet.

998
00:51:27,130 --> 00:51:31,100
Back in the '70s, that 1.9
million square feet

999
00:51:31,100 --> 00:51:34,330
had about 7,500 U.A.W.
automotive workers

1000
00:51:34,330 --> 00:51:36,300
making middle-class wage with
decent benefits

1001
00:51:36,300 --> 00:51:38,770
and able to send their kids to
college and do all the things

1002
00:51:38,770 --> 00:51:41,000
that the middle-class family
should be able to do.

1003
00:51:41,000 --> 00:51:44,270
Our factory today, with
automation,

1004
00:51:44,270 --> 00:51:48,300
would probably be about
700 United Auto Workers.

1005
00:51:48,300 --> 00:51:52,130
That's a dramatic change.

1006
00:51:52,130 --> 00:51:54,230
Lot of union brothers used
to work there, buddy.

1007
00:51:54,230 --> 00:51:57,130
>> The TRW plant, that was
unfortunate.

1008
00:51:57,130 --> 00:51:59,830
>> Delphi... looks like they're
starting to tear it down now.

1009
00:51:59,830 --> 00:52:01,300
Wow.

1010
00:52:01,300 --> 00:52:04,770
Automations is, is definitely
taking away a lot of jobs.

1011
00:52:04,770 --> 00:52:07,530
Robots, I don't know how they
buy cars,

1012
00:52:07,530 --> 00:52:09,300
I don't know how
they buy sandwiches,

1013
00:52:09,300 --> 00:52:11,100
I don't know how they go to the
grocery store.

1014
00:52:11,100 --> 00:52:13,430
They definitely don't pay taxes,
which serves the infrastructure.

1015
00:52:13,430 --> 00:52:17,300
So, you don't have the sheriffs
and the police and the firemen,

1016
00:52:17,300 --> 00:52:20,830
and anybody else that supports
the city is gone,

1017
00:52:20,830 --> 00:52:21,900
'cause there's no tax base.

1018
00:52:21,900 --> 00:52:25,770
Robots don't pay taxes.

1019
00:52:25,770 --> 00:52:27,900
>> NARRATOR: The average
personal income in Saginaw

1020
00:52:27,900 --> 00:52:31,570
is $16,000 a year.

1021
00:52:31,570 --> 00:52:34,600
>> A lot of the families that I
work with here in the community,

1022
00:52:34,600 --> 00:52:35,830
both parents are working.

1023
00:52:35,830 --> 00:52:37,470
They're working two jobs.

1024
00:52:37,470 --> 00:52:40,370
Mainly, it's the wages,
you know,

1025
00:52:40,370 --> 00:52:45,270
people not making a decent wage
to be able to support a family.

1026
00:52:45,270 --> 00:52:48,930
Like, back in the day, my dad
even worked at the plant.

1027
00:52:48,930 --> 00:52:51,300
My mom stayed home,
raised the children.

1028
00:52:51,300 --> 00:52:54,000
And that give us the opportunity
to put food on the table,

1029
00:52:54,000 --> 00:52:55,370
and things of that nature.

1030
00:52:55,370 --> 00:52:58,000
And, and them times are gone.

1031
00:52:58,000 --> 00:52:59,930
>> If you look at this graph of
what's been happening

1032
00:52:59,930 --> 00:53:01,670
to America since the end
of World War II,

1033
00:53:01,670 --> 00:53:05,000
you see a line for our
productivity,

1034
00:53:05,000 --> 00:53:07,730
and our productivity
gets better over time.

1035
00:53:07,730 --> 00:53:10,830
It used to be the case
that our pay, our income,

1036
00:53:10,830 --> 00:53:14,700
would increase in lockstep with
those productivity increases.

1037
00:53:14,700 --> 00:53:19,570
The weird part about this graph
is how the income has decoupled,

1038
00:53:19,570 --> 00:53:23,900
is not going up the same way
that productivity is anymore.

1039
00:53:23,900 --> 00:53:26,170
>> NARRATOR: As automation has
taken over,

1040
00:53:26,170 --> 00:53:29,770
workers are either laid off or
left with less-skilled jobs

1041
00:53:29,770 --> 00:53:33,400
for less pay,
while productivity goes up.

1042
00:53:33,400 --> 00:53:35,100
>> There are still plenty
of factories in America.

1043
00:53:35,100 --> 00:53:37,430
We are a manufacturing
powerhouse,

1044
00:53:37,430 --> 00:53:39,670
but if you go walk around
an American factory,

1045
00:53:39,670 --> 00:53:42,070
you do not see long lines
of people

1046
00:53:42,070 --> 00:53:44,470
doing repetitive manual labor.

1047
00:53:44,470 --> 00:53:46,600
You see a whole lot
of automation.

1048
00:53:46,600 --> 00:53:48,230
If you go upstairs in that
factory

1049
00:53:48,230 --> 00:53:49,830
and look at the payroll
department,

1050
00:53:49,830 --> 00:53:53,130
you see one or two people
looking into a screen all day.

1051
00:53:53,130 --> 00:53:55,800
So, the activity is still there,

1052
00:53:55,800 --> 00:53:58,000
but the number of jobs
is very, very low,

1053
00:53:58,000 --> 00:54:00,330
because of automation
and tech progress.

1054
00:54:00,330 --> 00:54:03,130
Now, dealing with
that challenge,

1055
00:54:03,130 --> 00:54:04,900
and figuring out what
the next generation

1056
00:54:04,900 --> 00:54:07,700
of the American middle class
should be doing,

1057
00:54:07,700 --> 00:54:09,700
is a really important challenge,

1058
00:54:09,700 --> 00:54:12,530
because I am pretty confident
that we are never again

1059
00:54:12,530 --> 00:54:15,330
going to have this large,
stable, prosperous

1060
00:54:15,330 --> 00:54:17,730
middle class doing routine work.

1061
00:54:17,730 --> 00:54:21,430
♪ ♪

1062
00:54:17,730 --> 00:54:21,430
♪ ♪

1063
00:54:21,430 --> 00:54:23,970
>> NARRATOR: Evidence of how
A.I. is likely to bring

1064
00:54:23,970 --> 00:54:27,530
accelerated change to the U.S.
workforce can be found

1065
00:54:27,530 --> 00:54:29,970
not far from Saginaw.

1066
00:54:29,970 --> 00:54:31,600
This is the U.S. headquarters

1067
00:54:31,600 --> 00:54:36,070
for one of the world's largest
builders of industrial robots,

1068
00:54:36,070 --> 00:54:40,030
a Japanese-owned company called
Fanuc Robotics.

1069
00:54:40,030 --> 00:54:43,230
>> We've been producing robots
for well over 35 years.

1070
00:54:43,230 --> 00:54:44,770
And you can imagine,
over the years,

1071
00:54:44,770 --> 00:54:47,330
they've changed quite a bit.

1072
00:54:47,330 --> 00:54:50,230
We're utilizing the artificial
intelligence

1073
00:54:50,230 --> 00:54:51,800
to really make the robots
easier to use

1074
00:54:51,800 --> 00:54:56,400
and be able to handle a broader
spectrum of opportunities.

1075
00:54:56,400 --> 00:54:59,770
We see a huge growth potential
in robotics.

1076
00:54:59,770 --> 00:55:02,330
And we see that growth potential
as being, really,

1077
00:55:02,330 --> 00:55:05,230
there's 90% of the market left.

1078
00:55:05,230 --> 00:55:07,230
>> NARRATOR: The industry says
optimistically

1079
00:55:07,230 --> 00:55:11,270
that with that growth,
they can create more jobs.

1080
00:55:11,270 --> 00:55:13,630
>> Even if there were five
people on a job,

1081
00:55:13,630 --> 00:55:14,870
and we reduced that down to two
people,

1082
00:55:14,870 --> 00:55:17,800
because we automated
some level of it,

1083
00:55:17,800 --> 00:55:20,570
we might produce two times more
parts than we did before,

1084
00:55:20,570 --> 00:55:22,170
because we automated it.

1085
00:55:22,170 --> 00:55:28,430
So now, there might be the need
for two more fork-truck drivers,

1086
00:55:28,430 --> 00:55:31,900
or two more quality-inspection
personnel.

1087
00:55:31,900 --> 00:55:33,870
So, although we reduce
some of the people,

1088
00:55:33,870 --> 00:55:38,100
we grow in other areas as we
produce more things.

1089
00:55:38,100 --> 00:55:43,070
>> When I increase productivity
through automation, I lose jobs.

1090
00:55:43,070 --> 00:55:44,370
Jobs go away.

1091
00:55:44,370 --> 00:55:47,170
And I don't care what the robot
manufacturers say,

1092
00:55:47,170 --> 00:55:49,830
you aren't replacing those ten
production people

1093
00:55:49,830 --> 00:55:53,570
that that robot is now doing
that job, with ten people.

1094
00:55:53,570 --> 00:55:56,830
You can increase productivity to
a level to stay competitive

1095
00:55:56,830 --> 00:56:00,970
with the global market-- that's
what they're trying to do.

1096
00:56:00,970 --> 00:56:02,530
♪ ♪

1097
00:56:02,530 --> 00:56:04,900
>> NARRATOR:
In the popular telling,

1098
00:56:04,900 --> 00:56:08,800
blame for widespread job loss
has been aimed overseas,

1099
00:56:08,800 --> 00:56:10,900
at what's called offshoring.

1100
00:56:10,900 --> 00:56:13,200
>> We want to keep
our factories here,

1101
00:56:13,200 --> 00:56:15,100
we want to keep
our manufacturing here.

1102
00:56:15,100 --> 00:56:19,470
We don't want them moving
to China, to Mexico, to Japan,

1103
00:56:19,470 --> 00:56:23,630
to India, to Vietnam.

1104
00:56:23,630 --> 00:56:25,770
>> NARRATOR: But it turns out
most of the job loss

1105
00:56:25,770 --> 00:56:28,370
isn't because of offshoring.

1106
00:56:28,370 --> 00:56:29,700
>> There's been offshoring.

1107
00:56:29,700 --> 00:56:34,300
And I think offshoring is
responsible for maybe 20%

1108
00:56:34,300 --> 00:56:36,000
of the jobs that have been lost.

1109
00:56:36,000 --> 00:56:38,270
I would say most of the jobs
that have been lost,

1110
00:56:38,270 --> 00:56:40,830
despite what most Americans
thinks, was due to automation

1111
00:56:40,830 --> 00:56:43,830
or productivity growth.

1112
00:56:43,830 --> 00:56:45,570
>> NARRATOR:
Mike Hicks is an economist

1113
00:56:45,570 --> 00:56:48,600
at Ball State University
in Muncie, Indiana.

1114
00:56:48,600 --> 00:56:52,300
He and sociologist Emily Wornell
have been documenting

1115
00:56:52,300 --> 00:56:54,670
employment trends
in Middle America.

1116
00:56:54,670 --> 00:56:59,130
Hicks says that automation has
been a mostly silent job killer,

1117
00:56:59,130 --> 00:57:01,200
lowering the standard of living.

1118
00:57:01,200 --> 00:57:04,400
>> So, in the last 15 years, the
standard of living has dropped

1119
00:57:04,400 --> 00:57:06,600
by 15, ten to 15 percent.

1120
00:57:06,600 --> 00:57:09,100
So, that's unusual
in a developed world.

1121
00:57:09,100 --> 00:57:10,600
A one-year decline
is a recession.

1122
00:57:10,600 --> 00:57:14,470
A 15-year decline gives
an entirely different sense

1123
00:57:14,470 --> 00:57:16,830
about the prospects
of a community.

1124
00:57:16,830 --> 00:57:20,500
And so that is common
from the Canadian border

1125
00:57:16,830 --> 00:57:20,500
And so that is common
from the Canadian border

1126
00:57:20,500 --> 00:57:22,970
to the Gulf of Mexico

1127
00:57:22,970 --> 00:57:25,300
in the middle swath
of the United States.

1128
00:57:25,300 --> 00:57:28,130
>> This is something we're gonna
do for you guys.

1129
00:57:28,130 --> 00:57:32,730
These were left over from our
suggestion drive that we did,

1130
00:57:32,730 --> 00:57:34,200
and we're going to give them
each two.

1131
00:57:34,200 --> 00:57:35,300
>> That is awesome.
>> I mean,

1132
00:57:35,300 --> 00:57:37,070
that is going to go a long ways,
right?

1133
00:57:37,070 --> 00:57:39,070
I mean, that'll really help that
family out during the holidays.

1134
00:57:39,070 --> 00:57:41,800
>> Yes, well, with the kids home
from school,

1135
00:57:41,800 --> 00:57:43,430
the families have three meals
a day that they got

1136
00:57:43,430 --> 00:57:45,170
to put on the table.

1137
00:57:45,170 --> 00:57:47,130
So, it's going to make a big
difference.

1138
00:57:47,130 --> 00:57:49,130
So, thank you, guys.
>> You're welcome.

1139
00:57:49,130 --> 00:57:50,830
>> This is wonderful.
>> Let them know Merry Christmas

1140
00:57:50,830 --> 00:57:52,370
on behalf of us here
at the local, okay?

1141
00:57:52,370 --> 00:57:54,930
>> Absolutely, you guys are
just, just amazing, thank you.

1142
00:57:54,930 --> 00:57:58,270
And please, tell, tell all the
workers how grateful

1143
00:57:58,270 --> 00:57:59,900
these families will be.
>> We will.

1144
00:57:59,900 --> 00:58:02,870
>> I mean, this is not a small
problem.

1145
00:58:02,870 --> 00:58:04,700
The need is so great.

1146
00:58:04,700 --> 00:58:07,830
And I can tell you
that it's all races,

1147
00:58:07,830 --> 00:58:10,070
it's all income classes

1148
00:58:10,070 --> 00:58:11,700
that you might think someone
might be from.

1149
00:58:11,700 --> 00:58:13,900
But I can tell you that when you
see it,

1150
00:58:13,900 --> 00:58:17,000
and you deliver this type
of gift to somebody

1151
00:58:17,000 --> 00:58:20,600
who is in need, just the
gratitude that they show you

1152
00:58:20,600 --> 00:58:24,470
is incredible.

1153
00:58:24,470 --> 00:58:28,470
>> We actually know that people
are at greater risk of mortality

1154
00:58:28,470 --> 00:58:32,130
for over 20 years after they
lose their job due to,

1155
00:58:32,130 --> 00:58:34,670
due to no fault of their own, so
something like automation

1156
00:58:34,670 --> 00:58:36,770
or offshoring.

1157
00:58:36,770 --> 00:58:38,970
They're at higher risk
for cardiovascular disease,

1158
00:58:38,970 --> 00:58:44,500
they're at higher risk
for depression and suicide.

1159
00:58:44,500 --> 00:58:46,630
But then with the
intergenerational impacts,

1160
00:58:46,630 --> 00:58:50,230
we also see their children
are more likely--

1161
00:58:50,230 --> 00:58:52,300
children of parents who have
lost their job

1162
00:58:52,300 --> 00:58:55,670
due to automation-- are more
likely to repeat a grade,

1163
00:58:55,670 --> 00:58:57,570
they're more likely to drop out
of school,

1164
00:58:57,570 --> 00:58:59,700
they're more likely to be
suspended from school,

1165
00:58:59,700 --> 00:59:01,470
and they have lower educational
attainment

1166
00:59:01,470 --> 00:59:05,200
over their entire lifetimes.

1167
00:59:05,200 --> 00:59:08,200
>> It's the future of this,
not the past, that scares me.

1168
00:59:08,200 --> 00:59:10,700
Because I think we're in the
early decades

1169
00:59:10,700 --> 00:59:13,170
of what is a multi-decade
adjustment period.

1170
00:59:13,170 --> 00:59:16,000
♪ ♪

1171
00:59:16,000 --> 00:59:20,170
>> NARRATOR: The world is being
re-imagined.

1172
00:59:20,170 --> 00:59:22,370
This is a supermarket.

1173
00:59:22,370 --> 00:59:26,800
Robots, guided by A.I., pack
everything from soap powder

1174
00:59:26,800 --> 00:59:31,530
to cantaloupes for online
consumers.

1175
00:59:31,530 --> 00:59:33,600
Machines that pick groceries,

1176
00:59:33,600 --> 00:59:37,170
machines that can also read
reports, learn routines,

1177
00:59:37,170 --> 00:59:40,730
and comprehend are reaching deep
into factories,

1178
00:59:40,730 --> 00:59:43,870
stores, and offices.

1179
00:59:43,870 --> 00:59:45,800
At a college in Goshen, Indiana,

1180
00:59:45,800 --> 00:59:49,030
a group of local business and
political leaders come together

1181
00:59:49,030 --> 00:59:54,830
to try to understand the impact
of A.I. and the new machines.

1182
00:59:54,830 --> 00:59:56,870
Molly Kinder studies
the future of work

1183
00:59:56,870 --> 00:59:58,470
at a Washington think tank.

1184
00:59:58,470 --> 01:00:00,970
>> How many people have gone
into a fast-food restaurant
59:58.970 --> 1:00:01.370 position:10%,start line:85% size:80%
and done a self-ordering?
1:00:01.370 --> 1:00:02.530 position:10%,start line:85% size:80%
Anyone, yes?
1:00:02.530 --> 1:00:04.400 position:10%,start line:85% size:80%
Panera, for instance,
is doing this.
1:00:04.400 --> 1:00:08.270 position:10%,start line:85% size:80%
Cashier was my first job,
and in, in, where I live,
1:00:08.270 --> 1:00:10.830 align:left position:10%,start line:85% size:80%
in Washington, DC, it's actually
the number-one occupation
1:00:10.830 --> 1:00:12.300 position:10%,start line:85% size:80%
for the greater DC region.
1:00:12.300 --> 1:00:14.670 align:left position:10%,start line:85% size:80%
There are millions of people who
work in cashier positions.
1:00:14.670 --> 1:00:17.000 position:10%,start line:85% size:80%
This is not a futuristic
challenge,


