1
00:00:06,505 --> 00:00:09,575
Welcome
to YouTube Original Stages,

2
00:00:09,608 --> 00:00:12,979
once home to Howard Hughes's
Spruce Goose assembly hangar,

3
00:00:13,012 --> 00:00:14,847
and home to much of
the first<i> Iron Man,</i>

4
00:00:14,881 --> 00:00:16,483
filmed 12 years ago.

5
00:00:17,149 --> 00:00:18,885
Many happy memories here.

6
00:00:18,918 --> 00:00:21,354
And speaking
of taking a look back...

7
00:00:21,387 --> 00:00:22,655
Technology.

8
00:00:22,688 --> 00:00:24,123
It's advancing faster

9
00:00:24,156 --> 00:00:25,892
and taking less time
to be widely adopted

10
00:00:25,925 --> 00:00:27,393
than ever before,

11
00:00:27,427 --> 00:00:30,196
like as in
it took roughly 10,000 years

12
00:00:30,229 --> 00:00:32,065
to go from writing
to printing press,

13
00:00:32,098 --> 00:00:34,700
but only about 500 more
to get to email.

14
00:00:34,734 --> 00:00:38,171
Now it seems
we're at the dawn of a new age,

15
00:00:38,204 --> 00:00:39,739
the age of A.I...

16
00:00:39,772 --> 00:00:41,507
Artificial Intelligence.

17
00:00:41,541 --> 00:00:42,875
Please define.

18
00:00:42,909 --> 00:00:45,078
[automated voice speaking]

19
00:00:48,314 --> 00:00:50,015
Uh-huh, okay.
There you have it.

20
00:00:50,049 --> 00:00:52,017
What does it mean?
I don't know.

21
00:00:52,051 --> 00:00:53,552
Tons of folks
are working on it, right?

22
00:00:53,585 --> 00:00:55,155
Most people don't know
that much about it,

23
00:00:55,188 --> 00:00:56,822
and of course,
there's no shortage

24
00:00:56,856 --> 00:00:58,123
of data or opinions.

25
00:00:58,157 --> 00:00:59,859
Anyway, I've heard it said

26
00:00:59,893 --> 00:01:01,861
that the best way
to learn about a subject

27
00:01:01,894 --> 00:01:02,896
is to teach it,

28
00:01:02,929 --> 00:01:04,197
but to level with ya,

29
00:01:04,230 --> 00:01:06,999
I have a wildly
incomplete education...

30
00:01:07,032 --> 00:01:08,501
Not in my day job,

31
00:01:08,534 --> 00:01:11,137
where I've been A.I.-adjacent
for over a decade.

32
00:01:11,170 --> 00:01:13,473
Anyway, I figured now
would be as good a time as any

33
00:01:13,506 --> 00:01:15,407
to catch up
on the state of things

34
00:01:15,441 --> 00:01:17,543
regarding
this emerging phenomenon.

35
00:01:17,576 --> 00:01:20,779
My sense of it
is it kind of feels like

36
00:01:20,813 --> 00:01:23,115
Pandora's box,
maybe... ish?

37
00:01:23,148 --> 00:01:24,884
Much of my understanding
on this topic

38
00:01:24,917 --> 00:01:26,786
has come from sci-fi stories,

39
00:01:26,819 --> 00:01:28,054
which usually depict us

40
00:01:28,087 --> 00:01:30,590
heading toward Shangri-La
or dystopia.

41
00:01:30,623 --> 00:01:31,791
Like most things,

42
00:01:31,824 --> 00:01:34,660
I suspect the truth is probably
somewhere in the middle.

43
00:01:34,694 --> 00:01:35,628
Now, along the way,

44
00:01:35,661 --> 00:01:37,497
we'll demystify
some common misconceptions

45
00:01:37,530 --> 00:01:40,633
about things we thought
we understood,
but probably don't,

46
00:01:40,667 --> 00:01:42,135
terms such as

47
00:01:42,168 --> 00:01:44,136
"machine learning,"
"algorithms,"

48
00:01:44,170 --> 00:01:46,940
"computer vision"
and "Big Data,"

49
00:01:46,973 --> 00:01:48,741
they will be
conveniently unpacked

50
00:01:48,775 --> 00:01:51,544
to help us feel like
we know what we're doing,

51
00:01:51,577 --> 00:01:52,412
kinda.

52
00:01:52,445 --> 00:01:54,681
By the way, Pandora's box...

53
00:01:58,485 --> 00:01:59,652
wasn't a box.

54
00:02:00,853 --> 00:02:02,255
It...

55
00:02:02,922 --> 00:02:04,791
was a clay jar.

56
00:02:04,824 --> 00:02:06,492
How about that?

57
00:02:06,525 --> 00:02:08,428
Demystified.

58
00:02:11,096 --> 00:02:14,200
A.I. is teaching the machine,

59
00:02:14,234 --> 00:02:17,236
and the machine
becoming smart.

60
00:02:17,270 --> 00:02:19,705
Each time we create
a more powerful technology,

61
00:02:19,738 --> 00:02:22,441
we create a bigger lever
for changing the world.

62
00:02:22,475 --> 00:02:24,677
[computer]
<i> Autonomous driving started.</i>

63
00:02:24,711 --> 00:02:26,345
[Downey]
<i> It's an extraordinary time,</i>

64
00:02:26,378 --> 00:02:29,315
<i> one of unprecedented
 change and possibility.</i>

65
00:02:30,483 --> 00:02:32,318
<i> To help us understand
 what's happening,</i>

66
00:02:32,352 --> 00:02:34,187
<i> this series
 will look at innovators</i>

67
00:02:34,220 --> 00:02:35,921
<i> pushing the boundaries
 of A.I...</i>

68
00:02:35,954 --> 00:02:37,189
No, stop!

69
00:02:37,223 --> 00:02:38,991
[Downey]<i> ...and how
 their groundbreaking work</i>

70
00:02:39,025 --> 00:02:40,894
<i> is profoundly impacting
 our lives...</i>

71
00:02:40,927 --> 00:02:42,194
Yay! [laughing]

72
00:02:42,228 --> 00:02:44,463
[Downey]
<i> ...and the world around us.</i>

73
00:02:44,496 --> 00:02:47,266
<i> In this episode, we'll meet
 two different visionaries</i>

74
00:02:47,299 --> 00:02:49,302
<i>exploring identity, creativity,</i>

75
00:02:49,335 --> 00:02:52,438
<i> and collaboration
 between humans and machines.</i>

76
00:02:52,471 --> 00:02:55,007
Intelligence used to be
the province of only humans,

77
00:02:55,041 --> 00:02:56,375
but it no longer is.

78
00:02:56,409 --> 00:02:58,912
We don't program the machines.
They learn by themselves.

79
00:03:09,155 --> 00:03:12,024
Mm. Ah. That's good.

80
00:03:12,058 --> 00:03:14,093
All right.

81
00:03:14,126 --> 00:03:17,430
My background's always been
a mixture of art and science.

82
00:03:18,564 --> 00:03:21,767
I ended up doing a PhD
in bioengineering,

83
00:03:21,801 --> 00:03:24,437
then I ended up
in the film industry,

84
00:03:24,470 --> 00:03:27,073
working on<i> King Kong</i>
to<i> Avatar,</i>

85
00:03:27,106 --> 00:03:28,841
simulating faces.

86
00:03:30,009 --> 00:03:31,844
I'd got to a point
in my career

87
00:03:31,877 --> 00:03:32,912
where I'd been, you know,

88
00:03:32,946 --> 00:03:35,081
lucky enough to win
a couple of Academy Awards,

89
00:03:35,114 --> 00:03:37,283
so I thought,
"Okay, what happens

90
00:03:37,317 --> 00:03:40,319
if we actually tried to bring
those characters to life,

91
00:03:40,353 --> 00:03:42,722
that actually
you could interact with?"

92
00:03:43,256 --> 00:03:45,091
[toddler crying]

93
00:03:45,124 --> 00:03:47,126
Baby... Ooh.

94
00:03:47,160 --> 00:03:48,728
[toddler fusses]

95
00:03:48,761 --> 00:03:49,795
What can you see?

96
00:03:49,829 --> 00:03:54,199
So "Baby X" is a lifelike
simulation of a toddler.

97
00:03:54,233 --> 00:03:57,470
Hey. Are you
excited to be here?

98
00:03:57,503 --> 00:03:59,338
She's actually seeing me
through the web camera,

99
00:03:59,371 --> 00:04:02,141
she's listening
through the microphone.

100
00:04:02,174 --> 00:04:04,577
Woo... yeah.

101
00:04:04,610 --> 00:04:07,346
Baby X is about
exploring the nature

102
00:04:07,380 --> 00:04:09,349
of how would we build
a digital consciousness,

103
00:04:09,382 --> 00:04:10,549
if it's possible?

104
00:04:10,582 --> 00:04:12,218
We don't know
if it's possible,

105
00:04:12,251 --> 00:04:14,053
but we're chipping away
at that problem.

106
00:04:14,086 --> 00:04:15,588
Hey, Baby. Hey.

107
00:04:15,621 --> 00:04:17,423
[Downey]
<i>"Problem" is an understatement</i>

108
00:04:17,456 --> 00:04:19,024
<i> for what
 Mark's chipping away at.</i>

109
00:04:19,058 --> 00:04:20,359
<i> His vision of the future</i>

110
00:04:20,393 --> 00:04:22,562
<i> is one where
 human and machine cooperate,</i>

111
00:04:22,595 --> 00:04:25,397
<i> and the best way
 to achieve that, he thinks,</i>

112
00:04:25,431 --> 00:04:28,834
<i> is to make A.I.
 as life-like as possible.</i>

113
00:04:28,867 --> 00:04:31,137
Peek-a-boo!

114
00:04:31,170 --> 00:04:32,604
[Baby X giggling]

115
00:04:32,638 --> 00:04:35,274
[Downey]<i> Which is why he began
 where most life begins...</i>

116
00:04:35,307 --> 00:04:36,575
<i> a baby...</i>

117
00:04:36,609 --> 00:04:39,078
<i> modeled
 after his own daughter.</i>

118
00:04:39,111 --> 00:04:41,614
So if we start
revealing her layers,

119
00:04:41,647 --> 00:04:43,582
she's driven by virtual muscles,

120
00:04:43,616 --> 00:04:45,551
and the virtual muscles,
in turn,

121
00:04:45,584 --> 00:04:47,553
are driven by a virtual brain.

122
00:04:47,586 --> 00:04:49,888
Now, these are
radically simplified models

123
00:04:49,922 --> 00:04:51,024
from the real thing,

124
00:04:51,057 --> 00:04:52,325
but nevertheless,

125
00:04:52,358 --> 00:04:54,593
they're models that
we can explore how they work,

126
00:04:54,627 --> 00:04:56,996
because we have
a real template that exists,

127
00:04:57,030 --> 00:04:58,398
the human brain.

128
00:04:59,932 --> 00:05:02,869
So, these are all driven
by neural networks.

129
00:05:03,869 --> 00:05:04,803
[Downey]<i> "Neural network"</i>

130
00:05:04,837 --> 00:05:06,773
<i> is a virtual,
 much simpler version</i>

131
00:05:06,806 --> 00:05:07,940
<i> of the human brain.</i>

132
00:05:07,974 --> 00:05:11,143
<i> The brain is the most
 complex system in our body.</i>

133
00:05:11,177 --> 00:05:14,880
<i> It's got 85 billion neurons,
 each of which fire non-stop,</i>

134
00:05:14,914 --> 00:05:19,385
<i> receiving, processing,
 and sending information.</i>

135
00:05:19,418 --> 00:05:22,021
<i> Baby X's brain
 is nowhere near as complex,</i>

136
00:05:22,054 --> 00:05:23,856
<i> but that's the goal.</i>

137
00:05:23,890 --> 00:05:26,625
<i> Instead of neurons,
 it's got nodes.</i>

138
00:05:26,658 --> 00:05:28,494
<i> The more the nodes
 are exposed to,</i>

139
00:05:28,527 --> 00:05:30,129
<i> the more they learn.</i>

140
00:05:30,163 --> 00:05:32,431
[Sagar] What we've learned
is it's very hard to build
a digital brain,

141
00:05:32,465 --> 00:05:34,466
but where we want to go
with it

142
00:05:34,500 --> 00:05:37,503
is we're trying to build
a human-like A.I.

143
00:05:37,537 --> 00:05:39,271
which has
a flexible intelligence

144
00:05:39,305 --> 00:05:40,873
that can relate to people.

145
00:05:41,607 --> 00:05:43,409
I think
the best kind of systems

146
00:05:43,442 --> 00:05:46,179
are when humans and A.I.
work together.

147
00:05:46,212 --> 00:05:49,549
One of the biggest
misconceptions of A.I.

148
00:05:49,582 --> 00:05:52,351
is that there is
a super-intelligent being,

149
00:05:52,384 --> 00:05:54,253
or what we call
a generalized A.I.,

150
00:05:54,286 --> 00:05:56,655
that knows all, can do all,

151
00:05:56,688 --> 00:05:59,124
smarter than all of us
put together.

152
00:05:59,157 --> 00:06:01,093
That is a total misconception.

153
00:06:01,127 --> 00:06:03,629
A.I. is built on us.

154
00:06:03,663 --> 00:06:06,365
A.I. is mimicking
our thought processes.

155
00:06:06,398 --> 00:06:09,569
A.I. is basically
an emulation of us.

156
00:06:11,037 --> 00:06:13,939
[Downey]
<i> Like visionaries before him,
 Mark's a dreamer.</i>

157
00:06:13,973 --> 00:06:16,475
<i> The current state
 of his moonshot, however,</i>

158
00:06:16,508 --> 00:06:17,809
<i> is a little more earthbound.</i>

159
00:06:17,843 --> 00:06:19,845
[computer]<i> Thank you
 for granting access</i>

160
00:06:19,879 --> 00:06:22,448
<i> to your microphone.
 It's good to hear you.</i>

161
00:06:22,481 --> 00:06:23,749
[Downey]<i> Today, most avatars</i>

162
00:06:23,783 --> 00:06:26,652
<i> are basically glorified
 customer-service reps.</i>

163
00:06:26,685 --> 00:06:27,720
[service avatar]
<i> Rest assured,</i>

164
00:06:27,753 --> 00:06:29,455
<i> your health
 is my primary concern.</i>

165
00:06:29,488 --> 00:06:31,290
[Downey]<i> They can answer
 simple questions</i>

166
00:06:31,323 --> 00:06:33,091
<i> and give scripted responses.</i>

167
00:06:33,125 --> 00:06:35,060
<i> I love
 helping our customers,</i>

168
00:06:35,094 --> 00:06:36,863
<i> so I'm keen to keep learning.</i>

169
00:06:36,896 --> 00:06:39,832
[Downey]<i> Beats dealing with
automated phonelines for sure,</i>

170
00:06:39,866 --> 00:06:42,268
<i> but it's a far cry
from Mark's ultimate vision...</i>

171
00:06:42,301 --> 00:06:43,635
[Sagar] Hey, Baby. Hey.

172
00:06:43,669 --> 00:06:46,672
[Downey]<i> ...to create avatars
 that can actually learn,</i>

173
00:06:46,706 --> 00:06:49,341
<i> interpret, and interact
 with the world around them,</i>

174
00:06:49,375 --> 00:06:51,310
<i> like a real human.</i>

175
00:06:51,343 --> 00:06:53,145
What's this?

176
00:06:53,178 --> 00:06:55,080
Spider.

177
00:06:55,114 --> 00:06:58,050
So we're starting
to get a spider forming
in her mind here,

178
00:06:58,083 --> 00:07:00,853
she's starting to associate
the word with the image.

179
00:07:00,887 --> 00:07:03,122
So, Baby... spider.

180
00:07:04,156 --> 00:07:05,324
Spider.

181
00:07:05,357 --> 00:07:06,893
Spider...

182
00:07:06,926 --> 00:07:10,096
Good! Okay, what's this?

183
00:07:10,830 --> 00:07:12,130
[Baby] Spider.

184
00:07:12,164 --> 00:07:14,233
No. This is a duck.

185
00:07:14,266 --> 00:07:15,901
Look at the duck.

186
00:07:15,934 --> 00:07:17,302
[Baby] Duck.

187
00:07:17,336 --> 00:07:18,904
[Sagar] Yeah.

188
00:07:18,937 --> 00:07:22,742
[Downey]
<i> Baby X uses a type of A.I.
 called "object recognition."</i>

189
00:07:23,809 --> 00:07:27,646
<i> Basically,
 it's how a computer sees...</i>

190
00:07:27,680 --> 00:07:29,915
<i> how it identifies an object,
 like a spider,</i>

191
00:07:29,949 --> 00:07:33,485
<i> or tells the difference
 between a spider and a duck.</i>

192
00:07:33,518 --> 00:07:36,255
<i> It's something
that you and I do naturally...</i>

193
00:07:36,288 --> 00:07:39,525
<i> ...but machines, like Baby X,
 need to learn from scratch,</i>

194
00:07:39,558 --> 00:07:42,695
<i> by basically sifting
through enormous piles of data</i>

195
00:07:42,728 --> 00:07:44,096
<i> to search for patterns,</i>

196
00:07:44,130 --> 00:07:46,165
<i> so that eventually,
 it can drive a car,</i>

197
00:07:46,199 --> 00:07:49,001
<i> or pick out a criminal
 in a crowded photograph,</i>

198
00:07:49,035 --> 00:07:52,438
<i> or tell the difference
 between me and... that guy.</i>

199
00:07:52,471 --> 00:07:55,607
[Sagar]
But now I'm gonna tell her
that spiders are scary.

200
00:07:55,641 --> 00:07:59,211
Look out!
Rawr! Scary spider! Rawr!

201
00:07:59,244 --> 00:08:00,813
[crying]

202
00:08:00,846 --> 00:08:03,248
Hey, hey. Don't cry.
It's okay. Hey...

203
00:08:03,282 --> 00:08:04,217
[Baby crying]

204
00:08:04,250 --> 00:08:05,417
Hey, it's okay.

205
00:08:05,451 --> 00:08:08,087
Now she's responding emotionally
to me as well,

206
00:08:08,120 --> 00:08:10,722
so we've gone all the way down

207
00:08:10,756 --> 00:08:14,593
to virtual neurotransmitters,
hormones, and so forth,

208
00:08:14,626 --> 00:08:15,995
so Baby X has a stress system.

209
00:08:16,595 --> 00:08:18,397
If I give her a fright...

210
00:08:18,431 --> 00:08:19,465
Boo!

211
00:08:19,498 --> 00:08:20,733
So we'll see basically

212
00:08:20,766 --> 00:08:22,902
some noradrenaline
was released then,

213
00:08:22,935 --> 00:08:25,737
and she's gone into a much more
vigilant state of mind.

214
00:08:25,771 --> 00:08:27,506
[Downey]
<i> What Mark is working on</i>

215
00:08:27,539 --> 00:08:29,875
<i> is known as
 "affective computing,"</i>

216
00:08:29,908 --> 00:08:33,612
<i> A.I. that interprets
 and simulates human emotion.</i>

217
00:08:33,646 --> 00:08:36,849
I believe that machines
are gonna interact with humans

218
00:08:36,882 --> 00:08:38,917
just the way
we interact with one another,

219
00:08:38,950 --> 00:08:41,120
through perception,
through conversation.

220
00:08:41,153 --> 00:08:44,390
So as A.I.
continues to become mainstream,

221
00:08:44,423 --> 00:08:46,759
it needs
to really understand humans,

222
00:08:46,792 --> 00:08:49,461
and so we want
to build emotion A.I.

223
00:08:49,495 --> 00:08:51,864
that enables machines
to have empathy.

224
00:08:51,897 --> 00:08:53,265
Hello, Pepa.

225
00:08:53,299 --> 00:08:55,233
<i> -Hello.</i>
-[man] Hello.

226
00:08:55,267 --> 00:08:57,036
<i> -Hello.</i>
-Hello.

227
00:08:57,069 --> 00:08:58,303
<i> -Hello.</i>
-[laughing]

228
00:08:58,336 --> 00:08:59,638
Oh, dear.

229
00:08:59,671 --> 00:09:02,041
<i> -We can do this forever.</i>
-I know we could. [laughs]

230
00:09:02,074 --> 00:09:04,009
[Howard] They've showed,
for example,

231
00:09:04,043 --> 00:09:07,112
older adults who have A.I. aides
at their nursing homes,

232
00:09:07,146 --> 00:09:08,647
they are happier

233
00:09:08,680 --> 00:09:10,783
with a robot that emotes
and is social

234
00:09:10,816 --> 00:09:12,851
than having no one there.

235
00:09:12,885 --> 00:09:16,422
That's really the enhancement
of human relationships.

236
00:09:16,455 --> 00:09:19,158
[Sagar] Hey...
Hello.

237
00:09:19,191 --> 00:09:20,826
You know, human cooperation

238
00:09:20,860 --> 00:09:23,763
is the most powerful force
in human history, right?

239
00:09:23,796 --> 00:09:26,065
Human cooperation
with intelligent machines

240
00:09:26,098 --> 00:09:28,868
will define
the next era of history.

241
00:09:28,901 --> 00:09:31,670
Using a machine
which is connected

242
00:09:31,704 --> 00:09:34,040
with the rest of the world
through the Internet,

243
00:09:34,073 --> 00:09:37,476
that can work as a creative,
collaborative partner?

244
00:09:37,510 --> 00:09:39,245
That's unbelievable.

245
00:09:47,820 --> 00:09:50,289
[will.i.am]
Jessica. Jessica.
One more time, one more time.

246
00:09:50,322 --> 00:09:52,792
We're gonna go
from just the first two verses,

247
00:09:52,825 --> 00:09:53,859
and the first two verses

248
00:09:53,892 --> 00:09:56,128
will take us
to three minutes, okay?

249
00:09:56,162 --> 00:09:57,463
I love music.

250
00:09:57,496 --> 00:09:59,665
The whole concept of music
is collaboration,

251
00:09:59,698 --> 00:10:01,734
so if there are some people
that see me as a musician,

252
00:10:01,767 --> 00:10:02,902
that's awesome.

253
00:10:06,706 --> 00:10:08,874
I first
became interested in A.I.

254
00:10:08,907 --> 00:10:11,510
because A.I.
is a very fruitful place
to create in.

255
00:10:11,543 --> 00:10:13,612
It's a new tool for us.

256
00:10:13,645 --> 00:10:16,182
I dream,
and make my dreams reality,

257
00:10:16,215 --> 00:10:17,783
whether the dream is a song

258
00:10:17,816 --> 00:10:20,853
or the dream
is an avatar of myself.

259
00:10:20,886 --> 00:10:24,022
One time, a friend was like,
"Well, you can't clone yourself.

260
00:10:24,055 --> 00:10:25,891
You can't be
in two places at once."

261
00:10:25,925 --> 00:10:28,094
That's the promise
of the avatar.

262
00:10:29,361 --> 00:10:30,563
I left it over there.

263
00:10:30,596 --> 00:10:32,764
All right, here we go.

264
00:10:32,797 --> 00:10:35,000
[Sagar] So, you're about
to enter the Matrix.

265
00:10:35,033 --> 00:10:39,672
I'm gonna sort of direct you
through just a bunch of poses.

266
00:10:39,705 --> 00:10:41,706
[will.i.am]
The team from Soul Machines

267
00:10:41,740 --> 00:10:44,143
is here to create
a digital avatar of myself.

268
00:10:44,176 --> 00:10:46,679
They had to put me
in this huge contraption

269
00:10:46,712 --> 00:10:48,414
with these crazy lights.

270
00:10:49,582 --> 00:10:50,916
What do you want me to do?

271
00:10:50,949 --> 00:10:52,751
[Sagar]
Your face is an instrument.

272
00:10:52,785 --> 00:10:55,421
All the wrinkles on the face
is like a signature,

273
00:10:55,454 --> 00:10:56,789
so we want to get

274
00:10:56,822 --> 00:10:59,758
the highest-quality digital
model of you that we can.

275
00:10:59,792 --> 00:11:01,626
Okay.
[chuckles]

276
00:11:01,660 --> 00:11:03,629
[Sagar] Yeah, that's perfect.
Okay, go.

277
00:11:03,662 --> 00:11:06,131
[rapid shutters snapping]

278
00:11:06,164 --> 00:11:09,935
[Sagar] So we have to capture
all the textures of their face.

279
00:11:09,968 --> 00:11:11,703
The geometry of their face...

280
00:11:11,736 --> 00:11:13,339
Big, gnashy teeth.

281
00:11:13,372 --> 00:11:15,140
How their face deforms

282
00:11:15,173 --> 00:11:17,443
to form the different
facial expressions.

283
00:11:17,476 --> 00:11:18,577
And how about a kiss?

284
00:11:18,610 --> 00:11:19,544
You could do...

285
00:11:19,578 --> 00:11:20,746
With my eyes closed?

286
00:11:20,779 --> 00:11:21,880
'Cause I don't kiss
with my eyes open.

287
00:11:21,913 --> 00:11:23,816
Every once in a while,
I peek.

288
00:11:23,849 --> 00:11:25,450
[cameras snapping]

289
00:11:25,484 --> 00:11:26,785
I wanted to have

290
00:11:26,819 --> 00:11:29,421
a digital avatar
around the idea of Idatity,

291
00:11:29,454 --> 00:11:32,791
and that's the marriage
of my data and my identity.

292
00:11:32,825 --> 00:11:35,193
Everyone's concerned
about, like, identity theft.

293
00:11:35,227 --> 00:11:38,063
Meanwhile,
everybody's giving away
all their data for free

294
00:11:38,097 --> 00:11:38,964
on the Internet.

295
00:11:38,998 --> 00:11:41,633
I'm what I like
and what I don't like,

296
00:11:41,667 --> 00:11:43,635
I'm where I go,
I'm who I know.

297
00:11:43,668 --> 00:11:45,704
I'm what I search.
I am my thumbprint.

298
00:11:45,738 --> 00:11:48,073
I am my data.
That's who I am.

299
00:11:48,106 --> 00:11:49,842
You pull your eyelids down
like that.

300
00:11:49,875 --> 00:11:51,209
We want to get that... yup.

301
00:11:51,243 --> 00:11:53,212
[will.i.am]
When I'm on Instagram
and I'm on Google,

302
00:11:53,245 --> 00:11:56,949
I'm actually programming
those algorithms
to better understand me.

303
00:11:56,982 --> 00:11:58,216
Awesome.

304
00:11:58,250 --> 00:12:00,719
In the future,
my avatar's gonna be
doing all that stuff,

305
00:12:00,752 --> 00:12:02,121
because I'm gonna program it.

306
00:12:02,154 --> 00:12:05,124
Get entertained through it,
get information through it,

307
00:12:05,157 --> 00:12:06,391
and you feel like

308
00:12:06,424 --> 00:12:09,828
you're having a FaceTime
with an intelligent entity.

309
00:12:09,861 --> 00:12:11,564
[laughing]
"Yo, check out this link."

310
00:12:11,597 --> 00:12:12,864
"Oh, wow, that's crazy."

311
00:12:12,898 --> 00:12:15,467
"Yo, can you post that
on my Twitter?"

312
00:12:15,501 --> 00:12:16,368
[laughter]

313
00:12:17,803 --> 00:12:19,838
-Hey.
-Hey.

314
00:12:19,872 --> 00:12:22,708
All right,
I'm the Soul Machines
lead audio engineer.

315
00:12:22,741 --> 00:12:26,412
Hopefully
we'll be able to build
an A.I. version of your voice.

316
00:12:26,445 --> 00:12:29,147
After creating Will's look,

317
00:12:29,180 --> 00:12:31,817
then we now
have to create his voice.

318
00:12:31,850 --> 00:12:34,820
For that, we actually have
to capture a lot of samples

319
00:12:34,853 --> 00:12:36,388
about how Will speaks,

320
00:12:36,422 --> 00:12:39,124
and that's actually
quite a challenging process.

321
00:12:39,157 --> 00:12:41,460
-Shall we kick off?
-Yeah, let's kick off.

322
00:12:41,493 --> 00:12:42,995
-A'ight, boo, here we go.
-Yeah.

323
00:12:43,028 --> 00:12:44,897
I'm Will,
and I'm happy to meet you.

324
00:12:44,930 --> 00:12:47,666
I'm here to bring
technology to life,

325
00:12:47,699 --> 00:12:50,468
and let's talk about
Artificial Intelligence.

326
00:12:50,502 --> 00:12:53,405
Oops. Really? Whoa.

327
00:12:53,438 --> 00:12:54,773
That's dope!

328
00:12:54,806 --> 00:12:57,142
So there's so many ways
of saying "dope," bro.

329
00:12:57,175 --> 00:12:58,010
Yeah, yeah.

330
00:12:58,043 --> 00:12:59,945
Now, how realistic
is it going to be?

331
00:12:59,978 --> 00:13:01,647
This will sound like you.

332
00:13:01,680 --> 00:13:04,717
The sentences
can be divided up into parts

333
00:13:04,750 --> 00:13:06,618
so that we can create words

334
00:13:06,651 --> 00:13:08,787
and build sentences,
like LEGO blocks.

335
00:13:08,820 --> 00:13:11,356
It will sound
exactly like you.

336
00:13:11,390 --> 00:13:13,993
Well, maybe we don't want
to have it too accurate.

337
00:13:14,026 --> 00:13:18,663
So you don't freak people out,
maybe I don't want it accurate.

338
00:13:18,697 --> 00:13:20,832
Maybe, there should be
some type of...

339
00:13:20,865 --> 00:13:21,934
"That's the A.I.,"

340
00:13:21,967 --> 00:13:23,568
'cause this
is all new ground.

341
00:13:23,602 --> 00:13:25,070
-Yeah.
-Like, we've...

342
00:13:25,104 --> 00:13:27,206
we are in an intersection
of a place

343
00:13:27,239 --> 00:13:28,907
that we've never been
in society,

344
00:13:28,940 --> 00:13:31,109
where people have to determine

345
00:13:31,143 --> 00:13:33,412
what's real
and what's not.

346
00:13:35,480 --> 00:13:37,382
[Downey]<i> While Mark
 jets back to New Zealand</i>

347
00:13:37,416 --> 00:13:39,918
<i> to try to create
 Will's digital doppelganger,</i>

348
00:13:39,951 --> 00:13:42,387
<i> Will's left waiting,
 and wondering...</i>

349
00:13:42,420 --> 00:13:44,522
<i> can Mark pull this off?</i>

350
00:13:44,556 --> 00:13:45,758
<i> What does it mean</i>

351
00:13:45,791 --> 00:13:47,659
<i> to have a lifelike
 avatar of you?</i>

352
00:13:47,692 --> 00:13:50,696
<i> A digital replicant
 of yourself?</i>

353
00:13:50,729 --> 00:13:52,498
<i> Is that a good idea?</i>

354
00:13:52,531 --> 00:13:54,466
<i> How far is too far?</i>

355
00:13:54,499 --> 00:13:56,735
[Domingos] We've been
collaborating with machines

356
00:13:56,768 --> 00:13:58,370
since the dawn of technology.

357
00:13:58,404 --> 00:14:00,138
I mean, even today,

358
00:14:00,171 --> 00:14:02,274
in some sense,
we are all cyborgs already.

359
00:14:02,307 --> 00:14:03,775
For example,

360
00:14:03,809 --> 00:14:06,144
you use OKCupid
to find a date,

361
00:14:06,177 --> 00:14:09,514
and then you use Yelp
to decide where to go, you know,

362
00:14:09,547 --> 00:14:10,882
what restaurant to go to,

363
00:14:10,915 --> 00:14:12,451
and then
you start driving your car,

364
00:14:12,484 --> 00:14:15,220
but there's a GPS system that
actually tells you where to go.

365
00:14:15,254 --> 00:14:17,923
So the human
and the machine decision-making

366
00:14:17,956 --> 00:14:19,491
are very tightly interwoven,

367
00:14:19,525 --> 00:14:22,294
and I think this will
only increase as we go forward.

368
00:14:25,764 --> 00:14:29,501
[Downey]
<i> Human collaboration
 with intelligent machines...</i>

369
00:14:29,535 --> 00:14:31,770
<i> A different musician
 in a different town</i>

370
00:14:31,803 --> 00:14:32,871
<i> with a different approach</i>

371
00:14:32,904 --> 00:14:34,940
<i> is giving the same problem
 a shot.</i>

372
00:14:34,973 --> 00:14:36,775
[Gil Weinberg]
People are concerned

373
00:14:36,808 --> 00:14:38,710
about A.I. replacing humans,

374
00:14:38,743 --> 00:14:40,445
and I think
it is not only

375
00:14:40,478 --> 00:14:42,948
not going to replace humans,
it's going to enhance humans.

376
00:14:45,784 --> 00:14:48,119
I'm Gil Weinberg.
I'm the founding director

377
00:14:48,153 --> 00:14:50,522
of Georgia Tech Center
for Music Technology.

378
00:14:50,556 --> 00:14:51,790
[plays piano]

379
00:14:51,823 --> 00:14:53,992
Ready?

380
00:14:54,025 --> 00:14:57,295
In my lab, we are trying
to create the new technologies

381
00:14:57,329 --> 00:15:00,799
that will explore
new ways to be expressive...

382
00:15:00,832 --> 00:15:02,801
to be creative...

383
00:15:02,835 --> 00:15:05,571
Shimon,
it's a marimba-playing robot.

384
00:15:05,604 --> 00:15:08,006
[playing marimba]

385
00:15:08,039 --> 00:15:11,643
What it does
is listen to humans playing,

386
00:15:11,676 --> 00:15:14,346
and it can improvise.

387
00:15:15,681 --> 00:15:18,350
Shimon is
our first robotic musician

388
00:15:18,383 --> 00:15:20,753
that has the ability
to find patterns,

389
00:15:20,786 --> 00:15:22,054
so, machine learning.

390
00:15:23,956 --> 00:15:25,190
Machine learning

391
00:15:25,224 --> 00:15:28,326
is the ability
to find patterns in data.

392
00:15:28,360 --> 00:15:31,463
So, for example,
if we feed Shimon
Miles Davis,

393
00:15:31,496 --> 00:15:32,463
it will try to see

394
00:15:32,497 --> 00:15:34,766
what note is he likely to play
after what note,

395
00:15:34,799 --> 00:15:38,103
and once it finds its patterns,
it can start to manipulate it,

396
00:15:38,136 --> 00:15:40,205
and I can have the robot
playing in a style

397
00:15:40,239 --> 00:15:43,575
that maybe is 30% Miles Davis,
30% Bach,

398
00:15:43,609 --> 00:15:46,377
30% Madonna,
and 10% my own,

399
00:15:46,411 --> 00:15:50,048
and create morphing of music
that humans would never create.

400
00:15:50,082 --> 00:15:51,450
[band playing tune]

401
00:15:55,320 --> 00:15:56,688
[Downey]
<i> Gil's groundbreaking work</i>

402
00:15:56,722 --> 00:15:59,725
<i> in artificial creativity
 and musical expression</i>

403
00:15:59,758 --> 00:16:02,561
<i> has been performed
 by symphonies
 around the world...</i>

404
00:16:03,695 --> 00:16:04,997
<i> ...but his innovation</i>

405
00:16:05,030 --> 00:16:07,566
<i> also caught the attention
 of another musician...</i>

406
00:16:07,599 --> 00:16:08,634
Okay.

407
00:16:08,667 --> 00:16:10,869
[Downey]<i> ...a guy who
 unexpectedly pushed Gil</i>

408
00:16:10,902 --> 00:16:12,671
<i> beyond enhancing robots</i>

409
00:16:12,704 --> 00:16:15,040
<i> to augmenting humans.</i>

410
00:16:15,073 --> 00:16:17,809
[Weinberg] I met Jason Barnes
about six years ago,

411
00:16:17,842 --> 00:16:20,912
when I was just about finishing
one phase of developing Shimon,

412
00:16:20,945 --> 00:16:24,549
and I was starting to think,
"What's next?"

413
00:16:24,583 --> 00:16:27,819
[Barnes] I got my first
drum kit when I was 15,
on Christmas,

414
00:16:27,852 --> 00:16:30,021
and when I lost my limb,
I was 22,

415
00:16:30,054 --> 00:16:32,358
so I was kind of used
to having two limbs.

416
00:16:34,392 --> 00:16:36,995
I started trying
to fabricate prosthetics

417
00:16:37,028 --> 00:16:38,697
to try
and get me back on the kit,

418
00:16:38,730 --> 00:16:41,934
which eventually led me
to working and collaborating
with Georgia Tech.

419
00:16:41,967 --> 00:16:44,235
[playing drums]

420
00:16:44,269 --> 00:16:46,772
[Weinberg] He told me
that he lost his arm,

421
00:16:46,805 --> 00:16:48,606
he was devastated,
he was depressed,

422
00:16:48,640 --> 00:16:49,808
music was his life,

423
00:16:49,841 --> 00:16:53,078
and he said,
"I saw that you develop
robotic musicians.

424
00:16:53,112 --> 00:16:55,080
Can you use some
of the technology that you have

425
00:16:55,113 --> 00:16:59,418
in order to allow me
to play again like I used to?"

426
00:16:59,451 --> 00:17:02,854
So that's the prosthetic arm
that we built for Jason.

427
00:17:02,888 --> 00:17:04,289
When he came to us,

428
00:17:04,323 --> 00:17:06,925
he just wanted to be able
to use sensors here

429
00:17:06,959 --> 00:17:09,961
so he can hold the stick
tight or loose.

430
00:17:09,994 --> 00:17:12,164
I suggested "Let's do that,
but also,

431
00:17:12,197 --> 00:17:13,398
let's have two sticks.

432
00:17:13,431 --> 00:17:15,534
One stick can operate
with a mind of its own,

433
00:17:15,567 --> 00:17:17,435
understanding the music
and improvising.

434
00:17:17,468 --> 00:17:20,305
One stick can operate based on
what you tell it
with your muscle,

435
00:17:20,338 --> 00:17:23,242
and also, each one of the sticks
can play 20 hertz...

436
00:17:24,710 --> 00:17:26,111
...faster than any humans,

437
00:17:26,144 --> 00:17:27,880
and together,
they can create polyrhythm,

438
00:17:27,913 --> 00:17:31,516
create all kind of textures
that humans cannot create."

439
00:17:31,550 --> 00:17:33,485
All right.
I think we're ready to play.

440
00:17:33,518 --> 00:17:35,554
[all playing tune]

441
00:17:38,123 --> 00:17:40,325
[Downey]<i> In some ways,
 the robotic drum arm</i>

442
00:17:40,359 --> 00:17:43,195
<i> allows Jason to play
 better than he ever has,</i>

443
00:17:43,228 --> 00:17:45,196
<i> but it still lacks
 the true function,</i>

444
00:17:45,230 --> 00:17:47,732
<i> or feeling,
 of a human hand.</i>

445
00:17:47,765 --> 00:17:49,033
[Weinberg] They don't provide

446
00:17:49,067 --> 00:17:51,469
the kind of dexterity
and subtle control

447
00:17:51,503 --> 00:17:53,806
that would
really allow anything.

448
00:17:55,607 --> 00:17:56,541
[Downey]<i> This revelation</i>

449
00:17:56,574 --> 00:17:58,877
<i> drove Gil
 to his next innovation...</i>

450
00:17:58,910 --> 00:18:02,146
<i> the Skywalker Hand.</i>

451
00:18:02,180 --> 00:18:04,382
<i> Inspired by Luke Skywalker
 from</i> Star Wars,

452
00:18:04,416 --> 00:18:07,252
<i> and created in collaboration
 with Jason,</i>

453
00:18:07,285 --> 00:18:09,187
<i> the revolutionary tech</i>

454
00:18:09,221 --> 00:18:11,690
<i> brings what was once
 the realm of sci-fi</i>

455
00:18:11,723 --> 00:18:13,725
<i>a little closer to our galaxy.</i>

456
00:18:13,758 --> 00:18:15,727
[Barnes] This is just like
a 3D-printed hand

457
00:18:15,760 --> 00:18:18,263
that you can, like,
download the files online.

458
00:18:18,297 --> 00:18:20,766
[Downey]<i> Currently,
most advanced prosthetic hands</i>

459
00:18:20,799 --> 00:18:24,135
<i> can't even thumbs-up
 or flip you the bird.</i>

460
00:18:24,168 --> 00:18:26,805
<i> They can only open or grip,</i>

461
00:18:26,839 --> 00:18:28,774
<i> using all five fingers
 at once.</i>

462
00:18:28,807 --> 00:18:32,210
Most of the prosthetics
that are available
on the market nowadays,

463
00:18:32,243 --> 00:18:34,279
um, actually use
EMG technology,

464
00:18:34,312 --> 00:18:35,814
which stands
for "electromyography,"

465
00:18:35,847 --> 00:18:38,616
and essentially what it does
is there are two sensors

466
00:18:38,650 --> 00:18:40,919
that make contact
with my residual limb,

467
00:18:40,952 --> 00:18:43,755
and they pick up electrical
signals from the muscles...

468
00:18:43,788 --> 00:18:46,090
So again, when I flex
and extend my residual limb,

469
00:18:46,124 --> 00:18:47,959
it will open
and close the hand,

470
00:18:47,992 --> 00:18:50,195
um, and I can rotate as well,

471
00:18:50,228 --> 00:18:51,763
but the problem with EMG

472
00:18:51,796 --> 00:18:54,999
is it's a very vague
electrical signal,
so zero to 100%.

473
00:18:55,032 --> 00:18:56,468
It's not very accurate at all.

474
00:18:56,502 --> 00:18:59,437
The Skywalker Hand
actually uses ultrasound tech.

475
00:18:59,470 --> 00:19:00,972
Ultrasound provides an image,

476
00:19:01,005 --> 00:19:04,443
and you can see
everything that's going on
inside of the arm.

477
00:19:04,476 --> 00:19:07,312
[Downey]<i> Ultrasound
uses high-frequency sound waves</i>

478
00:19:07,345 --> 00:19:10,182
<i> to capture live images
 from inside the body.</i>

479
00:19:11,282 --> 00:19:12,984
<i> As Jason flexes his muscles</i>

480
00:19:13,018 --> 00:19:14,786
<i> to move each
 of his missing fingers,</i>

481
00:19:14,819 --> 00:19:19,458
<i> ultrasound generates
 live images
 that visualize his intention.</i>

482
00:19:20,459 --> 00:19:23,128
<i> The A.I.
 then uses machine learning</i>

483
00:19:23,161 --> 00:19:24,462
<i> to predict patterns,</i>

484
00:19:24,496 --> 00:19:26,531
<i> letting a man who's lost
 one of his hands</i>

485
00:19:26,564 --> 00:19:29,400
<i> move all five of his fingers
 individually,</i>

486
00:19:29,434 --> 00:19:32,270
<i> even if he's as unpredictable
 as Keith Moon.</i>

487
00:19:32,304 --> 00:19:34,138
[Howard]
The work that Gil is doing

488
00:19:34,172 --> 00:19:35,440
is really important.

489
00:19:35,474 --> 00:19:37,909
Gil comes from
a non-engineering background,

490
00:19:37,943 --> 00:19:39,912
which means that his technology

491
00:19:39,945 --> 00:19:42,013
and the way
he thinks about robotics

492
00:19:42,046 --> 00:19:43,315
is actually quite different

493
00:19:43,348 --> 00:19:44,816
than, say, the way
I would think about it,

494
00:19:44,849 --> 00:19:46,718
since I come from
an engineering background.

495
00:19:46,751 --> 00:19:49,521
And the commonality is
that we want to design robots

496
00:19:49,554 --> 00:19:52,491
to really impact
and make a difference
in the world.

497
00:19:53,992 --> 00:19:56,261
[Weinberg] We were able
to create a proof of concept

498
00:19:56,294 --> 00:19:57,663
with Jason Barnes.

499
00:19:57,696 --> 00:20:01,433
Once we discovered that
we can do this with ultrasound,

500
00:20:01,466 --> 00:20:03,234
immediately I looked at,

501
00:20:03,268 --> 00:20:05,370
"Hey, let's try
to help more people."

502
00:20:10,208 --> 00:20:12,978
[Jay Schneider] That's okay,
just leave me hanging,
holding it.

503
00:20:13,011 --> 00:20:14,346
It's not heavy or anything.

504
00:20:14,379 --> 00:20:15,447
[Barnes] It's safe,
if you want to slide it back...

505
00:20:15,480 --> 00:20:17,349
No, no.
I'm messing with you.

506
00:20:17,382 --> 00:20:18,583
So I met Jason Barnes

507
00:20:18,617 --> 00:20:20,785
at an event called
"Lucky Fin Weekend."

508
00:20:20,819 --> 00:20:23,922
They're a foundation that deals
with limb difference.

509
00:20:23,955 --> 00:20:25,290
There we go.

510
00:20:25,323 --> 00:20:27,526
-Ah, all right.
-And it's out.

511
00:20:27,559 --> 00:20:29,361
[Schneider]
Do you ever work on your car

512
00:20:29,394 --> 00:20:30,762
without the hook?

513
00:20:30,795 --> 00:20:33,765
Not really. It's just way easier
and efficient for me to...

514
00:20:33,798 --> 00:20:36,334
The hook, the hook really
trips me out, though, man.

515
00:20:36,368 --> 00:20:38,503
[Schneider]
When I lost my hand,

516
00:20:38,536 --> 00:20:39,938
it was close to 30 years ago,

517
00:20:39,971 --> 00:20:42,574
and prosthetics were
kind of stuck in the Dark Ages.

518
00:20:42,607 --> 00:20:44,710
[rock drums and bass playing]

519
00:20:47,879 --> 00:20:50,782
In general, they didn't
really do a whole lot,

520
00:20:50,815 --> 00:20:52,049
and even if they moved,

521
00:20:52,083 --> 00:20:55,821
they seemed to be more passive
than actually worthwhile to use.

522
00:20:58,757 --> 00:21:01,059
I don't like to talk
about my accident,

523
00:21:01,092 --> 00:21:03,495
because I don't feel
it defines me.

524
00:21:03,528 --> 00:21:05,330
The narrative
on limb-different people

525
00:21:05,363 --> 00:21:07,398
has been the accident.

526
00:21:07,432 --> 00:21:10,335
"This is what happened,
and these are these sad things,"

527
00:21:10,368 --> 00:21:12,771
and it becomes
inspiration porn.

528
00:21:14,839 --> 00:21:17,309
For me, for example, right,
if I do something,

529
00:21:17,342 --> 00:21:19,444
I have to, like,
smash it out of the park,

530
00:21:19,477 --> 00:21:21,079
because otherwise I feel like
there's gonna be this,

531
00:21:21,112 --> 00:21:24,149
"Oh, well, he did it good enough
because he's missing his hand."

532
00:21:24,182 --> 00:21:25,950
-Yeah, yeah.
-And I'm like, "F that!"

533
00:21:25,983 --> 00:21:29,520
Like, I want to...
I'm gonna be as good or better
than somebody with two hands

534
00:21:29,554 --> 00:21:31,023
doing whatever I'm doing,
you know?

535
00:21:32,390 --> 00:21:34,359
Prosthetics,
at this point in my life,

536
00:21:34,392 --> 00:21:37,829
don't really seem like something
I would want or need.

537
00:21:37,862 --> 00:21:39,998
[Weinberg]
Manual robotic prosthetics

538
00:21:40,031 --> 00:21:41,800
have not been adopted well.

539
00:21:41,833 --> 00:21:42,867
Amputees try them,

540
00:21:42,900 --> 00:21:44,736
and then they don't
continue to use them.

541
00:21:50,742 --> 00:21:53,078
[Barnes] Yeah, man,
you stoked to check out the lab?

542
00:21:53,111 --> 00:21:54,412
Yeah, yeah, for sure.

543
00:21:54,446 --> 00:21:57,282
Right now,
I'm the only amputee
that's ever used

544
00:21:57,315 --> 00:21:58,649
the Skywalker Arm before.

545
00:21:58,683 --> 00:22:00,618
Did you have...
were you right-handed?

546
00:22:00,651 --> 00:22:01,987
No, I was born
left-handed, actually.

547
00:22:02,020 --> 00:22:03,454
Oh, you lucky bastard.

548
00:22:03,487 --> 00:22:05,256
-Yeah, I know, right?
-I was right-handed.

549
00:22:05,289 --> 00:22:06,958
[Barnes] It was
extremely important

550
00:22:06,991 --> 00:22:09,561
to get as many different people
as we can in there,

551
00:22:09,594 --> 00:22:10,928
including other amputees.

552
00:22:10,962 --> 00:22:13,798
It's hard to find people
that are amputees in general,

553
00:22:13,832 --> 00:22:16,768
and then, like,
upper-extremity amputees
is the next thing,

554
00:22:16,801 --> 00:22:18,470
and then finding people
who are willing,

555
00:22:18,503 --> 00:22:20,238
to step out
of their comfort zone

556
00:22:20,271 --> 00:22:22,106
-and then do this.
-Right.

557
00:22:22,139 --> 00:22:23,575
[Schneider] When I met Jason,

558
00:22:23,608 --> 00:22:26,545
I found it really interesting
that we had a lot in common,

559
00:22:26,578 --> 00:22:30,014
because we were both into cars,
we were both into music.

560
00:22:30,048 --> 00:22:31,783
-Hi, Gil.
-Hey. What's up?

561
00:22:31,816 --> 00:22:33,951
-Jason. Nice to meet ya.
-Nice meeting you.

562
00:22:33,985 --> 00:22:36,854
He's a step or two
ahead of me
with the technology stuff.

563
00:22:36,888 --> 00:22:39,724
[Barnes]
The way this hand works
is it essentially picks up

564
00:22:39,758 --> 00:22:42,093
the ultrasound signals
from my residual limb,

565
00:22:42,127 --> 00:22:43,862
so when I move my index finger,

566
00:22:43,895 --> 00:22:45,129
it'll move my index...

567
00:22:45,162 --> 00:22:47,232
ring...

568
00:22:47,265 --> 00:22:48,799
[Schneider]
Wow, for the first time,

569
00:22:48,833 --> 00:22:50,502
prosthetics are finally
getting to the point

570
00:22:50,535 --> 00:22:52,270
where they're getting
pretty close

571
00:22:52,303 --> 00:22:54,239
to actual human hand.

572
00:22:54,272 --> 00:22:55,907
You know, it got me excited.
I was like,

573
00:22:55,941 --> 00:22:58,143
"This is the type of thing
that I've been waiting for."

574
00:22:58,176 --> 00:22:59,877
If I was ever going
to try one again,

575
00:22:59,910 --> 00:23:02,881
this would be the type of stuff
that I would want to check out.

576
00:23:02,914 --> 00:23:04,081
When I move my thumb...

577
00:23:04,115 --> 00:23:05,917
[laughter]

578
00:23:08,286 --> 00:23:10,354
I know from experience

579
00:23:10,388 --> 00:23:12,357
that it's not always
working perfectly.

580
00:23:12,391 --> 00:23:14,459
It's very interesting for me
to have someone else

581
00:23:14,492 --> 00:23:16,327
who comes
and tries our technology

582
00:23:16,360 --> 00:23:18,130
to see
if it can be generalized.

583
00:23:20,665 --> 00:23:23,434
Is my arm getting warmer
because you're wrapping it,

584
00:23:23,468 --> 00:23:24,836
or does that have
heat in it?

585
00:23:24,870 --> 00:23:26,737
-It does have heat in it.
-Oh, okay.

586
00:23:26,771 --> 00:23:29,741
First thing we need,
if we're gonna get Jay
to try the hand,

587
00:23:29,774 --> 00:23:32,310
is we need to get
a custom-fit socket to his arm

588
00:23:32,343 --> 00:23:34,546
that's comfortable
and fits nice and snug.

589
00:23:34,579 --> 00:23:36,347
You comfortable
when they do this?

590
00:23:36,381 --> 00:23:38,316
This is the most awkward part
for me.

591
00:23:38,349 --> 00:23:40,652
-Nah, it was kinda weird.
-Ah, yeah. Yeah.

592
00:23:40,685 --> 00:23:42,787
I was 12 years old
when I lost my hand

593
00:23:42,820 --> 00:23:44,689
and had a prosthetic
for six months,

594
00:23:44,723 --> 00:23:46,925
and pretty much ever since then,
I haven't used it,

595
00:23:46,958 --> 00:23:48,793
and it's been
close to 30 years now.

596
00:23:48,827 --> 00:23:50,828
And there's the impression
of your arm.

597
00:23:50,862 --> 00:23:52,430
That's way easier
than I thought it was gonna be.

598
00:23:52,463 --> 00:23:53,732
That's wild, yeah!

599
00:23:53,765 --> 00:23:55,966
It may not be right for me,
but this is something

600
00:23:56,000 --> 00:23:58,503
that could really, really
help people's lives.

601
00:23:58,536 --> 00:23:59,805
It would be really cool

602
00:23:59,838 --> 00:24:03,008
to have a hand in helping
to develop the technology.

603
00:24:04,242 --> 00:24:06,411
All right.

604
00:24:06,444 --> 00:24:07,779
All right, ready?

605
00:24:08,580 --> 00:24:10,448
Just slide it in.

606
00:24:10,481 --> 00:24:12,150
Turn this... tighten.

607
00:24:12,183 --> 00:24:13,451
[knob ratcheting]

608
00:24:13,485 --> 00:24:14,652
How tight?

609
00:24:14,686 --> 00:24:16,788
As tight as you can
before it really hurts...

610
00:24:16,821 --> 00:24:18,356
-Oh, really?
-...because the tighter it is,

611
00:24:18,389 --> 00:24:20,491
-the better reading we'll see.
-Okay.

612
00:24:20,525 --> 00:24:22,360
-Now we apply the probe...
-Okay.

613
00:24:22,394 --> 00:24:24,062
...so it can
read your movements.

614
00:24:24,095 --> 00:24:25,229
Now we also

615
00:24:25,263 --> 00:24:27,432
have to work on the algorithm
and the machine learning,

616
00:24:27,465 --> 00:24:29,300
and for this,
we will need you to train.

617
00:24:29,333 --> 00:24:30,701
Okay.

618
00:24:30,735 --> 00:24:33,137
An able-bodied person,
when you move your finger,

619
00:24:33,171 --> 00:24:35,039
you're not thinking
about moving your finger,

620
00:24:35,072 --> 00:24:37,708
you just do it, because
that's how we're hardwired,

621
00:24:37,742 --> 00:24:39,544
but, honestly,
I don't really remember

622
00:24:39,577 --> 00:24:41,646
what it was like
to even have that hand.

623
00:24:41,679 --> 00:24:44,149
[Weinberg] Even though
an amputee doesn't have a thumb,

624
00:24:44,182 --> 00:24:45,717
they still have the muscle.

625
00:24:45,750 --> 00:24:48,319
You still have
some kind of memory

626
00:24:48,352 --> 00:24:49,888
of how
you moved your fingers,

627
00:24:49,921 --> 00:24:52,490
and you can think about
moving your phantom fingers,

628
00:24:52,523 --> 00:24:54,525
and the muscles
would move accordingly,

629
00:24:54,559 --> 00:24:56,694
and that's exactly what we use
in order to, uh,

630
00:24:56,727 --> 00:24:59,564
recreate the motion
and put it in a prosthetic arm.

631
00:24:59,598 --> 00:25:03,534
But does Jay still remember
how to move fingers

632
00:25:03,568 --> 00:25:06,204
that he didn't have for,
I believe, 30 years ago?

633
00:25:06,237 --> 00:25:08,606
Now we'll run the model,

634
00:25:08,640 --> 00:25:10,708
and you'll be able
to control the hand.

635
00:25:10,741 --> 00:25:13,444
[chuckles] You're optimistic.
I'm crossing fingers.

636
00:25:13,478 --> 00:25:15,379
Can I cross these fingers?
[laughs]

637
00:25:15,413 --> 00:25:17,281
Is that...
is that an option yet?

638
00:25:17,314 --> 00:25:19,750
Having Jay here for a day

639
00:25:19,783 --> 00:25:21,752
and hoping to get him to a point

640
00:25:21,786 --> 00:25:23,687
that he controls
finger by finger,

641
00:25:23,721 --> 00:25:25,923
I'm a little concerned
that it will not work

642
00:25:25,957 --> 00:25:28,026
in such a short period of time.

643
00:25:28,059 --> 00:25:29,794
Okay. And...

644
00:25:29,828 --> 00:25:32,730
-Ready?
-Yeah. You should try
each of the fingers.

645
00:25:32,763 --> 00:25:34,299
All right, that's the thumb...

646
00:25:35,500 --> 00:25:37,869
-Oh, shit!
-Unbelievable.

647
00:25:39,404 --> 00:25:41,338
All right, index...

648
00:25:41,372 --> 00:25:42,640
Yay!

649
00:25:42,674 --> 00:25:44,008
Wow, I'm surprised.

650
00:25:44,041 --> 00:25:46,043
Middle...

651
00:25:46,077 --> 00:25:47,245
[Barnes] Dude.

652
00:25:50,181 --> 00:25:51,917
Five for five?

653
00:25:53,918 --> 00:25:56,320
-[all cheering]
-All five of them!

654
00:25:56,353 --> 00:25:57,488
-Whoa.
-That's wild.

655
00:25:57,522 --> 00:25:59,324
All right,
let me do it again.

656
00:25:59,357 --> 00:26:00,625
You're a natural, man.

657
00:26:00,658 --> 00:26:02,527
Doesn't that feel crazy?

658
00:26:02,560 --> 00:26:04,295
-Yeah!
-Feels wild.

659
00:26:04,328 --> 00:26:06,931
-I didn't think
it'd be as good.
-I didn't either.

660
00:26:06,965 --> 00:26:09,234
He hit me in the back
after it worked, so...

661
00:26:09,267 --> 00:26:10,501
That's the first time.

662
00:26:10,535 --> 00:26:13,471
[Schneider]
It's like a game-changer,
even in its infancy,

663
00:26:13,504 --> 00:26:14,739
which is kind of insane,

664
00:26:14,772 --> 00:26:16,975
because it can
only get better from there.

665
00:26:17,008 --> 00:26:19,644
And it's really cool
to play a small part in that.

666
00:26:19,677 --> 00:26:21,946
[Weinberg]
Now we have two main goals.

667
00:26:21,979 --> 00:26:24,949
First,
you need to move your muscle
or your phantom finger,

668
00:26:24,983 --> 00:26:28,219
and immediately see response,
so this is one direction
of research.

669
00:26:28,253 --> 00:26:31,455
The other direction
is to make it more accurate.

670
00:26:31,489 --> 00:26:33,157
Being able to type
on a keyboard,

671
00:26:33,191 --> 00:26:35,960
use a computer mouse,
uh, open a water bottle,

672
00:26:35,993 --> 00:26:38,262
things like that that
most people take for granted.

673
00:26:38,296 --> 00:26:41,933
It's kind of like a...
you know, sci-fi movie,
soon to be written.

674
00:26:41,966 --> 00:26:44,435
-[laughter]
-Give us five, right?

675
00:26:44,469 --> 00:26:46,805
That's awkward...
oh, robot to robot hand.

676
00:26:46,838 --> 00:26:47,906
Nice!

677
00:26:49,173 --> 00:26:51,676
-That's...
that was real, right?
-Yeah.

678
00:26:51,710 --> 00:26:53,877
If I find out you guys
had a button under that desk...

679
00:26:53,911 --> 00:26:56,046
No, nah, I promise.
I promise.

680
00:26:56,080 --> 00:26:58,616
[Downey]
<i>What began as one man's pursuit</i>

681
00:26:58,650 --> 00:27:01,185
<i> to innovate music
 through A.I. and robotics</i>

682
00:27:01,218 --> 00:27:04,222
<i> unexpectedly became
 something much greater.</i>

683
00:27:05,689 --> 00:27:08,259
<i> A human body
cooperating with a bionic hand</i>

684
00:27:08,292 --> 00:27:09,927
<i> is one thing...</i>

685
00:27:09,961 --> 00:27:12,030
<i> but is it possible
 to humanize a machine</i>

686
00:27:12,063 --> 00:27:15,066
<i> to the point that
 it truly seems lifelike?</i>

687
00:27:15,099 --> 00:27:18,937
<i> Or is that still sci-fi,
 and far, far away?</i>

688
00:27:25,009 --> 00:27:26,611
[Greg]
How did things go with Will?

689
00:27:26,645 --> 00:27:29,280
[Sagar] You know, one of
the real challenges there

690
00:27:29,313 --> 00:27:30,881
was just getting enough material

691
00:27:30,915 --> 00:27:33,084
that we could actually
come back with.

692
00:27:33,117 --> 00:27:36,754
We can't possibly capture
somebody's real personality,

693
00:27:36,787 --> 00:27:38,156
you know, that's impossible,

694
00:27:38,189 --> 00:27:40,324
but in order
for it to really work,

695
00:27:40,357 --> 00:27:44,195
it's really important
to capture a feeling of Will.

696
00:27:44,228 --> 00:27:45,463
Right, so...

697
00:27:45,497 --> 00:27:48,733
[Downey]<i> Will's avatar
 is actually Mark's first go</i>

698
00:27:48,767 --> 00:27:51,802
<i> at creating a digital copy
 of a real person.</i>

699
00:27:51,836 --> 00:27:53,838
Wow, that's looking
pretty good.

700
00:27:53,871 --> 00:27:56,441
[Downey]<i> He's not just
 trying to clone a human,</i>

701
00:27:56,474 --> 00:27:57,641
<i> by any stretch,</i>

702
00:27:57,675 --> 00:27:59,710
<i> but trying to create
 an artificial stand-in</i>

703
00:27:59,743 --> 00:28:01,746
<i> that's somewhat believable.</i>

704
00:28:01,779 --> 00:28:04,716
<i> Still, like most firsts,
 it's bumpy,</i>

705
00:28:04,749 --> 00:28:07,017
<i> and it's a cautious road
 into the unknown.</i>

706
00:28:07,051 --> 00:28:09,420
[tech] A big challenge
that I've found

707
00:28:09,453 --> 00:28:11,288
while I've been looking
through a lot of the images

708
00:28:11,322 --> 00:28:14,358
is it seems that Will was
moving a lot during the shots.

709
00:28:14,391 --> 00:28:16,928
[Colin Hodges] Okay. When
we're building digital Will,

710
00:28:16,961 --> 00:28:19,130
we have about eight artists
on our team

711
00:28:19,163 --> 00:28:20,064
that come together

712
00:28:20,097 --> 00:28:22,066
and pull all
of the different components

713
00:28:22,100 --> 00:28:24,234
to bring together
this real-time character

714
00:28:24,268 --> 00:28:26,637
that's driven by
the artificial intelligence

715
00:28:26,670 --> 00:28:29,107
to behave
like Will behaves.

716
00:28:29,707 --> 00:28:31,843
Big challenges we've got

717
00:28:31,876 --> 00:28:34,278
is how we create
Will's personality.

718
00:28:34,312 --> 00:28:35,546
Yeah. Like, the liveliness

719
00:28:35,580 --> 00:28:37,314
and the energy
that he generates,

720
00:28:37,348 --> 00:28:38,316
and the excitement.

721
00:28:39,950 --> 00:28:41,753
The facial hair
was a challenge.

722
00:28:41,786 --> 00:28:44,121
Because it's so sparse,
it's quite tricky to get

723
00:28:44,155 --> 00:28:46,090
the hair separated
from the skin.

724
00:28:46,124 --> 00:28:48,493
[Sagar] We have
to be able to synthesize

725
00:28:48,526 --> 00:28:51,829
the sort of feel that
you're interacting with Will.

726
00:28:51,862 --> 00:28:54,599
So, Teah,
I've got some stuff to hear.

727
00:28:54,632 --> 00:28:56,767
We've got 16 variations.

728
00:28:56,801 --> 00:28:58,269
-16 variations?
-Yeah.

729
00:28:58,302 --> 00:29:01,072
[Sagar] We take the voice data
that we've got,

730
00:29:01,105 --> 00:29:03,808
and then we can enable
the digital version of Will

731
00:29:03,841 --> 00:29:05,743
to say all kinds
of different things.

732
00:29:05,776 --> 00:29:07,411
[digital Will]
<i> Here's the forecast.</i>

733
00:29:07,444 --> 00:29:08,680
<i> Yo, check out the forecast.</i>

734
00:29:08,713 --> 00:29:10,014
<i> Yo, check out
 the weather and shit.</i>

735
00:29:10,048 --> 00:29:11,582
<i> Here's the weather.
 Check out the weather.</i>

736
00:29:11,615 --> 00:29:13,451
<i> Yah, 'bout to make it rain!</i>

737
00:29:13,484 --> 00:29:14,819
<i> Kinda.</i>

738
00:29:14,852 --> 00:29:16,654
[Sagar] That's fantastic...
the words,

739
00:29:16,687 --> 00:29:18,723
the delivery, emphasis...

740
00:29:18,756 --> 00:29:21,960
Shows you just how complex
people react.

741
00:29:23,561 --> 00:29:26,431
[will.i.am] It's awesome
where we are
in the world of tech.

742
00:29:27,598 --> 00:29:29,367
Scary where we are,
as well.

743
00:29:29,400 --> 00:29:32,570
My mind started thinking,
like, "Wait a second here.

744
00:29:32,604 --> 00:29:34,405
Why am I doing this?

745
00:29:34,439 --> 00:29:36,941
What's the endgame?"

746
00:29:37,942 --> 00:29:41,846
Because, eventually,
I won't be around,

747
00:29:41,880 --> 00:29:43,214
but it would.

748
00:29:43,248 --> 00:29:45,650
[Downey]<i> Will's endgame
 is more modest than Mark's:</i>

749
00:29:45,683 --> 00:29:48,719
<i> a beefed-up
 Instagram following,
 a virtual assistant,</i>

750
00:29:48,753 --> 00:29:51,556
<i> anything that might help him
 expand his creative outlets</i>

751
00:29:51,589 --> 00:29:55,960
<i> or free up time
 for more creative
 or philanthropic pursuits.</i>

752
00:29:57,628 --> 00:30:00,431
Okay, so, here we go.

753
00:30:00,464 --> 00:30:02,634
That's looking really different.

754
00:30:02,667 --> 00:30:04,201
It's gonna be
really interesting,

755
00:30:04,234 --> 00:30:06,104
because, you know,
it's not every day

756
00:30:06,137 --> 00:30:08,439
you get confronted
with your virtual self.

757
00:30:08,472 --> 00:30:09,874
Right.

758
00:30:09,907 --> 00:30:12,076
Does he feel
that this is like him?

759
00:30:12,109 --> 00:30:14,078
If it's not
representative of him

760
00:30:14,112 --> 00:30:15,880
or if he doesn't think
it's authentic,

761
00:30:15,914 --> 00:30:18,216
then he won't want
to support it.

762
00:30:22,353 --> 00:30:24,889
-What up, Mark?
<i> -Oh, hey, how are you?</i>

763
00:30:24,923 --> 00:30:26,758
-You can see me, right?
<i> -Yes.</i>

764
00:30:26,791 --> 00:30:29,460
<i> Yo, wassup?
 This is will.i.am.</i>

765
00:30:29,494 --> 00:30:30,662
[laughing]

766
00:30:30,695 --> 00:30:31,929
[Sagar]
This is the new version of you.

767
00:30:31,962 --> 00:30:33,798
We can give him glasses there.

768
00:30:33,831 --> 00:30:35,433
[will.i.am laughs]
That's awesome.

769
00:30:35,466 --> 00:30:38,836
I remember I had a pimple
on my face that day.
You captured it.

770
00:30:38,869 --> 00:30:40,771
<i> The good thing is,
 it's digital,</i>

771
00:30:40,804 --> 00:30:42,273
<i> and we can remove it
 really easily.</i>

772
00:30:42,306 --> 00:30:44,542
How come you didn't
remove that? [laughs]

773
00:30:44,575 --> 00:30:47,111
[Sagar]<i> You can make him do
 a variety of things.</i>

774
00:30:47,145 --> 00:30:49,046
<i> Let's play "Simon Says."</i>

775
00:30:49,079 --> 00:30:50,815
<i> Say, "I sound like a girl."</i>

776
00:30:50,848 --> 00:30:52,550
<i> I sound like a girl.</i>

777
00:30:52,584 --> 00:30:54,085
<i> Say that
 with a higher pitch.</i>

778
00:30:54,118 --> 00:30:56,054
[high voice]
<i> I sound like a girl.</i>

779
00:30:56,087 --> 00:30:58,056
<i> Raise your eyebrows.</i>

780
00:30:59,390 --> 00:31:00,624
<i> Poke out your tongue.</i>

781
00:31:00,658 --> 00:31:02,359
[Will laughs]

782
00:31:02,393 --> 00:31:04,995
[will.i.am]
Tell me about growing up
in Los Angeles.

783
00:31:05,029 --> 00:31:06,363
<i> I was born and raised
 in Boyle Heights,</i>

784
00:31:06,397 --> 00:31:08,933
<i> which is west
 of east Los Angeles,</i>

785
00:31:08,966 --> 00:31:10,601
<i> which is east of Hollywood.</i>

786
00:31:10,634 --> 00:31:12,903
<i> Just east of downtown.</i>

787
00:31:12,937 --> 00:31:14,739
[will.i.am] Should it
sound exactly like me?

788
00:31:14,772 --> 00:31:16,206
Nope.

789
00:31:16,240 --> 00:31:17,742
Should it sound
a little bit robotic?

790
00:31:17,775 --> 00:31:20,078
Yes. It should.

791
00:31:20,611 --> 00:31:22,112
For my mom.

792
00:31:22,145 --> 00:31:24,749
My mom should not be confused.

793
00:31:24,782 --> 00:31:26,083
What's your name?

794
00:31:26,116 --> 00:31:27,618
[in Spanish]<i> Mi nombre es</i> Will.

795
00:31:27,652 --> 00:31:29,254
[in English] You speak Spanish?

796
00:31:29,287 --> 00:31:30,220
<i> I don't know.</i>

797
00:31:30,254 --> 00:31:31,088
[laughing]

798
00:31:31,122 --> 00:31:33,123
I know it needs
some fine-tuning,

799
00:31:33,157 --> 00:31:35,392
but the way
it's looking so far

800
00:31:35,426 --> 00:31:36,694
is mind-blowing.

801
00:31:36,728 --> 00:31:38,062
Thanks, Mark.

802
00:31:38,096 --> 00:31:39,930
<i> Yeah, no worries.</i>

803
00:31:39,964 --> 00:31:41,833
[Sagar] How far
do you go down that path

804
00:31:41,866 --> 00:31:44,202
until you can label it
a living...

805
00:31:44,235 --> 00:31:47,071
a digital living character?

806
00:31:47,104 --> 00:31:50,274
This raises some of
the deepest questions

807
00:31:50,308 --> 00:31:53,311
in science
and philosophy, actually,

808
00:31:53,344 --> 00:31:55,346
you know,
the nature of free will.

809
00:31:55,380 --> 00:31:56,347
How do you actually

810
00:31:56,380 --> 00:31:58,449
build a character
which is truly autonomous?

811
00:31:58,482 --> 00:32:01,252
Peek-a-boo!

812
00:32:01,286 --> 00:32:02,286
[Baby X giggles]

813
00:32:02,319 --> 00:32:05,389
What is free will?
What does it take to do that?

814
00:32:05,423 --> 00:32:07,024
[Weinberg]
Artificial Intelligence

815
00:32:07,058 --> 00:32:09,060
is crucial
to the work we are doing,

816
00:32:09,093 --> 00:32:10,828
to inspire, to surprise,

817
00:32:10,861 --> 00:32:13,498
to push human creativity
and abilities

818
00:32:13,531 --> 00:32:15,033
to uncharted domains.

819
00:32:15,767 --> 00:32:16,834
[all cheering]

820
00:32:16,868 --> 00:32:18,069
Unbelievable.

821
00:32:18,102 --> 00:32:19,971
[playing drums]

822
00:32:22,940 --> 00:32:24,442
[Downey]<i> Free will...</i>

823
00:32:25,476 --> 00:32:27,644
<i> ...it's something
 we've been grappling with</i>

824
00:32:27,678 --> 00:32:30,348
<i> for thousands of years,
 from Aristotle to Descartes,</i>

825
00:32:30,381 --> 00:32:33,150
<i> and will continue
 to grapple with
 for a thousand more.</i>

826
00:32:33,184 --> 00:32:35,786
<i> Will we ever be able
 to make an A.I.</i>

827
00:32:35,819 --> 00:32:37,488
<i> that can think on its own?</i>

828
00:32:37,522 --> 00:32:39,891
<i> A second, artificial version
 of me</i>

829
00:32:39,924 --> 00:32:42,060
<i> that is truly autonomous?</i>

830
00:32:42,093 --> 00:32:45,329
<i> A Robert that can actually
 think and feel on his own,</i>

831
00:32:45,362 --> 00:32:47,999
<i> while this Robert here
 takes a nap?</i>

832
00:32:48,032 --> 00:32:49,867
[engines roaring]

833
00:32:49,901 --> 00:32:51,202
<i> Impossible?</i>

834
00:32:51,235 --> 00:32:52,636
<i> Well, when you consider</i>

835
00:32:52,670 --> 00:32:55,606
<i> what human cooperation
 has already accomplished...</i>

836
00:32:55,639 --> 00:32:57,675
<i> a man on the moon...</i>

837
00:32:57,708 --> 00:32:59,944
<i> decoding the human genome...</i>

838
00:32:59,977 --> 00:33:02,613
<i> discovering
 faraway galaxies...</i>

839
00:33:02,647 --> 00:33:05,816
<i> I'd put my money
 on dreamers like Mark and Gil</i>

840
00:33:05,850 --> 00:33:09,487
<i>over the "Earth is flat" folks
 any day.</i>

841
00:33:09,520 --> 00:33:12,523
<i> Until then... nap time.</i>

842
00:33:16,860 --> 00:33:18,162
[man 1] Look at our world today.

843
00:33:18,196 --> 00:33:19,697
Look at everything
we've created.

844
00:33:21,865 --> 00:33:23,500
Artificial Intelligence
is gonna be

845
00:33:23,534 --> 00:33:26,370
the technology that takes that
to the next level.

846
00:33:26,404 --> 00:33:28,472
[man 2] Artificial Intelligence
can help us

847
00:33:28,506 --> 00:33:30,541
to feed
the world's population.

848
00:33:30,574 --> 00:33:33,878
[man 3]
The fact that we can find
where famine might happen,

849
00:33:33,911 --> 00:33:35,546
it's mind-blowing.

850
00:33:35,579 --> 00:33:37,181
These are conflict areas,

851
00:33:37,214 --> 00:33:39,750
this is an area that we need
to look at protecting.

852
00:33:39,784 --> 00:33:41,285
Then launch A.I.

853
00:33:41,318 --> 00:33:44,155
[man 4]
We are going to release
the speed limit on your car.

854
00:33:46,190 --> 00:33:47,725
Tim, can you hear me?

855
00:33:47,758 --> 00:33:49,260
[man 5] With A.I.,

856
00:33:49,293 --> 00:33:51,295
ideas are easy,
execution is hard.

857
00:33:52,663 --> 00:33:55,766
[Domingos]
What excites me the most
about where we might be going

858
00:33:55,799 --> 00:33:57,034
is having more super-powers...

859
00:33:57,067 --> 00:33:58,402
[firefighter] I got him!

860
00:33:58,436 --> 00:34:00,705
[Domingos]
...and A.I. is super-powers
for our mind.

861
00:34:00,738 --> 00:34:03,241
[man 6]
Even though the limb
is synthetic materials,

862
00:34:03,274 --> 00:34:05,342
it moves as if
it's flesh and bone.

863
00:34:05,376 --> 00:34:07,011
[woman 1]
You start to think
about a world

864
00:34:07,044 --> 00:34:09,814
where you can prevent disease
before it happens.

865
00:34:09,847 --> 00:34:11,415
[man 7]
A.I. can give us that answer

866
00:34:11,448 --> 00:34:13,084
that we've been seeking
all along...

867
00:34:13,117 --> 00:34:14,585
"Are we alone?"

868
00:34:14,619 --> 00:34:15,752
Bah!

869
00:34:15,786 --> 00:34:17,688
[man 8] I love the idea
that there are passionate people

870
00:34:17,721 --> 00:34:19,757
dedicating their time and energy

871
00:34:19,790 --> 00:34:21,392
to making these things happen.


