1
00:00:07,206 --> 00:00:08,341
You know how they say

2
00:00:08,374 --> 00:00:10,743
there are two certainties
in life, right?

3
00:00:10,777 --> 00:00:12,445
Death and taxes.

4
00:00:12,478 --> 00:00:15,014
Can't we get rid of
one of those?

5
00:00:15,048 --> 00:00:17,584
See, 100 years ago,
life expectancy was only 45,

6
00:00:17,617 --> 00:00:18,718
can you believe that?

7
00:00:18,752 --> 00:00:20,886
Then by the 1950s,
it was up to 65,

8
00:00:20,920 --> 00:00:22,355
and today, it's almost 80.

9
00:00:22,388 --> 00:00:25,792
Tomorrow, who knows? Right?

10
00:00:25,825 --> 00:00:27,527
Healthcare has made
huge progress.

11
00:00:27,560 --> 00:00:30,529
We've eradicated epidemics
that used to kill millions,

12
00:00:30,563 --> 00:00:32,465
but life is fragile.

13
00:00:32,498 --> 00:00:34,934
People still get sick,
or pass away

14
00:00:34,967 --> 00:00:36,903
for reasons that maybe

15
00:00:36,936 --> 00:00:39,606
should be, someday curable.

16
00:00:41,274 --> 00:00:44,344
What if we could
improve diagnosis?

17
00:00:44,377 --> 00:00:47,446
Innovate to predict illness
instead of just react to it?

18
00:00:47,480 --> 00:00:49,015
In this episode,

19
00:00:49,048 --> 00:00:51,183
we'll see how machine learning
is combating

20
00:00:51,217 --> 00:00:53,352
one of the leading
causes of blindness,

21
00:00:53,386 --> 00:00:56,055
and enabling a son
with a neurological disease

22
00:00:56,089 --> 00:00:58,123
to communicate with his family.

23
00:00:58,157 --> 00:01:00,259
AI is changing
the way we think

24
00:01:00,293 --> 00:01:01,861
about mind and body,

25
00:01:01,894 --> 00:01:03,162
life and death,

26
00:01:03,195 --> 00:01:05,098
and what we value most,

27
00:01:05,131 --> 00:01:07,100
our human experience.

28
00:01:07,967 --> 00:01:09,902
[fanfare music playing]

29
00:01:09,936 --> 00:01:14,006
[announcer]<i> ...and our other
 co-captain, Number 8!
 Tim Shaw!</i>

30
00:01:14,040 --> 00:01:16,176
[crowd cheering]

31
00:01:18,144 --> 00:01:20,546
[John Shaw]<i> We've lived
 with his football dream,</i>

32
00:01:20,580 --> 00:01:22,248
<i> All the way back
 to sixth grade</i>

33
00:01:22,281 --> 00:01:23,817
<i> when his coach said,</i>

34
00:01:23,850 --> 00:01:25,784
"This kid is gonna go
a long way."

35
00:01:25,818 --> 00:01:27,286
From that point on,

36
00:01:27,320 --> 00:01:29,556
Tim was doing pushups
in his bedroom at night,

37
00:01:29,589 --> 00:01:32,158
Tim was the first one
at practice.

38
00:01:32,191 --> 00:01:33,993
Tim took it seriously.

39
00:01:34,026 --> 00:01:38,498
[crowd screaming and cheering]

40
00:01:42,068 --> 00:01:44,270
[whistle blows]

41
00:01:44,303 --> 00:01:47,173
[announcer]
<i> Number 8, Tim Shaw!</i>

42
00:01:47,207 --> 00:01:49,075
[crowd cheering]

43
00:01:49,108 --> 00:01:51,210
<i> I don't know what
 they're doing out there,</i>

44
00:01:51,244 --> 00:01:53,179
<i> and I don't know
 who they comin' to!</i>

45
00:01:53,212 --> 00:01:55,648
[Robert Downey Jr.]
<i>For as long as he can remember,</i>

46
00:01:55,682 --> 00:01:58,351
<i> Tim Shaw dreamed
 of three letters...</i>

47
00:01:58,384 --> 00:02:00,119
<i> N-F-L.</i>

48
00:02:00,152 --> 00:02:01,220
[whistle blows]

49
00:02:01,254 --> 00:02:03,089
<i> He was a natural
 from the beginning.</i>

50
00:02:03,122 --> 00:02:05,458
<i> As a kid,
 he was fast and athletic.</i>

51
00:02:05,491 --> 00:02:09,495
<i> He grew into 235 pounds
 of pure muscle,</i>

52
00:02:09,528 --> 00:02:13,266
<i> and at 23,
 he was drafted to the pros.</i>

53
00:02:13,299 --> 00:02:16,068
<i> His dream was real.</i>

54
00:02:16,101 --> 00:02:19,505
<i> He was playing
 professional football.</i>

55
00:02:19,538 --> 00:02:21,441
[reporter]<i> Hello,
 I'm with Tim Shaw.</i>

56
00:02:21,474 --> 00:02:24,276
You get to start
this season right.
What's it feel like?

57
00:02:24,310 --> 00:02:26,278
It's that amazing
pre-game electricity,

58
00:02:26,312 --> 00:02:30,917
<i> the butterflies are there,
and I'm ready to hit somebody.
 You might wanna look out.</i>

59
00:02:30,950 --> 00:02:32,885
Hey, Titans fans,
it's Tim Shaw here,

60
00:02:32,919 --> 00:02:35,855
linebacker
and special teams animal.

61
00:02:35,888 --> 00:02:37,190
He loves what he does.

62
00:02:37,223 --> 00:02:39,525
He says,
"They pay me to hit people!"

63
00:02:39,559 --> 00:02:42,295
[crowd cheering]

64
00:02:42,328 --> 00:02:44,664
I'm here to bring you
some truth,

65
00:02:44,697 --> 00:02:46,232
a little bit of truth,

66
00:02:46,265 --> 00:02:47,900
and so we'll call it
T-Shaw's truth,

67
00:02:47,934 --> 00:02:50,737
'cause it's not
all the way true,
but it's my truth.

68
00:03:03,916 --> 00:03:06,553
[Tim Shaw speaking]

69
00:03:27,140 --> 00:03:29,308
[Tim from 2015 interview]
<i> In 2012,</i>

70
00:03:29,341 --> 00:03:31,777
<i> my body started to do things
 it hadn't done before.</i>

71
00:03:31,810 --> 00:03:34,647
<i> My muscles were twitching,
 I was stumbling,</i>

72
00:03:34,680 --> 00:03:38,985
or I was not making a play
I would have always made.

73
00:03:39,018 --> 00:03:40,553
I just wasn't
the same athlete,

74
00:03:40,586 --> 00:03:44,591
I wasn't the same
football player
that I'd always been.

75
00:03:45,658 --> 00:03:47,260
[Tim speaking]

76
00:04:03,443 --> 00:04:07,313
[Downey]<i> The three letters
 that had defined Tim's life
 up to that point</i>

77
00:04:07,347 --> 00:04:10,817
<i> were not the three letters
 that the doctor told him
 that day.</i>

78
00:04:10,850 --> 00:04:11,985
<i> A-L-S.</i>

79
00:04:12,819 --> 00:04:14,754
[Tim speaking]

80
00:04:37,609 --> 00:04:39,178
Okay...

81
00:04:39,211 --> 00:04:43,349
[Downey]<i> A-L-S, which
 stands for "amyotrophic
 lateral sclerosis,"</i>

82
00:04:43,382 --> 00:04:45,785
<i> is also known as
 Lou Gehrig's Disease.</i>

83
00:04:45,818 --> 00:04:49,488
<i>It causes the death of neurons
controlling voluntary muscles.</i>

84
00:04:49,521 --> 00:04:52,258
[Sharon Shaw]<i> He can't even
 scratch his head...</i>

85
00:04:52,291 --> 00:04:53,559
Better yet?

86
00:04:53,592 --> 00:04:55,627
<i> ...none of those
 physical things</i>

87
00:04:55,661 --> 00:04:58,397
that were
so easy for him before.

88
00:04:58,430 --> 00:05:01,233
He has to think about
every step he takes.

89
00:05:01,267 --> 00:05:03,469
So Tim's food comes
in this little container.

90
00:05:03,502 --> 00:05:05,071
We're gonna
mix it with water.

91
00:05:06,172 --> 00:05:07,407
[Tim speaking]

92
00:05:17,149 --> 00:05:19,152
[Downey]
<i> As the disease progresses,</i>

93
00:05:19,185 --> 00:05:20,719
<i> muscles weaken.</i>

94
00:05:20,753 --> 00:05:24,423
<i> Simple everyday actions,
 like walking, talking,
 and eating,</i>

95
00:05:24,456 --> 00:05:26,859
<i> take tremendous effort.</i>

96
00:05:49,915 --> 00:05:52,952
Tim used to call me
on the phone in the night,

97
00:05:52,985 --> 00:05:54,420
and he had voice recognition,

98
00:05:54,453 --> 00:05:56,221
and he would speak to the phone,

99
00:05:56,255 --> 00:05:57,957
and say, "Call Dad."

100
00:05:57,990 --> 00:06:02,228
His phone didn't recognize
the word "Dad."

101
00:06:02,261 --> 00:06:05,531
So, he had said to me...

102
00:06:05,564 --> 00:06:07,866
[voice breaking] "Dad,
I've changed your name.

103
00:06:07,900 --> 00:06:11,304
I'm calling...
I now call you "Yo-yo."

104
00:06:11,337 --> 00:06:14,907
So he would say into his phone,
"Call Yo-yo."

105
00:06:16,742 --> 00:06:20,879
[Sharon]<i> Tim has stopped
 a lot of his communication.</i>

106
00:06:20,912 --> 00:06:23,315
He just doesn't talk
as much as he used to,

107
00:06:23,349 --> 00:06:25,384
and I, I miss that.

108
00:06:25,418 --> 00:06:26,752
I miss it.

109
00:06:26,785 --> 00:06:29,555
-What do you think
about my red beard?
-No opinion.

110
00:06:29,589 --> 00:06:31,290
[snorts] That means
he likes it,

111
00:06:31,324 --> 00:06:32,858
just doesn't wanna
say on camera.

112
00:06:32,892 --> 00:06:35,862
Now, my favorite was when
you had the handlebar moustache.

113
00:06:35,895 --> 00:06:38,998
[Downey]<i> Language,
 the ability to communicate
 with one another.</i>

114
00:06:39,031 --> 00:06:42,734
<i> It's something that makes us
 uniquely human,</i>

115
00:06:42,768 --> 00:06:47,139
<i> making communication
 an impactful application
 for AI.</i>

116
00:06:47,172 --> 00:06:49,842
[Sharon] Yeah, that'll be fun.

117
00:06:57,516 --> 00:06:59,318
[Julie Cattiau]
<i> My name is Julie.</i>

118
00:06:59,351 --> 00:07:01,120
<i> I'm a product manager
 here at Google.</i>

119
00:07:01,153 --> 00:07:04,490
<i> For the past year or so,
 I've been working on
 Project Euphonia.</i>

120
00:07:04,523 --> 00:07:06,759
Project Euphonia
has two different goals.

121
00:07:06,792 --> 00:07:09,529
One is to improve
speech recognition

122
00:07:09,562 --> 00:07:12,531
<i> for people who have a variety
 of medical conditions.</i>

123
00:07:12,564 --> 00:07:15,268
The second goal
is to give people
their voice back,

124
00:07:15,301 --> 00:07:18,571
which means actually recreating
the way they used to sound

125
00:07:18,604 --> 00:07:20,573
before they were diagnosed.

126
00:07:20,606 --> 00:07:22,341
<i> If you think about
 communication,</i>

127
00:07:22,374 --> 00:07:24,476
it starts with
understanding someone,

128
00:07:24,510 --> 00:07:26,178
and then being understood,

129
00:07:26,211 --> 00:07:27,980
and for a lot of people,

130
00:07:28,014 --> 00:07:31,884
<i> their voice is like
 their identity.</i>

131
00:07:31,917 --> 00:07:34,920
[Downey]
<i> In the US alone,
 roughly one in ten people</i>

132
00:07:34,954 --> 00:07:36,789
<i> suffer acquired
 speech impairments,</i>

133
00:07:36,822 --> 00:07:39,592
<i> which can be caused by
 anything from ALS,</i>

134
00:07:39,625 --> 00:07:43,029
<i> to strokes, to Parkinson's,
 to brain injuries.</i>

135
00:07:43,062 --> 00:07:45,397
<i>Solving it is a big challenge,</i>

136
00:07:45,431 --> 00:07:49,735
<i> which is why Julie partnered
 with a big thinker to help.</i>

137
00:08:07,319 --> 00:08:11,524
[Downey]<i> Dimitri
 is a world-class research
 scientist and inventor.</i>

138
00:08:11,557 --> 00:08:14,326
<i> He's worked at IBM,
 Princeton, and now Google,</i>

139
00:08:14,359 --> 00:08:18,197
<i> and holds over 150 patents.</i>

140
00:08:18,230 --> 00:08:19,431
<i> Accomplishments aside,</i>

141
00:08:19,464 --> 00:08:21,800
<i> communication
 is very personal to him.</i>

142
00:08:21,833 --> 00:08:24,770
Dimitri has a pretty
strong Russian accent,

143
00:08:24,803 --> 00:08:27,973
and also he learned English
when he was already deaf,

144
00:08:28,007 --> 00:08:30,209
so he never heard himself
speak English.

145
00:08:32,144 --> 00:08:34,079
Oh, you do? Oh, okay.

146
00:08:34,113 --> 00:08:37,049
[Downey]<i> Technology can't yet
 help him hear his own voice.</i>

147
00:08:37,082 --> 00:08:40,419
<i> He uses AI-powered
 Live Transcribe</i>

148
00:08:40,453 --> 00:08:42,321
<i> to help him communicate.</i>

149
00:08:42,355 --> 00:08:43,789
[Cattiau] Okay,
that's awesome.

150
00:08:43,823 --> 00:08:47,260
So we partnered up with Dimitri
to train a recognizer

151
00:08:47,293 --> 00:08:50,663
that did a much better job
at recognizing his voice.

152
00:08:50,696 --> 00:08:53,832
The model that you're using
right now for recognition,

153
00:08:53,865 --> 00:08:57,069
what data did you train it on?

154
00:09:01,307 --> 00:09:04,277
[Downey]<i> So, how does
 speech recognition work?</i>

155
00:09:07,312 --> 00:09:10,683
<i> First, the sound of our voice
 is converted into a waveform,</i>

156
00:09:10,716 --> 00:09:13,118
<i> which is really just
 a picture of the sound.</i>

157
00:09:13,152 --> 00:09:16,288
<i> Waveforms are then matched
 to transcriptions,</i>

158
00:09:16,321 --> 00:09:18,357
<i> or "labels" for each word.</i>

159
00:09:18,391 --> 00:09:21,726
<i>These maps exist for most words
 in the English language.</i>

160
00:09:21,760 --> 00:09:24,530
This is where
<i> machine learning takes over.</i>

161
00:09:24,563 --> 00:09:26,865
<i> Using millions
 of voice samples,</i>

162
00:09:26,898 --> 00:09:28,700
<i> a deep learning model
 is trained</i>

163
00:09:28,734 --> 00:09:31,704
<i> to map input sounds
 to output words.</i>

164
00:09:31,737 --> 00:09:35,341
<i>Then the algorithm uses rules,
 such as grammar and syntax,</i>

165
00:09:35,374 --> 00:09:37,176
<i> to predict each word
 in a sentence.</i>

166
00:09:37,209 --> 00:09:39,244
<i> This is how AI
 can tell the difference</i>

167
00:09:39,278 --> 00:09:42,581
<i> between "there," "their,"
 and "they're."</i>

168
00:09:42,614 --> 00:09:45,584
[Cattiau]<i> The speech
 recognition model
 that Google uses</i>

169
00:09:45,618 --> 00:09:47,619
works very well for people

170
00:09:47,652 --> 00:09:50,590
who have a voice
that sounds similar

171
00:09:50,623 --> 00:09:53,225
to the examples that were used
to train this model.

172
00:09:53,258 --> 00:09:56,462
In 90% of cases,
it will recognize
what you want to say.

173
00:09:56,495 --> 00:09:58,797
[Downey]
<i> Dimitri's not in that 90%.</i>

174
00:09:58,830 --> 00:10:01,033
For someone like him,
it doesn't work at all.

175
00:10:01,066 --> 00:10:04,870
<i> So he created a model
 based on a sample of one.</i>

176
00:10:23,489 --> 00:10:25,291
[Downey]<i> But making
 a new unique model</i>

177
00:10:25,324 --> 00:10:27,893
<i> with unique data
for every new and unique person</i>

178
00:10:27,927 --> 00:10:29,795
<i> is slow and inefficient.</i>

179
00:10:29,828 --> 00:10:32,097
<i> Tim calls his dad "Yo-yo."</i>

180
00:10:32,131 --> 00:10:35,700
<i> Others with ALS may call
 their dads something else.</i>

181
00:10:35,734 --> 00:10:37,502
<i> Can we build one machine</i>

182
00:10:37,536 --> 00:10:40,539
<i> that recognizes
 many different people,</i>

183
00:10:40,572 --> 00:10:42,374
<i> and how can we do it fast?</i>

184
00:10:42,408 --> 00:10:45,277
[Cattiau]<i> So this data
 doesn't really exist.</i>

185
00:10:45,310 --> 00:10:46,878
<i>We have to actually collect it.</i>

186
00:10:46,912 --> 00:10:50,549
So we started this partnership
with ALS TDI in Boston.

187
00:10:50,582 --> 00:10:52,818
They helped us collect
voice samples

188
00:10:52,851 --> 00:10:54,286
from people who have ALS.

189
00:10:54,319 --> 00:10:55,588
This is for you, T. Shaw.

190
00:10:55,621 --> 00:10:59,025
[all] One, two, three!

191
00:11:00,259 --> 00:11:02,894
[all cheering]

192
00:11:02,928 --> 00:11:07,265
I hereby accept your ALS
ice bucket challenge.

193
00:11:07,299 --> 00:11:09,468
[yelping softly]

194
00:11:09,501 --> 00:11:11,870
[Downey]<i> When the
 ice bucket challenge
 went viral,</i>

195
00:11:11,904 --> 00:11:17,443
<i> millions joined the fight,
 and raised over $220 million
 for ALS research.</i>

196
00:11:17,476 --> 00:11:20,779
There really is a straight line
from the ice bucket challenge

197
00:11:20,812 --> 00:11:22,448
to the Euphonia Project.

198
00:11:29,187 --> 00:11:31,957
<i> ALS Therapy
 Development Institute
 is an organization</i>

199
00:11:31,990 --> 00:11:35,227
<i> that's dedicated to finding
 treatments and cures for ALS.</i>

200
00:11:35,260 --> 00:11:37,029
<i> We are life-focused.</i>

201
00:11:37,062 --> 00:11:40,299
How can we use
technologies we have

202
00:11:40,332 --> 00:11:42,634
to help these people right away?

203
00:11:42,667 --> 00:11:45,604
Yeah, they're
actually noisier.

204
00:11:45,638 --> 00:11:47,406
That's a good point.

205
00:11:47,439 --> 00:11:49,107
I met Tim a few years ago

206
00:11:49,140 --> 00:11:50,976
shortly after
he had been diagnosed.

207
00:11:51,009 --> 00:11:53,645
Very difficult to go public,

208
00:11:53,678 --> 00:11:55,681
but it was made
very clear to me

209
00:11:55,714 --> 00:11:57,116
that the time was right.

210
00:11:57,149 --> 00:12:00,486
He was trying to understand
what to expect in his life,

211
00:12:00,519 --> 00:12:02,688
<i> but he was also trying
 to figure out,</i>

212
00:12:02,721 --> 00:12:04,690
<i> "All right,
 what part can I play?"</i>

213
00:12:04,723 --> 00:12:07,526
All the ice bucket challenges
and the awareness

214
00:12:07,559 --> 00:12:09,461
have really inspired me also.

215
00:12:09,494 --> 00:12:10,763
<i> If we can just step back,</i>

216
00:12:10,796 --> 00:12:13,064
<i> and say, "Where can I
 shine a light?"</i>

217
00:12:13,098 --> 00:12:15,067
<i> or "Where can I give a hand?"</i>

218
00:12:15,100 --> 00:12:17,436
When the ice bucket
challenge happened,

219
00:12:17,469 --> 00:12:21,640
we had this huge influx
of resources of cash,

220
00:12:21,673 --> 00:12:23,542
<i> and that gave us the ability</i>

221
00:12:23,576 --> 00:12:26,678
<i>to reach out to people with ALS
 who are in our programs</i>

222
00:12:26,711 --> 00:12:28,714
<i> to share their data with us.</i>

223
00:12:28,747 --> 00:12:31,583
That's what got us
the big enough data sets

224
00:12:31,616 --> 00:12:33,786
to really attract Google.

225
00:12:36,688 --> 00:12:38,790
[Downey]<i> Fernando
 didn't initially set out</i>

226
00:12:38,823 --> 00:12:40,893
<i> to make speech recognition
 work better,</i>

227
00:12:40,926 --> 00:12:44,263
<i> but in the process of better
 understanding the disease,</i>

228
00:12:44,296 --> 00:12:47,332
<i> he built a huge database
 of ALS voices,</i>

229
00:12:47,366 --> 00:12:51,437
<i> which may help Tim
 and many others.</i>

230
00:12:54,873 --> 00:12:56,742
[John] It automatically
uploaded it.

231
00:12:56,775 --> 00:12:57,977
[Tim] Oh.

232
00:13:59,571 --> 00:14:01,574
How many
have you done, Tim?

233
00:14:07,379 --> 00:14:08,880
2066?

234
00:14:08,914 --> 00:14:12,217
[Fernando Vieira]<i> Tim,
 he wants to find every way
 that he can help.</i>

235
00:14:12,251 --> 00:14:16,388
It's inspiring to see
his level of enthusiasm,

236
00:14:16,421 --> 00:14:20,626
and his willingness to record
lots and lots of voice samples.

237
00:14:20,659 --> 00:14:23,128
[Downey]<i> To turn
 all this data into real help,</i>

238
00:14:23,161 --> 00:14:25,197
<i> Fernando partnered
 with one of the people</i>

239
00:14:25,230 --> 00:14:28,199
<i> who started
 the Euphonia Project,
 Michael Brenner...</i>

240
00:14:28,233 --> 00:14:30,335
-Hey, Fernando.
-Hey, how are you doing?

241
00:14:30,368 --> 00:14:32,337
[Downey]<i> ...a Google
 research scientist</i>

242
00:14:32,371 --> 00:14:34,105
<i> and Harvard-trained
 mathematician</i>

243
00:14:34,138 --> 00:14:35,507
<i> who's using machine learning</i>

244
00:14:35,540 --> 00:14:38,343
<i> to solve scientific
 Hail Marys, like this one.</i>

245
00:14:38,377 --> 00:14:42,381
Tim Shaw has recorded
almost 2,000 utterances,

246
00:14:42,414 --> 00:14:44,616
and so we decided
to apply our technology

247
00:14:44,649 --> 00:14:48,553
to see if we could build
a recognizer that
understood him.

248
00:14:48,587 --> 00:14:51,122
[Tim speaking]

249
00:14:51,156 --> 00:14:54,292
The goal, right, for Tim,
is to get it so that it works

250
00:14:54,326 --> 00:14:56,829
outside of the things
that he recorded.

251
00:14:56,862 --> 00:14:58,963
The problem is
that we have no idea

252
00:14:58,997 --> 00:15:00,799
how big of a set that
this will work on.

253
00:15:00,832 --> 00:15:04,102
[Brenner]<i> Dimitri had recorded
 upwards of 15,000 sentences,</i>

254
00:15:04,136 --> 00:15:07,105
which is just an incredible
amount of data.

255
00:15:07,138 --> 00:15:09,674
<i> We couldn't possibly
 expect anyone else</i>

256
00:15:09,707 --> 00:15:11,376
<i> to record so many sentences,</i>

257
00:15:11,410 --> 00:15:14,012
<i> so we know that
 we have to be able to do this</i>

258
00:15:14,046 --> 00:15:16,147
with much less recordings
from a person.

259
00:15:16,181 --> 00:15:18,283
So it's not clear it will work.

260
00:15:18,317 --> 00:15:20,586
[Tim speaking]

261
00:15:22,287 --> 00:15:24,189
-That didn't
work at all.
-Not at all.

262
00:15:24,223 --> 00:15:26,191
He said, "I go
the opposite way,"

263
00:15:26,224 --> 00:15:28,060
and it says,
"I know that was."

264
00:15:28,093 --> 00:15:29,728
[Brenner]
When it doesn't recognize,

265
00:15:29,761 --> 00:15:32,864
<i> we jiggle around
 the parameters of
 the speech recognizer,</i>

266
00:15:32,897 --> 00:15:34,432
<i> then we give it
 another sentence,</i>

267
00:15:34,466 --> 00:15:37,069
and the idea is that
you'll get it to understand.

268
00:15:37,102 --> 00:15:39,338
[Tim's recording]
<i> Can we go to the beach?</i>

269
00:15:42,140 --> 00:15:43,742
-Yes! Got it.
-Got it.

270
00:15:43,776 --> 00:15:45,844
That's so cool.
Okay, let's try another.

271
00:15:45,878 --> 00:15:47,980
[Downey]<i> If Tim Shaw
 gets his voice back,</i>

272
00:15:48,013 --> 00:15:50,716
<i> he may no longer feel
 that he is defined,</i>

273
00:15:50,749 --> 00:15:53,585
<i> or constrained,
 by three letters,</i>

274
00:15:53,618 --> 00:15:55,187
<i> but that's a big "if."</i>

275
00:15:55,220 --> 00:15:58,289
<i> While Michael
 and team Euphonia work away,</i>

276
00:15:58,323 --> 00:16:01,560
<i>let's take a moment and imagine
 what else is possible</i>

277
00:16:01,593 --> 00:16:03,228
in the realm of the senses.

278
00:16:03,261 --> 00:16:05,396
<i> Speech.</i>

279
00:16:05,430 --> 00:16:07,499
<i> Hearing.</i>

280
00:16:07,532 --> 00:16:08,934
<i> Sight.</i>

281
00:16:08,967 --> 00:16:10,869
<i> Can AI predict blindness?</i>

282
00:16:10,902 --> 00:16:12,004
[truck horn beeps]

283
00:16:12,037 --> 00:16:13,405
<i> Or even prevent it?</i>

284
00:16:50,509 --> 00:16:53,178
[Downey]<i> Santhi does not
 have an easy life.</i>

285
00:16:53,211 --> 00:16:56,348
<i> It's made more difficult
 because she has diabetes,</i>

286
00:16:56,381 --> 00:16:58,717
<i>which is affecting her vision.</i>

287
00:17:05,223 --> 00:17:10,229
[Downey]<i> If Santhi doesn't
 get medical help soon,
 she may go blind.</i>

288
00:17:12,664 --> 00:17:15,067
[Dr. Jessica Mega]
<i> Complications of diabetes</i>

289
00:17:15,100 --> 00:17:16,502
<i> include heart disease,</i>

290
00:17:16,535 --> 00:17:17,669
<i> kidney disease,</i>

291
00:17:17,702 --> 00:17:20,506
but one of the really
important complications

292
00:17:20,539 --> 00:17:21,807
is diabetic retinopathy.

293
00:17:21,840 --> 00:17:24,309
The reason it's
so important is that

294
00:17:24,342 --> 00:17:26,979
it's one of the lead causes
of blindness worldwide.

295
00:17:27,012 --> 00:17:29,681
<i> This is particularly true
 in India.</i>

296
00:17:31,216 --> 00:17:32,584
[giving instructions]

297
00:17:53,171 --> 00:17:55,239
In the early stages,
it's symptomless,

298
00:17:55,273 --> 00:17:57,276
but that's when it's treatable,

299
00:17:57,309 --> 00:17:59,377
so you want to screen them
early on,

300
00:17:59,410 --> 00:18:01,346
<i> before they actually
 lose vision.</i>

301
00:18:05,150 --> 00:18:06,418
<i> In the early stages,</i>

302
00:18:06,451 --> 00:18:08,353
<i> if a doctor is
 examining the eye,</i>

303
00:18:08,387 --> 00:18:10,622
<i> or you take a picture
 of the back of the eye,</i>

304
00:18:10,656 --> 00:18:14,493
<i> you will see lots of those
 bleeding spots in the retina.</i>

305
00:18:21,533 --> 00:18:25,136
Today, the doctors are
not enough to do the screening.

306
00:18:25,169 --> 00:18:27,071
We are very limited
ophthalmologists,

307
00:18:27,105 --> 00:18:28,673
<i> so there should be other ways</i>

308
00:18:28,706 --> 00:18:31,043
<i> where you can screen
 the diabetic patients</i>

309
00:18:31,076 --> 00:18:32,644
for diabetic complications.

310
00:18:32,678 --> 00:18:33,845
[Downey]<i> In the US,</i>

311
00:18:33,878 --> 00:18:37,148
<i>there are about 74 eye doctors
 for every million people.</i>

312
00:18:37,181 --> 00:18:39,751
<i> In India, there are only 11.</i>

313
00:18:39,784 --> 00:18:42,587
<i> So just keeping up with
 the sheer number of patients,</i>

314
00:18:42,620 --> 00:18:45,423
<i> let alone giving them
 the attention and care
 they need,</i>

315
00:18:45,457 --> 00:18:48,360
<i> is overwhelming,
 if not impossible.</i>

316
00:18:48,393 --> 00:18:52,264
[Dr. R. Kim]<i> We probably see
 about 2,000 to 2,500 patients</i>

317
00:18:52,297 --> 00:18:53,765
every single day.

318
00:18:53,798 --> 00:18:56,201
[Mega]
<i> The interesting thing
 with diabetic retinopathy</i>

319
00:18:56,234 --> 00:18:59,338
<i> is there are ways
 to screen and get
 ahead of the problem.</i>

320
00:18:59,371 --> 00:19:02,340
The challenge is that
not enough patients
undergo screening.

321
00:19:02,373 --> 00:19:05,310
[Downey]<i> Like Tim Shaw's
 ALS speech recognizer,</i>

322
00:19:05,343 --> 00:19:08,012
<i> this problem is also
 about data,</i>

323
00:19:08,046 --> 00:19:09,648
<i> or lack of it.</i>

324
00:19:09,681 --> 00:19:13,118
<i> To prevent more people
from experiencing vision loss,</i>

325
00:19:13,152 --> 00:19:16,221
<i> Dr. Kim wanted to
 get ahead of the problem.</i>

326
00:19:19,958 --> 00:19:21,660
So there's a hemorrhage.

327
00:19:21,693 --> 00:19:24,662
All these are exudates.

328
00:19:24,696 --> 00:19:27,499
[Downey]<i> Dr. Kim
 called up a team at Google.</i>

329
00:19:27,532 --> 00:19:29,267
<i> Made up of doctors
 and engineers,</i>

330
00:19:29,301 --> 00:19:31,536
<i> they're exploring ways
 to use machine learning</i>

331
00:19:31,570 --> 00:19:35,373
<i> to solve some of the world's
 leading healthcare problems.</i>

332
00:19:35,407 --> 00:19:39,110
So we started with
could we train an AI model

333
00:19:39,144 --> 00:19:41,780
that can somehow help
read these images,

334
00:19:41,813 --> 00:19:45,850
<i> that can decrease the number
 of doctors required</i>

335
00:19:45,883 --> 00:19:47,018
<i> to do this task.</i>

336
00:19:47,051 --> 00:19:48,587
So this is
the normal view.

337
00:19:48,620 --> 00:19:50,522
When you start
looking more deeply,

338
00:19:50,555 --> 00:19:53,225
then this can be
a microaneurysm, right?

339
00:19:53,258 --> 00:19:55,460
-This one here?
-[man] Could be.

340
00:19:55,493 --> 00:19:58,296
[Downey]<i> The team
 uses the same kind
 of machine learning</i>

341
00:19:58,329 --> 00:20:00,665
<i> that allows us
 to organize our photos</i>

342
00:20:00,699 --> 00:20:02,667
<i> or tag friends
 on social media,</i>

343
00:20:02,701 --> 00:20:05,437
<i> image recognition.</i>

344
00:20:05,470 --> 00:20:07,005
<i> First, models are trained</i>

345
00:20:07,038 --> 00:20:10,442
<i> using tagged images of things
 like cats or dogs.</i>

346
00:20:10,475 --> 00:20:12,543
<i> After looking at
 thousands of examples,</i>

347
00:20:12,577 --> 00:20:15,747
<i> the algorithm learns
 to identify new images</i>

348
00:20:15,780 --> 00:20:17,382
<i> without any human help.</i>

349
00:20:17,415 --> 00:20:19,785
<i> For the retinopathy project,</i>

350
00:20:19,818 --> 00:20:21,886
<i> over 100,000 eye scans</i>

351
00:20:21,920 --> 00:20:23,421
<i> were graded
 by eye doctors</i>

352
00:20:23,455 --> 00:20:26,958
<i> who rated each eye scan
 on a scale from one to five,</i>

353
00:20:26,991 --> 00:20:28,560
<i> from healthy to diseased.</i>

354
00:20:28,593 --> 00:20:30,228
<i> These images were then used</i>

355
00:20:30,261 --> 00:20:32,564
<i> to train a machine
 learning algorithm.</i>

356
00:20:32,597 --> 00:20:35,199
<i> Over time,
 the AI learned to predict</i>

357
00:20:35,233 --> 00:20:37,835
<i> which eyes showed signs
 of disease.</i>

358
00:20:37,869 --> 00:20:40,238
[Dr. Lily Peng]
This is the assistant's view

359
00:20:40,272 --> 00:20:41,706
<i> where the model's predictions</i>

360
00:20:41,740 --> 00:20:45,243
<i> are actually projected
 on the original image,</i>

361
00:20:45,277 --> 00:20:49,047
and it's picking up
the pathologies very nicely.

362
00:20:49,080 --> 00:20:51,649
[Downey]<i> To get help
 implementing the technology,</i>

363
00:20:51,682 --> 00:20:53,718
<i> Lily's team reached out
 to Verily,</i>

364
00:20:53,751 --> 00:20:56,287
<i> the life sciences unit
 at Alphabet.</i>

365
00:20:56,320 --> 00:20:58,089
[Mega] So, how was India?

366
00:20:58,123 --> 00:20:59,357
[Peng] Oh, amazing!

367
00:20:59,391 --> 00:21:01,226
[Mega]<i> Verily
 came out of Google X,</i>

368
00:21:01,259 --> 00:21:05,296
and we sit at the intersection
of technology, life science,
and healthcare.

369
00:21:05,330 --> 00:21:07,532
<i> What we try to do is
 think about big problems</i>

370
00:21:07,565 --> 00:21:09,300
<i> that are affecting
 many patients,</i>

371
00:21:09,333 --> 00:21:11,436
<i> and how can we bring
 the best tools</i>

372
00:21:11,469 --> 00:21:12,603
<i> and best technologies</i>

373
00:21:12,637 --> 00:21:14,205
<i> to get ahead of the problems.</i>

374
00:21:14,239 --> 00:21:17,709
The technical pieces
are so important,
and so is the methodology.

375
00:21:17,743 --> 00:21:19,644
How do you capture
the right image,

376
00:21:19,677 --> 00:21:21,412
and how does the algorithm work,

377
00:21:21,446 --> 00:21:24,015
and how do you deploy
these tools not only here,

378
00:21:24,048 --> 00:21:25,550
but in rural conditions?

379
00:21:25,583 --> 00:21:28,186
If we can speed up
this diagnosis process

380
00:21:28,219 --> 00:21:30,088
and augment the clinical care,

381
00:21:30,121 --> 00:21:31,990
<i>then we can prevent blindness.</i>

382
00:21:32,023 --> 00:21:36,128
[Downey]<i> There aren't many
 other bigger problems that
 affect more patients.</i>

383
00:21:36,161 --> 00:21:39,030
<i> Diabetes affects
 400 million worldwide,</i>

384
00:21:39,063 --> 00:21:41,433
<i> 70 million in India alone,</i>

385
00:21:41,466 --> 00:21:44,502
<i> which is why Jessica
 and Lily's teams</i>

386
00:21:44,536 --> 00:21:47,405
<i> began testing AI-enabled
 eye scanners there,</i>

387
00:21:47,439 --> 00:21:49,374
<i> in its most rural areas,</i>

388
00:21:49,407 --> 00:21:52,276
<i> like Dr. Kim's
 Aravind Eye Clinics.</i>

389
00:21:52,310 --> 00:21:55,146
-Is the camera on?
-Now it's on.

390
00:21:55,179 --> 00:21:56,147
Yeah.

391
00:21:56,180 --> 00:21:57,415
So once the camera is up,

392
00:21:57,449 --> 00:21:59,484
we need to check
network connectivity.

393
00:21:59,517 --> 00:22:01,419
[Sunny Virmani]
<i> The patient comes in.</i>

394
00:22:01,452 --> 00:22:03,521
They get pictures
of the back of the eye.

395
00:22:03,554 --> 00:22:05,389
One for the left eye,
and right eye.

396
00:22:05,423 --> 00:22:07,992
<i> The images are uploaded
 to this algorithm,</i>

397
00:22:08,026 --> 00:22:10,561
<i> and once the algorithm
 performs its analysis,</i>

398
00:22:10,595 --> 00:22:13,364
<i> it sends the results back
 to the system,</i>

399
00:22:13,397 --> 00:22:15,734
along with
a referral recommendation.

400
00:22:15,767 --> 00:22:17,936
It's good.
It's up and running.

401
00:22:17,969 --> 00:22:20,705
Because the algorithm
works in real time,

402
00:22:20,739 --> 00:22:23,442
you can get
a real-time answer to a doctor,

403
00:22:23,475 --> 00:22:26,445
<i> and that real-time answer
 comes back to the patient.</i>

404
00:22:30,382 --> 00:22:32,250
[Kim]<i> Once you have
 the algorithm,</i>

405
00:22:32,284 --> 00:22:34,853
<i> it's like taking
 your weight measurement.</i>

406
00:22:34,886 --> 00:22:36,054
<i> Within a few seconds,</i>

407
00:22:36,087 --> 00:22:39,358
the system tells you whether
you have retinopathy or not.

408
00:22:40,391 --> 00:22:41,793
[Downey]<i> In the past,</i>

409
00:22:41,826 --> 00:22:44,462
<i> Santhi's condition could've
 taken months to diagnose,</i>

410
00:22:44,495 --> 00:22:45,864
<i> if diagnosed at all.</i>

411
00:23:20,131 --> 00:23:23,501
[Downey]<i> By the time
 an eye doctor would've
 been able to see her,</i>

412
00:23:23,535 --> 00:23:26,671
<i> Santhi's diabetes might have
 caused her to go blind.</i>

413
00:23:27,705 --> 00:23:29,841
<i> Now, with the help
 of new technology,</i>

414
00:23:29,874 --> 00:23:31,376
<i> it's immediate,</i>

415
00:23:31,409 --> 00:23:33,378
<i> and she can take
 the hour-long bus ride</i>

416
00:23:33,411 --> 00:23:35,580
<i>to Dr. Kim's clinic in Madurai</i>

417
00:23:35,613 --> 00:23:37,149
<i> for same-day treatment.</i>

418
00:23:49,560 --> 00:23:51,163
[Downey]
<i> Now thousands of patients</i>

419
00:23:51,196 --> 00:23:53,631
<i> who may have waited weeks
 or months to be seen</i>

420
00:23:53,664 --> 00:23:56,601
<i> can get the help they need
 before it's too late.</i>

421
00:24:11,115 --> 00:24:12,951
Thank you, sir.

422
00:24:27,598 --> 00:24:29,033
[Downey]<i> Retinopathy</i>

423
00:24:29,067 --> 00:24:32,003
<i> is when high blood sugar
 damages the retina.</i>

424
00:24:32,037 --> 00:24:34,639
<i> Blood leaks,
 and the laser treatment</i>

425
00:24:34,672 --> 00:24:36,541
<i> basically "welds"
 the blood vessels</i>

426
00:24:36,574 --> 00:24:38,777
<i> to stop the leakage.</i>

427
00:24:38,810 --> 00:24:41,746
<i> Routine eye exams
 can spot the problem early.</i>

428
00:24:41,779 --> 00:24:44,048
<i> In rural or remote areas,
 like here,</i>

429
00:24:44,082 --> 00:24:48,753
<i> AI can step in and be
 that early detection system.</i>

430
00:24:57,662 --> 00:25:01,466
[Pedro Domingos] I think
one of the most important
applications of AI.

431
00:25:01,499 --> 00:25:04,736
<i> is in places
 where doctors are scarce.</i>

432
00:25:04,769 --> 00:25:08,038
In a way, what AI does
is make intelligence cheap,

433
00:25:08,072 --> 00:25:11,509
and now imagine what
you can do when you make
intelligence cheap.

434
00:25:11,543 --> 00:25:13,978
People can go
to doctors they
couldn't before.

435
00:25:14,012 --> 00:25:16,848
<i> It may not be
 the impact that catches
 the most headlines,</i>

436
00:25:16,881 --> 00:25:20,218
<i> but in many ways it'll be
 the most important impact.</i>

437
00:25:31,829 --> 00:25:34,199
[family chattering happily]

438
00:25:40,238 --> 00:25:42,774
[Mega] AI now is
this next generation of tools

439
00:25:42,807 --> 00:25:45,510
<i>that we can apply to clinically
 meaningful problems,</i>

440
00:25:45,543 --> 00:25:49,314
so AI really starts
to democratize healthcare.

441
00:25:56,521 --> 00:25:59,090
[Mega] The work
with diabetic retinopathy

442
00:25:59,123 --> 00:26:02,260
is opening our eyes
to so much potential.

443
00:26:03,394 --> 00:26:05,230
<i> Even within these images,</i>

444
00:26:05,263 --> 00:26:07,565
<i> we're starting to see
 some interesting signals</i>

445
00:26:07,598 --> 00:26:11,002
<i> that might tell us about
 someone's risk factors
 for heart disease.</i>

446
00:26:11,035 --> 00:26:13,070
<i> And from there,
 you start to think about</i>

447
00:26:13,104 --> 00:26:16,274
<i> all of the images that
 we collect in medicine.</i>

448
00:26:16,307 --> 00:26:18,609
<i>Can you use AI or an algorithm</i>

449
00:26:18,642 --> 00:26:20,745
<i> to help patients and doctors</i>

450
00:26:20,778 --> 00:26:23,214
<i>get ahead of a given diagnosis?</i>

451
00:26:23,248 --> 00:26:26,750
Take cancer as an example
of how AI can help save lives.

452
00:26:26,784 --> 00:26:29,120
<i> We could take a sample
 of somebody's blood</i>

453
00:26:29,153 --> 00:26:31,489
<i> and look for the minuscule
 amounts of cancer DNA</i>

454
00:26:31,523 --> 00:26:36,561
or tumor DNA in that blood.
This is a great application
for machine learning.

455
00:26:36,594 --> 00:26:38,363
[Downey]
<i> And why stop there?</i>

456
00:26:38,396 --> 00:26:39,998
<i> Could AI accomplish</i>

457
00:26:40,031 --> 00:26:42,767
<i> what human researchers
 have not yet been able to?</i>

458
00:26:42,800 --> 00:26:45,136
Figuring out how cells
work well enough

459
00:26:45,170 --> 00:26:48,506
that you can understand
why a tumor grows
and how to stop it

460
00:26:48,540 --> 00:26:50,475
without hurting
the surrounding cells.

461
00:26:50,508 --> 00:26:52,677
[Downey]<i> And if cancer
 could be cured,</i>

462
00:26:52,710 --> 00:26:54,445
<i>maybe mental health disorders,</i>

463
00:26:54,478 --> 00:26:56,081
<i> like depression, or anxiety.</i>

464
00:26:56,114 --> 00:26:58,682
There are facial
and vocal biomarkers

465
00:26:58,716 --> 00:27:00,585
of these
mental health disorders.

466
00:27:00,619 --> 00:27:03,521
<i> People check their phones
 15 times an hour.</i>

467
00:27:03,555 --> 00:27:05,089
<i> So that's an opportunity</i>

468
00:27:05,123 --> 00:27:08,093
<i> to almost do, like,
 a well-being checkpoint.</i>

469
00:27:08,126 --> 00:27:10,462
You can flag that
to the individual,

470
00:27:10,495 --> 00:27:11,562
to a loved one,

471
00:27:11,595 --> 00:27:14,365
<i> or in some cases
 even to a doctor.</i>

472
00:27:14,399 --> 00:27:17,368
[Bran Ferren]
<i> If you look at the overall
 field of medicine,</i>

473
00:27:17,401 --> 00:27:21,939
<i> how do you do a great job
 of diagnosing illness?</i>

474
00:27:21,972 --> 00:27:24,575
<i>Having artificial intelligence,</i>

475
00:27:24,608 --> 00:27:27,946
the world's greatest
diagnostician, helps.

476
00:27:31,616 --> 00:27:33,050
[Downey]<i> At Google,</i>

477
00:27:33,084 --> 00:27:35,386
<i> Julie and the Euphonia team
 have been working for months</i>

478
00:27:35,419 --> 00:27:38,323
<i> trying to find a way
 for former NFL star Tim Shaw</i>

479
00:27:38,356 --> 00:27:39,958
<i> to get his voice back.</i>

480
00:27:39,991 --> 00:27:42,360
[Dimitri Kanevsky speaking]

481
00:27:47,665 --> 00:27:50,167
Yes! So Zach's team,
the DeepMind team,

482
00:27:50,201 --> 00:27:53,738
has built a model
that can imitate your voice.

483
00:27:53,771 --> 00:27:55,073
For Tim, we were lucky,

484
00:27:55,106 --> 00:27:59,043
because, you know, Tim has
a career of NFL player,

485
00:27:59,077 --> 00:28:02,613
so he did multiple
radio interviews
and TV interviews,

486
00:28:02,647 --> 00:28:04,349
so he sent us
this footage.

487
00:28:04,382 --> 00:28:07,685
<i> Hey, this is Tim Shaw,
 special teams animal.</i>

488
00:28:07,719 --> 00:28:10,721
<i> Christmas is coming,
 so we need to find out</i>

489
00:28:10,754 --> 00:28:12,356
<i> what the Titans players
 are doing.</i>

490
00:28:12,390 --> 00:28:14,859
<i> If you gotta hesitate,
 that's probably a "no."</i>

491
00:28:14,892 --> 00:28:17,261
[Cattiau]<i> Tim will be able
 to type what he wants,</i>

492
00:28:17,294 --> 00:28:21,565
and the prototype will say it
in Tim's voice.

493
00:28:21,598 --> 00:28:22,967
<i> I've always loved attention.</i>

494
00:28:23,000 --> 00:28:24,936
<i> Don't know if you
 know that about me.</i>

495
00:28:24,969 --> 00:28:26,904
[laughs]<i> She's gonna
 shave it for you.</i>

496
00:28:26,937 --> 00:28:29,073
[Downey]<i> Interpreting speech
 is one thing,</i>

497
00:28:29,106 --> 00:28:31,575
<i> but re-creating the way
 a real person sounds</i>

498
00:28:31,608 --> 00:28:33,311
is an order of magnitude harder.

499
00:28:33,344 --> 00:28:36,280
<i> Playing Tecmo Bowl,
 eating Christmas cookies,
 and turkey.</i>

500
00:28:36,313 --> 00:28:40,151
[Downey]<i> Voice imitation
 is also known as
 voice synthesis,</i>

501
00:28:40,185 --> 00:28:43,220
<i> which is basically
speech recognition in reverse.</i>

502
00:28:43,253 --> 00:28:46,925
<i> First, machine learning
 converts text back
 into waveforms.</i>

503
00:28:46,958 --> 00:28:50,061
<i> These waveforms are then
 used to create sound.</i>

504
00:28:50,094 --> 00:28:53,965
<i> This is how Alexa
 and Google Home
 are able to talk to us.</i>

505
00:28:53,998 --> 00:28:56,734
<i> Now the teams from DeepMind
 and Google AI</i>

506
00:28:56,768 --> 00:28:58,336
<i> are working to create a model</i>

507
00:28:58,369 --> 00:29:01,739
<i> to imitate the unique sound
 of Tim's voice.</i>

508
00:29:01,773 --> 00:29:03,841
Looks like it's computing.

509
00:29:03,874 --> 00:29:05,943
But it worked this morning?

510
00:29:05,976 --> 00:29:08,012
We have to set expectations
quite low.

511
00:29:08,045 --> 00:29:11,115
[Cattiau]<i> I don't know how
our model is going to perform.</i>

512
00:29:11,149 --> 00:29:13,851
I hope that Tim will understand

513
00:29:13,884 --> 00:29:16,287
<i>and actually see the technology
 for what it is,</i>

514
00:29:16,320 --> 00:29:20,692
which is a work in progress
and a research project.

515
00:29:20,725 --> 00:29:25,997
[Downey]<i> After six months
 of waiting, Tim Shaw is
 about to find out.</i>

516
00:29:26,030 --> 00:29:28,365
<i> The team working on
 his speech recognition model</i>

517
00:29:28,399 --> 00:29:31,102
<i> is coming to his house
 for a practice run.</i>

518
00:29:33,471 --> 00:29:34,539
[doorbell rings]

519
00:29:34,572 --> 00:29:36,174
[dog barks]

520
00:29:39,009 --> 00:29:41,079
[Sharon] Good girl,
come say hello.

521
00:29:41,913 --> 00:29:43,548
-Hi!
-Oh, hi!

522
00:29:43,581 --> 00:29:44,749
Welcome.

523
00:29:44,782 --> 00:29:45,850
-Hi!
-Come in.

524
00:29:45,883 --> 00:29:47,752
Thanks for having us.

525
00:29:47,785 --> 00:29:49,921
[Sharon] He's met
some of you before,
right?

526
00:29:49,954 --> 00:29:51,823
How are you
doing, Tim?

527
00:29:51,856 --> 00:29:53,591
-Hi, Tim.
-Good to see you.

528
00:29:53,625 --> 00:29:55,259
-Hello.
-Hello.

529
00:29:55,293 --> 00:29:56,327
Hi.

530
00:29:56,361 --> 00:29:59,097
Hi, I'm Julie.
We saw each other
on the camera.

531
00:30:01,065 --> 00:30:03,401
It's warmer here
than it is in Boston.

532
00:30:03,434 --> 00:30:05,202
[Sharon] As it should be.

533
00:30:05,236 --> 00:30:06,571
[all laughing]

534
00:30:09,340 --> 00:30:10,408
Okay.

535
00:30:11,408 --> 00:30:13,377
Lead the way, Tim.

536
00:30:13,410 --> 00:30:16,714
[Cattiau]<i> I'm excited
 to share with Tim
 and his parents</i>

537
00:30:16,747 --> 00:30:18,249
what we've been working on.

538
00:30:18,283 --> 00:30:21,018
I'm a little bit nervous.
I don't know if the app

539
00:30:21,052 --> 00:30:24,155
is going to behave
the way we hope it will behave,

540
00:30:24,188 --> 00:30:27,792
but I'm also very excited,
to learn new things

541
00:30:27,825 --> 00:30:29,260
and to hear Tim's feedback.

542
00:30:29,293 --> 00:30:32,063
So I brought
two versions with me.

543
00:30:32,096 --> 00:30:34,899
I was supposed to pick,
but I decided
to just bring both

544
00:30:34,932 --> 00:30:37,335
just in case one is better
than the other,

545
00:30:37,368 --> 00:30:38,803
and, just so you know,

546
00:30:38,836 --> 00:30:41,639
this one here was trained

547
00:30:41,672 --> 00:30:44,075
only using recordings
of your voice,

548
00:30:44,108 --> 00:30:47,445
and this one here was trained
using recordings of your voice,

549
00:30:47,478 --> 00:30:51,215
and also from other participants
from ALS TDI

550
00:30:51,248 --> 00:30:54,685
who went through
the same exercise of...
[laughing]

551
00:30:54,718 --> 00:30:57,388
So, okay.

552
00:30:57,422 --> 00:30:59,924
I was hoping we could
give them a try.

553
00:30:59,958 --> 00:31:02,260
Are we ready?

554
00:31:03,995 --> 00:31:06,965
Who are you talking about?

555
00:31:12,236 --> 00:31:13,905
[app chimes]

556
00:31:15,206 --> 00:31:16,307
It got it.

557
00:31:16,340 --> 00:31:18,243
[John] It got it.

558
00:31:19,643 --> 00:31:20,912
[gasps]

559
00:31:24,582 --> 00:31:27,451
[Tim] Is he coming?

560
00:31:27,484 --> 00:31:28,453
[app chimes]

561
00:31:29,554 --> 00:31:30,889
Yes.

562
00:31:33,991 --> 00:31:36,961
Are you working today?

563
00:31:36,994 --> 00:31:38,396
[app chimes]

564
00:31:44,835 --> 00:31:46,004
[chuckling]

565
00:31:46,037 --> 00:31:48,873
It's wonderful.

566
00:31:48,906 --> 00:31:49,907
[Cattiau] Cool.

567
00:31:49,941 --> 00:31:51,909
Thank you
for trying this.

568
00:31:51,943 --> 00:31:53,044
-Wow!
-It's fabulous.

569
00:31:53,077 --> 00:31:55,279
[John] What I love,
it made mistakes,

570
00:31:55,312 --> 00:31:57,749
-and then it corrected itself.
-Yeah.

571
00:31:57,782 --> 00:31:59,650
I was watching it like,
"That's not it,"

572
00:31:59,683 --> 00:32:02,686
and then it went...
[mimics app]
Then it does it right.

573
00:32:02,720 --> 00:32:04,088
These were phrases,

574
00:32:04,122 --> 00:32:06,890
part of the 70%
that we actually used

575
00:32:06,924 --> 00:32:08,192
to train the model,

576
00:32:08,225 --> 00:32:11,795
but we also set aside
30% of the phrases,

577
00:32:11,828 --> 00:32:14,431
so this might not do as well,

578
00:32:14,465 --> 00:32:18,102
but I was hoping that
we could try some of these too.

579
00:32:18,136 --> 00:32:20,070
[John] So what
we've already done

580
00:32:20,104 --> 00:32:23,474
is him using phrases
that were used
to train the app.

581
00:32:23,508 --> 00:32:24,608
That's right.

582
00:32:24,641 --> 00:32:27,578
Now we're trying to see
if it can recognize phrases

583
00:32:27,611 --> 00:32:30,915
-that weren't part of that.
-[Cattiau] Yes, that's right.

584
00:32:30,948 --> 00:32:32,383
So let's give it a try?

585
00:32:33,917 --> 00:32:36,387
Do you want me to?

586
00:32:39,656 --> 00:32:43,061
Do you have the time to play?

587
00:32:44,828 --> 00:32:46,030
[app chimes]

588
00:32:47,965 --> 00:32:50,501
What happens afterwards?

589
00:32:52,469 --> 00:32:53,237
[app chimes]

590
00:32:53,270 --> 00:32:55,840
Huh. So, on the last one,

591
00:32:55,873 --> 00:32:58,242
this one got it,
and this one didn't.

592
00:32:58,275 --> 00:33:02,546
-We'll pause it. So...
-I love the first one,
where it says,

593
00:33:02,580 --> 00:33:04,782
-"Can you help me
take a shower?"
-[laughing]

594
00:33:04,815 --> 00:33:07,919
-[Cattiau] That's not at all
what he said.
-[John] I know,

595
00:33:07,952 --> 00:33:10,487
you've gotta be really careful
what you ask for.

596
00:33:10,521 --> 00:33:12,122
[all laughing]

597
00:33:12,156 --> 00:33:15,559
[John] So if, when it's
interpreting his voice,

598
00:33:15,592 --> 00:33:17,295
and it makes some errors,

599
00:33:17,328 --> 00:33:19,263
is there a way we can
correct it?

600
00:33:19,296 --> 00:33:22,066
Yeah. We want to add
the option

601
00:33:22,099 --> 00:33:24,335
for you guys to fix
the recordings,

602
00:33:24,368 --> 00:33:27,104
but as of today,
because this is
the very first time

603
00:33:27,137 --> 00:33:28,706
we actually tried this,

604
00:33:28,739 --> 00:33:30,474
we don't have it yet.

605
00:33:30,508 --> 00:33:32,910
[Cattiau]<i> This is still
 a work in progress.</i>

606
00:33:32,943 --> 00:33:34,811
<i> We have
 a speech recognition model</i>

607
00:33:34,845 --> 00:33:36,680
<i> that works for Tim Shaw,</i>

608
00:33:36,714 --> 00:33:38,449
which is, you know, one person,

609
00:33:38,482 --> 00:33:41,185
and we're really hoping
that, you know,

610
00:33:41,218 --> 00:33:43,254
this technology can work
for many people.

611
00:33:43,287 --> 00:33:46,090
There's something else
I want you to try,

612
00:33:46,123 --> 00:33:47,525
if that's okay?

613
00:33:47,558 --> 00:33:50,761
We're working with
another team at Google
called DeepMind.

614
00:33:50,794 --> 00:33:55,366
They're specialized in
voice imitation and synthesis.

615
00:33:56,500 --> 00:33:57,468
[Downey]<i> In 2019,</i>

616
00:33:57,501 --> 00:34:00,605
<i> Tim wrote a letter
 to his younger self.</i>

617
00:34:02,540 --> 00:34:05,776
<i> They are words written by
 a 34-year-old man with ALS</i>

618
00:34:05,810 --> 00:34:08,145
<i> who has trouble communicating</i>

619
00:34:08,179 --> 00:34:09,513
<i> sent back in time</i>

620
00:34:09,547 --> 00:34:15,252
<i> to a 22-year-old
 on the cusp of NFL greatness.</i>

621
00:34:15,285 --> 00:34:18,389
[Cattiau] So let me
give this a try.

622
00:34:18,422 --> 00:34:21,892
I just like using this letter
because it's just so beautiful,

623
00:34:21,925 --> 00:34:24,729
so let me see
if this is gonna work.

624
00:34:27,231 --> 00:34:30,567
[Tim's younger voice]
<i> So, I've decided to
 write you this letter</i>

625
00:34:30,601 --> 00:34:32,636
<i> 'cause I have so much
 to tell you.</i>

626
00:34:32,670 --> 00:34:33,905
<i> I want to explain to you</i>

627
00:34:33,938 --> 00:34:35,973
<i> why it's so difficult
 for me to speak,</i>

628
00:34:36,006 --> 00:34:37,974
<i> the diagnosis, all of it,</i>

629
00:34:38,008 --> 00:34:39,443
<i> and what my life is like now,</i>

630
00:34:39,476 --> 00:34:41,712
<i> 'cause one day,
 you will be in my shoes,</i>

631
00:34:41,745 --> 00:34:44,348
<i>living with the same struggles.</i>

632
00:34:57,961 --> 00:35:01,599
It's his voice,
that I'd forgotten.

633
00:35:14,945 --> 00:35:16,347
We do.

634
00:35:47,578 --> 00:35:49,280
[app chimes]

635
00:36:03,727 --> 00:36:05,028
[app chimes]

636
00:36:05,062 --> 00:36:06,998
We're so happy
to be working with you.

637
00:36:07,031 --> 00:36:08,466
It's really an honor.

638
00:36:12,536 --> 00:36:14,938
[John] The thought
that one day,

639
00:36:14,971 --> 00:36:18,409
that can be linked with this,

640
00:36:18,442 --> 00:36:20,310
and when you speak
as you are now,

641
00:36:20,344 --> 00:36:22,947
it will sound
like that, is...

642
00:36:29,319 --> 00:36:31,689
It's okay. We'll wait.

643
00:36:33,057 --> 00:36:34,758
[Cattiau]<i> There is
 a lot of unknown</i>

644
00:36:34,791 --> 00:36:37,861
<i> and still a lot of research
 to be conducted.</i>

645
00:36:37,894 --> 00:36:41,198
We're really trying to have
a proof of concept first,

646
00:36:41,232 --> 00:36:44,268
and then expand to not only
people who have ALS,

647
00:36:44,302 --> 00:36:45,669
<i> but people
 who had a stroke,</i>

648
00:36:45,702 --> 00:36:48,539
or a traumatic
brain injury,
multiple sclerosis,

649
00:36:48,572 --> 00:36:51,008
any types of
neurologic conditions.

650
00:36:51,041 --> 00:36:52,776
Maybe other languages,
too, you know?

651
00:36:52,809 --> 00:36:56,413
I would really like this
to work for French, for example.

652
00:36:56,446 --> 00:36:58,683
[Mega]<i> Wouldn't it be
 a wonderful opportunity</i>

653
00:36:58,716 --> 00:37:01,518
<i> to bring technology
to problems that we're solving</i>

654
00:37:01,551 --> 00:37:03,253
<i>in life science and healthcare,</i>

655
00:37:03,286 --> 00:37:05,222
and in fact,
it's a missed opportunity

656
00:37:05,256 --> 00:37:07,892
if we don't try to bring
the best technologies

657
00:37:07,925 --> 00:37:09,126
<i> to help people.</i>

658
00:37:09,159 --> 00:37:10,728
This is really
just the beginning.

659
00:37:10,761 --> 00:37:13,097
[Downey]
<i> Just the beginning indeed.</i>

660
00:37:13,130 --> 00:37:15,332
<i> Imagine the possibilities.</i>

661
00:37:15,365 --> 00:37:19,169
I think in the imaginable
future for AI and healthcare

662
00:37:19,202 --> 00:37:22,139
is that there is
no healthcare anymore,

663
00:37:22,173 --> 00:37:24,008
<i> because nobody needs it.</i>

664
00:37:24,041 --> 00:37:27,744
<i> You could have an AI
 that is directly talking
 to your immune system,</i>

665
00:37:27,778 --> 00:37:30,480
<i> and is actually preemptively
 creating the antibodies</i>

666
00:37:30,514 --> 00:37:32,917
<i> for the epidemics
 that are coming your way,</i>

667
00:37:32,950 --> 00:37:34,184
and will not be stopped.

668
00:37:34,218 --> 00:37:37,254
This will not happen
tomorrow, but it's
the long-term goal

669
00:37:37,288 --> 00:37:38,756
that we can point towards.

670
00:37:43,827 --> 00:37:46,163
[Downey]<i> Tim had never
 heard his own words</i>

671
00:37:46,196 --> 00:37:48,799
<i> read out loud before today.</i>

672
00:37:48,832 --> 00:37:51,568
Neither had his parents.

673
00:37:51,602 --> 00:37:54,939
[Tim]<i> Every single day
 is a struggle for me.</i>

674
00:37:54,972 --> 00:37:58,075
<i> I can barely move my arms.</i>

675
00:37:58,109 --> 00:38:00,410
[John] Have fun.

676
00:38:00,443 --> 00:38:02,346
<i> I can't walk on my own,</i>

677
00:38:02,379 --> 00:38:06,550
<i> so I recently started
 using a wheelchair.</i>

678
00:38:07,551 --> 00:38:10,454
<i> I have trouble
 chewing and swallowing.</i>

679
00:38:10,487 --> 00:38:12,857
<i>I'd kill for a good pork chop.</i>

680
00:38:12,890 --> 00:38:16,794
<i> Yes, my body is failing,
 but my mind is not giving up.</i>

681
00:38:18,262 --> 00:38:20,430
<i> Find what's most important
 in your life,</i>

682
00:38:20,464 --> 00:38:22,566
<i> and live for that.</i>

683
00:38:24,034 --> 00:38:26,937
<i> Don't let three letters, NFL,</i>

684
00:38:26,970 --> 00:38:28,372
<i> define you...</i>

685
00:38:28,406 --> 00:38:30,007
[crowd cheering]

686
00:38:31,809 --> 00:38:36,514
<i> ...the same way I refuse
to let three letters define me.</i>

687
00:38:43,254 --> 00:38:45,455
[John]<i> One of the things
 Tim has taught us,</i>

688
00:38:45,489 --> 00:38:48,925
<i> and I think it's a lesson
 for everyone...</i>

689
00:38:48,959 --> 00:38:51,962
Medically speaking,
Tim's life has an end to it.

690
00:38:51,995 --> 00:38:55,966
In fact, five years ago
we were told he only had
two to five years left.

691
00:38:55,999 --> 00:38:57,368
We're already past that.

692
00:38:58,735 --> 00:39:01,638
<i> He has learned very quickly</i>

693
00:39:01,672 --> 00:39:05,442
<i> that today
 is the day that we have,</i>

694
00:39:05,475 --> 00:39:08,579
and we can ruin today

695
00:39:08,612 --> 00:39:10,314
by thinking about yesterday

696
00:39:10,347 --> 00:39:12,082
and how wonderful it used to be,

697
00:39:12,116 --> 00:39:16,053
<i> and, "Oh, woe is me,"</i>
and "I wish it was like that."

698
00:39:16,086 --> 00:39:17,521
<i> We can also ruin today</i>

699
00:39:17,554 --> 00:39:19,222
<i> by looking into the future,</i>

700
00:39:19,256 --> 00:39:20,290
<i> and in Tim's case,</i>

701
00:39:20,323 --> 00:39:22,425
<i> how horrible
 this is going to be.</i>

702
00:39:22,459 --> 00:39:23,794
"This is going to happen,"

703
00:39:23,827 --> 00:39:25,996
<i> "I won't be able
 to do this anymore."</i>

704
00:39:26,029 --> 00:39:28,665
<i> So if we go either
 of those directions,</i>

705
00:39:28,698 --> 00:39:31,068
<i> it spoils us
 from being present today.</i>

706
00:39:31,101 --> 00:39:32,569
<i>That's a lesson for all of us.</i>

707
00:39:32,602 --> 00:39:35,406
<i> Whether we have
 an ALS diagnosis or not,</i>

708
00:39:35,439 --> 00:39:38,241
try to see the good
and the blessing of every day.

709
00:39:38,275 --> 00:39:40,644
You're here with us today.

710
00:39:40,677 --> 00:39:42,747
It's going to be a good day.


