WEBVTT

00:00.533 --> 00:03.700
(audience applauding)

00:11.370 --> 00:13.500
- I hope everyone had
a good lunch or is busy

00:13.500 --> 00:15.343
finishing up an excellent lunch.

00:18.800 --> 00:21.593
I'm joined with two close friends of mine,

00:22.670 --> 00:25.020
and I'm probably the only
person who can say this

00:25.020 --> 00:26.370
in the entire world.

00:26.370 --> 00:28.786
I work with and for both of them.

00:28.786 --> 00:29.670
(audience laughing)

00:29.670 --> 00:32.540
So I want to make sure I
disclose my conflict of interest

00:32.540 --> 00:34.300
to start with.

00:34.300 --> 00:35.810
So General Shanahan went

00:35.810 --> 00:38.500
to Michigan
- Go Blue

00:38.500 --> 00:42.380
- Go Blue, ROTC, entered
the service of our country

00:42.380 --> 00:43.453
in 1984.

00:45.010 --> 00:46.900
He's been promoted a gazillion times,

00:46.900 --> 00:48.120
he was in charge of a whole bunch

00:48.120 --> 00:49.360
of intelligence activities,

00:49.360 --> 00:51.430
a whole bunch of operational activities,

00:51.430 --> 00:55.540
and eventually we needed
somebody operationally

00:55.540 --> 00:58.220
to implement AI in the entire DoD

00:58.220 --> 01:01.720
and he was the perfect choice.

01:01.720 --> 01:04.370
So I work with him in my
role as Chairman of the DIB.

01:05.310 --> 01:09.133
Kent Walker was a federal prosecutor.

01:10.346 --> 01:13.910
Law and order federal prosecutor
who then chose to come

01:13.910 --> 01:15.370
to Silicon Valley, actually I think worked

01:15.370 --> 01:17.100
at eBay for awhile.

01:17.100 --> 01:20.540
And then we snagged him, we
being Google, maybe 15 years

01:20.540 --> 01:21.730
ago,
- Yeah, I guess

01:21.730 --> 01:24.150
- 15 years every day together

01:25.070 --> 01:27.870
and during that time not
only did he set up our legal

01:27.870 --> 01:31.490
function but is now in
charge of all global policy,

01:31.490 --> 01:33.120
PR, all those sorts of things together

01:33.120 --> 01:35.630
so very, very significant players.

01:35.630 --> 01:39.100
And what I thought we should
do since you all have heard

01:39.100 --> 01:44.100
from me plenty, is simply
start and perhaps Kent,

01:44.460 --> 01:46.440
we should just have you make some comments

01:46.440 --> 01:48.730
about the world as you see it today.

01:48.730 --> 01:51.250
- Sure, so thank you very much Eric,

01:51.250 --> 01:53.000
General Shanahan, it's a
pleasure to be with you today

01:53.000 --> 01:54.530
and with all of you.

01:54.530 --> 01:58.450
The topic of today's panel,
Private Public Partnerships,

01:58.450 --> 02:00.510
is extraordinarily important to me.

02:00.510 --> 02:03.990
I grew up in this community,
my father was in the service

02:03.990 --> 02:06.870
for 24 years, I was born
and spent the first years

02:06.870 --> 02:09.350
of my life on U.S. military bases.

02:09.350 --> 02:11.860
My father finished his career at Lockheed,

02:11.860 --> 02:14.700
so I feel a profound commitment
to getting this right,

02:14.700 --> 02:17.816
to making sure that the private
sector, the defense sector,

02:17.816 --> 02:22.090
and universities can work
together in the best possible way.

02:22.090 --> 02:25.410
Before we jump in to the
thoughts on how we can accomplish

02:25.410 --> 02:29.180
that, I wanted to take
on two issues up front.

02:29.180 --> 02:32.610
It's been frustrating
to hear concerns around

02:32.610 --> 02:35.650
our commitments to the
national security and defense

02:35.650 --> 02:38.880
and so I wanted to set the
record straight on two issues.

02:38.880 --> 02:40.360
First on China.

02:40.360 --> 02:43.750
In 2010 you may remember
that Google was public

02:43.750 --> 02:46.360
about an attack on our
infrastructure that originated

02:46.360 --> 02:50.930
in China, sophisticated
cyber security attack.

02:50.930 --> 02:53.770
We learned a lot from that
experience and while a number

02:53.770 --> 02:56.600
of our peer companies have
significant commercial

02:56.600 --> 02:59.610
and AI operations in China,
we have chosen to scope

02:59.610 --> 03:02.260
our operations there very carefully.

03:02.260 --> 03:06.110
Our focus is on advertising
and on work supporting

03:06.110 --> 03:07.310
an open source platform.

03:08.150 --> 03:10.670
Second, with regard to
the more general question

03:10.670 --> 03:13.210
of national security and our engagement

03:13.210 --> 03:17.770
in the Maven Project, it
is an area where it's right

03:17.770 --> 03:20.410
that we decided to press the reset button

03:20.410 --> 03:22.750
until we had an opportunity
to develop our own set

03:22.750 --> 03:25.040
of AI principles, our own work with regard

03:25.040 --> 03:27.760
to internal standards
and review processes.

03:27.760 --> 03:30.680
But that was the decision
focused on a discreet contract,

03:30.680 --> 03:32.995
not a broader statement
about our willingness

03:32.995 --> 03:36.060
or history of working with
the Department of Defense

03:36.060 --> 03:38.290
and the National Security Administration.

03:38.290 --> 03:42.270
We continue to do that, we
are committed to doing that,

03:42.270 --> 03:45.200
and that work builds on
a long tradition of work

03:45.200 --> 03:49.860
throughout the Valley on
national security generally.

03:49.860 --> 03:53.300
It's important to remember
that the history of the Valley

03:53.300 --> 03:55.910
in large measure builds
on government technologies

03:55.910 --> 04:00.910
from radar to the internet
to GPS to some of the work

04:00.940 --> 04:04.160
on autonomous vehicles
and personal assistants

04:04.160 --> 04:05.760
you're seeing now.

04:05.760 --> 04:08.560
Just in the last couple of
weeks we had an extraordinary

04:08.560 --> 04:11.430
accomplishment with regard
to quantum supremacy

04:11.430 --> 04:13.730
which moved forward the
frontiers of science

04:13.730 --> 04:16.570
and technology, but that
was not an achievement

04:16.570 --> 04:17.920
by Google alone.

04:17.920 --> 04:20.220
It built on research that had been done

04:20.220 --> 04:22.320
at the University of
California in Santa Barbara,

04:22.320 --> 04:24.620
it benefited from extensive consultation

04:24.620 --> 04:27.190
with research scientists at NASA,

04:27.190 --> 04:30.070
it was carried out in many
ways on super computers

04:30.070 --> 04:31.900
from the Department of Energy.

04:31.900 --> 04:35.900
So those kinds of exchanges
and collaborations

04:35.900 --> 04:39.470
are really key to what has
made America technological

04:39.470 --> 04:42.700
innovation as successful as it's been.

04:42.700 --> 04:45.190
And just as we feel as we're contributing

04:45.190 --> 04:48.260
to the defense community,
national security community,

04:48.260 --> 04:51.570
a lot of that work, that
community, is a part of Google.

04:51.570 --> 04:54.000
We have lots of vets who work at Google.

04:54.000 --> 04:56.110
We go above and beyond to
make sure that reservists

04:56.110 --> 04:58.800
working at Google can complete
their military service

04:58.800 --> 05:01.860
while having thriving
careers, and even our tools

05:01.860 --> 05:04.580
we have tried to take steps
to make sure that vets

05:04.580 --> 05:08.620
transitioning to civilian
life can make the best use

05:08.620 --> 05:11.433
of their military skills
in the private sector.

05:12.530 --> 05:15.880
As we do that we also are
fully engaged in a wide

05:15.880 --> 05:18.840
variety of work with different agencies.

05:18.840 --> 05:21.340
With the JAIC we are working
on a number of national

05:21.340 --> 05:25.750
mission initiatives from
cyber security to health care

05:25.750 --> 05:27.470
to business automation.

05:27.470 --> 05:31.360
With DARPA we are working
on a number of fundamental

05:31.360 --> 05:34.710
projects to ensure the robustness of AI,

05:34.710 --> 05:38.770
to identify deep fakes and
to progress work on the end

05:38.770 --> 05:42.990
of Moore's Law and how
to progress the operation

05:42.990 --> 05:46.010
of hardware and use
software/hardware interfaces

05:46.010 --> 05:47.480
in better ways.

05:47.480 --> 05:50.650
So as we take on those
kinds of things we're eager

05:50.650 --> 05:54.040
to do more, we are pursuing
actively additional

05:54.040 --> 05:57.400
certifications that will
allow us to more fully engage

05:57.400 --> 06:00.200
across a range of these
different topic areas.

06:00.200 --> 06:02.410
And we think that's extremely important.

06:02.410 --> 06:04.430
At the same time we think
there's a great partnership

06:04.430 --> 06:07.450
to be had on the work
that the DIB has announced

06:07.450 --> 06:10.120
in the last week, their AI principles,

06:10.120 --> 06:12.860
which were I thought very well done,

06:12.860 --> 06:15.140
it's a lengthy document
but a thoughtful document.

06:15.140 --> 06:17.380
And really continues
to work the groundwork

06:17.380 --> 06:20.230
that was laid by the
Department of Defense back

06:20.230 --> 06:23.940
in I think it was 2012 with
directive three thousand

06:23.940 --> 06:27.030
zero nine which talked about
the use of human judgment

06:28.293 --> 06:29.850
in the application of
advanced technologies,

06:29.850 --> 06:32.880
the charter of the JAIC,
the work the DoD has done

06:32.880 --> 06:34.230
with its own AI principles.

06:35.242 --> 06:37.270
In the private sector,
we too have been trying

06:37.270 --> 06:38.360
to drive forward on this.

06:38.360 --> 06:42.090
We have not only put out
principles in very common

06:42.090 --> 06:44.430
and overlapping areas,
there's a lot in common

06:44.430 --> 06:45.680
in these questions.

06:45.680 --> 06:50.680
Safety, human judgment,
accountability, explainability,

06:50.970 --> 06:54.740
fairness are all critical
areas where different

06:56.090 --> 06:58.260
actors in the space, each
have different things

06:58.260 --> 07:01.430
to contribute and I think
that's critically important.

07:01.430 --> 07:04.550
This is a shared responsibility
to get this right.

07:04.550 --> 07:07.420
As the DIB report notes,
we need a global framework,

07:07.420 --> 07:10.110
we need a global approach
to these issues endorsing

07:10.110 --> 07:12.240
the OECD framework around these issues,

07:12.240 --> 07:15.940
extremely important and something
that we want to support,

07:15.940 --> 07:17.540
and we are working together to figure out

07:17.540 --> 07:19.660
where are the complementarities.

07:19.660 --> 07:21.130
Because at the end of
the day we are a proud

07:21.130 --> 07:24.010
American company, we are
committed to the defense

07:24.010 --> 07:26.590
of the United States,
our allies and the safety

07:26.590 --> 07:29.140
and security of the world,
we are eager to continue

07:29.140 --> 07:32.500
this work and think about
places we can work together

07:32.500 --> 07:34.250
to build on each other's strengths.

07:35.230 --> 07:37.100
- Well thank you Kent.

07:37.100 --> 07:39.930
General, take us through what you're up to

07:39.930 --> 07:41.045
at the JAIC.

07:41.045 --> 07:42.500
- Well so first of all let me say thanks,

07:42.500 --> 07:45.050
it is great to be here
and I thank both Eric

07:45.050 --> 07:46.650
and Kent for the opportunity to do this.

07:46.650 --> 07:49.870
Admittedly though, I will
say I'm a poor substitute

07:49.870 --> 07:51.570
for the Chairman of the
Joint Chiefs of Staff,

07:51.570 --> 07:54.090
General Milley, although I
say it's a lower probability

07:54.090 --> 07:55.800
of any headline-grabbing soundbites,

07:55.800 --> 07:57.770
so you get that with it.

07:57.770 --> 08:00.960
And I also confess this
is undoubtedly the first

08:00.960 --> 08:03.470
and last time I will serve as a warmup act

08:03.470 --> 08:05.150
for Dr. Henry Kissinger

08:05.150 --> 08:07.391
(audience laughing)

08:07.391 --> 08:10.090
but hang on for the main event, as I say.

08:10.090 --> 08:12.800
I not only welcome but
I relish the opportunity

08:12.800 --> 08:14.500
to have this broader conversation

08:14.500 --> 08:17.320
about public-private partnerships.

08:17.320 --> 08:19.700
When you ask me to reflect
back on my two years

08:19.700 --> 08:22.620
as the Director of Project
Maven, and just about

08:22.620 --> 08:25.650
a year in the seat as
the director of the JAIC,

08:25.650 --> 08:28.310
there is one over-arching
thing that continues

08:28.310 --> 08:30.840
to resonate strongly with me.

08:30.840 --> 08:34.750
It's the importance, and
I would say the necessity,

08:34.750 --> 08:37.260
of strengthening bonds between government,

08:37.260 --> 08:39.203
industry and academia.

08:40.130 --> 08:42.927
This was said this
morning, you brought it up

08:42.927 --> 08:44.630
and others had also mentioned it,

08:44.630 --> 08:48.350
is this idea this relationship
should be depicted

08:48.350 --> 08:49.410
as a triangle.

08:49.410 --> 08:51.870
And it actually should be in the form

08:51.870 --> 08:56.260
of an equilateral triangle:
government, academia, industry.

08:56.260 --> 08:59.920
I would suggest that that's
largely the form it did take

08:59.920 --> 09:02.960
beginning in the 1950's
and largely lasting

09:02.960 --> 09:05.050
until the early part of this decade.

09:05.050 --> 09:07.810
Walter Isaacson writes
about this very eloquently

09:07.810 --> 09:10.970
and powerfully in his
book, The Innovators.

09:10.970 --> 09:14.850
It is what really drove
to Silicon Valley today.

09:14.850 --> 09:16.920
It's not the case today.

09:16.920 --> 09:19.880
At best the sides of the
triangle are no longer

09:19.880 --> 09:23.976
equidistant, you might
even say they are distorted

09:23.976 --> 09:27.350
or a little frayed in addition to being

09:27.350 --> 09:28.860
of different lengths.

09:28.860 --> 09:32.160
The reasons for that are
complex and they're multi-fold.

09:32.160 --> 09:36.700
Snowden, Apple encryption,
mismatched operating tempo

09:36.700 --> 09:39.810
and agility, different business
models, general mistrust

09:39.810 --> 09:41.680
between the government and industry.

09:41.680 --> 09:44.220
We started talking past each other instead

09:44.220 --> 09:45.800
of with each other.

09:45.800 --> 09:49.410
The task is made much more
difficult today by the fact

09:49.410 --> 09:52.180
that industry is moving
so much faster than the

09:52.180 --> 09:54.640
Department of Defense, in
fact the rest of government,

09:54.640 --> 09:58.040
when it comes to the adoption
and integration of AI.

09:58.040 --> 10:00.220
We're playing perpetual catch-up.

10:00.220 --> 10:03.460
And some employees in
the tech industry see no

10:03.460 --> 10:06.510
compelling reason to work
with the Department of Defense

10:06.510 --> 10:08.560
and even those who want to work with DoD,

10:08.560 --> 10:11.560
which I say is far more
than sometimes is portrayed,

10:11.560 --> 10:14.170
I say put everybody in
this room in that category,

10:14.170 --> 10:17.490
we don't make it easy for them.

10:17.490 --> 10:20.070
So I would just reinforce
some of the themes

10:20.070 --> 10:23.120
that are in the Security
Commission's report

10:23.120 --> 10:25.450
or the interim report,
and that is this idea

10:25.450 --> 10:30.450
of a share sense of responsibility
about our IA future.

10:30.640 --> 10:33.860
A shared vision about
the importance of trust

10:33.860 --> 10:36.110
and transparency.

10:36.110 --> 10:39.340
Our national security depends on it.

10:39.340 --> 10:42.600
And even for those who for various reasons

10:42.600 --> 10:46.260
still view DoD with suspicion
or who are reluctant

10:46.260 --> 10:48.890
to accept that we are in
a strategic competition

10:48.890 --> 10:51.970
with China, I would hope they
would still agree with us

10:51.970 --> 10:55.890
that AI is a critical component
of our nation's prosperity,

10:55.890 --> 10:59.430
vitality and self-sufficiency.

10:59.430 --> 11:02.320
So in other words, no
matter where you stand

11:02.320 --> 11:05.320
in respect to the government's
future use of AI-enabling

11:05.320 --> 11:10.070
technologies, I submit
that we can never attain

11:10.070 --> 11:13.510
the vision outlined in the
Commission's interim report

11:13.510 --> 11:16.950
without industry and
academia with us together

11:16.950 --> 11:18.600
in an equal partnership.

11:18.600 --> 11:21.730
There's too much at stake to do otherwise,

11:21.730 --> 11:23.360
we are in this together.

11:23.360 --> 11:27.700
Public-private partnerships
are the very essence

11:27.700 --> 11:30.060
of America's success as a nation,

11:30.060 --> 11:32.950
not only in the Department
of Defense but across

11:32.950 --> 11:34.900
the entire United States government.

11:34.900 --> 11:36.900
So the message we want to send today,

11:36.900 --> 11:40.343
we have to make this triangle
back to what it used to be.

11:41.210 --> 11:42.623
- Well thank you General.

11:44.230 --> 11:46.950
I think I'm gonna ask a couple
of questions to both of you

11:46.950 --> 11:48.810
and let's start with the
same question to both.

11:48.810 --> 11:52.253
Kent, talk about Maven some more.

11:53.090 --> 11:54.482
- Sure.

11:54.482 --> 11:57.030
(audience laughing)

11:57.030 --> 11:59.810
Well I think it's no
secret that we came up

11:59.810 --> 12:01.200
as a consumer company.

12:01.200 --> 12:03.310
We are quickly evolving into also becoming

12:03.310 --> 12:06.000
an enterprise company and
putting a lot of resources

12:06.000 --> 12:08.090
into that, but there
are different protocols

12:08.090 --> 12:11.040
and different ways of
engaging, and as we go along

12:11.040 --> 12:14.400
that journey, I'd be lying
to tell you that everybody,

12:14.400 --> 12:16.670
all our employees have an identical view

12:16.670 --> 12:19.040
on a lot of hard issues, they don't.

12:19.040 --> 12:22.110
But in some ways that
debate and discussion

12:22.110 --> 12:25.600
is the positive as well as the negative.

12:25.600 --> 12:27.570
It's, in many ways it's
in our DNA but it's

12:27.570 --> 12:29.020
in the DNA of America.

12:29.020 --> 12:31.820
You could argue that that
kind of constructive debate

12:31.820 --> 12:34.660
is America's first innovation.

12:34.660 --> 12:36.470
You look at great research scientists

12:36.470 --> 12:39.280
like Richard Feynman who was
one of the leading thinkers

12:39.280 --> 12:41.850
in quantum mechanics
who was also notoriously

12:41.850 --> 12:44.370
iconoclastic, free-thinking guy.

12:44.370 --> 12:48.320
We think out of that
comes incredible strength.

12:48.320 --> 12:51.990
If we work together well we can
actually have a more robust,

12:51.990 --> 12:55.150
more resilient framework,
a framework that helps

12:55.150 --> 12:57.870
build social trust as well
as a framework that works

12:57.870 --> 12:59.480
for the world.

12:59.480 --> 13:01.880
So as we put forward our AI principles

13:01.880 --> 13:05.180
and our governing processes,
because an important thing

13:05.180 --> 13:08.010
to note is that the principles
in a sense are easy.

13:08.010 --> 13:11.970
As the DIB report notes,
the report devotes a couple

13:11.970 --> 13:14.770
of pages to the principles
and a long section

13:14.770 --> 13:17.690
to the implementation
because you quickly discover

13:17.690 --> 13:20.400
that a lot of the hard problems
are when the principles

13:20.400 --> 13:22.060
conflict and are challenging.

13:22.060 --> 13:24.270
We've had debates about
whether or not to publish

13:24.270 --> 13:26.734
a paper on lip reading which has...

13:26.734 --> 13:28.160
- [Eric] Say that again.

13:28.160 --> 13:30.430
- We have had debates
about whether to publish

13:30.430 --> 13:32.670
a paper on lip reading.

13:32.670 --> 13:34.940
Lip reading is a great
benefit to people who are

13:34.940 --> 13:37.230
hard of hearing around
the world, et cetera,

13:37.230 --> 13:38.920
but you could imagine it could be misused

13:38.920 --> 13:41.650
for surveillance and
other kinds of purposes.

13:41.650 --> 13:44.200
After reviewing a particular technology,

13:44.200 --> 13:45.720
we determined that it was appropriate

13:45.720 --> 13:48.170
to publish because that
particular technology

13:48.170 --> 13:51.120
was useful really only
in one-to-one settings

13:51.120 --> 13:53.200
not for surveillance at a distance.

13:53.200 --> 13:55.260
But it's an example of
the kinds of discussions

13:55.260 --> 13:57.780
we have around issues like lip reading

13:57.780 --> 14:00.900
or facial recognition or
other challenging questions

14:00.900 --> 14:03.500
where we have to come to
terms with the reality,

14:03.500 --> 14:05.080
the trade-offs that we're making.

14:05.080 --> 14:08.800
Very much the case in a
lot of these issues as well

14:08.800 --> 14:10.150
but we think there's an awful lot of room

14:10.150 --> 14:13.260
for collaboration and
coordination on cyber security,

14:13.260 --> 14:16.370
on logistics, on
transportation, on health care,

14:16.370 --> 14:18.150
many more topics where
we're already engaged

14:18.150 --> 14:19.710
with the military.

14:19.710 --> 14:20.870
- [Eric] General, same question.

14:20.870 --> 14:22.580
Tell us more about Maven.

14:22.580 --> 14:24.920
- Okay, so when we started Project Maven,

14:24.920 --> 14:27.610
our intent was to go
after commercial industry.

14:27.610 --> 14:30.340
Eric and the DIB had told us
this is where the solutions

14:30.340 --> 14:32.200
already exist, do not reinvent the wheel,

14:32.200 --> 14:33.230
it happens out there.

14:33.230 --> 14:35.520
And our approach was a simple one.

14:35.520 --> 14:38.740
We wanted everybody in the
market that was a small

14:38.740 --> 14:41.730
startup of 15 people, which
is one of the companies

14:41.730 --> 14:45.520
we got on contract, to
the biggest internet data

14:45.520 --> 14:47.920
cyber cloud companies in the world.

14:47.920 --> 14:50.060
And one of those happened
to be Project Google.

14:50.060 --> 14:52.960
Why did we go after
Project, or why did we have

14:52.960 --> 14:54.320
to Google with Project Maven,

14:54.320 --> 14:58.310
because we wanted to take the
best AI talent in the world

14:58.310 --> 15:01.120
and put it against our
most wicked problem set,

15:01.120 --> 15:03.230
wide-area motion imagery,
it's an extraordinary

15:03.230 --> 15:07.210
difficult problem to go after
and we did a very successful

15:07.210 --> 15:09.590
collaboration with the
Google team on this.

15:09.590 --> 15:12.420
What was happening internal
to the company is how

15:12.420 --> 15:15.320
that played out is a little
bit, a different story,

15:15.320 --> 15:17.910
but we got all the way to
the end of the contract

15:17.910 --> 15:20.930
and we got products that
we were very pleased with.

15:20.930 --> 15:22.640
Now it was unfortunate
I think even for some

15:22.640 --> 15:24.800
of the software engineers on that project,

15:24.800 --> 15:26.450
they got to the point
where they almost felt

15:26.450 --> 15:29.500
a little bit ostracized
because others criticized

15:29.500 --> 15:31.410
them for working with the
Department of Defense.

15:31.410 --> 15:33.460
But day to day, from
the senior-most leader

15:33.460 --> 15:36.850
down to the people working
on the Project Maven team

15:36.850 --> 15:39.720
we got tremendous support from Project,

15:39.720 --> 15:41.520
in Maven from Google.

15:41.520 --> 15:44.630
What we found though, and
this is really the critique

15:44.630 --> 15:48.400
on both sides, is we lost
the narrative very quickly.

15:48.400 --> 15:50.980
And part of this was about
the company made a strategic

15:50.980 --> 15:53.230
decision really not to be
public about what they wanted

15:53.230 --> 15:54.063
to do.

15:54.063 --> 15:55.840
Our approach in the Department
of Defense is willing

15:55.840 --> 15:58.760
to talk as much as the company
wanted us to talk about,

15:58.760 --> 16:00.170
we do whatever the market would bear,

16:00.170 --> 16:01.750
in very general terms,
we didn't want to get

16:01.750 --> 16:03.240
into operational specifics.

16:03.240 --> 16:04.950
This was a project, was intelligence,

16:04.950 --> 16:07.380
surveillance, reconnaissance on a drone,

16:07.380 --> 16:09.610
remotely piloted aircraft,
it had no weapons on it,

16:09.610 --> 16:11.650
it was not a weapons
project, it is not a weapons

16:11.650 --> 16:14.430
project, but what happened
is we started hearing

16:14.430 --> 16:16.390
these wild stories and assumptions

16:16.390 --> 16:19.530
about what Project Maven was
and was not to the point where

16:19.530 --> 16:21.810
if you google today, no pun intended,

16:21.810 --> 16:22.810
you actually google,

16:23.690 --> 16:26.320
the adjective controversial has now been

16:26.320 --> 16:28.530
inserted permanently in
front of Project Maven.

16:28.530 --> 16:30.800
It was not controversial to
me, it was not controversial

16:30.800 --> 16:33.190
to the team, I would say
it's not controversial

16:33.190 --> 16:35.640
to anybody right now
beyond some people who just

16:35.640 --> 16:36.900
don't like what we're doing.

16:36.900 --> 16:39.262
So I guess what I bring
it all the way full circle

16:39.262 --> 16:42.030
is this is an interesting
point of, I've thought a lot,

16:42.030 --> 16:43.820
and I'm not sure everybody
fully appreciates

16:43.820 --> 16:45.610
or agrees with me.

16:45.610 --> 16:47.770
I view what happened
with Google and Maven as

16:47.770 --> 16:50.200
a little bit of a canary in a coal mine.

16:50.200 --> 16:52.630
The fact that it happened
when it did as opposed

16:52.630 --> 16:54.500
on the verge of a
conflict or a crisis where

16:54.500 --> 16:56.760
asking for their help,
we've gotten some of that

16:56.760 --> 16:58.340
out of the way, you've heard
Kent talking a little bit

16:58.340 --> 17:01.440
about of a reset here
and how much the company

17:01.440 --> 17:04.000
and all the other
companies that we deal with

17:04.000 --> 17:06.100
want to work with the
Department of Defense.

17:06.100 --> 17:08.660
I think that narrative is
an important narrative,

17:08.660 --> 17:11.810
it happened, it would have
happened to somebody else

17:11.810 --> 17:14.630
at some point, but this
idea of transparency

17:14.630 --> 17:17.080
and a willingness to
talk about what each side

17:17.080 --> 17:19.520
is trying to achieve may
be the biggest lessons

17:19.520 --> 17:21.368
of all that I took from it.

17:21.368 --> 17:23.740
- It's a real tragedy
that we don't wear hats

17:23.740 --> 17:25.850
anymore because I could borrow three hats

17:25.850 --> 17:27.560
and figure out which hat I'm wearing.

17:27.560 --> 17:30.250
With my DIB hat on, I
can tell you that when

17:30.250 --> 17:34.300
I met General Shanahan the real
problem inside the military

17:34.300 --> 17:37.880
is that we take these
exquisitely trained soldiers,

17:37.880 --> 17:40.160
airmen, so forth and so on,
and we put them in front

17:40.160 --> 17:42.750
of mind-numbing observational tasks.

17:42.750 --> 17:46.040
They literally watch screens all day.

17:46.040 --> 17:49.000
And it's a terrible
waste of the human asset

17:49.000 --> 17:50.820
that the military produces.

17:50.820 --> 17:53.650
And so there's a huge
opportunity to try to sort

17:53.650 --> 17:56.060
of get them to work at
a higher level position

17:56.060 --> 17:59.452
and that's why the DIB recommended the,

17:59.452 --> 18:01.240
made the procedure and indeed the creation

18:01.240 --> 18:03.760
of the Joint Center for AI which Kent,

18:03.760 --> 18:06.070
you both stood up and now head.

18:06.070 --> 18:08.394
Let's talk about another
question for both of you

18:08.394 --> 18:11.780
which has to do with ethics.

18:11.780 --> 18:14.530
Now, in the middle of the
kerfuffle that went on

18:14.530 --> 18:17.810
inside of Google, Kent had
the good idea of having

18:17.810 --> 18:22.530
a formal AI ethics proposal,
and he drove inside of Google

18:22.530 --> 18:26.070
an ethics process which
produced a remarkable public

18:26.070 --> 18:30.330
document, now I have my
Google hat on, which I think

18:30.330 --> 18:33.220
is really quite definitive,
and I think maybe you could

18:33.220 --> 18:36.410
talk about that and then similarly,

18:36.410 --> 18:40.058
the DIB produced a
proposal to the military

18:40.058 --> 18:43.200
and I believe you are the
customer for the proposal

18:43.200 --> 18:46.093
that we wrote on military AI ethics.

18:47.870 --> 18:50.860
I assume both of you are
in favor, since Kent wrote

18:50.860 --> 18:53.330
the first one and all the other
industry companies have now

18:53.330 --> 18:56.450
largely copied, variance of
your approach in one form

18:56.450 --> 18:58.870
or another, I assume you
are in favor of this.

18:58.870 --> 19:01.180
What are the consequences
of these ethics things,

19:01.180 --> 19:05.230
does it really work, does,
for example, does Google

19:05.230 --> 19:09.370
prevent, does Google turn off
things or stop doing things

19:09.370 --> 19:11.870
like in the last little
while, I mean, how does it

19:11.870 --> 19:14.603
actually work, and same
question for you General,

19:16.960 --> 19:19.180
there are people who claim
that the military won't operate

19:19.180 --> 19:20.830
under ethics principles.

19:20.830 --> 19:23.780
In our report we cite the
many rules of the military

19:23.780 --> 19:24.960
is required to operate under.

19:24.960 --> 19:26.040
Maybe you could talk about that.

19:26.040 --> 19:26.890
So Kent?

19:26.890 --> 19:27.800
- Sure.

19:27.800 --> 19:30.700
So I think as the General
noted, having frameworks

19:30.700 --> 19:33.440
in place early on, both
the set of principles

19:33.440 --> 19:37.047
but then also the review
processes and escalation

19:37.047 --> 19:39.850
opportunities is a
critical part of internal

19:39.850 --> 19:42.120
as well as external transparency.

19:42.120 --> 19:44.500
It's right, among our
principles we've talked about

19:44.500 --> 19:47.560
surveillance being a concern,
so we want to make sure

19:47.560 --> 19:51.210
that some of the recognition
tools and the image

19:51.210 --> 19:53.510
tracking software that we're
developing are deployed

19:53.510 --> 19:54.800
in appropriate ways.

19:54.800 --> 19:56.480
We want to be a good partner,
we don't want to pull away

19:56.480 --> 19:59.410
support but we want to
make sure we know the scope

19:59.410 --> 20:02.640
of the project that we're
developing, and when we're

20:02.640 --> 20:04.920
licensing that for
commercial uses have a sense

20:04.920 --> 20:06.400
of the direction of travel there.

20:06.400 --> 20:08.550
I think that's a valuable
thing for both sides

20:08.550 --> 20:10.680
in terms of making sure
that expectations are clear

20:10.680 --> 20:13.110
and in terms of building
no only trust internally

20:13.110 --> 20:14.850
but trust across society.

20:14.850 --> 20:18.100
So another example would
be when it comes to general

20:18.100 --> 20:21.450
purpose APIs for facial
recognition where you don't know

20:21.450 --> 20:23.550
necessarily what use is
gonna be made of them,

20:23.550 --> 20:26.320
we said until we develop
more policy and more

20:26.320 --> 20:29.250
technological safeguards
we're gonna be very cautious

20:29.250 --> 20:30.830
about proceeding in that area.

20:30.830 --> 20:33.060
Another example is when
it comes to weapons,

20:33.060 --> 20:35.320
we have said this is a nascent technology,

20:35.320 --> 20:37.290
we want to be very careful
about the application

20:37.290 --> 20:40.230
of AI in this area, so that's not an area

20:40.230 --> 20:42.560
that we're pursuing, given our background,

20:42.560 --> 20:45.080
we recognize the limits of our experience

20:45.080 --> 20:46.000
in that area.

20:46.000 --> 20:48.070
Obviously the military is gonna be deeper

20:48.070 --> 20:49.990
and have more understanding
of safety implications

20:49.990 --> 20:50.823
and the like.

20:50.823 --> 20:52.350
So we're gonna continue
to work through these

20:52.350 --> 20:53.183
different areas.

20:53.183 --> 20:55.470
I think there's a remarkable
degree of convergence

20:55.470 --> 20:59.467
we see between the OECD, the DoD, the DIB,

21:00.500 --> 21:02.130
now internationally we're starting to see

21:02.130 --> 21:04.100
the European commissions
say they're coming up

21:04.100 --> 21:06.300
with regulations for
artificial intelligence

21:06.300 --> 21:07.670
in the next hundred days.

21:07.670 --> 21:09.500
I think this will be a
very interesting exercise

21:09.500 --> 21:12.280
as we all pursue kind of
a combination of how we

21:12.280 --> 21:15.180
build acceptance for this next
generation of technologies.

21:16.040 --> 21:19.560
- And so looking at it
through the DoD lens,

21:19.560 --> 21:22.080
this may be the best starting
point, when you talk,

21:22.080 --> 21:24.730
Kent mentions this area
of convergence between

21:24.730 --> 21:27.320
commercial industry,
academia and the government,

21:27.320 --> 21:29.270
probably the AI ethics
principles are as good

21:29.270 --> 21:31.580
as anything else to drive
the stake in the ground,

21:31.580 --> 21:34.550
do we agree on all of
these, some of these,

21:34.550 --> 21:36.480
and if we don't disagree
let's get the conversation

21:36.480 --> 21:38.130
going, so it's a good starting point.

21:38.130 --> 21:39.810
Another part is, I need
to state the obvious,

21:39.810 --> 21:44.260
that I can tell you with certainty

21:44.260 --> 21:49.260
that China and Russia did not
embark on a 15-month process

21:49.730 --> 21:52.100
involving public hearings and discussion

21:52.100 --> 21:54.190
about the ethical, safe and lawful use

21:54.190 --> 21:56.310
of artificial intelligence,
they're not doing it.

21:56.310 --> 21:58.800
And I don't expect they ever will do it.

21:58.800 --> 22:01.260
So people may question what
the department is doing

22:01.260 --> 22:02.980
and why we're doing it,
but I tell you what,

22:02.980 --> 22:07.050
we just embarked on this long
process just to make sure

22:07.050 --> 22:09.680
we took into account all
of the different voices

22:09.680 --> 22:12.340
on the ethical use of
artificial intelligence

22:12.340 --> 22:14.210
and I would say the product
that's been delivered

22:14.210 --> 22:16.940
is an excellent product
shaped by a lot of people

22:16.940 --> 22:20.500
who spent time and attention against this.

22:20.500 --> 22:23.530
I've said this in other
settings and over 35,

22:23.530 --> 22:25.860
pretty much 35 and a
half years in uniform,

22:25.860 --> 22:28.400
I have never spent as
much time on this question

22:28.400 --> 22:30.660
of the ethical use of given technology.

22:30.660 --> 22:33.240
The Department of Defense
actually has a long

22:33.240 --> 22:35.680
and I would say a commendable
history despite flaws

22:35.680 --> 22:37.830
along the way of looking
at the ethical use

22:37.830 --> 22:40.700
of emerging technologies
whatever they are.

22:40.700 --> 22:42.890
There are differences with
artificial intelligence

22:42.890 --> 22:45.580
and what the DIB report
does very well is start

22:45.580 --> 22:47.950
with here is what's similar
to every other technology

22:47.950 --> 22:49.680
that ever been field in the department,

22:49.680 --> 22:51.380
here are some areas that may be different,

22:51.380 --> 22:54.140
we're not quite sure yet,
and here are some substantive

22:54.140 --> 22:56.810
differences like systems
that learn on their own.

22:56.810 --> 22:59.130
That's a pretty good framework
for going after this.

22:59.130 --> 23:01.900
We have a way of looking
at this, no matter if it's

23:01.900 --> 23:04.210
artificial intelligence
or any other technology,

23:04.210 --> 23:07.440
our history, our processes,
our approach and our training

23:07.440 --> 23:09.950
are in place to look at
any emerging technology

23:09.950 --> 23:12.570
and how we bring it in from
a pilot and a prototype

23:12.570 --> 23:13.580
into production.

23:13.580 --> 23:15.760
So now that this report has been presented

23:15.760 --> 23:17.970
to the Secretary of Defense, it is up,

23:17.970 --> 23:19.180
I get two questions now.

23:19.180 --> 23:20.610
One, what do you think about the report?

23:20.610 --> 23:23.440
It's an excellent report,
provides the best possible

23:23.440 --> 23:25.000
starting point, and the
number two is what are you

23:25.000 --> 23:26.010
going to do about it.

23:26.010 --> 23:27.780
This is where it gets really complicated.

23:27.780 --> 23:30.280
We have to come up with
an implementation plan.

23:30.280 --> 23:32.740
It will not be a JAIC implementation plan,

23:32.740 --> 23:35.300
it will be a department-wide
implementation plan

23:35.300 --> 23:37.780
taking these recommendations,
putting something together

23:37.780 --> 23:40.740
through my boss, Mr. Dana
Deasy as the Chief Information

23:40.740 --> 23:43.390
Officer of the department in
making some recommendations

23:43.390 --> 23:45.960
on how we implement this
for across the entire

23:45.960 --> 23:47.090
Department of Defense.

23:47.090 --> 23:49.820
That is not an overnight
task, this is gonna take us

23:49.820 --> 23:52.590
a while to get this right but
we now have an outstanding

23:52.590 --> 23:53.423
starting point.

23:54.310 --> 23:59.310
- So, that's a wonderful
framing for where we are,

23:59.780 --> 24:02.410
I'd like to push a little
bit on where this will go.

24:02.410 --> 24:06.280
Kent, let me give you an example.

24:06.280 --> 24:09.170
Open AI developed a
technology which will allow

24:09.170 --> 24:12.870
arbitrary rewriting of text
that was sufficiently good

24:12.870 --> 24:17.080
that they became concerned
and they didn't release it

24:17.080 --> 24:19.580
and said they only released
it in certain model ways

24:19.580 --> 24:21.040
to certain researchers.

24:21.040 --> 24:24.040
That's an example, and I
asked them, and they said,

24:24.040 --> 24:26.770
I said did anyone put any pressure on you,

24:26.770 --> 24:30.030
and they said no, we just
thought it was our good judgment.

24:30.030 --> 24:35.030
You famously, very early
said on the face recognition

24:35.040 --> 24:38.220
thing, we're going to avoid
that as a general purpose

24:38.220 --> 24:39.900
because of the dangers.

24:39.900 --> 24:43.530
Where will the industry
end up in this sort of

24:43.530 --> 24:45.120
self restraint thing.

24:45.120 --> 24:48.030
Is it going to be a
common set of principles,

24:48.030 --> 24:49.870
is this going to be, is
the industry going to have

24:49.870 --> 24:53.650
to have an AI ethics
common with respect to,

24:53.650 --> 24:55.590
you know, being careful?

24:55.590 --> 24:57.410
How will this play out in your view?

24:57.410 --> 24:59.390
- I think you already see
some of that first to work

24:59.390 --> 25:01.890
across the industry with the
partnership on artificial

25:01.890 --> 25:04.790
intelligence to exchange
information on some of the work

25:04.790 --> 25:06.020
that's being done.

25:06.020 --> 25:08.610
It's going to be an evolving
question as we develop

25:08.610 --> 25:10.550
more infrastructure,
more of these frameworks

25:10.550 --> 25:13.130
about the appropriate limits
of the use of artificial

25:13.130 --> 25:15.450
intelligence, the appropriate
safeguards and checks

25:15.450 --> 25:18.660
and balances for a whole
variety of different areas.

25:18.660 --> 25:22.070
But I think, I'm hopeful,
that with a common groundwork,

25:22.070 --> 25:25.040
the way we've started to lay
already, we're on the path

25:25.040 --> 25:28.090
to doing that, but this is
true of any new technology.

25:28.090 --> 25:32.560
Any communications platform
from the radio to television

25:32.560 --> 25:35.040
to the internet you've
needed new regulatory

25:35.040 --> 25:37.440
infrastructures, new social conventions

25:37.440 --> 25:39.620
about how do you use
these different tools.

25:39.620 --> 25:42.230
This is an extraordinarily
powerful technology,

25:42.230 --> 25:45.190
we're at the early days, so
I think it's understandable

25:45.190 --> 25:47.250
that you're seeing a variety
of views come together

25:47.250 --> 25:49.080
but also notable that
you're seeing the degree

25:49.080 --> 25:51.250
of convergence that you're seeing.

25:51.250 --> 25:56.150
- So General, you have
talked inside the Pentagon

25:56.150 --> 25:59.713
about this notion of
a new kind of warfare.

26:01.468 --> 26:03.130
And I think the term that you all use

26:03.130 --> 26:04.363
is algorithmic warfare.

26:06.630 --> 26:09.610
Take us through, in the
same sense that Kent talked

26:09.610 --> 26:11.750
about how this new
emergent thing sort of new

26:11.750 --> 26:15.370
and powerful, what's new
and powerful about this

26:15.370 --> 26:17.930
technology in a military context.

26:17.930 --> 26:19.900
With your long experience
and understanding,

26:19.900 --> 26:21.350
how the military frames it.

26:21.350 --> 26:23.740
What's the language,
what's the positioning.

26:23.740 --> 26:26.640
- I go back to as we were formed and then

26:26.640 --> 26:29.840
Deputy Secretary of Defense
Bob Work was in the room,

26:29.840 --> 26:32.110
I'll never forget it, it's like yesterday,

26:32.110 --> 26:35.310
designating us, okay, you
are now formed as the team

26:35.310 --> 26:38.290
that's gonna figure out
how you actually field AI,

26:38.290 --> 26:40.220
get away from the research piece of it,

26:40.220 --> 26:43.730
which was all happening
wonderfully behind the scenes,

26:43.730 --> 26:46.000
but now we needed a team
that was focusing on fielding

26:46.000 --> 26:48.820
to the warfighter, and
the name that he gave us

26:48.820 --> 26:51.290
was the algorithmic warfare
cross-functional team.

26:51.290 --> 26:52.550
It's not an accidental name.

26:52.550 --> 26:54.350
It's become Project Maven
because it's much easier

26:54.350 --> 26:55.573
to say than aquifith,

26:57.458 --> 27:00.423
(audience laughing)

27:00.423 --> 27:02.590
- And, your acronyms are gonna kill me.

27:02.590 --> 27:04.750
Okay, let's
- So let's just focus

27:04.750 --> 27:05.583
on algorithmic warfare.

27:05.583 --> 27:07.143
- Why don't you tell us
what algorithmic warfare is.

27:07.143 --> 27:09.790
- So it's the idea we're
going to face a fight in the

27:09.790 --> 27:12.120
future, we're used to fighting
for 20 years in a certain

27:12.120 --> 27:15.100
type of fight,
counter-terrorism, insurgencies,

27:15.100 --> 27:19.640
we are going to be shocked
by the speed, the chaos,

27:19.640 --> 27:21.970
the bloodiness and the
friction of the future fight

27:21.970 --> 27:24.260
in which this will be, maybe playing

27:24.260 --> 27:25.640
in microseconds at time.

27:25.640 --> 27:27.430
How do we envision that fight happening.

27:27.430 --> 27:29.880
It has to be algorithm against algorithm.

27:29.880 --> 27:33.220
It is a, as you described
earlier, as we were talking

27:33.220 --> 27:36.450
about this, it's a Boydian
OODA, how fast can we

27:36.450 --> 27:38.420
get inside somebody's decision cycle.

27:38.420 --> 27:40.467
- Remind people what the OODA is.

27:40.467 --> 27:43.800
- Colonel John Boyd, Air
Force colonel who was

27:43.800 --> 27:47.550
sort of the author of the
observe, orient, decide, act,

27:47.550 --> 27:50.840
which is how you get through
the cycle of decision making

27:50.840 --> 27:52.770
which was really never
about the decide or the act,

27:52.770 --> 27:54.460
it was more about the observe and orient,

27:54.460 --> 27:56.230
I think really about the orient phase.

27:56.230 --> 27:58.323
But in this future fight we're looking at,

27:58.323 --> 28:00.390
this would be happening
so fast, if we're trying

28:00.390 --> 28:02.880
to do this by humans against machines

28:02.880 --> 28:05.310
and the other side has the
machines and the algorithms

28:05.310 --> 28:08.180
that we don't, we're at
an unacceptably high risk

28:08.180 --> 28:09.450
of losing that conflict.

28:09.450 --> 28:11.780
Now this is a challenging
one because I think part

28:11.780 --> 28:14.950
of what you're getting at
in that future scenario

28:14.950 --> 28:18.440
how are people going to be
assured that our algorithms

28:18.440 --> 28:20.580
are going to work as intended
and they don't take on

28:20.580 --> 28:22.200
a life of its own so to speak.

28:22.200 --> 28:24.590
What we will fall back on,
I think this is a starting

28:24.590 --> 28:27.610
point for what the DIB principles gave us,

28:27.610 --> 28:30.380
is test, evaluation,
validation and verification.

28:30.380 --> 28:33.000
We have to do a lot more
work on the front end,

28:33.000 --> 28:35.650
by the time they field it
we know it's being fielded.

28:35.650 --> 28:39.010
But I think we're really
going to be at a disadvantage,

28:39.010 --> 28:42.760
if we think we're gonna be at
a pure human against machine.

28:42.760 --> 28:45.350
It'll be human and machine on one side,

28:45.350 --> 28:48.200
human, machine on the other,
but the temporal dimension,

28:48.200 --> 28:51.010
this fleeting superiority
that you may be facing

28:51.010 --> 28:53.500
where decisions will be
made that fast, it might be

28:53.500 --> 28:54.990
algorithm against algorithm.

28:54.990 --> 28:58.190
- Yeah, that was, to me the
key question military matter

28:58.190 --> 29:01.650
is what happens when the
whole scenario is faster

29:01.650 --> 29:02.853
than human decision making.

29:02.853 --> 29:05.080
Because I understand the
way the military works

29:05.080 --> 29:08.210
is when there's a threat,
in general people check

29:08.210 --> 29:10.740
with their superior, there's
sort of a rule of engagement,

29:10.740 --> 29:13.200
there's human judgment,
it's all built around some

29:13.200 --> 29:15.460
number of minutes, right?

29:15.460 --> 29:17.520
Not some number of nanoseconds.

29:17.520 --> 29:21.420
How will the military
adjust its procedures

29:21.420 --> 29:23.440
to deal with this real possible threat?

29:23.440 --> 29:26.040
- It won't be driven by above.

29:26.040 --> 29:28.330
The innovation will happen
at the lowest possible level.

29:28.330 --> 29:30.610
What we have to be able
to do in places like JAIC

29:30.610 --> 29:33.520
or Maven is give people the
policies and the authorities

29:33.520 --> 29:36.080
and the framework to do
what they need to do.

29:36.080 --> 29:38.320
The innovation, the people
that will say I have a

29:38.320 --> 29:40.780
solution to this, I'm
going to write a code,

29:40.780 --> 29:42.660
I'm going to develop an
algorithm and apply it to

29:42.660 --> 29:45.280
this problem set in the
field if you've given me

29:45.280 --> 29:47.160
the data, the tools, the
frameworks, the library,

29:47.160 --> 29:49.320
the standards, all those
other things, we can do it.

29:49.320 --> 29:51.580
So if the idea in that
fight, it will be more

29:51.580 --> 29:54.340
decentralized than a lot
of people are comfortable

29:54.340 --> 29:56.960
with today and that brings risks with it.

29:56.960 --> 29:59.740
So we're talking about higher
risks, higher consequence,

29:59.740 --> 30:02.620
but it's either that or
risk losing the fight.

30:02.620 --> 30:05.890
So it's this idea of
decentralized development,

30:05.890 --> 30:09.590
decentralized experimentation,
decentralized innovation.

30:09.590 --> 30:12.030
The innovation as was
described in one of the panels

30:12.030 --> 30:13.750
this morning happens at the bottom.

30:13.750 --> 30:15.930
We've got to give them the push from above

30:15.930 --> 30:18.120
to make it succeed.
- And also add

30:18.120 --> 30:19.342
- Kent

30:19.342 --> 30:20.175
- in addition to the tempo component,

30:20.175 --> 30:23.840
there's new fronts in the
cyber security, cyber defense.

30:23.840 --> 30:27.946
We're seeing already sort
of efforts to destabilize

30:27.946 --> 30:29.460
with disinformation
campaigns and the like.

30:29.460 --> 30:31.420
So the more we can work
together to recognize

30:31.420 --> 30:34.320
those patterns across a wider
battlefield if you will,

30:34.320 --> 30:35.570
the better for everybody.

30:36.580 --> 30:40.580
- Kent, do you have a
model for how the industry,

30:40.580 --> 30:43.130
one of the themes of our whole
conference is the industry

30:43.130 --> 30:45.720
and the government need
to work together broadly,

30:45.720 --> 30:48.210
and obviously we have
a senior general here,

30:48.210 --> 30:49.990
but I'm really referring to
the government as a whole,

30:49.990 --> 30:52.050
and there's a whole lot
more than just the DoD

30:52.050 --> 30:53.390
that needs AI.

30:53.390 --> 30:55.930
Do you have a model for how
the industry should work

30:55.930 --> 30:58.830
with the federal government,
the state governments,

30:58.830 --> 31:00.770
the DoDs and so forth?

31:00.770 --> 31:05.570
- Well, I've only talked
about two important elements

31:05.570 --> 31:08.460
the DIB report talks, touches on as well.

31:08.460 --> 31:11.180
The first is this notion of
trying to build broad trust

31:11.180 --> 31:13.060
in the application of new technologies,

31:13.060 --> 31:15.250
the second is the need
for a global framework

31:15.250 --> 31:17.000
which helps with that process.

31:17.000 --> 31:19.110
The third, as General Shanahan alluded to,

31:19.110 --> 31:22.500
is a more operational
administrative question

31:22.500 --> 31:26.700
of how do we make it as easy
as possible for new companies

31:26.700 --> 31:29.110
to enter into these kinds of partnerships.

31:29.110 --> 31:31.110
So a lot of the innovation,
a lot of the cutting edge

31:31.110 --> 31:33.880
research being done in Silicon
Valley is not being done

31:33.880 --> 31:36.260
by large companies, it's
being done by small companies.

31:36.260 --> 31:39.820
It's a rich ecosystem of
innovation and it's challenging

31:39.820 --> 31:43.070
even for a company Google's
size to start to get more

31:43.070 --> 31:44.960
involved in that environment.

31:44.960 --> 31:47.710
It's doubly difficulty for some
of these smaller companies.

31:47.710 --> 31:51.850
So as we look at modernizing
procurement from the military

31:51.850 --> 31:55.010
side and working with Congress
as well to make it as quick,

31:55.010 --> 31:59.440
as nimble, as flexible as
possible, responsive to new needs.

31:59.440 --> 32:02.420
Looking at increasing R and
D funding across the board

32:02.420 --> 32:05.130
because that's traditionally
been a really fertile

32:05.130 --> 32:07.960
ground for a lot of these
collaborative enterprises

32:07.960 --> 32:09.400
to move forward.

32:09.400 --> 32:13.000
Looking at human resources
exchanges, there's a lot

32:13.000 --> 32:16.350
of authorities out there
which authorize private sector

32:16.350 --> 32:19.270
people to come into the
government, but in practice

32:19.270 --> 32:20.850
it's harder than you would think.

32:20.850 --> 32:23.070
So a lot of that hard
work on the ground I think

32:23.070 --> 32:25.190
is important to making this a success.

32:25.190 --> 32:27.450
- For both of you, because
we're gonna be making

32:27.450 --> 32:30.338
recommendations that ideally
will end up in legislation

32:30.338 --> 32:33.973
a year from now, plus or minus,
are there specific things

32:33.973 --> 32:36.590
that we could do that would
promote private-public

32:36.590 --> 32:39.960
partnerships, for example
as you know the DoD has

32:39.960 --> 32:43.890
DIUX, SCO and a number
of other groups that work

32:43.890 --> 32:46.927
very closely, of Incutel,
obviously the extraordinary

32:46.927 --> 32:49.650
contribution that DARPA
has played to our industry

32:49.650 --> 32:52.790
and to technology and to me
personally, so forth and so on.

32:52.790 --> 32:55.857
So the sum of all of
that, do you have a model,

32:55.857 --> 32:58.770
and I'll ask Kent the same
question, do you have specific

32:58.770 --> 33:01.040
things that would be
helpful that would decrease

33:01.040 --> 33:04.700
the friction and increase
the cohesion between small

33:04.700 --> 33:07.190
companies, large companies,
the federal government,

33:07.190 --> 33:08.860
procurement, the DoD?

33:08.860 --> 33:09.830
- A couple of different thoughts on that.

33:09.830 --> 33:12.030
First of all, there's so much
that has started to happen

33:12.030 --> 33:13.810
over the last couple of
years with places like

33:13.810 --> 33:16.810
Defense Digital Service,
DIU, Kessel Run, Compile

33:16.810 --> 33:20.180
to Combat, all what I call
these beginning insurgencies

33:20.180 --> 33:21.130
to get things moving.

33:21.130 --> 33:23.070
- And we should pause and
say that these are each

33:23.070 --> 33:25.410
small teams of software
people inside the DoD

33:25.410 --> 33:28.030
that have had an outside
impact in changing

33:28.030 --> 33:30.740
the procedures in important
aspects in the Air Force

33:30.740 --> 33:33.510
for example, in some of the
K-Ox and things that like.

33:33.510 --> 33:35.340
- So that all got it
started but we have to do

33:35.340 --> 33:37.130
is figure out how to
institutionalize, how to make

33:37.130 --> 33:39.490
that systemic change across
the Department of Defense

33:39.490 --> 33:41.100
which is the next hard part to do.

33:41.100 --> 33:42.910
And you ask about what
are the ways we do that.

33:42.910 --> 33:44.790
I'll tell you one of the
biggest ones is just talent,

33:44.790 --> 33:47.530
bringing in talent from
the outside, from academia,

33:47.530 --> 33:48.363
from industry.

33:48.363 --> 33:50.710
Our chief scientist, Jo
Crisman, who is here today,

33:50.710 --> 33:53.480
has time in the government
in ARPA working for a startup

33:53.480 --> 33:56.160
in her last job, Nand
Mulchandani who is our chief

33:56.160 --> 33:58.310
technical officer, 25 years in the Valley,

33:58.310 --> 34:00.920
he comes in and within 24
hours takes a different

34:00.920 --> 34:02.450
view of what we're trying to do.

34:02.450 --> 34:04.470
We need a lot more of
that, we need sabbaticals,

34:04.470 --> 34:07.200
people coming in from
academia for a year or two,

34:07.200 --> 34:08.570
going back out to the outside.

34:08.570 --> 34:10.560
Us putting people in
education with industry,

34:10.560 --> 34:12.738
Secretary of Defense corporate fellowship.

34:12.738 --> 34:14.550
That's all beginning to happen.

34:14.550 --> 34:17.450
It needs to scale to the
next level to really start

34:17.450 --> 34:20.000
to understand what we're
each talking about.

34:20.000 --> 34:21.730
Me going out to the
Valley and talking to the

34:21.730 --> 34:24.190
Suite, C-Suite, only gets so far.

34:24.190 --> 34:26.513
It's the peer-to-peer
relationships and discussion

34:26.513 --> 34:28.250
that I think are gonna
be more important than

34:28.250 --> 34:29.083
anything else.

34:29.083 --> 34:31.190
- And I very much agree with
that, I think you've seen

34:31.190 --> 34:35.100
examples of it with the JAIC, with DARPA,

34:35.100 --> 34:36.890
we're priming the pump
on a variety of really

34:36.890 --> 34:41.300
important areas whether
that's training or in models

34:41.300 --> 34:43.277
and simulation, they're
helpful in that, recruiting,

34:43.277 --> 34:45.320
a number of different areas.

34:45.320 --> 34:48.790
Another important
component of this is the IT

34:48.790 --> 34:52.301
modernization because in
many ways, the AI kernel

34:52.301 --> 34:56.110
is critical but it comes
embedded within a larger

34:56.110 --> 34:59.190
environment of software that's
oftentimes very difficult

34:59.190 --> 35:01.390
because you have to
get security clearances

35:01.390 --> 35:03.710
and appropriate certification
for all the elements

35:03.710 --> 35:05.700
of that piece, so there's that combination

35:05.700 --> 35:08.780
of successful individual
experiments and trial runs

35:08.780 --> 35:11.710
to build the familiarity
at the peer-to-peer level

35:11.710 --> 35:14.190
but also the systemic
change to make it easier

35:14.190 --> 35:17.103
to have wider adoption of
the technology more broadly.

35:18.900 --> 35:21.440
- It's time for us to finish up.

35:21.440 --> 35:24.810
My objective in this
panel was to put to bed

35:24.810 --> 35:27.700
this notion that somehow
Silicon Valley wouldn't

35:27.700 --> 35:30.720
work with the military and
I think we've clearly seen

35:30.720 --> 35:32.390
examples, small companies,
large companies,

35:32.390 --> 35:34.860
strong statement from Kent
on those, and we just sort

35:34.860 --> 35:36.880
of move forward and build this collective

35:36.880 --> 35:40.840
between the private and
the public partnerships.

35:40.840 --> 35:43.780
Kent, can you sort of
summarize sort of the key

35:43.780 --> 35:46.690
takeaway that you want to
offer us, the key message,

35:46.690 --> 35:48.193
the key word, the key,

35:49.970 --> 35:52.740
why are you here and why
did you make a special trip

35:52.740 --> 35:54.023
just to make this point?

35:54.950 --> 35:57.930
- I want to be clear, and
I'll restate what I said

35:57.930 --> 35:58.763
at the beginning.

35:58.763 --> 36:00.720
We are a proud American
company, we are committed

36:00.720 --> 36:04.900
to the cause of national
defense for the United States

36:04.900 --> 36:08.000
of America, for our allies
and for peace and safety

36:08.000 --> 36:09.470
and security in the world.

36:09.470 --> 36:12.630
We approach that task
thoughtfully as we do using

36:12.630 --> 36:15.690
approaching a variety of
advanced technologies.

36:15.690 --> 36:17.870
We want to be thoughtful
and make sure we have clear

36:17.870 --> 36:20.320
frameworks and transparency
and understanding

36:20.320 --> 36:21.530
as we move forward.

36:21.530 --> 36:23.440
I think that it's a
mission that the military

36:23.440 --> 36:25.810
and the U.S. government share,
and I'm looking forward,

36:25.810 --> 36:27.760
we're looking forward
to working more closely

36:27.760 --> 36:29.720
together in the future.

36:29.720 --> 36:32.740
- So General, you never like these things,

36:32.740 --> 36:35.490
but you're sort of the
top, you're the tip,

36:35.490 --> 36:39.510
you're the fellow who's going
to make this change happen

36:39.510 --> 36:43.670
across 3.2 million people, $660 billion,

36:43.670 --> 36:45.630
an enormous bureaucracy.

36:45.630 --> 36:47.480
How are you going to pull this off?

36:47.480 --> 36:49.029
- One person at a time.

36:49.029 --> 36:50.460
(audience laughing)

36:50.460 --> 36:53.610
It has to be a combination of top down.

36:53.610 --> 36:57.310
It was said on the previous
panel, you must have

36:57.310 --> 37:00.260
the full support of
leadership from the very top

37:00.260 --> 37:02.610
to show that it's a
priority for the department.

37:02.610 --> 37:04.990
That's critical but also insufficient.

37:04.990 --> 37:06.740
You have to have the bottom up innovation,

37:06.740 --> 37:07.950
the people pushing from below.

37:07.950 --> 37:09.670
It's there today, there's no question,

37:09.670 --> 37:11.380
some of them are represented in this room.

37:11.380 --> 37:13.360
They already knows what that
future needs to look like,

37:13.360 --> 37:15.580
how do we meet that in
the middle, and give them

37:15.580 --> 37:18.210
the resources, the tools to succeed.

37:18.210 --> 37:20.610
The last thing I'll say
is this is intimidating,

37:20.610 --> 37:22.100
this is a daunting task.

37:22.100 --> 37:23.060
No way around it.

37:23.060 --> 37:25.690
It is a multi-generational
problem, it's going to

37:25.690 --> 37:28.030
require a multi-generational solution.

37:28.030 --> 37:29.890
I'm not gonna wake up
tomorrow and suddenly realize

37:29.890 --> 37:30.800
we've got this all right.

37:30.800 --> 37:33.380
We're gonna have some fits
and starts, some successes,

37:33.380 --> 37:35.820
some drawbacks, but
just keep plowing ahead,

37:35.820 --> 37:37.910
and with the resources and the commitment

37:37.910 --> 37:40.940
of the department behind
us I know we'll get there.

37:40.940 --> 37:41.773
- Well thank you.

37:41.773 --> 37:44.900
I think it's worth saying,
I've worked with Kent

37:44.900 --> 37:47.920
for 15 years and with my
Google hat on, I'll tell you

37:47.920 --> 37:51.180
I cannot be more proud of
the impact that he's had

37:51.180 --> 37:55.470
on our society, the scale and
the reach of our corporation.

37:55.470 --> 37:57.590
I think you can see this today.

37:57.590 --> 38:00.910
And General, I don't think
Bob Work could have chosen

38:00.910 --> 38:03.100
a better person to lead this.

38:03.100 --> 38:05.960
Our partnership with you
over the last three years,

38:05.960 --> 38:09.730
you really have moved the
resources, gotten the money,

38:09.730 --> 38:12.420
gotten the attention and
delivered, and there was

38:12.420 --> 38:13.580
no one before you.

38:13.580 --> 38:16.830
You are that person, so
thank you both very much,

38:16.830 --> 38:17.873
thank you all.

38:17.873 --> 38:21.123
(audience applauding)

