04/23/2019
Cybersecurity with Donnie Hasseltine
Donnie Hasseltine is a U.S. Marine Corps officer currently stationed in the Bay Area with the 23d Marine Regiment who served in Kosovo, Iraq, and Afghanistan. He is a graduate of the Virginia Military Institute, the Joint Military Intelligence College, the Naval War College, and recently completed an Executive Master in Cybersecurity at Brown University. […]
00:00:00.000 |
This is KZSU Stanford.
|
00:00:04.080 |
Welcome to entitled opinions.
|
00:00:06.280 |
My name is Robert Harrison.
|
00:00:08.280 |
We're coming to you from the Stanford campus.
|
00:00:10.520 |
When I got together with him last week to discuss our upcoming show,
|
00:00:27.400 |
the guest who joins me in the studio today remarked to me that in title opinions
|
00:00:32.360 |
is the most idiosyncratic podcast he's ever listened to.
|
00:00:36.440 |
And I like the sound of that.
|
00:00:38.880 |
If it's not idiosyncratic, it's not entitled opinions.
|
00:00:42.600 |
And if it's not entitled opinions, well, it's not entitled opinions.
|
00:00:48.520 |
Adiosincacy suits the brigade just fine.
|
00:00:52.320 |
Those who listen to this show come in all varieties of weirdness.
|
00:00:57.240 |
The outcasts, the outsized, the outlived, the out of mind.
|
00:01:02.840 |
The brigade knows its own.
|
00:01:05.440 |
Lovers, hermits, pagans, saints, lepers, fugitives, misfits, contortionists,
|
00:01:13.000 |
and possibilities.
|
00:01:15.400 |
The perturbed, the persecuted, the pensive, the pinnheaded, the palpitators.
|
00:01:21.240 |
They all tune in.
|
00:01:24.160 |
I wonder by my trough, what would happen if we ever
|
00:01:27.200 |
got all of you together in a room or on a hillside one day?
|
00:01:32.200 |
It would be a surreal, great awakening for sure.
|
00:01:36.560 |
So be warned, this show is for those who feed on another kind of bread and another kind
|
00:01:41.760 |
of mushroom, then you'll find at the deli next door.
|
00:01:46.960 |
Stay tuned friends, another episode of entitled opinions coming up.
|
00:01:50.960 |
[Music]
|
00:02:00.960 |
[Music]
|
00:02:04.960 |
[Music]
|
00:02:10.960 |
[Music]
|
00:02:20.960 |
[Music]
|
00:02:26.960 |
The topic of our show is cyber security.
|
00:02:34.880 |
That's something I know very little about and I'm going to sit back and learn about it
|
00:02:39.240 |
from the special guests who joins me in the studios of KZSU.
|
00:02:44.480 |
Today I want to be like one of the sea men in Conrad's Lord Jim, who I quote, "wallow
|
00:02:50.640 |
in their good chairs and think to themselves, hang exertion, let that Marlow talk."
|
00:02:57.800 |
Our Marlow today is Donnie Hassell-Tine, not our usual kind of guest on entitled opinions.
|
00:03:04.640 |
Donnie Hassell-Tine is a marine officer and veteran of the Kosovo, Iraq and Afghanistan
|
00:03:09.560 |
campaigns, a graduate of Virginia Military Institute, Joint Military Intelligence College and Naval
|
00:03:16.960 |
War College. He was a military advisor for Stanford University's hacking for defense courses.
|
00:03:24.240 |
He has recently completed Brown University's Executive Master in Cyber Security Program.
|
00:03:30.720 |
Donnie, welcome to the program.
|
00:03:32.120 |
Thanks Robert, it's great to be here.
|
00:03:34.200 |
So the first thing I should point out to our listeners is that you're still on active
|
00:03:37.360 |
duty and that any opinions you may express on this show called entitled opinions are your
|
00:03:42.600 |
own and do not reflect the policy or position of the U.S. government.
|
00:03:46.320 |
That is correct, and I appreciate that, Robert.
|
00:03:48.320 |
All right, good.
|
00:03:49.320 |
So I actually looked up the word cyber security today just to orient myself and then I told
|
00:03:55.240 |
myself, "Why am I going to go and read through this stuff when I have you, Donnie,
|
00:03:59.880 |
here to tell me and our listeners to start off with what is cyber security exactly?"
|
00:04:04.760 |
Well, it's probably fitting you started with the Lord Jim quote Robert because it actually
|
00:04:08.960 |
comes from the Greek Kubernetes, which is Steersman and it grew about in the 1940s actually
|
00:04:15.960 |
in the term cybernetics grew out of Kubernetes.
|
00:04:19.280 |
It was really a study of how systems, both living in machine, kind of communicate.
|
00:04:23.880 |
And over time, that shortened a cyber and eventually gave rise to terms like cyber or
|
00:04:29.680 |
cyber work, which came from cyber organism.
|
00:04:32.720 |
And now really, well, we see, virtually there's a cyber, everything it seems nowadays.
|
00:04:36.960 |
So Steersman.
|
00:04:38.200 |
Wow.
|
00:04:40.200 |
I knew I was going to learn a lot today.
|
00:04:42.400 |
It's off to a good start.
|
00:04:44.200 |
Yeah, so that's the etymology of the word, the history of the word.
|
00:04:49.680 |
And where do we stand now in this brave new world that we've entered into in the past decade
|
00:04:57.800 |
or two where almost everything about the organization of our society seems dependent
|
00:05:03.400 |
upon not only the computer, but the internet connections.
|
00:05:07.800 |
And when there's a malfunction with bar trains, you're listening to that traffic report
|
00:05:14.680 |
and they've been down for an hour, it's a disaster.
|
00:05:18.600 |
Absolutely.
|
00:05:19.600 |
So we are so completely dependent on an invisible cyber connectivity.
|
00:05:25.600 |
Absolutely.
|
00:05:26.600 |
And I think that, well, you really need to look at it at the very base level, is it's
|
00:05:29.840 |
really a transfer of information from place to place.
|
00:05:33.560 |
And when you talk about information, the three, or we call the triad and cyber security,
|
00:05:38.200 |
we call the CIA triad, different CIA confidentiality, integrity, and availability.
|
00:05:43.880 |
And that is, can you take information and can you, can I pass information from me to you
|
00:05:48.520 |
confidentially so no one else can read it or see it or steal it?
|
00:05:53.400 |
Is there integrity to that data?
|
00:05:54.400 |
In other words, once you receive that data, can you verify that the data that I sent is
|
00:05:58.760 |
the same data you received?
|
00:06:00.600 |
And then the availability of it is that that information in those pathways that connect
|
00:06:04.360 |
them are constantly available for the information to move across that superhighway.
|
00:06:08.800 |
So cyber security is really ensuring that that information is that trades, places, and spaces
|
00:06:14.600 |
and time moves about in a secure and available fashion for both the consumers and the
|
00:06:20.040 |
providers of that information.
|
00:06:21.720 |
And that's really the essence of how it all works.
|
00:06:24.640 |
So cyber security is also a response to cyber attacks.
|
00:06:28.280 |
Absolutely.
|
00:06:29.280 |
And if one were to take the example, which just because last week I happened here that
|
00:06:35.280 |
the bar trains for one whole, but that's not necessarily a security issue or is the CIA
|
00:06:42.760 |
pertinent to an example like that?
|
00:06:44.880 |
Yeah, absolutely.
|
00:06:45.880 |
So I would say that cyber security is fairly broad.
|
00:06:47.760 |
And I think that when you, when you use a term like security, you immediately kind of jump
|
00:06:52.480 |
to the offense defense warfare type analogies.
|
00:06:55.840 |
And there's certainly a huge application for that in cyber security, but that's why I want
|
00:06:59.760 |
to go back down to that basis of information flow.
|
00:07:02.360 |
I mean, when you really even break apart a computer network, you're really talking about
|
00:07:06.920 |
just a database of information that you're finding a way to query and extract and then
|
00:07:12.440 |
compute that information in a way that can be managed.
|
00:07:14.840 |
So if you think about the bar example, they're all the timelines of when those trains
|
00:07:19.600 |
are supposed to be moving.
|
00:07:20.600 |
There's the data coming in on how fast they're moving and what tracks they're on.
|
00:07:25.280 |
And there's computing going on in that.
|
00:07:26.800 |
So any disruption in that can cause an accident, can cause trains to be delayed, can cause
|
00:07:30.760 |
a wide variety of things.
|
00:07:32.320 |
And that is all presupposes that the information is secure in a moving about in a secure
|
00:07:37.520 |
fashion.
|
00:07:38.520 |
So I think I always bring that up as just the basic thing of where you understand.
|
00:07:42.680 |
And then from there, you get in the point of, you know, a cyber attack could be an
|
00:07:48.320 |
accident, right?
|
00:07:49.320 |
You could have a computer go down in the case of the bar example where that broke down
|
00:07:53.480 |
an entire system, but no one actually tried to make it happen.
|
00:07:57.440 |
But you can also take the point out from the human side of it as can a human then use knowledge
|
00:08:03.400 |
skills or vulnerabilities to exploit that to cause an incident that they can benefit
|
00:08:08.680 |
from.
|
00:08:09.680 |
Right.
|
00:08:10.680 |
Well, so the real pressing question we could have waited until the end, but I'm going
|
00:08:14.360 |
to write away.
|
00:08:16.000 |
How vulnerable are we in general, given that if I go to my ATM and it's not working,
|
00:08:23.200 |
I, you know, I don't have access to cash, they can't do anything at the teller.
|
00:08:27.120 |
The supermarket will not check you out if their system isn't working.
|
00:08:31.880 |
My sense is that the vulnerability is truly extreme.
|
00:08:37.320 |
On the other hand, I don't know anything about security.
|
00:08:39.000 |
Yeah.
|
00:08:40.000 |
Well, you're both extremely vulnerable and not vulnerable at the same time.
|
00:08:44.360 |
And I know that's kind of an interesting way to put it, but the vast majority of cyber
|
00:08:49.800 |
attacks occur due to human error.
|
00:08:52.400 |
And I think that's the first thing to recognize is we think of computers as kind of like this.
|
00:08:58.360 |
We make them almost this godlike infallible thing, but we got to realize that the start
|
00:09:03.800 |
that computers are made by humans and they're used by humans.
|
00:09:07.000 |
And the vulnerability is that that human computer interface.
|
00:09:09.400 |
It's your point about you personally and like talking about how vulnerable are we.
|
00:09:14.360 |
I think you have to be very careful about that we are you talking about like you and I as
|
00:09:17.200 |
private citizens and visuals, we talking about larger organizations and governments.
|
00:09:21.120 |
And I think there was a great example a few years ago at DEF CON, which is the big hacker
|
00:09:25.760 |
convention in Las Vegas every year where a reporter hired a hacker to hack into him and
|
00:09:31.920 |
see how badly he can mess up his life and did a pretty good job of it to cut to the
|
00:09:36.800 |
chase.
|
00:09:37.800 |
But there was someone in that video interview from one of the media organizations who
|
00:09:41.440 |
was a hacker himself.
|
00:09:43.320 |
And I can get the name for your listeners.
|
00:09:44.880 |
But he basically said, hey, are you concerned about getting beat up by a martial arts
|
00:09:51.080 |
expert like when you walk through campus at Stanford?
|
00:09:54.560 |
And the answers will probably know is like, but you know that if a temporary black belt
|
00:09:58.860 |
jumped out of the bushes like you would have no chance, right?
|
00:10:01.600 |
And absolutely.
|
00:10:02.600 |
Well, there are people like that on the internet and the odds are like if you start going
|
00:10:08.200 |
down dark alleys and visiting places you shouldn't go in the internet, you're going to
|
00:10:11.080 |
interface with these individuals.
|
00:10:12.440 |
But if you stay on the main, lighted thoroughfares and stick to the basics, you're probably
|
00:10:17.240 |
going to be okay.
|
00:10:18.720 |
So there's a difference between knowing the threat.
|
00:10:20.480 |
You're talking about the individual.
|
00:10:21.480 |
Right.
|
00:10:22.480 |
I'm talking about the individual just just right now.
|
00:10:24.200 |
And I would say just to briefly hit that is what we call cyber hygiene.
|
00:10:29.200 |
Don't click a link that sent you.
|
00:10:30.480 |
You know, validate the address.
|
00:10:32.280 |
Look at the URL.
|
00:10:33.960 |
If you get an update email from Apple or Google, instead of clicking the link, go to the actual
|
00:10:39.240 |
website.
|
00:10:40.240 |
And it may be worth a side point here.
|
00:10:42.720 |
I mean, as an example with the recent elections and the WikiLeaks hack of the Democrat
|
00:10:49.400 |
National Committee, you know, the John Podesta emails is a perfect example of there where he received
|
00:10:54.040 |
a spear phishing email trying to get him to click the link to reset his password.
|
00:10:59.720 |
So the hacker could then get his credentials and access it.
|
00:11:03.080 |
And he recognized that it didn't look right and talked to an IT person who said, Hey,
|
00:11:08.720 |
yeah, that's legitimate.
|
00:11:10.360 |
The IT person man, it was a legitimate spear fish.
|
00:11:13.120 |
I go reset your password now.
|
00:11:15.760 |
But it sounds like the reporting job best took like that was legitimate email and he clicked
|
00:11:19.840 |
the link and reset his password that way.
|
00:11:21.880 |
And that's what opened up that entire line of events.
|
00:11:24.480 |
So even someone who actually knew, hey, that doesn't look right.
|
00:11:28.160 |
Sometimes miscommunications the human level allow access to do that.
|
00:11:31.720 |
So the first thing is like, are you a big target?
|
00:11:33.760 |
Do you have a lot of money?
|
00:11:34.760 |
Are you a public figure?
|
00:11:35.760 |
You someone that someone wants to go after?
|
00:11:37.200 |
And the second piece, if you're not, you just do the basics right.
|
00:11:39.960 |
Have decent passwords, update your computers and regular basis.
|
00:11:44.520 |
Don't click things.
|
00:11:45.520 |
You're probably going to be safe.
|
00:11:46.520 |
Yeah.
|
00:11:47.520 |
Well, I'm not a big target at all.
|
00:11:48.880 |
But whenever I get it emails like that, I never click.
|
00:11:51.240 |
That's if they really wanted, they'll come back to me.
|
00:11:54.920 |
Or they'll call you, which is even better.
|
00:11:57.200 |
So yeah, for the individual, I understand that if you navigate carefully or sensibly,
|
00:12:03.320 |
you can be fine.
|
00:12:04.400 |
But it's the larger system, systemic issues.
|
00:12:09.480 |
The Vernor Heritage Log, the filmmaker had a documentary come up a few years ago called
|
00:12:14.880 |
Low and Behold.
|
00:12:15.880 |
Absolutely.
|
00:12:16.880 |
Yeah.
|
00:12:17.880 |
You've seen that.
|
00:12:18.880 |
And there he just imagines, like solar flares or something that can affect the internet
|
00:12:23.960 |
and communications.
|
00:12:26.120 |
And these are our other nations trying to disable an entire society of entire enemy.
|
00:12:35.200 |
Anyway, that's the systemic thing is what worries me the most, because that's where I feel
|
00:12:39.200 |
most helpless.
|
00:12:41.440 |
Because I has nothing to do with my decisions, nothing to do with my agents.
|
00:12:44.840 |
No, you're absolutely right.
|
00:12:45.840 |
In real history, like if the biggest threat, maybe due to this individual, is to have
|
00:12:50.800 |
someone take your credit card, which case you can cancel.
|
00:12:53.800 |
You have very specific actions you can take.
|
00:12:56.200 |
But if we're someone to take down the electric grid, there's really nothing you can do
|
00:12:59.520 |
about that.
|
00:13:00.520 |
And I think there's a lot of concerns about the things that could potentially take down
|
00:13:04.960 |
things.
|
00:13:05.960 |
And it's an increasing fear in a network society.
|
00:13:07.720 |
I mean, everything we have is connected.
|
00:13:09.600 |
And we start looking at the future, the very near future, with things like 5G, you're
|
00:13:14.240 |
going to see even more things be connected.
|
00:13:16.000 |
In fact, things like autonomous vehicles and other things that really work in the future
|
00:13:21.440 |
require this type of very high speed, low latency network.
|
00:13:26.720 |
But when you have that, that offers a lot of openings.
|
00:13:29.280 |
And I think the way to look at it too is every time you touch the internet, I mean, where
|
00:13:34.080 |
you touch the internet is an endpoint and that's a vulnerability.
|
00:13:36.560 |
I mean, another anecdotal story.
|
00:13:38.840 |
There was an example of hackers getting a new casino through the aquarium, because the aquarium
|
00:13:42.880 |
was connected to the internet and wasn't protected.
|
00:13:44.760 |
So you need to think of anything that touches the internet is a window or a door.
|
00:13:48.880 |
And if you're in your house, do you lock or close it?
|
00:13:51.800 |
But I mean, to get in that broader point, I think you're absolutely right.
|
00:13:55.120 |
It brings a lot of fear out there because of the potential, but at the same time, it's
|
00:14:00.160 |
really hard sometimes to quantify and understand whether that fear is well placed.
|
00:14:03.840 |
I mean, you're right.
|
00:14:04.840 |
A solar flare could generate something akin to an electromagnetic pulse that could shut down
|
00:14:09.720 |
electricity in the internet for a period of time.
|
00:14:11.880 |
But how likely is it?
|
00:14:12.880 |
What are the potentials of?
|
00:14:13.880 |
And that's the whole thing where you're trying to balance that risk.
|
00:14:16.720 |
I think that the other thing just to be aware of is just recognizing just in that network
|
00:14:22.680 |
point, what are the fail safes?
|
00:14:25.360 |
There have been examples where we start looking at like cyber physical attacks or the
|
00:14:28.760 |
ones you're talking about.
|
00:14:29.760 |
Where can I use the network to access a physical thing?
|
00:14:33.920 |
And the most famous example, that is the Aurora exercise put on my Homeland Security
|
00:14:38.360 |
in 2007 where they were able to put a generator out and connect to the internet and then
|
00:14:44.360 |
explode the generator just by sending incorrect information to get it to run out of cycle.
|
00:14:48.960 |
And that's kind of the great fear of that is how do we kind of protect ourselves against
|
00:14:54.800 |
that.
|
00:14:55.800 |
And in some cases, it's as simple as like, hey, do you have a physical off switch?
|
00:14:59.560 |
You know, some of these systems, they're all electronic when it goes haywire.
|
00:15:03.000 |
It's all electronic.
|
00:15:04.000 |
You can't actually twinning about it.
|
00:15:05.000 |
It's almost like old cars I can repair new cars.
|
00:15:08.240 |
I lift the hood.
|
00:15:09.240 |
It's plastic.
|
00:15:10.240 |
There's no way to get in there and get behind it.
|
00:15:12.360 |
So having some type of physical human interface that can bypass or pull yourself off the
|
00:15:17.560 |
network is generally, once you've disconnected from the network, you may have lost that
|
00:15:21.760 |
service, but it also has prevented continued damage until you can diagnose it.
|
00:15:25.880 |
Yeah, I've reached in my mind, Battlestar Galactico, which is a television series where
|
00:15:30.520 |
the horrible apocalyptic destruction of all of humanity or the only people surviving are the
|
00:15:40.560 |
ones aboard us, a starship that was not connected to the mainframe.
|
00:15:44.320 |
So, Donny, the brave new world that we're entering, you mentioned AI, but let's go through
|
00:15:49.000 |
some of these technologies that many of my listeners might have only a vague idea about.
|
00:15:54.960 |
So cloud technologies.
|
00:15:56.360 |
Yeah, so I mean, simply put, cloud technology is really still just a standard type of computing
|
00:16:02.440 |
thing.
|
00:16:03.440 |
It's a computer, right?
|
00:16:04.440 |
It's just the data is not on your physical terminal.
|
00:16:06.600 |
So I have my laptop up.
|
00:16:08.760 |
There's data that is physically on my computer, but there's also stuff stored in the cloud.
|
00:16:13.200 |
And what that does is they distribute that information over servers all over the world,
|
00:16:17.880 |
which gives you some type of redundant access.
|
00:16:19.720 |
So as you save information, it's stored in multiple servers that are all network together.
|
00:16:25.120 |
So the cloud isn't still a physical computer that is connected to the internet.
|
00:16:28.320 |
It's just not yours.
|
00:16:29.320 |
But that allows you to do is, if your terminal becomes disconnected or lost, you can still
|
00:16:33.680 |
eventually access that information or you can access it from effectively any point on the
|
00:16:37.640 |
internet.
|
00:16:38.640 |
So you're basically throwing the information up into the center of that network's group
|
00:16:41.720 |
of computers, knowing that if you have the right username, password, authentication,
|
00:16:46.200 |
information, you can access it wherever you go.
|
00:16:48.720 |
So it allows you to use, maximize scale.
|
00:16:51.960 |
And it allows you to basically, instead of having to keep buying memory, you can rent memory
|
00:16:57.120 |
on larger computers.
|
00:16:58.800 |
So it's a very efficient way to manage businesses.
|
00:17:00.880 |
It's a very efficient way to manage information.
|
00:17:03.440 |
And it's a way to basically ensure that availability of that CIA trial.
|
00:17:08.480 |
So if I want to see a cloud in the physical machine of it, is there any place I can go around
|
00:17:15.840 |
here?
|
00:17:16.840 |
Yeah.
|
00:17:17.840 |
There's probably servers here on the Stanford campus where it's going to be a--
|
00:17:20.840 |
That's what it is.
|
00:17:21.840 |
So if you're in a dark warehouse room that's going to have lights of aisles, like a supermarket
|
00:17:25.120 |
and every aisles going to have a set of probably Cisco or other computers switches with
|
00:17:30.080 |
instead of one ethernet cable, 50 to 100 of them sticking out, and it's all tied together
|
00:17:34.960 |
and it's connected to the internet so that when you save your Word document, it's actually
|
00:17:39.880 |
going through the internet and saving on a memory chip somewhere in one of those points.
|
00:17:44.920 |
So this is an immense service, which we all avail ourselves of.
|
00:17:49.480 |
And it sounds like we really should be quite grateful for something which for many of us
|
00:17:54.120 |
comes for free, you know?
|
00:17:55.640 |
Absolutely.
|
00:17:56.640 |
And I think that it's, you don't realize, but you have Gmail.
|
00:17:58.600 |
I mean, all your data is saved up on a cloud somewhere.
|
00:18:01.160 |
And the real big providers are Amazon Web Services, Google Cloud, and Microsoft ads.
|
00:18:06.280 |
Why do they provide those services if I can introduce my suspicion?
|
00:18:09.960 |
Oh, well, yeah, because they're getting information that they can sell to be quite
|
00:18:14.440 |
Machiavellian about it, right?
|
00:18:15.480 |
I mean, there's no one-- as we say in the cybersecurity world, if you're not paying
|
00:18:19.000 |
for the product, you're the product, right?
|
00:18:21.080 |
So in some ways, if you see something, you have a free option, one that's $5 a month.
|
00:18:26.320 |
Sometimes a $5 a month is the safe one because you're buying the ability to control your
|
00:18:30.000 |
information, whereas the free version, they're going to monetize, otherwise.
|
00:18:33.200 |
That's why I like Netflix.
|
00:18:34.200 |
I know that I'm paying for it every month.
|
00:18:36.000 |
Absolutely.
|
00:18:37.000 |
I'm not the product.
|
00:18:38.000 |
Well put, yeah, great.
|
00:18:39.640 |
So an artificial intelligence is something that I've done a show on.
|
00:18:45.480 |
And every year it seems that it's becoming more of a reality.
|
00:18:49.840 |
I mean, the cyberbolic imaginations that we used to have about it.
|
00:18:54.600 |
But the Android is not far off.
|
00:18:58.040 |
The autonomous vehicle is already becoming a reality.
|
00:19:01.760 |
Where do things stand with artificial intelligence?
|
00:19:03.920 |
What challenges does that represent?
|
00:19:05.560 |
Yeah, I think that artificial intelligence is-- you've got to look at it in a family or
|
00:19:11.200 |
a field, a wider field.
|
00:19:13.160 |
And I think when people hear artificial intelligence, a lot of times they jump to generalized
|
00:19:17.840 |
AI, which is what we think of as the terminator, something that can kind of think and act
|
00:19:23.120 |
fully on its own.
|
00:19:24.680 |
I still think we're pretty far away from that.
|
00:19:26.520 |
I think if you talked to some of the experts here at Stanford, they would back that up.
|
00:19:30.320 |
I think I told you in the act at one time where a study had kind of looked at how many
|
00:19:34.160 |
effectively on-off switches are in a neuron.
|
00:19:37.280 |
And when you figure that out, the human brain effectively has more computing power than
|
00:19:41.400 |
every computer in the planet tied together.
|
00:19:44.400 |
So I think we're a long way off to getting a point where we can create computers that
|
00:19:47.720 |
can move that fast.
|
00:19:48.720 |
Okay, hold on, can interrupt.
|
00:19:51.040 |
Then how is it that no human grandmaster of chess can even stand a chance to beat either
|
00:19:57.840 |
the-- what do they call it?
|
00:20:00.200 |
The stock fish or the new one would get a deep mind stuff.
|
00:20:03.480 |
The alpha zero or something.
|
00:20:05.640 |
Is that just because it's computational?
|
00:20:07.600 |
That's speed and computation and also what is the real field of AI that we're pretty
|
00:20:12.320 |
good at or getting good at is specialized AI.
|
00:20:15.240 |
Like we can build something that can do some very specific tasks much, much faster than
|
00:20:21.040 |
a human.
|
00:20:22.040 |
And I think that that's the difference is that the thing that AI and machine learning
|
00:20:26.240 |
bring, it allows you to move at machine speed, which is a factually light speed as opposed
|
00:20:30.400 |
to human speed, which is significantly slower.
|
00:20:32.960 |
So in some cases, the human brain may have the computing power, but it is not organized
|
00:20:39.200 |
and designed for that level of precision and speed.
|
00:20:42.400 |
And I think that's the real piece out there.
|
00:20:43.960 |
But the challenge really that we're struggling with AI is you still have to teach the system.
|
00:20:50.400 |
There's no-- there's not really one that can really completely learn on its own.
|
00:20:53.960 |
So what that really ties into is the machine learning aspect where you build a series of
|
00:20:58.800 |
algorithms.
|
00:21:00.240 |
And those algorithms try to read, for example, an image.
|
00:21:03.040 |
You still have to humans go in and look and correct those images to say, yes, that's a
|
00:21:07.320 |
cat or yes, that's a dog.
|
00:21:09.240 |
And eventually the computer gets fast enough to learn and it can recognize that.
|
00:21:12.240 |
And you see that, and what you can do on Google today.
|
00:21:15.040 |
Eventually we're getting to now and again, there's some projects on Stanford that do
|
00:21:18.520 |
this, trying to do what's called unsupervised learning, where I can build a set of algorithms,
|
00:21:23.320 |
I can feed in information, the computer can start to learn that information itself and
|
00:21:28.200 |
have a human kind of check it on the backside.
|
00:21:30.800 |
So that is really what's going to lead us to a level where that AI really catapults a
|
00:21:35.160 |
to point work in maneuver and operate and think on its own.
|
00:21:38.440 |
But again, I think we'll find it stills in me, a fairly defined lane, like autonomous
|
00:21:42.560 |
car, it's going to do great at driving a car.
|
00:21:45.480 |
It's not going to make you espresso.
|
00:21:46.880 |
You know, like it's not going to be able to jump from one task to another very easily.
|
00:21:50.200 |
Although I did read this, what for me was a fascinating account in this science section
|
00:21:56.160 |
of the New York Times, I think I know exactly when it was because it was right at the beginning
|
00:22:00.200 |
of a course I was teaching on Jumbatista, Vico, and I brought it in as an example of this
|
00:22:07.160 |
Alpha Zero chess AI, which apparently taught itself by playing itself, and it took only
|
00:22:14.360 |
like 20 minutes to play itself, millions of times.
|
00:22:17.400 |
And having done that millions of times, it learned kind of on its own, unlike the stockfish
|
00:22:23.000 |
which it crushed in a competition, which was the kind of AI that you were describing, where
|
00:22:29.680 |
human beings have to supervise, have to input the principles and get it to do the thing
|
00:22:33.880 |
that humans have to think for it.
|
00:22:35.720 |
Whereas apparently this new one kind of was thinking for itself and had moves that at least
|
00:22:40.160 |
looked very intuitive and surprising.
|
00:22:43.840 |
And yeah, I think you're referring to Alpha Go, which was the now the Google project that
|
00:22:48.760 |
did the Chinese go game, which is I know fascinating.
|
00:22:51.240 |
I know about the Alpha Go as well, but this is called Alpha Zero.
|
00:22:55.040 |
And the mathematician from Cornell who was the author of this article was speaking about
|
00:22:58.920 |
how that will soon be an Alpha Infinity.
|
00:23:01.960 |
Anyway, the idea was that it did not play chess like a brute machine.
|
00:23:08.880 |
It played chess like so much.
|
00:23:10.840 |
Yeah, you want some finesse.
|
00:23:11.840 |
finesse and even engaged in this what is known as romantic chess, which is intuitive.
|
00:23:18.960 |
There's such an actual thing in the history of chess called romantic chess.
|
00:23:22.720 |
Anyway that author then goes on to imagine a kind of Alpha Infinity where humans will turn
|
00:23:32.120 |
to that new form of intelligence for all the knowledge that we ourselves would never be
|
00:23:39.280 |
able to arrive at on our own.
|
00:23:43.160 |
And anyway, it was interesting to me because this guy I was teaching Vico said that we
|
00:23:47.800 |
can only know that which we make.
|
00:23:50.680 |
That making is kind of conditioned for knowing.
|
00:23:53.440 |
But how we can make something that we ourselves can't know what it knows if you extrapolate
|
00:23:59.200 |
to that kind of Alpha Infinity future, that is a dilemma, a conundrum that that that
|
00:24:05.680 |
is a paradox.
|
00:24:06.680 |
And I think you're absolutely right.
|
00:24:08.480 |
I think that's something I think we're struggling with.
|
00:24:10.200 |
I mean, you see things where you have these computers and these AI now that do their own
|
00:24:14.720 |
art and some of its bizarre, some of its interesting, and then others where they had the
|
00:24:19.880 |
one example not too long ago where they started having computers kind of connect.
|
00:24:24.440 |
These artificial intelligence talked to each other and eventually they invented their
|
00:24:27.240 |
own language and no one really knew what they were saying or what whether it's gibberish
|
00:24:31.440 |
what was going on eventually they just shut it down and you're right.
|
00:24:35.000 |
We are making things now that kind of hit that paradox where we're making things that
|
00:24:40.640 |
make things that we can't understand which is kind of fascinating and human history I think.
|
00:24:44.680 |
So you mentioned 5G just very briefly in passing and this is another kind of specter that
|
00:24:53.240 |
is looming.
|
00:24:54.240 |
I hear it on news programs and beware of the 5G era is coming that it's everything
|
00:25:00.240 |
is going to change with 5G.
|
00:25:01.800 |
Can you tell us what 5G is about?
|
00:25:04.680 |
It's interesting because I think that a lot of times when you think about the previous
|
00:25:08.160 |
technology which have kind of backwards been termed 1G, 2G, 3G and 4G, when you line things
|
00:25:13.080 |
up a number you think it's the same distance but really 5G is an exponential change from
|
00:25:19.160 |
4G and to put it in more specific terms is it allows you it uses much smaller waves with
|
00:25:26.440 |
radio wavelengths which allows you to send a lot more information.
|
00:25:30.360 |
The problem with smaller wavelengths that get deflected easier but the technology is now
|
00:25:34.360 |
allowed us to shrink the antenna size and make compact like constellations of antennas
|
00:25:39.560 |
that allow you to broadcast and connect in certain ways.
|
00:25:42.680 |
And really what it allows you to do is right now really the fastest internet you're probably
|
00:25:45.760 |
going to get is through a cable modem at your house.
|
00:25:50.360 |
This will allow you to transmit over the air probably something 10 to 20 times faster
|
00:25:56.600 |
than what's currently wired cable internet.
|
00:25:59.280 |
It probably rival optical fiber and the other piece that's not just the speed it's the
|
00:26:04.040 |
latency.
|
00:26:05.040 |
You know when you access a website it's not just you ask one question and the database
|
00:26:10.360 |
spits out the whole website.
|
00:26:12.160 |
There's hundreds of little packets that transfer over and if any one of those packets gets
|
00:26:16.200 |
delayed because they're all taking some kind of different pathways to your device that delays
|
00:26:21.000 |
the reconstruction of the image but this will drop latency by 60 to 120 times what we currently
|
00:26:27.440 |
have.
|
00:26:28.440 |
What that allows you to do is move high fidelity data very quickly across a wide range of
|
00:26:34.240 |
devices and really what that allows you to do is do things like autonomous vehicles because
|
00:26:38.080 |
for autonomous vehicles to work we started out with using cameras and lidar to map what's
|
00:26:42.960 |
around us but the real thing that five G-brings allows the different cars to talk to each
|
00:26:48.360 |
other.
|
00:26:49.360 |
So instead of sensing a car and interpolating what it's speed is and moving around it you
|
00:26:53.440 |
can actually communicate all of the diagnostics for those two vehicles and really time
|
00:26:58.320 |
how you're moving together.
|
00:27:00.000 |
So that really gives us ability to take all of our technology and really exponentially
|
00:27:05.440 |
increase the way it moves because it goes back again the information is everything drives
|
00:27:09.920 |
an information.
|
00:27:10.920 |
If you can move information faster and with less latency it allows you to do things that
|
00:27:15.320 |
previously were unimaginable.
|
00:27:17.320 |
Yeah I see it coming down here.
|
00:27:20.000 |
You know I bought my cars is the last year before they started putting computers into
|
00:27:25.320 |
the cars because I don't like cars that are smarter than me and I don't like the idea that
|
00:27:32.800 |
they have this electronic dependency but I see the day coming where it's going to be
|
00:27:38.560 |
obligatory by law to have a communication system so that your car can communicate with
|
00:27:44.560 |
other cars on the road otherwise you're not going to be allowed on the road.
|
00:27:47.880 |
So it is a brave new world definitely with this five G. So increase speed I never need
|
00:27:53.720 |
any more speed than what I would tell you how primitive I am.
|
00:27:58.160 |
If I call up a Wikipedia you know if my Google cyber security it comes up right away.
|
00:28:05.520 |
I can stream live stream things on a computer but seems perfectly fast enough for me but
|
00:28:13.440 |
I guess people need more speed for these other sorts of operations.
|
00:28:17.280 |
So just another two things quickly before we move into the question of where is it?
|
00:28:23.680 |
Where are the real threats coming from these days?
|
00:28:26.000 |
So you have this thing here on quantum and edge computing?
|
00:28:31.240 |
Yeah and I'll just briefly hit that and honestly I'm by no means an expert in those points
|
00:28:36.080 |
but you know quantum computing is really what we're looking at is taking what we now
|
00:28:41.200 |
think of as physical switches and dropping it down to the quantum level so in other words
|
00:28:44.920 |
can you use the actual movement of atoms as an on/off switch because if you take your computer
|
00:28:51.640 |
and boil it down to the very very bottom all it is is a
|
00:28:55.440 |
millions of on/off switches and that's how they communicate and save and all the data
|
00:29:00.360 |
but right now you're hitting the physical limits of being able to create a chip that
|
00:29:05.560 |
can have that number of switches.
|
00:29:08.000 |
A quantum device allows you to bring those switches down to the atomic level which allows
|
00:29:12.480 |
you to pack even more switches in what can find space which means you can just do a lot
|
00:29:17.040 |
more computing power and then edge computing is kind of working with how they can
|
00:29:21.600 |
manage the data storage like we talked about the cloud and get that computing as close as
|
00:29:26.440 |
it can to the point of where it's actually needed and really that that's being applied
|
00:29:30.480 |
to someone who talks about autonomous vehicles and other things like that.
|
00:29:34.320 |
That's great.
|
00:29:35.320 |
So lastly this category of the Brave New World blockchain and cryptocurrencies.
|
00:29:40.360 |
Yeah and blockchain has a lot of interesting potential.
|
00:29:44.040 |
The biggest thing I would say about this is that I think a lot of people look at blockchain
|
00:29:47.840 |
and cryptocurrencies one and the same and they're very different things.
|
00:29:51.920 |
Cryptocurrencies are based on blockchain technology and concepts but you can do blockchain
|
00:29:57.800 |
for a lot of different things and what blockchain really is is it's a, effectively a
|
00:30:02.000 |
ledger system.
|
00:30:03.000 |
What it allows you to do is I would basically write down say a transaction and I would
|
00:30:08.920 |
write it down and I would show it to other people like for example yourself and friends
|
00:30:14.200 |
and each of you would validate that transaction say yep that's a valid transaction.
|
00:30:18.240 |
At that point there's a way you can cryptographically hash that which is basically to lock
|
00:30:22.280 |
the data so you can anyone can check whether that data has been tamper with.
|
00:30:26.720 |
And as you do that each time you add another chain to that calculation so you end up getting
|
00:30:32.320 |
a series of transactions which prevents anyone from being able to go back and tamper with
|
00:30:37.520 |
a previous transaction.
|
00:30:38.880 |
So there is potential for things like any type of business or financial transactions
|
00:30:43.360 |
or stuff is like real estate for example when you do title insurance and you title research
|
00:30:49.000 |
you could do all that on the blockchain and then you have a irrevocable, unchangeable
|
00:30:53.240 |
ledger system to manage things and that's really what the cryptocurrencies are based on
|
00:30:57.800 |
is that type of technology but they're not, they're kind of two different things.
|
00:31:02.480 |
Great.
|
00:31:03.480 |
So maybe we can now move to this question of where the big threats are coming from we touched
|
00:31:09.080 |
on it at the beginning a little bit but we can get more specific when it comes to for
|
00:31:15.200 |
example the question of nation states is cyber attack going to be a new form of warfare?
|
00:31:21.320 |
Yeah and I think it certainly is and it already on some ways become one.
|
00:31:25.000 |
I mean if you think about human existence there's always a series of conflict going on
|
00:31:29.760 |
and there's always an aspect of countries and organizations position themselves for
|
00:31:33.560 |
advantage, either on an individual, on a business or a nation state level.
|
00:31:38.360 |
And really when you take a look at it now as we see with military weapons nation states
|
00:31:43.440 |
have been the only real groups organizations that can muster the resources to do large scale
|
00:31:49.640 |
weapons things like nuclear weapons.
|
00:31:51.680 |
So they still have, while some cases you can have individuals now and transnational groups
|
00:31:57.600 |
muster some technologies still in the world stage the most dangerous cyber weapons can
|
00:32:01.840 |
be built and utilized by nation states and you'll often hear the term advanced persistent
|
00:32:07.480 |
threats or APT's reference there because that's really the people that can really get in and
|
00:32:12.760 |
really break apart computers or hack into things at a level that's really hard to see.
|
00:32:18.600 |
So the other options beyond nation states these advanced persistent threats would be cyber
|
00:32:23.000 |
criminals and hacktivists and cyber criminals are what you say people using cyber and hacking
|
00:32:27.800 |
to basically steal money and use for business purposes and then hacktivists generally hacktivists
|
00:32:33.200 |
are individuals who are groups that have largely used it in some cases more of a defacement
|
00:32:38.000 |
or a political movement or a sign they use the hacking and cyber offense and cyber weapons
|
00:32:44.720 |
to basically make a political statement.
|
00:32:46.640 |
Those are the three probably big big groups.
|
00:32:48.720 |
I think the biggest thing to realize is the real large scale types of efforts out there.
|
00:32:53.400 |
I have moved way beyond you know an individual in a garage and it's moved into a organized
|
00:32:58.760 |
business and organized military weapon that can be used to really position yourself in conflict.
|
00:33:05.800 |
Okay so I gather that the threat is mostly human.
|
00:33:09.880 |
You're not too worried about any kind of natural disaster or something.
|
00:33:15.960 |
Well yeah I think that there's certainly an aspect of that and even if you talk on the cyber
|
00:33:19.360 |
security realm there's a whole realm or domain on physical security and physical disaster
|
00:33:24.320 |
response that has to go into everything.
|
00:33:26.440 |
So I mean that's always a whole kind of different ranges if we have an earthquake that
|
00:33:30.760 |
takes out one of those server banks how do we how are we how is the network resilient
|
00:33:35.800 |
enough to kind of move past that.
|
00:33:37.200 |
So I don't mean to move past that and say that's not a concern certainly is but I think
|
00:33:41.920 |
that the thing that captures people's imagination oftentimes is more the nation state hacking
|
00:33:47.080 |
more than the solar flare although you're right both could dramatically affect the
|
00:33:50.760 |
network and connectivity.
|
00:33:52.480 |
So speaking about that kind of malicious intentional attacks so how are these attacks executed?
|
00:33:59.640 |
You talked about fishing earlier but there's probably a lot more to it than that.
|
00:34:03.600 |
Yeah absolutely I mean believe it or not fishing is actually a fairly one of the most commonly
|
00:34:09.240 |
used tactics and I think the first thing to think about is what is known as the cyber
|
00:34:13.480 |
kill chain and that's pulled from a military background and what it is it talks through
|
00:34:18.280 |
what you need to do to get a cyber weapon to work.
|
00:34:20.960 |
And the first point is is reconnaissance which is doing your research on the target and figuring
|
00:34:25.640 |
out something as simple like if we're going to talk an efficient example I want to hack
|
00:34:30.520 |
Robert Harrison Stanford what is what is Stanford's email address it's at Stanford IDD
|
00:34:35.240 |
okay what is what are the names is it first name dot last name is initials doing that type
|
00:34:39.680 |
of research and once you've identified the individuals or the email contracts now you've
|
00:34:44.840 |
got information on how you might be able to access them.
|
00:34:48.320 |
Then you have to weaponize you have to build something that can be packaged into a document
|
00:34:53.520 |
or a link or something else that once I get you to click or download or access it it
|
00:34:58.760 |
gives an opening for me to get into it at that point I email that to you that's your delivery
|
00:35:03.400 |
exploitation is me exploiting you clicking on that link and then that allows me to install
|
00:35:08.440 |
malware on your system then I work command and control to talk back and forth that malware
|
00:35:13.640 |
and then eventually what I try to do is do lateral movement across the network where I pick
|
00:35:17.880 |
the weakest point or the person that has access to the network I need and I use that as a door
|
00:35:23.120 |
to get in and try to escalate my privilege and get further and further and deeper in that
|
00:35:27.280 |
work to find what I want and then actually trade that data or deploy that so I think
|
00:35:31.960 |
that's the first thing to think about it's it's an extended chain if you disrupt any
|
00:35:35.080 |
step in that process that kind of defeats the attack so fishing is often that delivery
|
00:35:41.040 |
method and there's a couple different versions out there and I'll throw some out there
|
00:35:44.920 |
because they get kind of creative but there's the fishing but then there's also what's called
|
00:35:48.240 |
spear fishing where fishing you tend to have the Nigerian prince email that goes out to
|
00:35:53.360 |
a hundred thousand people the spear fishing is is I'm going after the CEO of this or really
|
00:35:58.560 |
any specific individual they call whale fishing is when you go after a big fish a specific
|
00:36:03.520 |
individual so if I'm going to try to specifically engage the CEO or a government official
|
00:36:07.800 |
that would be the whale fishing and then there's two other things out there which are kind
|
00:36:11.520 |
of funny called smishing and fishing and smishing is is getting you to click on a
|
00:36:16.800 |
link or getting you access by SMS texting you and fishing is is just calling you up on
|
00:36:21.040 |
the phone and getting you to give your password over the phone and again at DEF kind
|
00:36:25.400 |
of a year they have a fantastic competition where you can enter the competition and if they
|
00:36:30.480 |
do you get tans to do research on a target and they lock you in a glass booth in front
|
00:36:35.440 |
of the audience and you have a set time limit to try to get as much information out of the
|
00:36:40.160 |
person on the phone and eventually the the winner that gets the most points ultimately
|
00:36:43.720 |
will get them to navigate to a malicious page and open up and open up a document all via
|
00:36:48.440 |
the phone and you can go look online there's some pretty fantastic examples of what people
|
00:36:54.120 |
will just tell you a clean side even just there was one the other day where someone
|
00:36:58.120 |
just walked down the street and started interviewing about computer passwords and they say
|
00:37:01.920 |
like what do you think a good password would be is like oh well like maybe you know
|
00:37:06.800 |
something that's important to me like maybe my favorite color and then maybe you know
|
00:37:10.680 |
my favorite number and then maybe a birthday and there's like oh yeah it's great so
|
00:37:14.320 |
where you from like yeah so what's your favorite color oh it's green or like you know
|
00:37:19.000 |
made my high school mascot oh really what high school did you go to and you'll see them
|
00:37:23.560 |
just literally tell them the framework for the password and the actual password itself
|
00:37:28.080 |
in a five-interview with a stranger on the street so again it kind of goes back to what
|
00:37:31.960 |
actually said the weaknesses often often human so that that's kind of the fishing social engineering
|
00:37:37.640 |
piece there's there's other things out there not kind of just run through a few of them
|
00:37:41.720 |
there and we can kind of dive deep dive into one is need to there's often called denial
|
00:37:46.040 |
of service and what that effectively allows you to do is if you think of like the internet
|
00:37:52.200 |
is two computers talking to each other if you think about Amazon right Amazon has thousand
|
00:37:56.560 |
hundred thousand millions of people all over and stuff at the same time well if you
|
00:38:00.400 |
figure out how many effectively incoming requests in that website handle and I triple it
|
00:38:06.400 |
and do it all at the same time it can crash the website and so using denial of service or
|
00:38:11.240 |
distributed denial of service allows you to basically use multiple computers and oftentimes
|
00:38:15.600 |
malware will do this they'll get malware on your computer and when you're not using your
|
00:38:18.920 |
computer in the background someone's using that computational power to basically send queries
|
00:38:24.600 |
to a website to basically engage and bring down that website that's kind of a denial
|
00:38:29.280 |
of service and that's again it more of a temporary jamming sort of thing and then other things
|
00:38:34.880 |
people do is ran somewhere that's pretty common for cyber criminals some nation states do
|
00:38:39.920 |
as well where they'll go in and lock down information and then extract a ransom to unencrypted
|
00:38:44.680 |
information and then crypto jacking is no one out there this relates to the cryptocurrencies
|
00:38:49.880 |
where people because it's not tangible money it's all digital right so if I can get
|
00:38:54.760 |
ways to break into that I can extract that information and a lot of times the way cryptocurrencies
|
00:39:00.840 |
reward you new gain money is actually adding to that transaction and computing which
|
00:39:05.440 |
eventually takes more and more computing power and what you'll find is people install
|
00:39:09.360 |
malware to muster millions of computers that people like your own computer not realizing
|
00:39:15.000 |
and when you're at home and not using it again use that computing power to crank and
|
00:39:18.120 |
gain cryptocurrencies for the user so those are some different things that are used out there
|
00:39:23.840 |
as far as the threats so I'm going to assume that nation states are intensely aware of the
|
00:39:32.160 |
threats and are also taking steps to protect themselves from it no absolutely and it's
|
00:39:39.480 |
not just protecting themselves against other nation states or protecting themselves against
|
00:39:43.080 |
the same things cyber criminals hackdivists I mean a while ago when we had a you know
|
00:39:49.160 |
incident with a Chinese pilot I've remember the specifics of it a few years ago is Chinese
|
00:39:55.120 |
hackers not sure who took down some US government websites and defaced them you know
|
00:40:01.440 |
we are constantly working to kind of protect our networks and ensure that they're safe
|
00:40:06.320 |
but it gets very interesting when you think about how different nation states think about
|
00:40:10.800 |
things and when you think about for example in the United States you might have a company
|
00:40:17.120 |
with intellectual property another nation state may want to desire to steal that intellectual property
|
00:40:22.760 |
in some nation states you have a case where the government health protect their companies
|
00:40:27.080 |
in other cases where the United States you have an interesting point where we don't
|
00:40:30.800 |
necessarily protect all of our individual companies we help facilitate that but it's
|
00:40:35.080 |
one of those points where is if you and detect an intrusion you need to call the
|
00:40:37.640 |
FBI to come in and assist and then how do we kind of break that apart and discover who's
|
00:40:42.400 |
out there there's a lot of best practices which our home and security distribute out there's
|
00:40:47.880 |
also what's called a isax were basically different industry verticals gather together
|
00:40:52.480 |
and share information with each other so that if one person's attack you get a little
|
00:40:56.240 |
bit of a herd immunity where you share that very rapidly and try to protect other people
|
00:41:00.440 |
that industry against potential attacks from whoever they might be so don't you about
|
00:41:04.880 |
the issue that gets a lot of attention these say about elections and the hacking or how
|
00:41:11.880 |
you know Russia is is trying to and I believe Russia has successfully I don't want to say
|
00:41:19.440 |
destabilize but it but it certainly seems that with the chaos that it's it has engendered
|
00:41:27.280 |
if not deliberately then they lucked out because our reaction has been you know really
|
00:41:33.760 |
quite hysterical in the last two years and we've gone into a mode of civil war in this
|
00:41:39.680 |
country because we have a president who doesn't come out and openly denounce you know Russia
|
00:41:46.680 |
for what it's been accused of doing by the secret services which is tampering with the
|
00:41:52.120 |
elections even if it was more or less not nearly as bad as one might have imagined the
|
00:41:58.040 |
idea is that we we now have this uneasiness about whether our election process has the
|
00:42:05.320 |
the same kind of integrity that we would have assumed that it had before and are we just
|
00:42:10.360 |
overreacting to this
|
00:42:12.280 |
uh... yes and now and i mean i'm not gonna comment on the any the political aspects there
|
00:42:16.080 |
with the president and and how it works with russia will say though that
|
00:42:20.360 |
there's when you talk about social media in the way social media has the ability to amplify
|
00:42:24.760 |
voices good bad and different
|
00:42:26.960 |
i think there are a lot of potential for what's called influence operations and i think
|
00:42:30.480 |
that um...
|
00:42:31.360 |
there's two ways to look at it's
|
00:42:33.400 |
whether individuals either groups within or without the united states are
|
00:42:37.800 |
stimulating that type of conflict and stoking that conflict
|
00:42:40.960 |
that's one aspect to potentially influence election potentially influence a
|
00:42:45.800 |
government i think the other way to look at is
|
00:42:47.880 |
is the actual hacking of
|
00:42:49.760 |
elections and i think that
|
00:42:51.280 |
in some ways that's kind of interesting is both deal with information and
|
00:42:54.600 |
the fascinating thing about the united states system is is we are so distributed
|
00:42:58.200 |
and this goes into kind of the resilience is when you distribute
|
00:43:00.960 |
systems
|
00:43:01.920 |
it's much harder to hack because you
|
00:43:04.760 |
there's a hundred systems you can only have two of them
|
00:43:07.120 |
you're gonna get that resilience when you take a look in the united states i
|
00:43:09.520 |
think we have something close to
|
00:43:11.460 |
nineteen thousand different
|
00:43:13.760 |
individual local
|
00:43:15.640 |
entities that handle elections
|
00:43:17.760 |
so being able to twist the election process and is normally hard if you're
|
00:43:21.440 |
trying to come at it and they all use
|
00:43:23.080 |
different systems different authentication methods
|
00:43:25.520 |
uh... to that i think that
|
00:43:27.080 |
there's certainly room there
|
00:43:28.640 |
to improve ourselves in election security and
|
00:43:31.120 |
do a better public private partnership there to
|
00:43:33.840 |
find ways to protect the individual vote
|
00:43:36.920 |
and i think that's probably a tainable
|
00:43:38.680 |
the point that you mentioned at the beginning which is
|
00:43:41.760 |
how do we
|
00:43:43.600 |
work with information that may be coming from a bot
|
00:43:46.840 |
or a
|
00:43:47.840 |
foreign entity
|
00:43:48.920 |
or a
|
00:43:49.560 |
disparate group
|
00:43:50.680 |
and how do we
|
00:43:51.520 |
adjust how that
|
00:43:52.760 |
influence works on our fellow citizens
|
00:43:54.960 |
that's much for your problem and i think it's less technical and it more
|
00:43:57.660 |
because it gets in the free speech aspect
|
00:43:59.880 |
i think if you can say you're talking about manipulating social media
|
00:44:03.640 |
and fake news and uh...
|
00:44:06.000 |
finding ways to uh... spread
|
00:44:08.440 |
false stories and and and and and and and and site for the inside all these
|
00:44:11.560 |
uh... in planet well
|
00:44:13.040 |
i mean i'm the passions right i think right now you've seen that if you look
|
00:44:16.400 |
at uh... something called deep fakes which is
|
00:44:18.840 |
the ability now to take
|
00:44:20.440 |
a video of a person
|
00:44:23.280 |
you can record and track the facial movements and you can actually now
|
00:44:26.800 |
replay that
|
00:44:28.360 |
and have like another words i can
|
00:44:30.880 |
take a
|
00:44:31.880 |
a video of a public figure and then videotape you
|
00:44:34.880 |
and you can say whatever you want
|
00:44:36.640 |
and it can actually make the video look like they're speaking so
|
00:44:40.320 |
you have the ability now to actually pick a world leaders face
|
00:44:43.800 |
on
|
00:44:44.400 |
another person effectively and and does that qualify as cyber attack i i i i
|
00:44:48.040 |
i think that
|
00:44:48.800 |
i think it's being certainly looked at from a cyber security standpoint because
|
00:44:51.920 |
one one of the things are looking at is how do we
|
00:44:54.920 |
authenticate those videos how do we encrypt and
|
00:44:58.160 |
effectively do a digital signature you know when i send you an email i can
|
00:45:01.280 |
cryptographically sign it so you know it came from me
|
00:45:03.760 |
is a way to do that in watermark those videos in such a way
|
00:45:07.260 |
that uh...
|
00:45:08.320 |
we can make sure we know because
|
00:45:10.240 |
if you're taking
|
00:45:11.680 |
you know picky maybe a country that's less stable or even country is stable
|
00:45:15.120 |
and you suddenly
|
00:45:16.280 |
put that country's leader up
|
00:45:17.840 |
saying something that can incite violence
|
00:45:20.120 |
you have the ability to really have a pretty profound effect on things i
|
00:45:23.000 |
think that uh...
|
00:45:24.560 |
the challenge there is really can we differentiate between
|
00:45:27.840 |
saying the united states
|
00:45:29.320 |
a fellow citizen exercise in free speech
|
00:45:31.800 |
and a
|
00:45:33.320 |
foreign entity or
|
00:45:36.040 |
machine learning driven
|
00:45:38.040 |
uh... social media voice that is trying to incite conflict
|
00:45:42.000 |
yeah that's
|
00:45:44.040 |
well
|
00:45:44.920 |
it's hard to keep up with
|
00:45:46.360 |
what is possible not possible i think when uh... just before we came on
|
00:45:50.100 |
air you're telling me that now you there's technology that tell
|
00:45:53.840 |
how a person sits and what they're
|
00:45:56.320 |
it's that you can track
|
00:45:58.600 |
yeah and i think that you know as an example i would say that
|
00:46:01.800 |
and this kind of ties into to overall social or overall cybersecurity is
|
00:46:06.280 |
one of the key things we use to help you talk about how normal person can help
|
00:46:10.000 |
them
|
00:46:10.480 |
sells protect on some cybersecurity
|
00:46:12.440 |
one of things we use is called to factor authentication
|
00:46:15.360 |
and you see that in a lot of things and what that really means is
|
00:46:18.400 |
something you know
|
00:46:19.440 |
like you're in password
|
00:46:20.680 |
something you have i a phone or device
|
00:46:23.560 |
and something you are i a biometrics
|
00:46:25.840 |
the fascinating thing is biometrics
|
00:46:28.000 |
as large as you can find a thing's like
|
00:46:30.000 |
a thumbprint
|
00:46:30.860 |
and a retinal scan
|
00:46:32.360 |
but now i get a point of like
|
00:46:34.280 |
what time do you wake up in the morning how do you sit in a chair to to your
|
00:46:36.960 |
example
|
00:46:37.960 |
you know how do you move around
|
00:46:39.400 |
and it can start basically quantifying
|
00:46:42.000 |
your human habits to develop a profile to say
|
00:46:45.200 |
i can identify someone not just by face
|
00:46:48.200 |
or other things but i can identify them by
|
00:46:50.720 |
how they walk how they talk
|
00:46:52.560 |
how they sit
|
00:46:53.360 |
how they move
|
00:46:54.640 |
and that has one fascinating aspect from the fact that
|
00:46:57.480 |
now i can protect the my network because i know that
|
00:47:00.320 |
when someone sits down the chair and logs in if it's there's a disparity between
|
00:47:03.640 |
those two
|
00:47:04.520 |
i can lock down that system
|
00:47:06.000 |
but also as profound locations for like a surveillance type aspect in
|
00:47:09.800 |
what can be done with that information
|
00:47:11.720 |
once it's collected
|
00:47:12.920 |
so i think that's another thing i would always throughout you when you
|
00:47:15.160 |
fill out things and you're
|
00:47:16.520 |
trying to decide what you put into a computer
|
00:47:19.080 |
don't put anything in the you can't change
|
00:47:21.400 |
you know like
|
00:47:22.320 |
you can change your name you can change a password
|
00:47:24.280 |
you can change your hair color right you
|
00:47:26.400 |
you probably can't change a fingerprint you probably can't change some of
|
00:47:28.760 |
these other things of who you are and that's
|
00:47:31.280 |
the old white and why is apple or why are the iPhones
|
00:47:35.360 |
they they all have to have the uh... the thumbprint or now the facial
|
00:47:38.720 |
recognition everyone
|
00:47:40.440 |
finds that such an advantage it's convenient
|
00:47:43.320 |
and i think that's one of those things where you always security is always
|
00:47:46.280 |
balance with convenience so you can make something
|
00:47:49.280 |
completely secure i mean really nothing in the computer you get something the
|
00:47:52.880 |
the ninety-nine percent right
|
00:47:54.360 |
but um... it's terribly inconvenient almost unusual
|
00:47:57.200 |
but you're advised you advise us against that well
|
00:48:00.400 |
i would always advise again is it as a becoming cyber security professional is
|
00:48:04.440 |
you always have to find the balance of
|
00:48:07.080 |
usability and convenience with security
|
00:48:09.880 |
and a lot of times that comes with assessing the risk properly and deciding
|
00:48:13.800 |
it was the likelihood of this because if it's a high likelihood and it's a
|
00:48:17.160 |
high payoff
|
00:48:18.280 |
you probably in a ramp up the security and drop the convenience just like
|
00:48:21.680 |
it's fairly easy for you to log in your Stanford computer but it's probably hard for you to access
|
00:48:26.280 |
the servers
|
00:48:27.480 |
you know directly right
|
00:48:29.760 |
well in the few minutes we have remained on it can i ask it's not a personal
|
00:48:33.040 |
question but it's a question of your biography because you're being a marine
|
00:48:36.840 |
you're you
|
00:48:38.200 |
given a number of years for your life to
|
00:48:40.080 |
defense
|
00:48:42.720 |
does going into this new
|
00:48:44.360 |
realm of cybersecurity is that continuous with your vocation as uh...
|
00:48:49.400 |
as a marine or is it is it something
|
00:48:51.840 |
different do you do you consider it of still part of the national defense or
|
00:48:56.760 |
some other uh... path that you're going to get into the other side i think that
|
00:49:00.080 |
um... yeah i've had a great uh... time in the marine core and i've been very proud of my service and i think
|
00:49:06.000 |
the issue about cybersecurity is is really in some ways you could argue that's
|
00:49:09.560 |
really where the fight is now it's in it's in cyber space
|
00:49:12.800 |
and uh... that's not to say there's opportunities in the military side certainly
|
00:49:15.960 |
but i think you know when you think about it especially during silicon valley
|
00:49:18.720 |
i mean the amount of
|
00:49:19.840 |
research and development and new technologies out here
|
00:49:22.600 |
in this area is is absolutely phenomenal
|
00:49:25.840 |
you've got a real eyes though that intellectual property other people are
|
00:49:28.240 |
trying to take it and that has national security implications so in some ways
|
00:49:32.120 |
if you can
|
00:49:33.160 |
bring cybersecurity a private industry it does i think help the national
|
00:49:36.320 |
security in the nation as a whole
|
00:49:39.480 |
so uh...
|
00:49:40.680 |
what are your plans now that you have your degree from brown university
|
00:49:44.080 |
well i'm i'm right now the process of closed out my career in the marine core and
|
00:49:47.400 |
and then i'm right now working to uh... transition
|
00:49:50.360 |
about to start doing some cybersecurity work
|
00:49:52.560 |
for some uh... software service companies here in silicon valley
|
00:49:55.840 |
and uh... hope and help protect them and give them some of that knowledge
|
00:49:58.680 |
well that's great
|
00:50:00.440 |
thanks for all your service we've been speaking with dony hassell team
|
00:50:04.080 |
who's been our guest here to talk about something that
|
00:50:07.280 |
i've learned a lot about it in the last hour and i know that all my listeners
|
00:50:11.360 |
have learned a lot
|
00:50:12.640 |
from what you have to share with us so thanks again for coming on to entitled
|
00:50:15.520 |
opinions
|
00:50:16.480 |
that's been pleasure thank you for taking care
|
00:50:23.480 |
uh...
|
00:50:30.480 |
uh...
|
00:50:37.480 |
uh...
|
00:50:44.480 |
uh...
|
00:50:51.480 |
uh...
|
00:50:58.480 |
uh...
|
00:51:05.480 |
uh...
|
00:51:12.480 |
uh...
|
00:51:19.480 |
uh...
|
00:51:26.480 |
uh...
|
00:51:33.480 |
uh...
|
00:51:40.480 |
uh...
|
00:51:47.480 |
uh...
|
00:51:54.480 |
uh...
|
00:52:01.480 |
uh...
|
00:52:08.480 |
uh...
|
00:52:15.480 |
uh...
|
00:52:22.480 |
uh...
|
00:52:29.480 |
uh...
|
00:52:34.480 |
uh...
|
00:52:40.480 |
uh...
|
00:52:47.480 |
uh...
|
00:52:54.480 |
uh...
|
00:52:59.480 |
uh...
|
00:53:05.480 |
uh...
|
00:53:12.480 |
uh...
|
00:53:19.480 |
uh...
|
00:53:24.480 |
uh...
|
00:53:30.480 |
uh...
|
00:53:37.480 |
uh...
|
00:53:44.480 |
uh...
|
00:53:47.480 |
uh...
|
00:53:52.480 |
uh...
|
00:53:59.480 |
uh...
|
00:54:06.480 |
uh...
|
00:54:11.480 |
uh...
|
00:54:17.480 |
uh...
|
00:54:24.480 |
uh...
|
00:54:30.480 |
uh...
|
00:54:36.480 |
uh...
|
00:54:42.480 |
uh...
|
00:54:49.480 |
uh...
|
00:54:56.480 |
uh...
|
00:55:01.480 |
uh...
|
00:55:07.480 |
uh...
|
00:55:14.480 |
uh...
|
00:55:21.480 |
uh...
|
00:55:26.480 |
uh...
|
00:55:32.480 |
uh...
|
00:55:39.480 |
uh...
|
00:55:45.480 |
uh...
|
00:55:49.480 |
[BLANK_AUDIO]
|