Skip to main content

tv   Former Navy Secretary Richard Danzig on Cybersecurity  CSPAN  January 1, 2016 5:10pm-6:17pm EST

5:10 pm
board. the presidents' intelligence the homelandd and security secretaries advisory council. all.s not he's been a trustee of the wreath college and the rand theoration, a director of center for a new american director of a saffron hill ventures, a european investment firm. in addition, he has recently been a director of the national semiconductor corporation, and of the human genome sciences listed on the nasdaq. and there's a few more things i think we'll be here for a while. year, wrote a very influential, very thought-provoking report, which is one of the reasons we invited keynote speech. a lot of tell you that police
5:11 pm
attention was paid to that report and things have been moving. so without further ado, richard. [applause] >> thank you. good work. well, thank you very much. thank you for that nice introduction. very wise of you to truncate it. my reputation as a speaker had preceded me when i came early and saw that you had out the best speaker award before i spoke. shows very good judgment. and angelos mentioned that he career, orncated my the description of my career. maybe this event will actually truncate the career itself. i actually, among other things, was a law professor. i remember having a strong sense about my teaching qualities, when a student came up to me at
5:12 pm
classesof one of the and gushing with enthusiasm, said, professor danzig, i just don't know how you do it. your every lecture is better than the next one. [laughter] >> i thought about that for a while and decided to quit teaching. um... it led me eventually into government where, among other became secretary of the navy. i thought,- admittedly at some lengths, when a marine got up to leave, that didn't seem to me to be withpriately compatible the dignity of the secretary of the navy. asked himed him and where he was going. i didn't think the dignity of the secretary of the navy was enhanced by busines his answer,h was going to get a haircut. i said, why didn't you get a haircut before i began speaking? he said, i didn't need one. [laughter] >> so these factors all suggest
5:13 pm
were very wise to give out the award long before i spoke. but i thank you for the opportunity. to try and do something a little difficult here. is to provide sort of a bridge between the technology world that you all know so well the washington policy maker world that you're now physically embedded in, but also functionally so embedded in, as cyber security issues. the challenge is to talk between the two worlds and not only to a way that's descriptive that isytic but also prescriptive and suggestive. this draws ind, some measure on a paper that i year, on this last this concept of living on a diet of poison fruit. this is the organization site
5:14 pm
you can download it from the web, if you'd like. i'm going to go further than the but some chunk of the foot notes and the like will background, ifer you want to pursue it further. tot i'd like to do today is give you a sense of the world as it, in terms of particularly admiring the problem first. i'm going to spend a little bit of time just emphasizing the character of what we're having to deal with. to go'm going to try deeper and analyze it. by analyze it, i mean to try and of the key things that underlie the world as we see it. abstract, beyond perhaps kind of everyday concepts that to some basic propropositions about why we'ren in.situation we're and then i'm going to try and description of terrain that i think would be more familiar to you, which is just some of the kinds of things
5:15 pm
trying to do about it. what i want to get to particularly is a set of recommendations. to attempt toing be comprehensive in those recommendations. byant, rather, to suggest and large things that are new, things that we're not doing, our are not so much on agenda, not that i've discovered some incredible curative unknown world but i think there are things to be said about this are not enough on our agendas. say,t me start by, as i admiring the problem. a common kind of phrase used is the motion that this is a wicked problem, by which it's meant that it's highly interactive. it has a number of different components. and these different components create difficulties in our because in fact different parts of the problem resolve it without connecting with other parts. so technology, for example,
5:16 pm
interacts with legal system. i've shown you just a little concerns here.f i particularly emphasize that we concerned obviously with business realities. to provide patches -- or maybe it's not so great, but something to provide patches on systems that have known vulnerabilities. actually look about why it is that people don't -- or why they download those patches, the reasons are highly and relate to business impair tifs. it may seem like this is just a that i ought to be able to drive people towards. integratingthey're the new software into systems complex, because some of these systems aren't being shut down on anything like frequency that would response,mediate
5:17 pm
stores will implement, a few onres at a time, then move to other stores, so you're always having some parts of their system lagging patching or in the a power company will have an annual shutdown for maintenance. and as far as they're concerned, that's the occasion for updating. idea of updating more frequently is one that they can readilynd but not integrate into their business model. you find that problems like this with real frequency. there are also just the kinds of cultural problems in organizations. talking with a chief information security officer for one of the really brand-name companies about his difficulties. he said, every rule you come up most obvious, immediately people ask for --exceptions. i said give me an issue. decree, neverue a tell anyone your password.
5:18 pm
more basic?e he said, immediately the c.e.o. tells me of course he's sharing with his assistant. how else can he or she get into the e-mails? have somere going to theory of change, you're going to have to take account of these variables. i think another aspect of admiring the problem is simply change.d of it's very difficult, i think, cope with --n to maybe even for you all as well. and the example i use in the national scaurt establishment -- security establishment to kind of bring the basic point the historical annal log. ofnk about the introduction gunpowder into europe, circa 1300. something that over time the characterges of warfare, which is essential
5:19 pm
but the -- now, context,in the warfare our notions of defense, for building castle walls that are straight, have to be abandoned. notions of chivalry and leadership change dramatically, because if i stand in front of the army waving my sword i'm going to be shot dead. organizations have changed, because now i need mass for firepower and i need ability to bring my troops into a state of training that's than if iophisticated simply raise up farmers to be a kind of posse to go deal with something for a few weeks. so i begin to require standing whoes with trained officers even understand something about ballistics and the like. then the state changes, because to have, in the state, a theselity for sustaining armies, which brings me to taxation and the like.
5:20 pm
munitions need a industry, because if i don't have a munitions industry, i'm any future kind of combat. so everything changes. the nature of warfare, the nature of the economy, the nature of the state. my observ observation is a prety which is that the coming of the information age is not less significant than the gunpowder. but all the changes take place essentially over the course of two centuries. that we'venges experienced by and large, and the major changes, in the occurredon world, have over two decades. so the speed of assimilation is just very, very difficult for policy makers and others. official put it nicely to me when we were talking about this and i made the point i've just described. said, yeah, the problem is that the technology changes at speed of morris law, and the people don't. what's in our heads doesn't
5:21 pm
change that fast. we have all kinds of legacy systems operating in this context. so you see the dramatic changes that we've experienced up until now. i'm going to say a little bit the future shortly. but what i'd like to is thatarly emphasize quote, a famous well-known computer scientist, theiam faulkner, who said past is not dead. it's not even past. do i have to explain this reference? i'm not sure. overlaps, these thatnuities from the world has past and that remain embedded in our systems and give problems.ntal so let me give you one example from the national security world little subtle and illustrative, perhaps beyond your experience. think it's important in the way government officials think about digital information
5:22 pm
that by and large they were in the warfare context, originally, and our biggest development of them, in the context of espionage and intelligence, the national security agency, n.s.a., is obviously the leading arena of capability in this regard for us. it's striking when you think the national security world, that it has some kind of implicit norms. in the cold war, there's a tim moore, who helped me the think a this out. war, we basically didn't interact in a direct conflict way with the soviet union, whenever possible we avoided that. involved various proxies. think about the vietnam war, cuba, other issues like that. but by and large, we didn't have direct confrontation. about there was some sense of and off the road restraint. at the same time, in the
5:23 pm
espionage world, by and large, were off. could do something, then way ofmething by discovering intelligence and the like, directly involved and competition. comes now the cyber-world and attitude, i think, became all bets are off. unrestricted. we don't have these kinds of restraints. as another example, if you use a weapon in d.o.d., you want to introduce a weapon, there's elaborate legal analysis that says, is this weapon consistent with the laws of war? but what happens in the cyberworld is that though the that you are used to, that you would use for espionage intelligence gathering, can also be used for offensive
5:24 pm
the battlefield. those tools are treated as if they were information-gathering tools and we don't have the kind around themucture or the conceptual structure around them that we have for other things. that's one of the reasons i think that the government is struggling with the reaction to the office of personnel that you'reack seeing, because the general historical attitude has been things.e two kinds of there's warfare and there's espionage. the cyber know, straddles both of them. when they straddle both of them, creates complications. so we have historic ways of thinking that while the world is so rapidly changing, those historic ways of thinking are handicapping or limiting us. we have this kind of that noentalization longer works out. we don't have these kinds of understandings of distinctions between offense and defense that
5:25 pm
we used to have. longer begin to work. and another example is, in the private sector, is different from public sector. we used to think of that warfare the public sector kind of context. you -- byns when public sector, i mean government. what happens when you begin to freely?re i want to take you back a little bit and just give you a little a chinese document written at the end of the 20th century. published a piece in the new york times in 1998, doing what every pentagon official the most fundamental and wonderful and important thing to do. newh was i introduced a acronym. and i was very proud of it. wasew acronym i thought very cleverly signed. new.s called it stood for nonexplosive warfare. the notion was there are a lot of things that are coming that
5:26 pm
are weapons that don't go bang, that don't explode or are not kinetic. this had, i think, absolutely characteristic success for me, which is to say nobody talks it.t but i nonetheless sort of want to try and revive it, by revealing it here. two chinesehese colonels in 1999. they advanced the motion of unrestricted warfare, which you can read right up here. that basic notion was we're coming into an era of technological violence, that distinction between battlefield. and that the new concepts enable kind of warfare. they then went on to talk about weapons.ew concept ande not trying to kill destroy so much as we're trying to control. remember, this is 1999.
5:27 pm
a single stock market crash, a single computer virus can affect these kinds of new concepts. what we're trying to do to achieve victory is to control, not to kill. entering an era of political, economic and technological violence. some morning, people will awake to discover with surprise that a few gentle and kind things have begun to have offensive and lethal characteristics. well, you, in if light of the experience the last derk will not -- decade, will not be surprised at this. things. these we've lived them. we see it in the world of business, where we're dealing with things like i.t. theft and other kinds of difficulties that i've sketched here. see it as well in individual
5:28 pm
not only the negatives but also how the positives are with the negatives, sharing of data and the like. in general,eing it in the context of the new warfare that i've suggested to you. so where are we going in regard to this? i don't know. i don't think you know. i published a paper a few years "driving in the dark" which got some attention, because the gist of the argument, as some others have made, has been we can't see this the evolution of the complex future, the emergent realities are going to be challenging for us, because in fact our headlights only go so far. and if you look at the predictions, historically, they're not very valuable. to 1990, and you look at predictions about how impactogy change will national security, the most striking thing to me is the
5:29 pm
paucity of attention to the internet. internet is there. comes out of darpa starting in the 70's. relatively robust in the 80's. it's all there. except in't see it, retrospect. there's a wonderful book called once young is obvious know the answer." in retrospect, we can see all this. prospect, we're not good prick tors. predictors.not good with need to recognize that, because it's extremely relevant dealing withwe're here. the fact is, i can point to the fact that i know something about the pace of technology change, i that transition will continue in ever accelerating kinds of ways. there's a huge variety of actions and actors out there into will occur and that
5:30 pm
i do know, though, when i am concerned about as a national security analyst, and what policymakers ought to be concerned about. very particularly, i am concerned about the destruction of social properties -- and things like the financial system, power companies, and the like, that provide a back for our capabilities -- backbone for our capabilities and i'm concerned about how things may evolve for individuals apart from the state. my first reaction as the internet of things evolved ever further was that this represented a set of risks from a national security standpoint -- was i concerns, could hack my refrigerator or cause an individual automobile accident, but if i am a terrorist groups like isis and i want to create havoc, lack of trust, indeterminacy, and other
5:31 pm
contexts in america, maybe if i can make people very unsure about the safety of their automobiles by periodically causing them to wreak havoc, i could achieve political fans in ways that i care about -- political ends in ways that i care about. there is a sense of the problem. at this point you may feel a little bit like this is just too much in some dimensions to come out from a policy standpoint, but clearly it needs to be thought about. among the other parts of my background, i was at one point a supreme court clerk working not far from here for a supreme court justice, and another supreme court justice, besides the one that i was working for, justice douglas, who was well-known as a misanthropic, sort of, guy.
5:32 pm
he kind of love mankind in abstract, but hated the rest of us. he felt, one day telling a story about his father, which was quite illustrative, said his father was a minister who wandered around the pacific northwest and one day he mounted his help it, looked out at his audience, and found just one guy sitting out there and he said to that guy do you really want me to go ahead with this service. the guy looked up at him and justice douglas said the cowboy said well, preacher, i'm just a lowly cow hand, but if i went to the field and -- to feed 40 horses and found just one, i would not let the horse go hungry, so he decided to give a whole service, sermon, prayers, hymns, walked to the back, shook hands with the congregation of one and the cowboy shook hands with him.
5:33 pm
he proceeded to wander off to his father could not stand it, and yelled, how did you like that, and the preacher said how would you like that -- i am just a lonely cow hand, but if i went out to feed a field of horses and found just one, i what not dump the whole load -- i would not dump the whole load on him. [laughter] you have to get past wringing our hands and saying i have contributed some. i think we need to get at the root causes and give the -- and i will be you a summary that represents an abstraction of the phenomenon of the odd complexity of these systems. the microsoft operating system -- they do not reveal the number of lines of code. ballpark -- 50 million lines of
5:34 pm
code. i asked that major corporate financial company person to estimate for me how many lines of code is company maintains and he is responsible for. answer -- one trillion. these systems are, as others have observed, the most complex kinds of systems we have invented, and that means we have extraordinary difficulty observing them, extraordinary difficulty enabling us to comprehend what is happening within them, and they have exceptional vulnerability. if you take the notion of -- the stark notion of one bug for every thousand lines of code, the bug does not equal vulnerability, but it gives us some sense of what is involved when you try to write out 50 million lines of code. in fact, in conveying to policymakers this point, which is extremely important, i think, their first intuition is you
5:35 pm
guys created this problem. it is a technology problem. fix it. either you were to, if i am a right-wing politician -- you are too much about your piece-loving hippies who did not care enough about security, or if i may left wing politician, you guys are all capitalists who wanted to get the software out the door because that is what you got paid for and you did not care enough. i say to them, think about something in the world you know -- the u.s. tax code. the u.s. tax code is 4 million words. rightly a tax code that does not have any loopholes. now, you might -- write me a tax code that does not have any loopholes.
5:36 pm
you might suggest they are writing tax codes with the intention of loopholes, but if you write a 4 million document -- word document and you give me an army of lawyers struggling to find vulnerabilities in the document, i will find them. you ought to understand you cannot create something of that level of complexity without having these kinds of errors. now give me 50 million lines of code, which are, of course, even less observable to the author. and never, you understand, if you reflect on it, this is a mass production operation. it is not like some single person in microsoft since their writing 50 million lines of code and comprehensive. nobody comprehends it. it is put together as a variety of different things. do not think there is a technical answer to this readily available at the scale and complexity that we need it. when you look beneath admiring the problem and start to analyzing it, it it is compounded by the phenomenon of extensibility. it has to work with an adobe system, etc., etc., and that interactive affect will create complexity beyond anything my own system did, even if i could
5:37 pm
somehow generate my own system. it is like the tax code has to work with a whole pal pay of -- panoply of different business laws and state laws. beyond that, i have a communications problem. the systems are designed to communicate. you understand that. you cannot believe to understand how novel that would have looked 40 or 50 years ago if you could go back. in the late-19 90's, the director of the cia, george tenet, safe with shock in testament -- says with shock in testimony in the senate, the enemy is on our system, our networks are open, and of course that is the case because of the nature of our communication. the more you let people in, the more you connect functions up in the more you enhance the risks associated with these complex systems. you understand how fundamental
5:38 pm
it is that we create this meditative -- communicative power. the system also transfers information -- take for example, snowden. we had historically many people like snowden -- people that come in and take documents, whatever their motives, and then hand them along. what is unique about snowden is 1.7 million documents. we are never in the history of espionage had anybody take 1.7 million documents, but it is a consequence of the fact that in these very complex and communicative systems, we transfer information, which is inherent in the virtues of the system.
5:39 pm
i want to create a world in which an analyst can get at information across a number of different domains. i want to have that ability. if i am right and power system, for example, i want to see the whole sound of transmission lines and the like. or, if i am running a pipeline system, which valves are open and which valves are closed. i want, as it turns out, to collect information in the information age enables me to that. -- to do that. in -- the internet of things will expand my capability to that. what we did was to collect some 29.9 million documents, including from me, that ran 100 to 200 pages, including favorites, foreign contacts, histories, embarrassing evidence, and the like, put it all in one place, and created so that anyone who hacked into the system could conveniently have it all, where as in a three cyber, predigital age, it was not that concentrated. it enhances that capability. a smart man at microsoft invented the phrase disintermediation.
5:40 pm
one of the advantages of the digital world as we take human beings out of the loop. it is terrifically advantageous. if i started with people who are intermarried -- intermediaries in making my dinner reservations, travel reservations, buying my tickets, i feel frustrated as compared with digital opportunities to do that myself. on a larger, national scale, it is usually valuable in government that i democratize information or when i was secretary of the navy, introduced a internet system that had all kinds of technological advantages, saved all kinds of money.
5:41 pm
what i really valued was that i could empower somebody in the bureaucracy who needed a new aircraft part to simply see the inventory and order it up without going through the silo of the warehousing people, the logistics people and the like, all of which held information as a source of power and created division within the organization. removing those human beings is usually beneficial, but it also removes gatekeepers, guardians, people that might observe what is happening. wait a minute, somebody is excellent trading this financial information, or i got a request
5:42 pm
notches for a new password, but 50 new passwords, or all these changes that human beings might observe. finally, these systems are amazingly flexible. we value the fact that our computers or laptops can do so many different kinds of things -- word processing, comedic munication, spreadsheets, etc.. the basic point i will -- communication, spreadsheets, etc.. the basic point i want to offer comes back to the title of the paper. this is poison fruit. this is not a luddite position i am taking. there is no way to turn the clock back. i do not want to turn it back. we need to recognize that inherent in each -- each virtue that i have summarized along the side is the risk. that to the degree that i concentrated or communicate or take people out of the loop,
5:43 pm
etc. -- to the degree i buy the benefits of this technology and each and every one of those steps i introduced, security consequences give rise to greater risk. the virtue of the system is intertwined with its limitation, it's liability, and it's risk. that is fundamental. it is not just of the complexity of the system gives me these problems. one recent technology fixes new not just get me there is every time i buy more security i tend to do so in ways that involve some sacrifice of virtues. i want to spend a minute, having talked about software, just to say a little bit about hardware. the hardware insecurities are quite real as well, and you are aware of that. an easy example i like to get his people think about supply chain and all kinds of sophisticated ways -- what is being made in china that goes into the f 35 or latest fire
5:44 pm
aircraft, etc.. what i am struck by his even if you preserve your whole system, if it turns on something you used to get -- the -- struck by is even if you preserve your whole system, something you used to power an adapter made with a device that is used for hacking your cell phone, the creative fundamental problem. the range of issues is extraordinarily gray here, and from an espionage standpoint, i just would point out to you -- you are all familiar with the stoxx not experience. the iranians moves to a particular set of frequency converters and the like because they became convinced that some foreign power had hacked into what they were buying to install in their nuclear establishment from abroad and they had to begin to produce their own stuff, which, then, of course, set them up for the vulnerabilities for some of their own stuff and integrate -- introduced a variety of efficiencies. the global supply chain gives us a chance to forget more vulnerability associated with the hardware world and i want to show the sophisticated audience this point by just giving you a chance to reflect for a moment on a statistic you are not often horribly exposed to, -- probably exposed to, which is -- i want to ask a simple question with respect to the question of transistors. there is a nice, little cartoon that says -- this guy says "it is time for us to spend more time with our children." he says to his wife, "how many do we have?"
5:45 pm
if you think about that as a problem, think about the transistor world and imagine the question, how many transistors are manufactured globally every second? i just want you to think about the answer and i'm not going to embarrass you by having you stand up or embarrass me by thinking i already know the answer. when i first started thinking about this i did a back of the envelope calculation and the number was so unnerving for me that i managed to get some friends at intel to get to work on it and they commandeered the research department and came up with a number that was so disorienting that we had a final couple of hours of phone calls and agreed on a number. i want you to think of the question -- how many transistors are manufactured worldwide per second.
5:46 pm
just as a measure of how well you understand this -- do you have your number in your head? every second, 14 trillion transistors. the complexity of this system, the difficulty of policing it on the hardware side needs to be appreciated. then, of course, there is the human side. here is a nice picture of snowden. before snowden, we had manning. the openness of the system to third parties is striking at one of the leading theories is iranians thought they had the system air gaps. there was a distance between the centrifuge system and a software -- a physical difference. of course, all kinds of things happen -- patches come down. the system needs to be updated. contractors need to go in and one of the theories is maybe some contractor got infected, brought in the virus, etc. if you are running a worldwide corporation, and aerospace
5:47 pm
corporation, for example, you have to integrate with all kinds of suppliers from all kinds of portions of the world and that then causes you to share information. lots of people have access to the information, huge problems, and even at those people are not at the level -- and the ability to manipulate these people is pretty great. if people have not read the book -- it is often possible to read that to realize that you, too, can be fooled with some clever social engineering. every system, when you look at it, winds up having management problems, configuration problems outside of the software and hardware. i've given you my password example already.
5:48 pm
so, you are familiar with many of the efforts to deal with this -- the countermeasures are a long history. we know we tried barriers in training, but we had fundamental problems with these. they leak very badly. the screening in the antivirals -- you are familiar with the set of issues, the dependency on existing signatures, the way the antivirals lag the attacks, the way many import vulnerabilities themselves and can be used as the basis for exploits. we have done a lot of hunting the vulnerabilities. it is nice to see the rise of that effort. i think it is going to produce some benefit, so is active defense. that monitoring the situation and the like yields limited benefit. we can create enclaves and encrypt to greater degrees -- a useful kind of thing, but again, the information needs to be shared and when we get into the sharing, we get into all kinds of vulnerabilities described in
5:49 pm
the inherent software vulnerabilities that may exist. it is hard for me to believe, and when you talk to sophisticated inside operators, it is hard for them to believe that they cannot get into almost everything. if he really cared enough and had enough resources -- i talked to someone who makes a career of it. he goes around dealing with complex industrial systems. i asked him how many times he is unable to penetrate his client. he says it might have happened once. it is so unusual. the vice president for security at ogle has said in a -- at
5:50 pm
google in a public context has said when google organizes red teams they succeed getting in 60% of the time. they are thwarted 40% of the time. that is google defending itself. i think we overstate the degree to which we can defend these systems and what happens is corporations like to hire red teams that affirm the qualities of the people that hired them so you do not wind up getting good penetration analysis, ultimately, about what serious attackers will do. i'll come back to the deterrence point. i want to know that what we are doing is raising the cost for attackers, not actually preventing them. one of the things i did was a ability hunting, exploit development, and i asked them to go back to their records and show me -- a rough indicator -- i'd not want to make too much of this. it is just illustrative. what has happened over time in terms of their production function for vulnerability discussion? how many researchers of medium quality they need to find vulnerabilities? basically, this chart from 2006 2 2007 shows the production function and that it is gone harder to find
5:51 pm
possibilities as we get things like fonts -- fuzzing and all kinds of things that are out there, but if one producer could research and find two significant among abilities in a year, and only finds a half -- that is to say it takes in two years to find it on average, we have significantly raise the cost for attackers. it is now four times is difficult as it was before by this rough, illustrative measure, but it just means if you hired four kaunda's many people, you can produce the same number of vulnerabilities -- hired four times as many people, you produce the same number of vulnerabilities. here is the report. every week we get a description of substantial vulnerabilities. look at the successes enjoyed by the top-level people, or, even, the people that are not at the very end of the distribution and
5:52 pm
win the top prizes. you understand all of this. hopefully this way of conveying the problem and i want in my closing minutes to talk about my overview of how we can improve the situation given where we are. the fundamental proposition that emerges from this is presumed vulnerability. presumed digital vulnerability and in critical systems treat this as though it is contested territory -- a phrase used in some congressional testimony. create lean systems, that is one's with fewer attack surfaces, and recognize that this is poisoned fruit, go on a
5:53 pm
diet. ask yourself do i really need this functionality because it is introducing vulnerability, and that does not just mean enclaves and the like. the easiest example for me is a printer. most people think they want a printer to print. it seems pretty evident. they do not think enough about what marrying the facts capability with printing capability does for the outside world. how about the fact that my printer has a bluetooth capability that enhances its vulnerability? how do i feel about memory in my printer? most people are buying memory in their printer and do not want. i come back to the example of snowden. he could steal 1.7 million decades. how come he could copy them? i can see why he could get access, but as an administrator, there is nothing that gave him the need to take the stuff out. as far as i can tell from the outside, the answer to that is the nsa people are not dumb. they disable the computer
5:54 pm
capability that would enable application, but snowden is not dumb, and he, with a screwdriver, re-enabled it, so my question is why did it have a capability to begin with and the answer is because we buy standard computers with a huge range of capabilities and what we should be doing is we should be thinking about buying leanness, slimming down the system. we have to think about the version -- virtue of analog. one of the things frustrating about stocks that was it was not just a penetration of the systems that control the centrifuges. it was, and we know this from the public documentation, a system that also deceived the iranian operators as to what was happening by plane back to them
5:55 pm
simply standard operation of centrifuges when, in fact, centrifuges were spinning out of control. fundamental design of this application is do not convert your situational observation capabilities and your safety systems to the same modality is your operating systems. if the iranians had had a plain, old analog system that measured vibrations of centrifuges, and when the vibrations began to get out of control, sounded a physical alarms that rank, no digital attacker from a distance could've thwarted the system. maybe somebody could have gone in the room and disabled one, or five, or 10, but the centralization and the concentration of the digital system would have been offset by a plain, old analog system, or i have a friend that comes out --
5:56 pm
where i have a friend that comes on the central intelligence world. he is paranoid. his paranoia leads into having video cameras were in the house, but being a smart guy, he is paranoid about his video cameras and worried that people from the outside world might tap into his cameras and observe everything in his house. so, what to see you -- he puts an index card next to the video cameras when he goes out so that if the camera is swiveled, the index card falls over. an analog system. all liens jesting is -- an analog system. all i am suggesting is we need to go through the system and where possible inserting analog capabilities. this is a back to the future prescription, but if you believe, as i do, the digital systems are inherently insecure, you want this complementarity here.
5:57 pm
you also want to separate the systems so the contamination of one does not lead to others. you want clean the words of resilience terrorists, to decrease the amount of coupling, the degree of integration. i like having an apple system alongside a microsoft system simply because i have, then, some diversity of long abilities, or to use a phrase repeatedly used, i want to avoid a monoculture. i want to create resilience in the system. from a public policy standpoint, i ought to be saying hey -- there are some systems out there, the power system, the financial system that this country depends on so fundamentally that i am going to
5:58 pm
impose some degree of requirement on them that they measure often insecurity firms that they are what we recognize in other arenas as too big to fail. we regulate our airline systems and we need to do similar things in the internet world. you have to recognize the speed of change, so you cannot come in my view, regulated by saying we must do x, y, and z, it becomes to -- but when you do it it becomes to limiting. i would not have some overall cyber czar impose it, and i would encourage them to use persuasion, subsidies, everything possible, including, ultimately, regulation, to get companies to the point where they provided a convincing case that they have done what they could to reduce the vulnerabilities they've ever supplied and the like.
5:59 pm
disaggregating the problem is very important. we need to recognize the fundamental differences between different industries in this context. for example, the finance and the power industries are dramatically different in their business cases. i will come back to this point about the interaction between technology and its culture. if i run a finance company, i am being attacked all the time, every day, millions of times. i constantly refreshing my software, policing my boundaries. understand that my fundamental assets are digital. they are not physical, and i'm at the very cutting edge of software and the like. what i want from government is information about attacks and by and large i want them to help share that kind of information and i want them to leave me in a high degree of freedom.
6:00 pm
if i run a power company, i am not as used to these kinds of companies -- frequencies of attack. i have a much slower cycle of operation. my financial base is regulated. i cannot pour money into it at any given time. i have an annual maintenance downtime period and i will react more slowly. i need a lot of basic education from the government about what is out there and some raising of my standards with regard to it. that is just an example. therefore, i am very inclined to push this problem within washington to reach of the relevant departments. well, i very much an enthusiast about longer-term research and development in this arena. i have sketched why it is that i think the problems are inherent in the technology, but there are opportunities in terms of making encryption easier to use, something we had just talked about here, in terms of the use
6:01 pm
of formal languages to scale of our capabilities to provide more protection and the basic design of our systems with the security focus would yield, i think, a lot more benefit. so, for example, i am the former secretary of the navy, so i say as the navy develops its next generation of submarine or destroyer, let's make it a national goal to say how will i design the system so as to reduce this -- the vulnerability, minimize the amount of poison fruit it consumes, maximize my use of analog and out of bound systems, maximize my use of formal language protection and key junctures, etc.. when i design the system i'm going to come up with something very different. in the navy world, i wanted the captain to see it all. of course, in the digital age that creates a company is a set of vulnerabilities.
6:02 pm
how would i design my ship fundamentally different, conceptually, if i take account of this. that is a product worth doing over the decades. my basic theme, pushing the analog notion was you guys are really good historically and safety analysis, regulation, and the like. you say, for example, that we need -- i will make up a number -- 10 different cooling capability so that if one or two, or three fail, i have the
6:03 pm
ability to bring in the others and when you get something african number, you think you have built in resilience and safety -- some other significant number, you thank you have the resilience and safety. you have created one failure point. you need as regulators to recognize that. also, as we design new systems, and we need to create good thinking. also because we care so much about the business culture, it is not just a technological problem, and the tendency is to invest in technology.
6:04 pm
i would like many more any investment in the anthropology associated with the use of these systems. it is worth johnny does not use encryption. we need to understand better what is going on within the systems. we need a better pool of attack information and i suggested in that paper a year ago and as example of how this is done -- a private corporation in the faa will, the federal aviation world, recognized the faa collected data on all aerial accidents, but did not on near misses. it was a big issue. how do we share information about near misses. people said i'd not want the faa regulating it. they set up a private entity. in the beginning, one or two airlines cooperated. with time, ever-increasing numbers that and now it covers, essentially, all of the system. faa has a seat on the board, but does not control information.
6:05 pm
the faa -- the information is then shared more broadly. that is a readily achievable model. i do not need legislation and the like. i can move forward in that regard. power company's are beginning to do some of this. i would like to add that i am very concerned about the federal workforce and this is a example of how we might move in this context from the traditional ways of doing business, which historically those legacy systems are so embedded, the past is not dead, it is not even past. we have a set of federal hiring, training standards that are very ill adapted to the digital world that i just described. my suggestion is we could create a federally funding -- funded research and development corporation that is a private company. there are many such that already exists. i am on the board of one. i'm not pushing this for them, but just as a general proposition, we could create a new one that would be a place for people who are cyber-skilled to come together and be hired. how would you hire them? i do not really care in the arena you work in some much
6:06 pm
about degrees and traditional credentials. what i care about actually finding vulnerabilities or dealing with other people's exploits. maybe i should hire winners who achieve in this context. what do i want to do in terms of training and the like -- i want to put these people together in a pool because so much that is learned is from hyundai job change. they learn from each other as peers. i want somebody who is cyber-skilled to run this. i want to make about putting this in silicon valley. why do i need it in washington? suddenly, i draw a different talent pool that i can tell you i think we desperately need. i want to conclude by talking about norms and deterrence and then i want to invite your questions. there is a lot of fuss now about
6:07 pm
the challenges of our conflicts with the chinese, people indicted in pittsburgh from the pla -- the people's liberation army in china. we have obvious sources of conflict over espionage and i.t. theft. the point i would make is we also have some areas of strategic stability. we have not yet states attack one another, to speak of, in the context of shutting down their power systems, undermining their financial systems, big, catastrophic kinds of things that i would be concerned about. i think we, and for example, china or russia, have common interests in avoiding that kind of warfare. it is just not good for either of us, ultimately. to give us a slightly more technical example, i do not think it serves either china or us to probe each other's nuclear command and control networks with our espionage tools. why is that? we recognize we have system -- systems that have gone under the name of mutually destroyed structures, med.
6:08 pm
if china only has the ability to strike first and we threaten destroying their missiles, they need to launch. if they know they have the ability to survive an attack -- even if we attack first, the situation will be more stable and they will be less likely to attacking a hair trigger weight. if the whole system depends on cyber systems and the system is inherently insecure and we are out there probing them and they are out there probing us, and the same tools that can be used for the espionage can be used for offense, i have now introduced insecurity in the world. i have worked -- moved from a world of mad to a world of mud, mutually unsure destruction. it is less stable. we can find our way towards some emergent norms that agree.
6:09 pm
there are certain kinds of things we will not do. we will have problems with this, challenges in negotiating it, enforcing them, and the like. we can talk about them if you like. the reality is when we entered the nuclear age we had no idea about arms control or how to do it, and slowly over time we began to find our way toward some stability, and we thought about the men began to articulate them in norms, arms controls -- arms control agreements, and so forth and we need to the same thing in the cyber world. i have given you a real soup to nuts thing. it might be that i don't the whole load -- dumped the whole load on you, but i suggest that we all see it is a brave, new
6:10 pm
world combat but these changes of the equivalence of gunpowder over a -- world, but that these problems -- changes are the equivalence of gunpowder over a period of time. we need to analyze the core of our problems, and i've tried to give you a summary of that today, and to think not just about the kinds of incremental activities i've made reference to, but also, most fundamentally, to try to structure institutions, programs, and norms in light of our analysis in ways that can make us stronger. will we ever be completely protected and ok in this area? no, but we can improve our batting average quite significantly. i cannot guarantee performance or success in every single pitch that is thrown at me, but i know how it is that i could get to a
6:11 pm
better world with a much higher success rate, a better batting average, and, boy, i would extraordinarily value that. i thank you all for not having gone out for haircuts and i invite your questions. thanks very much. [applause] >> coming up tonight, a focus on the prison system. panelists discussed the overcrowding rehabilitation program and what happens to prisoners after they leave the system. [captions copyright national cable satellite corp. 2016] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] in themember one male bedroom with the doors closed, it was his turn. cry and he was basically saying, don't i
6:12 pm
deserve the respect to be told the truth? i know my dad is not a boot camp because my cousin comes home on winter break and christmas. researcherbest as a -- they talkedim about not sharing the information because they did not know where to start. maybe there would be questions that they could not answer. >> the regulation has also loomed large. over the course of the prison boom, we really shifted our motivation. that somehow, these are deeply immoral actors that we want to submit to the harshest possible conditions.
6:13 pm
i think we are now seeing the consequences of what i feel is a larger strategy. the 2 million individuals that we currently have in prison. the physically brutal conditions, releasing them into the streets, we will somehow have a policy for success. >> those are the remarks from the recent conference on the criminal justice system. >> this sunday, michael ramirez on his career and recent book of satirical cartoons. >> i have a figure that's a conglomeration of extremist israeli settlers and a palestinian figure that, if you
6:14 pm
notice, he has his shoes on. both of these figures are false religion for a political purpose. an equal opportunity offender. >> from the annual techcrunch disrupt, a demonstration of ibm artificial intelligence computer system known as watson, the founder of the instant message service and a conversation with snoop dogg. thank you for that unenthusiastic welcome back. we appreciate that. it's monday. very exciting is to have him join us on the stage.
6:15 pm
we talked a little bit about ai. either way, the future is here. please welcome dr. john kelly from ibm and our moderator alex wilhelm. ♪ >> we will try something a little different. we will do a live demo. we hope it will work. >> what you're about to see is something that no one else has seen. watson, theremember artificial intelligence machine on jeopardy. was an open domain question-and-answer system at the forefront of artificial intelligence. fast forward to today. what you will see it watson has ingested all of wikipedia.
6:16 pm
it answers questions and can inform opinions on any subject. >> watson? ?s it going to happen active >> watson, please start. today, we will demonstrate some of our capabilities generating arguments for or against specific topics. >> let's go ahead and do wikipedia is reliable pro-speech. approximately 4 million wikipedia articles. scanned all 3000 sentences.

17 Views

info Stream Only

Uploaded by TV Archive on