Skip to main content

tv   Technology and Privacy Debate  CSPAN  July 5, 2017 2:09am-3:48am EDT

2:09 am
on whetherdispute tech companies should be required to help the government execute search warrants to acquire data. the attorney general, who altered memos regarding the use of enhanced interrogation techniques and former security secretary michael chernow. by thes hosted intelligence squared u.s. foundation and the national constitution center in san francisco. [applause] >> and, speaking of the applause energy, again because ultimately
2:10 am
eveningsd the overall discourse into about eight 50-minute podcast and also a radio broadcast, because of that we're going to do bits of production right in front of you. you will see me saying things like, i will be right back. and i will not go anywhere. i will still be right here. and there will be times when we take breaks and i will ask for your spontaneous applause for atmospherics or when i introduce he debaters, etc. i will let you know ahead of time when i will be making that kind of request. if i need to do that, i will say put your hands together. let's do that right now. let's have one more round of applause. [applause] lives where true
2:11 am
privacy can be found exist oddly enough on our smartphones, which are designed so when you put the phone on lock, no one can get past its encryption, not even apple with the iphone or google with its pixel. which is great, right? but not if you are in law enforcement and you have reason to believe a bad person's phone contains secrets that can solve crimes and stop terrorist attacks. in that case, should apple or google help the fence bust the encryption? patriotic? our is this sort of privacy that encryption represents something sacrosanct, and not to mention, something fragile? you put a backdoor into it, who knows who might come through it later? this sounds like the makings of a debate, so let's have it. yes or no to this statement.
2:12 am
tech companies should be required to help law enforcement executed search warrants to access customer data? that's our debate. we are at the san francisco -- i'm sorry, the sf just center in san francisco. that's our debate. we are in san francisco in partnership with the national constitution center with four superbly qualified debaters who will argue for and against the motion. the debate is in three rounds. the audience here votes to choose the winner and only one side wins. we would like to have you felt your opinion as you come off the street to tell us where you stand on this motion. take a look at the language. it's a lot of words. tech companies should be required to help law enforcement executed search warrants to access customer data. go to the key bed -- keypad under your seat.
2:13 am
.ress number one if you agree push number two if you disagree with the motion. this team's position. push number three if you are undecided, which is a perfectly reasonable position to be in as the debate starts. i will make -- i will just wait a moment for full i contact from everyone. ok. we are going to move on. you have a few more seconds to finish up. i will move on. we are debating the responsibility of tech companies when the government comes asking for data, ashley encrypted data. we have one team arguing is in support of the idea. the first invader for the motion, these welcome stuart baker.
2:14 am
stuart, you have served in government and -- important positions. you served under president george w. bush on the department of homeland security is the first assistant secretary for policy. you have long argued that people who oppose government access to the kind of data we will be talking about tonight under appreciate how access to that data can enhance our security. where did that appreciation come from? what do you know that they don't? not what i know, it's who i know. i have seen the people who are at the fbi, and nsa, at the age as were trying to protect us. more than half of them joined after 9/11 because of 9/11. underre absolutely resourced, overwhelmed. they need our help. without our help, they will not succeed. that's why i believe that everyone owes them a duty of providing assistance when they can.
2:15 am
john: and is why you are on the side. can you tell us who your partner is? mr. baker: he is my debating partner for the second time. john yoo, he is a pleasure. john: ladies and gentlemen, john yoo. [applause] law at professor of berkeley and a visiting scholar at the american enterprise institute. following september 11, he worked on national security at the department of justice and wrote controversial memos, which will be in your obituary. [laughter] with time you have debated us. the last time we actually did it in philadelphia, your hometown. your mother was in the audience. you told us then that there was no way you could lose with her sitting there. well, she's not here tonight. what does that do to your game? [laughter] you keep inviting me, i keep losing, so you are the ones with the problem.
2:16 am
[laughter] mom probably works at nsa now, so she's probably an audience listening anyway. john: ladies and gentlemen, the team arguing for this motion. [applause] now let's meet the team arguing against third welcome michael chertoff. [applause] michael, you have debated with us a number of times before. you are the cofounder of the chertoff group. you are the second secretary of homeland security under george w. bush. before that, you work at the head of the department of justice's criminal division. way back, you were a young prosecutor and helped to put quite a few mob figures behind bars. in those days, there was no digital data they today. today, would it have made your job easier? mr. chertoff: let me just say i'm delighted to be here, and john and stewart were colleagues
2:17 am
when i was in government. we did it the old-fashioned way. guys used to be wiretapped bar they would have electronic surveillance. they would leave the room, walk around the block, turn up the radio. we meet our cases with witnesses, photographs, circumstantial evidence, and we were successful. of guys away for 100 years of peace. john: tell us your partner is. mr. chertoff: catherine crump is my partner. she is a professor at berkeley. i have not had the pleasure of debating with her, but i'm looking forward to it. john: ladies and gentlemen, catherine crump. [applause] as michael said, you are a professor of law also at berkeley and acting director of the samuelson law, technology, and public policy clinic. you were a staff attorney for the aclu. you have been sounding alarms
2:18 am
about the staggering amount of data that law enforcement can and does collect on people's actual movements by tracing their cell phones, photographing license plates. day-to-day, what steps do you take to make yourself less digitally visible, or is it not even possible anymore? ms. crump: today it's pretty tough. online, you have tools to help you maintain privacy, but in the physical space, it is hard to do much but smile for the cameras. [laughter] john: thank you. the team arguing against the motion. [applause] now we move onto round one. round one will be opening statements by each debater in turn. they will be six minutes each. speaking first for the motion, tech companies should be required to help law enforcement execute search warrants to access customer data. here is stewart baker, former
2:19 am
general counsel for the national security agency. ladies and gentlemen, stuart baker. [applause] thank you. the way we have divided it, we the argument.ide i will be talking about the obligations to help law enforcement when necessary, which i believe leads to the obligation of tech companies to provide assistance. john will be talking about why particularly today we need help with law enforcement from technology companies. let me start. i would like to start with the question, which is whether tech companies should be required to help law enforcement execute search warrants to gain customer data. i want to stress what that question doesn't require you to support in order to come out in the affirmative.
2:20 am
we would love it if you concluded that the government can require companies to put backdoors in their products or break their crypto. if you believe that, you are obviously going to support this motion. but that is not what the proposition says. it says they should be required to help law enforcement. to my mind, that does not mean they are always required, but they are sometimes required to help law enforcement. that's not really a surprise, because everybody is required to help law enforcement in the right circumstances. if you have a unique ability to law law enforcement, and enforcement can't solve the problem on its own, you have an obligation to assist law enforcement. this has been true for hundreds of years well before the u.s. was founded. there was a common-law obligation to assist law
2:21 am
enforcement upon request, particularly when only you could provide that assistance. then -- actually, we all understand it. if you are witness to a crime, if you have evidence in a file cabinet behind your desk of a crime, you are going to get a subpoena from the government, and you have an obligation to assist the government by providing them with the evidence you already have. this is the rule for all of us. --'re going to get the subpoenas, search warrants, requests for that data. if you are a landlord and your tenant is suspected of engaging in drug selling are some other crime, the government will come with a search word and ask for your help. they don't want to knock down the door, they want you to use the master key.
2:22 am
that will allow us to get in without the subject knowing he is being investigated, and that may turn out to be important. you have an obligation as a landlord to provide that assistance. this is a requirement for all of us. different for tech companies. there's no silicon valley exceptionalism policy that applies. the supreme court has said .xactly that in a case against the united states, asking for help from new york telephone, now verizon, saying we would like you to assist us in carrying out an intercept of communications data verizon -- the company that is now verizon said, "no, we don't feel like it. why don't you do it? "
2:23 am
and the government said, "you are in a unique position to assist us in a way that will not be obvious to the criminal, and therefore you have an obligation to provide that assistance." the supreme court said it was right. there's no special exception for phone companies or tech companies. you need to provide that because it's a part of your obligation as a citizen. guess i shouldn't sit down without mentioning the elephant in the room, which of course is .pple against the fbi i want to make clear that while i'm pretty skeptical about apple's arguments in that case, you don't have to be entirely skeptical to vote in the affirmative in this case. there's no one who is arguing here that the obligation to help law enforcement is without boundary. if you can show it is too burdensome, that the government can do this without your help, that it is going to cost too much, hurt your customers, if
2:24 am
you can make a persuasive argument under current law, you don't have to provide the assistance. but if you can't, you are required to provide that assistance. difference -- the one place where i think apple made a statement, made an argument that is inconsistent with voting for this proposition, is when they said, we can help, we just don't want to. that is exactly a defiance of the obligation that every other citizen has to provide assistance to the government. there is no exception that says just because you're the world's wealthiest company, you don't have to do this. if you agree with that aoposition that there isn't silicon valley exception from the obligations of citizenship,
2:25 am
then you want to vote in support of this motion. thank you. john: thank you, stewart baker. [applause] the motion is tech companies should be required to help companies -- the government search customer data. thatext argument against motion is catherine crump, former staff attorney for the aclu and current professor at uc berkeley. catherine crump. [applause] ms. crump: you don't need to believe that there is a silicon valley exception to the obligation to help in order to a prose -- a post this regulation. this is not about whether tech companies should hand over evidence they are capable of accessing in response to a properly obtained weren't. of -- warrant. of course they should. this is a case of the government controlling through the use of taking control of
2:26 am
iphones less securely to facilitate access to data. the answer to that question should know in this era of profound cyber insecurity. the government's role should be to encourage companies to design devices more secure. i'm going to talk about the importance of encryption and supporting free speech and commerce online. my partner will talk about why encryption, the widespread availability, of encryption enhances national security rather than to tax from it. we rely on the internet for everything. we use it to communicate with friends and loved ones, understand medical diagnoses, and engage in banking. corporations store there must valuable proprietary information online, and the government also -- best troops of data digitally, including law enforcement and national security information. as a result, the security of the internet is critical.
2:27 am
yet, the systems we rely on to store all of this data are radically insecure. year, pew research reported over half of americans have personally experienced a major data breach. the issue is urgent. having the contact as your email -- on your email account dumped on line can be devastating. just ask hillary clinton campaign manager, could not only found that personally embarrassing, but well could have affected the course of a presidential election. our data is leaking all the time in large volumes. companies have repeatedly failed to protect it. people increasingly relies their data is vulnerable. if we want the internet to continue to be a place where speech and commerce flourished, we need to have people be able to share their thoughts and credit card numbers over the the internet. strong encryption is the best dissent available against cyber attacks.
2:28 am
one strong encryption is deployed, users hold the keys to their own data. data escapesat from prying eyes, including the eyes of tech companies. we ought to that build a backdoor in order to allow law enforcement access to data. the problem is you cannot build works only for the u.s. government, good guys, or other people with good motives. if you build it for them, foryption will be weakened everyone. no one should be altogether and back door. not tech companies, not the government, not anyone. recent outcome of the wannacry ran some -- ransomeware.
2:29 am
securing theser types of secrets to the extent it is to exist is no longer present today. the closer you look at the issue of the feasibility of creating a backdoor, the more impractical such a solution becomes. just think about this. what phones would it apply to? what it apply to older phones, would they be grandfathered in? what about phones built overseas? when a german traveler comes to the u.s. and their phone is not compliant, will they have to surrender the phone at the border custome? if so, that is a massive inconvenience. if not, that is a huge loophole. who should be able to recover data? if the answer is tech company should be able to recover data, they won't be just pressured by the government to make data available, but by
2:30 am
every government around the world no matter how does what it. -- the spot kick. -- despotic. to me, the issue is not about protecting us from the government. we have the rule of law, the fourth amendment, due process, and a culture of compliance to help us. the issue is protecting us from the bad guys. there are a lot more bad guys than law enforcement agents. if we create an opportunity for government agents to use a backdoor, that is going to be taken advantage of many times over by criminals. unconstrained eye laws and norms and don't get warrants. if they know there's a key or another way to access data, they will do everything they can to obtain it, and that he will with the distributed
2:31 am
structure of the internet was designed to prevent. this day and age, we would be better off if companies increase security for user data rather than make them weaker. john: thank you, catherine crump. [applause] we are halfway through the opening round of this debate. we have four debaters, two teams of two. we are debating the motion a tech company should be required to help law enforcement search data to help law enforcement. now to the third statement. john yoo, professor of law at uc berkeley, arguing for the motion. ladies and gentlemen, john yoo. [applause] thank you. it is wonderful to be here. it's a great venue. it is the cleanest jazz club i've ever been in. [laughter] i don't understand what kind of jazz is being made here, but
2:32 am
it's probably safe for all ages. it's wonderful to be back. it's great to be here -- make them longer. it's great to be here because this is the fourth time i have -- the pleasure of losing actually the third time. i will not go down as a four-time loser. stewart, my partner, we debated last year in philadelphia. i lost. i told him to pander to the high-tech audience as much as possible. so what did he do? he did not wear a tie. it is also a great pleasure to be debating against my friend, who are often think of as the finest lawyer i have encountered, certainly in government service. i have never been an opponent of
2:33 am
his. after many years of friendship, i will finally able to say what i think about him, so watch out. it is great to be here with my junior colleague. made it aave requirement of hiring her that she never debate me. i'm glad she is here. i'm going to get serious. i think i heard her conceit on the question presented it it is a very simple one. should tech companies help law enforcement? i think the catherine said, yes, of course they should. so nobody has to listen to anything we say after that point. she cleverly, and this is why she is so smart and we hired her, she cleverly changed the debate to something about encryption. i don't know anything about encryption. i care about the constitution. doesn't saytion anything about encryption, that the constitution says, and let
2:34 am
me hold out my problem -- i am going to win this time. i have the fourth amendment. you should all get one of these, it is a pocket constitution. if you write to the supreme court, they will send you a free one. they all have different versions, i am not sure which one you will get. [laughter] the fourth amendment says the right of the people to be secure in their persons, houses, papers and effects against unreasonable searches and seizures, should not be violated. notice it doesn't say against all searches and seizures or some come it says unreasonable searches and seizures. i hope everyone here tonight would at least agree that is the standard, what is reasonable. i am not going to tell you what is reasonable. i'm a citizen and we all have our own views of what is unreasonable. i think it is for our elected representatives to vote on
2:35 am
legislation to decide what is reasonable or not. that is how we handled other changes in technology from the telephone to money transfers to all kinds of things, where at the beginning, people said, it is so different and we should have no rules or new rules. but we did was we adapted the rules of the past to the new situation in a reasonable way. that is the way our society operates and what we should do with data held by tech companies. if you think it is reasonable to use reasonableness, this is how i would do the test beard according to the supreme court, in many cases, the court has said when you judge reasonableness, you balance the benefits of pursuing a particular action in terms of whether it advances government interest versus the loss of privacy. it is not a categorical, everything is off-limits, or the government can do whatever it wants beard it calls on us --
2:36 am
whatever it wants beard a calls on us to make a choice. the judges do that for us. in this case, the balance would be the reduction of the possibility of terrorist attacks, i think. stewart mentioned it would be remiss not to tension apple versus fbi, i also think it would be remiss not to mention in the united kingdom, the nation probably most similar to us in the world, they have suffered to terrorist attacks in the week, and there have been a spate of terrorist attacks not just in the united kingdom in paris, brussels and let's not forget the united states. i think sometimes we have a short attention span. we are being led by a president who has an even shorter attention span, and i think we forget the things that happened in the last few years. just in 2013, 4 years ago,
2:37 am
terrorists bombed the boston marathon, killed three people. just two years ago in san bernardino, 14 people were killed i terrorists -- killed by terrorists. a gay nightclub was attacked in orlando, 49 people killed. 53 people injured. the reason i mention these is not to raise the scare that there are terrorists all around but that the government interest is to reduce those attacks. the only way to do it in this kind of world we are living in with terrorists who organize themselves in networks for they take advantage of global commerce, is to get data into arermation on them, if we going to have any chance to stop the attacks from succeeding in the future. thank you very much. [applause] thank you. and now making his statement against the motion, former
2:38 am
secretary of homeland security, michael chertoff. [applause] am not going to match you joke for joke, i'm going to need all of my time. i want to begin on a serious note. we are all deeply feel for the families who of lost loved ones in london and manchester and all of the world. we know it is important to the best we can to stop these kinds of things from happening your -- from happening. i will to you, and this is from experience of having been on duty september 11, the kind of abilities tech companies provide to u.s. government to detect u.s. terrorism is vastly greater than it has ever been. there is a treasure trove of information as it developed not only through metadata, who is calling the sending messages to data, geolocational data, although this enhanced
2:39 am
with artificial intelligence and analytics. although this is made available to the government. when it is in the position -- possession of a company provided the government has a subpoena or some legal process. when look at the resolution, it does not say that we resolved, that the tech coming should comply with lawful process because nobody debates that. the cup -- the question is should they be required to go beyond subpoenas and warrants ad turn things over based on voluntary basis, when there is not a basis for a warrant or the is no subpoena? or even more significantly, should tech companies be required to take steps to weaken encryption or other measures that protect information simply techse right now, those companies don't have access to the information and therefore they cannot comply with the requirement it be turned over. only -- let me be clear.
2:40 am
there are many different kinds of applications you can use that to not give the service provider the ability to access the data. when they get a subpoena or a search warrant, they give over what they can, and that may be the identity of the owner of the phone or ip address, they turn over what may be available to them in terms of routing the messages from one point to another. that is the metadata. but they do not have the ability to turn over the messaging in and unencrypted form, or in some applications, the messages have whatsapp,d like under and there's nothing to turnover. with the government is arguing for and what this resolution is arguing for is that tech copies have to go further. they have to organize themselves so they have the ability to decrypt with a duplicate key all of the data that gets transferred so that they have the ability to store things that you think you have deleted so
2:41 am
they can turn it over if there is a request. the fact of the matter is, under the constitution and the traditions of this country, we do not require people to organize their lives so they store everything they say and write so it can be available if somebody want to -- wants to come along later and investigate them. what are we talking about? in thet values constitution, we are talking about national security. if you open up the newspapers, what you see is you have foreign nature -- foreign nations hacking into our political parties, criminals stealing financial data, terrorists trying to get information about where americans might be going, particularly american servicemen and women so they can target them. the way to protect that data is not to expect the government to do it, it is to expect each individual take the steps necessary to protect that information. often that does require encryption.
2:42 am
sometimes it requires choosing an application in which the message disappears once it has been rescued are there that people who can use these things? absolutely. more significantly are the number of good people who use these to protect themselves. i would say it is a matter of national security, the ability to met -- to let the majority protect themselves. i would also tell you the world is not going to go dark and we are not going to be in mortal peril if in fact we have encrypted communications or disappearing messages that cannot be seized by the government. as i pointed out earlier, there is an enormous number of tools available through metadata, locational data and similar things that the government can and routinely does get from tech companies. in the case of the san bernardino folks, the data that have been uploaded to the cloud was turned over to the
2:43 am
government. it was only the data on the phone that it not been uploaded that was an accessible to the tech companies and allegedly to the government, although the government eventually did break into it. there is plenty out there to protect us. i would also say to you that even if there were a rule that said u.s. tech companies must have the capability to decrypt any message or must have the ability to store and retrieve any message even if it disappears, that would not stop the bad guys. they would go to other parts of the world or onto the dark web and would simply by encryption that could not be opened or would simply buy a tool that allows them to make messages disappeared. we would have reduced protection for law-abiding people and would not really have deterred the people who are not law-abiding. i believe the government and tech companies can work together, but in a way that does not sacrifice security of an innocent person who wants to
2:44 am
protect his or her own financial data, private information and health information. thank you very much. [applause] john: thank you. --t concludes around one round one of the debate. now we move on to round two. debaters take questions from me, they can address each other directly or take distance from you, our live audience. we have two teams are doing for and against this motion, tech companies should be required to help law enforcement execrable -- law-enforcement execute search warrants. been arguing the law and history is clear, there is an obligation help law enforcement, especially if you have an ability to offer that help. you cannot say no, but within reasonable bounds, reasonableness is to be decided by the courts.
2:45 am
they emphasize the boundaries are there. however, they point out that there is no silicon valley exception to that rule, making reference to the position that apple took in the case with the fbi that paraphrasing, apple said that we can but we won't. just like a landlord with a master key, at the scene of a cry, -- a crime, they need to turnover the key. the other side is arguing that strong encryption and encryption is, to a significant degree, the crux of the matter as this issue moves forward. strong encryption is the best defense against cyber attack. you cannot hold a backdoor that only works for the good guys. there also are doing that there are plenty of other ways for law enforcement to use data that is already available to them, search warrant's that are not resisted by tech companies. they are saying the world will not go dark just because the fbi cannot get its hands on that
2:46 am
master key. i want to stipulate that both teams recognize that technology companies have an obligation and have been meeting the obligation to hand over various kinds of data, they have been doing it a long time, particularly metadata, information in the cloud, and etc. whether a team needs to defend encryption or not, we are here in light of where this challenge has moved, into a world where, as illustrated by the appliques, the question -- illustrated by the apple case, the question of whether to help law enforcement is in contention. i want to go first to the team arguing against the motion. your opponents are basically saying that if a company has a unique ability to help, there is an obligation, a citizens obligation to do what needs to be done to turn the key in the lock. what is your response?
2:47 am
>> it is not an unlimited obligation. this came up in the apple case, where you have to get reasonable assistance, and the real question was, at what point did it become an undue burden, what law enforcement is asking a company to do? apple maintained that being required to, for example, create a master key, would be an undue burden. relying on a lot of the security arguments we made earlier. the ability to protect data targeted by others. john: what about the point that when there is a unique ability, that changes the standard? agenthere is only one that can help law enforcement, i think that is what he was implying, that it changes the formula somewhat. >> i don't think it changes the formula. i think ultimately it comes down to burnham suddenness -- bur
2:48 am
densomeness. a person is ton do anything absolutely at all. eness in thensom sense it will be burdensome on the company. a master key essentially systems an operating that shuts down after you try to break it, it would compromise not just a phone, but all of the phones. therefore, if somebody got hold of that capability, it would not just be a single phone broken, it would be everyone's phone. that was the burden they were worried about. john: your response. challengethe need to the moderator as well as the other side, i apologize. both you and the other side suggested that no one is arguing that there is no obligation to help in these circumstances. in fact, when i was preparing
2:49 am
for this, i talked to a manhattan district attorney told me the following story. he said, you know, we used to , andthe iphone 4 to apple they had every ability in the world to decrypt that. brought them in order, they weught us the phone, -- brought them the phone, they give us the information on the film. we broke a lot of cases. time fbiaround the versus apple was heating up, they called and said, take back your phones. we are out of the business of providing insistence -- providing assistance to law enforcement. they said basically, our minds have change, we will not do it anymore. we still can, we choose not to. that is cold. applicationning the the obligationg
2:50 am
to help when you can. so yes, companies are required to help law enforcement when they can. that is my view of what the motion is, and it is not something that goes without saying in silicon valley, because at least apple believes they can decide for public relations reasons or because it does not fit their litigation strategy, to stop providing help you can provide. john: john? >> i will take up the question of encryption. i think there is also a claim that encryption will make all of our data safe, and i don't think that is the case. i think that is over claiming what technology can do. podesta'stioned john emails being attacked by maybe the russians, who knows. in?did they happen -- hack they sent him a standard phishing thing.
2:51 am
the other thing i would say about encryption is yes, government was asking apple to provide help to get into the phone itself. there are flaws and all of the operating systems, with for people to hack in to samsung, android phones or apple phones. a patch everyload other day it seems like. you repair the plot. flaw.k -- repair the i think what mike is saying that to break all of the products simultaneously, i don't think that is accurate. the government is asking for a way to get into this phone held by someone who carried out a terrorist attack, and obviously you can fix it after. it is not like you're going to publish the flaw and say, come on in, now you know the password to.
2:52 am
do moving i want to forward is alternate voices from opposite sides of the stage. michael, since you turned your site into a double, i give them a double. but i would like to make it single. which of you would like to respond? >> i do think it is important to understand what was at stake in the fbi case. first, let me point out, they ultimately hired a company that managed to circumvent the operating system feature and discovered there was nothing of particular relevance on the phone. got all the stuff backed up in the cloud from apple. the issue that was presented was, do you find a way to essentially create a vulnerability or a workaround featuring the operating system? as stewart kind of points out, they had stacked up a few hundred requests to break phones.
2:53 am
the reality is that once the vulnerability was created, it would be in constant demand for breaking things in the future. you might think, that is ok, because the geeky -- because they could keep the vulnerability hidden so the bad guy cannot get it. wannacrit a for that is y. they shut down the national health service your even the u.s. government cannot protect some of the tools and exploits they put together. is sliding over is that what the government asked apple to do was to use a backdoor that apple had already built into its own. how many people here got a u2 album you do not want on your phone? [laughter] >> that was apple using the back door into your phone. they use it to run any code they
2:54 am
want on your phone. we heard everybody on the other side of the debate saying it is a fatal hole, and yet apple uses it on your phone. balance, we have to give you security updates, the only way we can do that is if we have this door. on balance, it is better to have the back door and protect it and have no ability to update your phone. it is the ability to update your phone that the fbi had asked apple to use, to make a change in the code that would allow you to continue trying combinations after the first 10. that wasn't the secret that would've gotten somebody into a phone, the secret was how apple guarantees updates reach you. that backdoor already exists and is being used for apple to do good security updates and send you think.
2:55 am
-- send you things. >> i want to get bogged down into a factual discussion, the key thing they wanted that apple did not want to do is create an exploit that would be updated that would change the operating system, remove the future that says after you try a certain number of times, everything gets shut down and you are done. that was the tool they wanted created. that tool, once created, what had been a target for everybody who wants to break into phones. but think the resolution is broader than that. nobody denies, i have never heard anybody in the tech community say, we are not going to obey court orders or subpoenas. in this case, if they have access to data, they will turn it over. what is really at stake and been debated at the heart of this, don't configure the system's that deny you the ability to access information. that means, let's dumb down and lower protections. john: i want a question two
2:56 am
katherine. he was talking about a balance between privacy versus the need to pursue and gather national security. we're at a time when the balance, we have to recognize the balance is shifting, the threat of national security, particularly by terrorist groups exploiting technology, encrypted or not, is on the rise, and obviously dangerous. and therefore, we need to do a reconfiguration of the privacy issue. what is your take? that is to misunderstand this particular debate. it is more security versus less security. -- by using strong encryption, you secure the privacy of the data and you also improve security across the board for vulnerable data for corporations, governments and individuals. >> i would say, again, the touchstone is reasonableness.
2:57 am
then't see where constitution says it is up to apple to decide what is reasonable, it is up to us through our government to decide what the reasonable balances between privacy and having information to try to increase security of our country. i still think i hear the argument that stewart was arguing against, the idea that tech companies are somehow different. that they can willingly and intentionally design their systems to make it impossible for them to comply with these legal requests for information. should beeryone potentially subject, that is why we have a government and courts. i would much rather have a judge or commerce pass legislation making that balance, rather than let apple, facebook or google or samsung in korea decide the balance between security and privacy. itn: your opponent is saying is the privacy and security issue is not the real issue, she
2:58 am
is saying it is security versus more security. her argument is for even greater security, which i think her implication is it would make it even more difficult for the government to get access. >> that is possible, the possible consequence of more encryption is better security for our country. i just of why apple gets to decide that for the united states. i think if that is really a consequence of increasing a christian, our government should make -- increasing encryption, our government should make that call. trade-offs exist everywhere. there is no kind of policy we can have more of all the good stuff and no cost in bad stuff. everything is a trade-off. my question is, who determines the balance. >> i don't disagree with you, the government should pass a law. that is the debate, the question about the resolution is, should
2:59 am
the government has a lot of basically mandates to tech companies or anybody else, you cannot configure your products in such a way that will not allow you to comply with a court order to turn over information? the decision is, you can't configure your products, you are going to wind up, i think this hurtnwise, you're going to the security of everybody else that is innocent. that is not the subject of the subpoena. it's a government were to pass a law and say, you should not be able to delete any emails ever that you generate for you should not be able to use an application where the message disappears after it is read, or you should not be able to turn your phone off, you should keep it on to record everything you say all the time. that would make it very easy for law enforcement, when they targeted you, to find evidence against you. it would also mean everybody else will be walking around being big brother. that would have privacy and
3:00 am
security implications. john: i also want to take that point that there is greater inclusion of power for the sake of every buddy security, that there should be more. apply -- implyld less access for the government. >> l think so. want to go back to the point that apple has a backdoor into our phones. they have balanced the value of having the backdoor -- john: why don't you think so? >> they balance. they say, we give our customers more security make us a more profitable company in we can give them u2 albums. what they left out of the balance was the balance of all of us to suffer from crime. that is not on their balance sheet, and they did not take into account your what i am arguing is we need to take into
3:01 am
account the consequences of encryption in deciding whether there are to be access to the phone. and just as apple made the conclusion that we are better off on balance with a backdoor for them, but a very well protected backdoor, we should come to the conclusion that the backdoor something like it should be used to protect the rest of us and not just the profits of apple. john: kevin -- catherine? any of us disagree that we want to help people who are victims of crimes. i think the question is what is the best way to do that. experts isus of tech that if you create these backdoors, you will increase the vulnerability of people across the board, and given the fact that the vast majority of people are innocent and we all have sensitive data stored on these
3:02 am
systems, a backdoor is the wrong way to go. >> cai just ask one question? consensus of everyone in the tech community is backdoors are a bad idea, why does apple have a backdoor? why does microsoft give us automatic updates we cannot turn down? they have built a backdoor in. >> i am not a computer scientist, i am relying on people who have made this point. none of us on the stage of theirer scientists, but view is that by installing a backdoor that would allow you to override encryption, you will make the data less to cure. overwriten't encryption, am not worried about my u2 album. [laughter] >> the backdoors we are talking about and what is being suggested is the back door that allows you to decrypt something encrypted. >> apple can put anything on your phone to run anywhere, and
3:03 am
make it do anything. backdoor. >> if you have an encrypted cap app, and you have the key and the center has the key, period. these things are designed exactly so that only two people, the sender and recipient, can get it. that is what they want to change. i want the government or somebody else to have a sacred -- have a second key, a vulnerability. >> that is not our queue. our view is tech companies should help with again. if you have built a product that does not have a backdoor, nobody is saying you have to help, because you cannot. john: i want to go back
3:04 am
opponents are arguing that the request to apple and we are using apple as an illustration -- that the request that tech companies assist the feds in busting their encryption is unreasonable. >> we should recognize that it is really the israelis that are good at this good we couldn't do it ourselves. >> is it unreasonable? reasonableness: there is no formula. >> god, i think the senate would actually shut down and go on strike which you should do -- you should nominate me, actually -- no. who is going to oversee the impeachment trial -- no goo.
3:05 am
judge and you look at the games to our security. it is going to get worse, not better. as we encounter more battlefield successes on the ground and we start to eliminate this caliphate, isis will try to encourage more people to carry out the kinds of attacks we have been seeing in paris, london, no the united states data one i don't think there is going to admit there is no loss of privacy. balance the?d your the risk is to apple's business model. risk that, it is the it would get out to that guys. >> i don't have a problem with asking apple to do that, or the government compelling apple to do that.
3:06 am
where you are trying to -- >> how come? -- i was in the government. the government thought there would be information on his phone that when lead to a broader conspiracy. mike's right, it turned out it didn't. but you don't know that beforehand. and that's an important -- you have to put yourself in the position of the people who are trying to protect their country at the time they're doing it. we don't know how big the conspiracy was. i think it's going to become a bigger and broader problem, these series of attacks. the loss of privacy i think, on the other hand, is up to us as voters to decide. again, i don't think we should say, oh, apple gets to decide whether the loss of privacy -- john donvan: let me take -- let me take your justification for your position to catherine crump and how do you respond to everything that you just heard john say? catherine crump: i think that the encryption, with users controlling the key, means that users are in control of their own data. it juxtaposes
3:07 am
them between the companies and others. and overriding that creates security problems. john donvan: but he's saying -- john's saying life and death. life and death just trumps it all. i'm going to change that word. [laughter] john yoo: i never use the word "trump." john donvan: yeah. john yoo: [unintelligible] will never -- it's like to play bridge. john donvan: the life and death issue is decisive and it [unintelligible]. catherine crump: and no one is denying that it can be a serious cost to law enforcement not to be able to access the content of someone's phone. the question is, how do you balance that cost against the cost of not having encryption, and particularly in an era where law enforcement has lots of other information that's available to you. you -- every time you walk around the city, you're picked up on myriad surveillance
3:08 am
cameras. encryption doesn't change that. automatic license plate readers blanket the streets. encryption doesn't change that. even when you can't access the content of communications, you will often be able to identify meta data about the communication, who sent the email, what time the phone calls took place. and many law enforcement and national security officials believe that that meta data can actually be what's really important to solving crimes. john donvan: so interesting that opponents are saying, stewart baker, that you don't really need that. there's too much other information yielding too much information, and also, as michael chertoff pointed out, when the feds actually finally cracked the apple phone, there was apparently nothing on it. so what's your response to that? stewart baker: that would be overstating the case. john donvan: i know you're not -- i know you're not arguing for the cracking -- stewart baker: there's never a guarantee when you go into a phone that you're going to find the evidence you're hoping for. but on average, you do. you know, the argument is sort of boiled down
3:09 am
to, well, it's a great time to be a cop because there's so much data. technology is making life easier for you. and in some ways, that's true, but you know, technology is making life great for the criminals too. prior to 2016, no one imagined that the russians could change the outcome of an election just by sitting in moscow and having fun with the files that they stole, or that isis could recruit teenagers in minneapolis without ever coming into the united states to carry out attacks. technology is transforming crime in the same way it's transforming crime-busting, but it's not clear that on balance law enforcement ends up better. " 00:58:01
3:10 am
and when we can see a real criminal law enforcement problem arising from new technology, of course we ought to consider regulating it. i should stress this is not the argument we have to make to win this. all we have to say is they have an obligation to help, and they are not doing it. john donvan: i'd like to go to audience questions now. right here behind the bars, you're wearing a white shirt, if you can stand up, a microphone will come down from here, right on this side. what's your name, sir?
3:11 am
male speaker: my question is -- john donvan: could you tell us your name, please? male speaker: stephen maine, resident of -- work in san francisco, resident marin county. isn't the core issue of the fourth amendment the protection of the expectation of confidentiality and the right of privacy, right? john donvan: great question. male speaker: isn't that the -- isn't that the core? john donvan: i think that's a challenge to john yoo's side, so i'd like to take it to john yoo. john yoo: yeah, i'm happy to hide behind the supreme court on this one. they don't say that the fourth amendment itself puts that value above all others. it says you balance it. you're quite right: the privacy interest, which we actually didn't talk about that much
3:12 am
about the laws, so thanks for bringing it up, is the privacy is the reasonable expectation of society and privacy. and it could be phone calls, written letters, whatever. but you always balance that against security, right? i mean the supreme court's been very clear that we have to balance the two values. it's hard to actually figure out how do we measure what society's reasonable expectation of privacy is, and that's why when we've had this technological changes in the past with the telegraph, the telephones, money transfers, ultimately we've asked congress to step in and pass a law and make a judgment. for the -- in the beginning the courts have done it, eventually congress. and in no case did our elected representatives or any of the judges say privacy i was going to say trumps every -- privacy trumps all other values. it's what's reasonable to us as a society to balance the two. john donvan: so i want to let michael chertoff actually follow up, if he'd like to, or catherine, if you'd like to. catherine. catherine crump: yeah, well, i think we agree about what the applicable standard is, right? it's a balance between an individual's expectation of privacy and then the public safety means on the other side. i think we just disagree about
3:13 am
how that comes out in this particular case. john donvan: down near the front row here. male speaker: my name's raphael. i'm actually going to berkeley law. [laughter] my question is essentially, you're saying that tech companies don't have the ability to help because the -- because the data's encrypted and the user has the key. what i'm saying is, does the company still have the key
3:14 am
because of artificial intelligence? to illustrate that, gmail now allows you to auto reply. so, that's based on your content. can you say that tech companies will -- do not have the key? michael chertoff: so -- john donvan: okay. michael chertoff. michael chertoff: yeah. so, some companies do keep a key -- or some enterprises do keep a
3:15 am
duplicate key because they want their employees, for example -- they want to be able to see what their employees are doing. no one is arguing on our side that tech companies should disobey court orders. if you have the capability -- if you have a key, a duplicate key, and a court ordered you to turn it over, game over. you turn it over. the question in the resolution is, is there an obligation to help -- meaning, do you have to configure your system in such a way that you'll always have that duplicate key? some companies don't maintain the duplicate key. and in that instance, they can't comply. and what the resolution would -- if congress were to adopt the principle in the resolution, congress would say, "oh, when you design encryption, you must always have a duplicate key or a backdoor." they tried to do that about 20 years ago with respect to a -- something we call a chipper clip. and it kind of failed, because
3:16 am
there were problems with the way they were being executed, in terms of being vulnerable. so, no one's arguing, "don't comply with an order if you can." what we're arguing is you're not obliged to arrange your life so it's easy for the government to get a court order to have you turn this over. john donvan: stewart baker to respond? stewart baker: so, i -- you made a good point, that for some companies, having that data is so important that they discourage encryption. their business model is such they want the date. they don't really want you to encrypt it. so, i -- and so, they make choices that are dis-incentivizing encryption. apple doesn't live off the data, and they have created a market niche for themselves that says, "come to us. we don't use your data." and there's a perfectly good argument that they were using the san bernardino case as an exercise in free marketing, to show that they were the privacy protectors and that google and their other competitors are not. i think, at the end of the day,
3:17 am
though, the question is, is anybody here comfortable saying, "we're going to trust our privacy and our security to the marketing and the technological profit-driven decisions of the tech companies? "does anybody think they have our interests at heart, and our interests -- not just selling us stuff, plus keeping us safe? i just don't believe it, and i don't think we should rely on them to make that call. catherine crump: so, i think your comments, though, raise an interesting point, which is what are the market incentives of tech companies? and for a lot of purposes, tech companies are not going to want to have data encrypted. so, for example, your gmail account isn't encrypted because you're going to want certain functionalities, and certain companies are going to want to be able to access the data in order to sell you advertisements, for example. so, i think you need to think about the scope of the encryption problem as being limited, because there are a lot of market incentives on the other side that are going to limit the use of this tool. john yoo: i think this raises, actually, an interesting point -- goes back to the first --
3:18 am
john donvan: john yoo. john yoo: -- question too, about encryption. i find it actually strange, as a society -- we're more than happy to surrender lots of privacy to companies to mine our emails, and then to pop up weird ads about things that they think i want to buy, places i've been. i think that when we consent -- you know, we actually decide what's reasonable -- are we going to let tech companies not only design systems that use artificial intelligence to make them unbreakable, but then also just say, "yeah, we're not going to help you try to figure out a way to defeat it" in the next crisis? i think, as a society, i find it very likely we're going to say, "the government should at least have the same right to mine that data" -- look, as we're giving all these companies already. john donvan: i want to remind you that we are in the question and answer section of this intelligence squared u.s. debate. i'm john donvan, your host. we have four debaters, two teams of two, debating this motion: tech companies should be required to help law enforcement execute search warrants to access customer data. going back to audience questions.
3:19 am
and -- err -- [unintelligible] -- ma'am in the red sweater, if you could stand up. female speaker: hi. my name is kate conger. my question is about the life and death issue that you raised. earlier you were talking about how law enforcement needs access to encrypted messages to save lives. and i'm wondering how you balance those lives with the lives of our service men and women overseas whose locations are protected by encryption -- how you balance those lives against the lives of victims of intimate partner violence who might be hiding their information, their location from their spouse via encryption. why are the lives of terror victims worth more than the lives of service men and women -- of women who are being killed by their partners? john donvan: and your question is directed to? female speaker: stewart? would you like to take that?
3:20 am
john donvan: okay, stewart. yeah, no, first name basis. stewart baker: yes. [laughter] catherine crump: [unintelligible] doesn't have a tie on. stewart baker: yeah, exactly. yeah, come on. so no one is arguing that what we want is completely insecure phones that give away data that can get people killed. as i said, apple has built the technology that allows them to modify phones one at a time if necessary. and they have protected that successfully. and that means that the data that they've protected has not been given away to lead the deaths of innocents. but they could use that technology to protect innocents, and they're not doing it. and in my view, they should. john donvan: other side like to respond? michael chertoff: i -- you know, we could go around and around
3:21 am
arguing particular facts in the case. i think we'll -- they've wanted apple to do a modification that would ultimately, if it got out, affected all phones. but let's put -- john donvan: that's -- i just want to say that that's certainly the way the argue -- apple presented -- [talking simultaneously] john donvan: let michael finish. michael chertoff: but we're not on trial. we don't have evidence here. so let's take the broader proposition. if apple didn't add u2 music to your phone, i think your comment is dead right, exactly right. there is real security value to encryption. and if you -- if you require companies that have encryption without u2 updates to have a back door, you would weaken that encryption. that's what all the engineers say. and that means if somebody -- if a bad guy either discovered the vulnerability or got ahold of the exploit, they would then have the ability to compromise the safety of the people you are describing.
3:22 am
and if you said, well, that's okay, the government can protect it, i just have one word, wannacry. apparently, as reported in the press, the government wasn't so good at protecting the exploit based on a generally available vulnerability, and hospitals were shut down, and there was a global impact. and that's exactly the kind of thing you don't want to do. john yoo: can i jump in? john donvan: yes, please do. [applause] john yoo: two points. we've heard now this debate about whether it's apple's own back door that is at risk or the magic of getting rid of the limits on how many times you can try to guess a password before the machine wipes out its data. well, you know, if you've done any coding at all, you know that the wipeout of the data has -- there's a line in there that says, "after x tries, wipe the data." and if you went in and change the 10 to 1 million, you would have done what the government asked. that is not a secret. that is not hard. there is nothing that protects
3:23 am
you against that change being made other than apple's secret which is how to get the phone to accept the code change that it wants. there's no -- and there's no requirement to -- there was no argument by the fbi that the secret for how to do that should be given to the united states government to protect in some database that was subject to a leak. apple could have kept that secret and just fixed the phone so that it didn't wipe out the data after 10 tries. that's all they had to do, and they chose not to do it. john donvan: man in the light blue shirt because i saw you shaking your head during stewart's comment. but i don't want you to argue with stewart. i want you to ask a question. if you could stand up and tell us your name, please. male speaker: well, i heard that -- john donvan: what's your name? male speaker: i'm a former software engineer. i work for a legal department at a big tech company. and i too will be attending berkeley law. john donvan: yes!
3:24 am
[laughter] male speaker: you're all going to take catherine's classes then, you know. you don't want to read alexander hamilton with me, do they? male speaker: well, professor, i wanted to go back to a point you made a little earlier about how cryptography encryption is not a panacea. and you're absolutely right. the general consensus in computer security circles is that there's always another flaw. and in fact, the apple and fbi thing was -- their fight was mooted when the fbi found another way into the phone without apple's help. john donvan: i need you to go to a question now. male speaker: so because there are more flaws out there, doesn't that put the burden back on law enforcement rather than asking tech companies to help. john donvan: i'd like to let this side answer first, and then i would like to hear the other side's response to that. john yoo. john yoo: i think -- i'm not sure exactly what the -- how to answer the question. but what i -- i think there's a false choice here that's being presented by our worthy and handsome and attractive
3:25 am
opponents. and that's, there's a choice between letting the government have access -- and i think you heard in the last question -- and complete vulnerability. i don't think that's true. i think it's very much as you describe. there are these programs and operating systems, and then there are flaws, and then we correct them. and sometimes we can use the flaws to the society's advantage, and then we correct them. it's not the case -- you know, some people often use this analogy of locks on doors. and i've heard it said, oh, what you're asking apple to do is to change the locks of -- no one can have locks on their doors. i don't think that's really what's going on based on what i understand about -- i'm not a computer scientist, but i did have a trs80 back in the early 1980s, late '70s. i bet very people here can make that same claim. they had a radio shack -- john donvan: i can. john yoo: -- computer as a kid. male speaker: oh, you have so dated yourself. probably liked katrina and the waves, too, as his favorite band. john donvan: ultimately you made a very good wheel chuck. john yoo: but the point is like, right, that it's not a choice that if
3:26 am
you help the government all of a sudden everybody's data is suddenly visible. then i think we use, you know, adaption and computer scientists will fix the flaw and the locks are restored. all we're asking about is to say to the locksmith, come to this house. please open this lock. we're not asking you to take all the locks off all the doors. john donvan: catherine crump. catherine crump: yeah, i think your point is a good one. encryption, while it may be the best tool we have available often isn't perfect. the apple case illustrated that. they were able to get in using a vulnerability that they purchased. there will continue to be vulnerabilities. and particularly in high-profile, high-value investigations, they may continue to be used. john donvan: another question? right down front, sir. a mic's going to come for you. just one second. male speaker: now, you realize you just called on the former chief justice of the state of california. so whatever he says i'm going to agree with it. [laughter] male speaker: i'm asking -- john donvan: please tell us your name. male speaker: -- as a layperson.
3:27 am
ronald george. john donvan: thank you. male speaker: i just wonder whether, in weighing privacy interests against security interests if congress, in considering a law requiring that such help be provided, were to be presented with credible evidence that there were plans to import a nuclear device into the united states whether that would change the position that the no side have, that kind of substantial showing, or whether you would still adhere to the same position. john donvan: michael chertoff. michael chertoff: well, you know, i think -- and the position we have, again, is that the companies have to comply with the law and their rules. the issue is when a company's capabilities are configured in such a way that they just don't have access to the information, that's going to frustrate law enforcement. if law enforcement can figure out its own way to get that, god bless them. but the real challenge presented here, and the argument that people in congress, some of them have made is, you should prevent
3:28 am
tech companies from organizing themselves in such a way that they don't have access to all the information whether they're required to turn it over. so let's pick something a little less esoteric in encryption. there are applications now, messaging applications, which ones you've read the message, it disappears. and should the government be able to say, wait a second, terrorists could be communicating on when that shipment of nuclear materials coming in and the message is going to disappear. we're never going to know what was said. so let's make every company require that all those messages, although they appear to disappear, they actually get stored. and that would apply to everybody because you don't -- congress doesn't pass a law knowing who a particular terrorist is, the law is generally applicable. i would argue two things: that it would be inappropriate to pass a law like that because, at some point, you might really need to get a message that otherwise wouldn't be saved. but i would also tell you, based on my experience, going back to
3:29 am
a period after 9/11, the kind of data that is available now that is turned over by the tech companies and is generated by the tech companies precisely because they are business people, makes the kind of stuff we got in 2001 look like child's play. it would have been a dream to have the kind of data that's available now back 10 years ago when we were responding to 9/11. so if you look at technology development as to whether net net has been good for security, i will tell you it has dramatically tipped the balance in favor of security. john donvan: other side like to answer that? stewart baker. stewart baker: yeah, very briefly. i think that example shows that there are times when we simply will not leave the decision to the companies. it's -- and if you believe that, if you thought there was a phone that had data about the
3:30 am
importation of a nuclear device into the united states, no one would be saying, oh, well, it's apple choice about whether they're going to use their backdoor to provide access. we would say this is a choice that ought to be made by society as a whole through our elected representatives. i -- and if you believe that, then i think the answer to the question is, yes, there are times when companies should be required to help law enforcement. john yoo: also, we would help american companies would willingly try to help in such a situation, shouldn't need the compulsion of the law. what worries us, i think, is this growing atmosphere that it's okay for tech companies to say no, we're not going to help the government even in a scale of a threat as high as the one you're proposing, mr. chief justice. john donvan: ma'am on the aisle, there. if you could stand up, they'll be able to see you for the microphone. female speaker: hi, my name's audrey. i'm the genetic counselor, actually, so i work in genetic testing. this is a question for john or stewart -- am agonistic to which one of you answers the question.
3:31 am
but the cost of genetic testing has gotten down because -- basically dropped dramatically, and there's a real philosophy that democratizing the ability of people to have access to the information really requires the tech companies. it's a huge amount of data, and that data needs to get stored and that data needs to get analyzed. and my question would be that do you think there should be some kind of backdoor key? should there be something set up for genetic -- to store the genetic information where the
3:32 am
companies need to make it acceptable to law enforcement or is there a case where that's something that's actually going to be useful, where it makes sense to compel a company to reveal that type of information, where they're not really making a profit off of that type of backdoor? john yoo: so i'll answer just because my brother is in this industry, too. so it seems to me that we've already made a choice about the privacy of genetic data -- not an except hypothetical you have, but dna testing to track down crimes. you could have had a regime or a world where you could have -- we could have said the government is not allowed to know your dna sequence. it cannot do dna testing. that's part of your right to privacy, your individual right, and that would be very similar to your idea. and if we're going to -- if i'm going to hire a genetic testing
3:33 am
company to sequence my code, the government can never look at that either. i mean, companies could take that position. but we don't have that view, right? we are actually expanding quite broadly the use of dna testing to help us, you know, solve crimes. and, you know, it's not just -- it's not just the privacy -- you know, you're losing a little bit of privacy by letting the government do that, but we are also protecting victims. we're solving a lot of crimes. we're also proving that some people who are convicted were actually innocent and those people are being released from jail. i would just say as a society we've already made the judgment i would just say as a society we've already made the judgment you're asking about, that it's reasonable in certain circumstances for the government to have access to your dna testing in order to solve a john donvan: catherine, would you like to take a crack at that? catherine crump: yeah, i don't think there's a point of disagreement here. it sounds like in your example the company itself can access
3:34 am
the data, and in that circumstance if the company is capable of accessing the data, they need to comply with whatever lawful process the government uses to get the data. i think this debate is more focused on circumstances in which the data is protected even from the company. john donvan: and that concludes round two of this intelligence squared us debate, where our motion is tech companies should be required to help law enforcement execute search warrants to access customer data. [applause] and now we move on to round three. round three are closing statements by each debater in turn. here making his closing statement in support of the motion stewart baker, former assistant secretary for policy at the department of homeland security. stewart baker: first, for those of you who come back, i should introduce you to a concept that i learned at a conference yesterday for the israelis -- which is the israeli question -- which is a speech followed by
3:35 am
the words don't you agree? [laughter] i was thinking about this issue and researching it, and i came across a case with a woman named brittany mills who was -- who answered her door one day and was shot dead at point blank range. there are no -- the police know she knew the person she opened the door for. they know nothing else. they do know she had an iphone, that she kept a diary on it. her mother says that she was careful to keep those records. apple was not prepared to provide any assistance in finding out what's on that phone. i -- that can't be right. tim cook has given many speeches about how companies have values because people have values and apple has values. and they care about the environment, and they work hard
3:36 am
-- even sacrifice profits because of their concern for the environment. i think the message i would want to send them out of this debate is they need to have a concern for the brittany mills of the world as well. that privacy they were providing is not doing her any good, and she never wanted this kind of privacy. and so, i would ask that you vote to say "yes," companies can be required to help law enforcement to execute search warrants. thank you. [applause] john donvan: thank you, stewart baker. [applause] and here making her closing statement against the mother, catherine crump, acting director of the samuelsson law technology and public policy clinic at berkeley law. catherine crump: we all want to help the berkeley -- brittany millses of the world. but the question here is where is the greater good? are we going to make everyone's communication insecure in order to create a backdoor?
3:37 am
and i'll just tell one story -- which is that 20 years ago, the united states, in a law called calia decided -- and most other countries decided to create a requirement that there would be a backdoor for telephone switches. about 10 years ago, someone illegally wiretapped the phones of many people in greece using one of these backdoors. it included the prime minister. it included the mayor of athens, and so on and so forth. so, when you create these backdoors, they are vulnerable. they can be abused. and the better choice is to try to secure everyone's data across the board. [applause] john donvan: thank you, catherine crump. [applause] and now making his closing statement in support of the motion, john yoo, law professor at uc berkeley. john yoo: o, unlike my other panelists, i don't have a good story. i'm not irish, i'm korean. we're not good at stories. so, i have no witty thing that's going to sum it all up, the way that jfk or tip o'neill could have. i wish --
3:38 am
stewart baker: e has a korean mom. he cannot go home of [inaudible] john yoo: hat is true. please, please vote for us. my mom is asking you -- [laughter] -- actually, i wanted to go back to something jeff rosen said when he started this whole thing. and he said he always asked himself what brandeis would do. and i actually always ask myself, what would hamilton do? and the reason i ask is because hamilton is so cool and hip right now. they even make rap music about him. i'm not rich enough to have the pull to get actually in to see the show, but i hear hamilton even talks in rap. this is amazing to me. i've been studying hamilton for 25 years and i love the guy. and i think what hamilton said is something we should come back to, because hamilton was involved with drafting the constitution. he was the first treasury secretary. you all know this because all of you have seen the play. hamilton said the primary mission -- the purpose of government is the protection of the community from attacks. he didn't say it trumped everything. it doesn't mean that we have to live in a world with no protections, or no security, or no privacy from -- for our data.
3:39 am
but it means that ultimately, when it comes down to it, and this is the question, i think, that chief justice george properly raised -- is we all have to balance the needs of the government against our privacy rights. and as a society, we can sometimes and should decide that we want to trade off some amount of privacy for security. anyone who's telling you that that's a false choice, i think, is not being truthful. there's always a tradeoff in anything we do, any government policy that we reach. and i think, in this case, all we want to acknowledge in asking you to vote "yes" for the resolution is that we're asking you to acknowledge that the government at some times has a right to protect us -- that should sometimes, in the right circumstances to protect us that should sometimes, in the right circumstances, require us to give up a small amount of privacy. john donvan: hank you, john yoo. [applause] and finally, making his closing statement against the moment, michael chertoff, executive chairman and cofounder of the
3:40 am
chertoff group. michael chertoff: ell, thanks, everybody. very stimulating debate and great questions. look, i know stewart likes to talk a lot about the apple phone case. and we're not going to resolve the engineering question about whether what would have been required would have been to create a general vulnerability. but that's not what the resolution is about. the resolution is about whether tech companies or anybody for that matter should be required to help do whatever can be done in order to make things accessible to law enforcement. there's no doubt congress can pass a law. that's not the issue. the question is, would that be wise. and if you apply it in this circumstance, what you see the resolution says is, you should have to configure your platforms in such a way that you can always access information when there is a lawful demand to do so. the problem with that is it doesn't create security for everybody.
3:41 am
it creates security in some circumstances. if you look at what goes on around our world, if you look at the $80 million stolen from the bank of bangladesh, if you look at the efforts to influence elections in france, if you look at the personal value that is stolen and the 500 million yahoo! accounts that were hacked by the russians who got dieted, that's 500 million individuals whose personal information is out there. you realize that if you're leaking encryption or you limit the ability to protect the data, you are putting the security of the many at risk simply because the government would be benefited and in some cases to get access to the data. with all the tools the government has that the companies give them, data backed up to the cloud, meta data, locational data, sometimes the government will have to do it the hard way. but in the greater good of security for everybody, that may be the right way. and that's what i would say
3:42 am
congress ought to bear in mind when they look at this problem. thank you very much. john donvan: thank you michael chertoff. and that concludes round three of this intelligence squared u.s. debate. [applause] and now it's time to learn which side you feel has argued the best. we want to ask you again to go to the key pads at your seat and vote for a second time. take a look at the motion. tech companies should be required to help law enforcement execute search warrants to access customer data. push number one if you agree with the motion, the motion -- the side argued by this team. push two if you disagree with the motion, this team. push number three if you became or remain undecided. we give victory to the team whose numbers have changed the most in percentage points between the first and the second vote. so it's the different between the first and the second vote as opposed to the absolute vote that determines victory for one team or the other.
3:43 am
it'll take about a minute and a half for the results to come in. but while that's happening, i just want to say a couple of things. as i mentioned in the beginning, the goal of intelligence squared u.s. is to raise the level of public discourse and to prove that people with disparate points of view, with real disagreements on principle nevertheless can sit down, exchange ideas, speak to one another civilly, maybe even change each other's minds. and i just want to say that the spirit in which these four debaters did that, the game they brought to the stage absolutely lived up to our principles. and i want to thank all of you for what you did. [applause] i also want to, again, thank the great jeffrey rosen for being our partner in this with the national constitution center. if you haven't been to philadelphia, we've done debates there. the center itself is spectacular. but as jeffrey also pointed out, the center goes far beyond
3:44 am
philadelphia. it's a national organization. it's getting everywhere. los angeles is not that far a hop. and so you heard about the program coming up there. but keep an eye on the ncc. it's really going places, and it's great to be partner with them. i also want to -- i want to point out this about intelligence squared u.s. we're a nonprofit organization. we put these debates on, and then we release them for free to the public. as i mentioned, the podcast is out there. the public radio program is out there. we're in a lot of schools. a lot of schools now actually incorporate us as part of the curriculum, particularly in high schools and the upper grades of elementary school, and we're very, very proud of that. but i also want to say that we depend enormously on public support to continue that but i also want to say that we depend enormously on public support to continue that mission going. so, if you like what you saw, if you like what you do -- what we do, we'd appreciate it if you could give us some support, and there's a way to do it with your encrypted cell phone right now. [laughter]
3:45 am
if you -- you -- if you text the word "debate" to the following number, you'll get a link and you can make a contribution. and i know it's a cliché, but big or small, they all count. we appreciate them all. and so, that number is 797979, whose secret meaning is -- absolutely nothing. it's random. but we would greatly, greatly appreciate that. so, reminding you -- the motion -- somebody hacked into it. what putin -- >> putin wanted to change the outcome of the debate. >> the winner is la la land. >> i've got it now.
3:46 am
tech companies should be required to help law enforcement execute search warrants to access customer data. before the debate, in polling the live audience here in san francisco, 26 percent agreed with this motion. 47 percent were against the motion. 27 percent were undecided. those were the first results. one more time i'll say this -- it's the difference between the first and the second vote that determines our winner. so, let's look at the second vote. the team arguing for the motion -- their first vote was 26 percent. their second vote, 36 percent. they went up 10 percentage points. stewart baker: wow! john donvan: hat's the number to beat. let's see the team against -- arguing against the motion. their first vote was 47 percent. their second vote was 58 percent stewart baker: oh, come on! john donvan: hey got 11 percentage points. they just -- [applause] -- snuck in. congratulations to the team arguing against the motion. our congratulations to them. thank you from me, john donvan, and intelligence squared u.s. we'll see you next time. [applause]
3:47 am
>> we really have such a pleasure being here in san francisco. and thank you so much. find us on our podcast. make the contribution. thank you. [applause] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit] >> this afternoon we will have coverage of the liberian education minister talks about the use of charter schools and developing countries. the panel will look at countries around the world using something similar to charter schools. coverage begins at four clock p.m. eastern on c-span. tonight, amazonas supreme court justice on the debate


info Stream Only

Uploaded by TV Archive on