tv Government Surveillance Privacy Conference - Part 4 CSPAN December 10, 2019 7:57am-10:13am EST
again, if you've been following privacy issues for a while, you will be familiar with what is sometimes called the going dark problem, a concern that the pervasive growth of strong encryption presents an obstacle to law enforcement agencies especially since it's gone from a technology that's in the providence of people with distanced technical knowledge and understanding, able to used difficult tools to something that is baked into very
user-friendly technology in a way that doesn't require much sophistication at all. you probably use encryption more or less daily without even recognizing just by using a smartphone or a standard web browser. and as increasingly this encryption is not just between end users and central identities that law enforcement can approach with a warrant but end to end, meaning encrypted between users without access by an intermediary entity like google or facebook. that is causing a certain amount of nerves by both law enforcement and intelligence agencies. in the past it's been framed by the threat of terrorism. a couple years back, there was an iphone used by the san bernardino shooter.
more recently with the announcement that facebook is intending to deploy encryption, they're concerned that this will end the automated scanning of messages for child exploitation imagery and other kinds of files. and so cut off one significant point of access for law enforcement prosecutors to go after child predators. we have an excellent panel arranged here. unfortunately, it is the season of illness. professor matt blaze is down with the flu and unable to join us. we have an absolutely phenomenal panel. before i was a journalist nerd and one of my previous hats was the washington editor of a great
techie news site. i was happy to work there because it was one of the handful of places where you could expect deep dives into technical questions and in my case, sometimes league questions, written to be accessible to a generally educated audience without -- but without leaving out, like, a lot of the nuance and details that might be interesting to someone with a little bit more of a knowledge based. i was pleased that we had sean gallagher who is the i.t. editor to act as moderator for this panel. sean will introduce the rest of our excellent panelists.
>> good afternoon. we're going to get a wake check here. we have awake people and i hope everybody at home is awake. my name is sean gallagher. i'm the i.t. and national security editor and i'm here today with robin green with facebook, with jim baker of our street, and with -- again, i just met him -- >> brad whitman. >> -- brad whitman with the
department of justice. and the topic at hand is encryption. this was going back two decades. encryption was something that has been fought over and mostly been settled. there was a chip that was implanted in devices that were going to have encrypted communications over them, that the federal government had presented as a standard and it was fought against and it was eventually ignored by industry and proven to be vulnerable by our absent guest matt blaze, among t others, and it was decided that having a back door into encryption was a bad idea. for some time over the past decade, going back several administrations, the fbi's leadership has pressed the case for some sort of limit on encryption and as former fbi director comey put it, they wanted a golden key to encrypted
communications. because encrypted communications become much more common than they were in the 1990s to prevent criminals from going dark. and the latest version of this argument, they've used the incidence of online child sexual exploitation as a reason to raise the demand again and asked facebook in a letter that he signed along with his officials from the united kingdom and australia to not deploy end-to-end encryption across all of their products for messing by default. and out of here it would allow p pedophiles to go dark. and they cited a reason as facebook being a major source of information about child pornography. about 80% of the cases of exchange of information came from facebook in 2018.
so they're seeking -- they're seeking to ask facebook to not deploy end-to-end encryption until the company could provide some way for legal, warranted access to communications. technical experts have argued that any sort of back door weakens the protections it provides everyone in legal communications because it would make encryption more fragile. the question before the panel is, is there a way to have secure communication for the masses and encryption and have legal access under warrant? where do the constitution and the laws of mathematics and physics come to equilibrium on this. can facebook provide secure communications for the rest of
us? i will allow our panelists to open with that. first i'll let robin speak briefly about it and then as quickly as possible bring it out to audience questions. >> thank you. thank you for inviting me to speak today at this very important event. i want to first sort of start by talking about why is facebook moving our messaging services to end-to-end encryption? i started at facebook in february. before that i spent about eight years working in civil society on many of the same issues. starting in february and having the messaging that we're shifting our services was an exciting time to start. it's important to think about why this is happening.
ultimately facebook has always been committed to helping people build communities, having their voices heard, many of our services, facebook, instagram, we think of as the public square. but what we're seeing increasingly is that people are wanting to have more private communications, wanting to have one on one or small group communications. they're more conscious of the private information they're sharing with one another because they're having more personal communications online, whether it's sharing stories or personal information about your life and photos or transacting business, people want to be sure that the communications that they're having over their messaging services are secure. and that's secure from facebook, that's secure from external threats like hackers and other malicious actors and that's secure from any other unintended recipient, including the government. we think it's important to make sure that people can have that
kind of control and confidence in their communications to know that they have the privacy and security that's needed given how much data is getting shared and how private and sensitive those data are. in addition to that, we want to make sure that we do this right. and so we're not just flipping a switch. this is a long process. there are a lot of technical challenges that we're addressing with doing this to make sure that we do it in a way that, you know, is good for users and make sure that we're providing them with the end-to-end encryption. and we're making our services interoperable so we can have a more stream lined experience across all of our services. we want to make sure we get the privacy parts of that right as well. beyond that, we have for years now been industry leaders when it comes to safety on your platform. as you mentioned, a large portion, you know, of the
information that nicmic receives comes from facebook. we're going to continue to put safety first in an end-to-end encryption space and we're thinking hard about how to be the leader in industry in encrypted messaging on safety while making sure that people have that same strong end-to-end encryption where only they and their intended recipient can see the information. >> thank you. jim? brad, go ahead and kick off from your side. >> let me -- thanks for having me here today to talk about these issues. we in government and i think we as a society are confronting an epidemic of child exploitation and abuse much of which is
facilitated through online platforms and sharing images of their acts. this includes absolutely horrific sexual abuse of children and toddlers. the numbers are absolutely staggering. in 2018, facebook made 16.8 million reports to the national center for missing and exploited center. 12 million from facebook messenger alone. we are grateful for these reports and we're grateful for the outstanding cooperation that we get from facebook. we rely on facebook and other companies as do other governments around the world. thousands upon thousands of children have been safeguarded as a result of these reports. in march, as robin mentioned, facebook announced that it plans to implement end-to-end encryption across its messaging services so it will no longer be able to see the content of those messages on the platforms. the ceo of facebook acknowledged
frankly that there are real safety concerns to address associated with this shift and we have a responsibility to work with law enforcement to help prevent the use of facebook for child exploitation as well as other social ills, terrorism, organized crime and others. after the change, quote, we will never find all of the potential harm we do today when your security systems can see the messages themselves. so in response to this, the governments of not just the united states but of the united kingdom and australia, written to the ceo of facebook in october of this year asking that he not implement the end-to-end encryption without ensuring that there's no reduction in user safety, without including a means to lawful access to the communications. this is something that we charged with public safety and protecting your children and children around the world felt it was our obligation to do. we haven't yet received a
response and we haven't been consulted. it's been suggested that pattern analysis of some kind can substitute for access to content to identify child sexual exploitation or other harms. we're skeptical that this can occur. you can't investigate and have evidence to prosecute the pre perpetrator. it's interesting to compare facebook with apple. apple's instant messaging service has been end-to-end encrypted. we receive only 43 from apple in the same period which is end-to-end encrypted. that's maybe some indication of what we're spectacle of and concerned about. to be clear, the government supports encryption.
we're not against encryption. we use encryption in the government. we are responsible also for cyber security and prosecuting cyber crime. that's our responsibility. we rely on it, we understand that commerce is dependent on it and our society is going to be dependent on it. what we oppose is end-to-end encryption that does not permit lawful access when necessary. we think it can be done safely. the concern has been it can't be done safely but, look, facebook messenger today is not end-to-end encrypted and no one thinks it's not safe. the cloud is by and large not encrypted as i understand it. no one says information stored in the cloud is not safe. we think solutions can be found and we want to work with companies to find solutions to this problem. that's it. >> jim? >> i'm looking forward to having a discussion with these issues.
i worked ongoing dark for a long, long time. and this has been a personal journey for me. both at the justice department, in the private sector, at the fbi, and since i've left the fbi. and so i take with great seriousness the comments that brad has made about the victims. there are real victims because encryption does inhibit, it does slow down, it makes law enforcement less efficient and less effective. and in the san bernardino case, when i was at the fbi, i was a general counsel there, and i thought we had a very serious and sol m obligation to the victims of the terrorist attack to do everything we could to run down every investigative lead. having in our possession the -- one of the phones of one of the perpetrators and having consent from the city of san bernardino
that actually owned the phone because he was a city worker and having a warrant to get into it, we thought it was the logical thing to do to try to get access to that information. we -- apple disagreed. we ended up in court and that dispute -- that legal dispute fizzled because a third party came forward and explained they had a way to allow us to get into the phone and so there was no judicial resolution of the matter. and so because the case was moot there at that point because we had a way to get into the phone. but my concern -- i have several concerns about the government's current approach and i've had to rethink my own approach which was strongly in favor of trying to find a way to enable the government to get access to communications. a couple of thinks have driven my thinking on this. number one is, the problem -- this at the end of the day is a
legal problem. it's not a technical problem. the sophisticated companies can write software to give access to the government. that can be done. the question is or the technical reality is, but it can't be done in a way that provides a substantial amount of cyber security, the same way that the kind of encryption systems that we have today do, does. i lost my verb there. you can re write the software, but it's not going to be as secure. that's the basic idea. the problem to me is not technical in that sense, like it could be done but with significant risks attached to it. the problem is not the fourth amendment because the government can go and get whatever warrant they want for whatever device or system that they want to get under the various legal regimes that might apply, the problem is that there's no clear, legal mechanism to force companies to
rewrite their software, to redesign their systems. the various legal provisions, they simply don't empower the government to get a court order to force companies to do what the government wants them today. that just doesn't exist. to me, the government, law enforcement agencies, myself, we've been telling the public about this for years, we've been telling congress about this for years and nothing has happened. congress has failed to act. there's a lot of reasons that we could go into about why that is, but they haven't done it. to me, that's just like dealing with reality. the reality is congress has not acted and i don't foresee them acting in the future. the administration has revived this issue recently. there's a hearing next week to discuss all this. maybe that will start to have an impact. but honestly, i doubt it. that's one reality. i don't see congress giving the administration the legal tools that it needs to force companies to do this. the second reality, i think, is
that in my view, the country, the united states and its allies, face an existential threat with respect to cyber security, malicious actors, our cybersecurity is that bad. it is sub par, poor. i don't know how else you would to describe it. encrypting stored data and spreading the use of encryption wherever we can in the very complex digital ecosystem that we all rely onto conduct our most essential services and business activities as a society, that is just -- encryption is a way -- it's not the only way and it's not a perfect way. but it's a significant way that we can use to protect ourselves from the very, very significant, existential threats that we face. so what i'm urging law enforcement and what i did and what i'm urging law enforcement to do is to rethink their
approach to encryption. because they are stewards of public safety and they have to protect the most people from the worst harm, they need to, i think, rethink their approach to encryption and actually embrace it. i think the right thing to do is to embrace it. but recognizing, what brad says is true, there are real victims of crime because -- there are real victims of crimes who will suffer because encryption in certain circumstances will inhibit the ability of the government to do its job. it will slow them down, it will make them less efficient. they use other investigative means, but having said that all, i think it's time for the government to rethink its approach to encryption and embrace it instead of trying to find ways to undermine it quite frankly. >> thanks. a couple questions come to mind. first, i want to give everybody a chance to respond to each other. but also i want to add in that there's a couple of concerns that come up from everyone's
points here. what is driving the demand for end-to-end encryption on facebook right now and on other platforms as well is a lot of it is a feeling of lack of privacy because of a loss of trust in some of the platform providers over the past few years, things like the cambridge analytica scandal and the spreading of personal information through various means, admittedly algorithmically and not necessarily by people, but there's still a lot of concern about conversations being cacheed for long periods of time. asking facebook not to use end-to-end encryption, doesn't that push people who would use end-to-end encryption on their platform, off of the platforms?
there are other platforms that are able to share with lots of people an end-to-end encryption. so why would you specifically go after facebook in this case? i understand they're the major contributor to reports, but doesn't that create a situation where people who are aware of this debate, perpetrators of those types of crimes move into another place where they can already go dark. >> first of all, on your question of whether people will move platforms over this issue, we haven't seen this to date. other platforms are available now. people are still using facebook messenger today. it's not end-to-end encrypted. we haven't seen that today is my point. second point is, do not intend to single out facebook. facebook has been a good citizen to date by all of the reports i mentioned. our concern is the shift to a
paradigm where we're concerned we're no longer getting the reports we're getting today. we would like all of industry to cooperate with the government and provide lawful access, not just facebook, but the other companies as well. >> robin. >> we will continue to be good citizens after we moved to end-to-end encryption. safety is one of the top priorities on our platform and we're thinking very hard and taking our time to build these new tools in a way where we can be confident that we're addressing the safety concerns not only of law enforcement and the department of justice but ourselves and of the public and our users. nobody wants to be using platforms that have harmful activity on them and so we're committed to a program basically of prevent, detect and respond. and so we're going to prevent. we're looking for ways to
identify how are bad actors getting in touch with each other, how are they, you know, finding victims, so that we can prevent those connections from happening in the first place. and then we're looking to detect bad activity. no, we won't have the contents of information. we'll have to change our methods. but we're going to be able to find what that bad activity looks like so we can take action on it, on the platform. and then we want to be able to respond. we want to make sure that people have the ability to report bad activity when it's happening. if you receive some kind of harmful message or aabusive message, you can do a report on facebook. if you do a report, you can consent to share with us that harmful or illegal activity in which case we will have access to the contents and could share it. things will change. that's for certain. but what we are doing is engaging in a robust process. we're having conversations with governments and law enforcement about what are the kinds of
signals that you are seeing that are helpful that are not content based so we can figure out what are the ways to identify, you know, some of these problems. we're talking to public safety experts. we've had consultations with dozens of experts to make sure that we're getting all of the information that we need so that we can build a safe product. and similarly, we're having conversations of privacy experts. none of this works if people don't feel like they have the control and privacy that they desire. we are seeing a significant shift to end-to-end messaging. 85% of messages are sent over encrypted messaging services worldwide. this is what people expect and that's why we're looking to provide it. the way that people are using their messaging services demands it because of the kind of cyber security threats you've mentioned. people are having private communications that they want to keep between themselves and their intended recipients but they're also doing business. they're sharing intellectual
property information, they're sharing financial information and engaging in conversational commerce, they share medical information. and so we have to make sure that they're security. the one other thing that i'll just add is, you know, when we're thinking about how to do safety right, you know, that was a stark statistic about apple, but there are ways you can continue reporting. i will, you know, share that whatsapp, for example, takes down 250,000 accounts because of harmful activity every month. we are able to find harmful activity even when we don't have access to content and we're going to continue to do so and we think we're well positioned to do this because we have spent so long leaning into safety on our services. >> is that mostly because of
user reporting on whatsapp? >> some of it is user reporting. a lot of the reports, these are takedowns, i don't have that number off the top of my head, but we will still continue doing scans for abusive content on our public platforms. all the public spaces, nothing changes. we're still going to be looking for abuse of content on facebook and on instagram. it's the messaging spaces where that changes. but there are still some public parts of the messaging spaces, so, for example, profile photo and is group names can be public. if you wind up using exploitative imagery as your profile photo, this is a good indication that this is not an okay account. we would be able to identify
that account because of the scanning. send that information to nicmic and take down the account. >> have you done any analysis whether the 17 million reports, is there going to be a drop off in the accounts that we get? your ceo has acknowledged that -- >> i can't speak to the percentage of, you know, decline, but certainly, you know, we think the reporting will change, right? it won't be the same kinds of image hashes, but we are consulting with law enforcement to find out how can we make -- or identify useful information for you that's not content-based and builds on the whatsapp privacy model when it comes to the data we have access to. >> jim, as far as other ways to go after this content, to pursue
people in an end-to-end environment, what type of techniques have you seen that could aid in going after these types of problems that don't require a man in the middle, back door -- >> yeah, maybe a couple different observations along those lines. number one is -- let me back up and talk about this issue a little bit more. society's failure to protect children is profound and everybody in society shares a blame for that. everybody because we have not done what we need to do to protect children, period, full stop. and so even as we have heard from bad, even with the current communication systems that we have, we still have thousands and thousands of children saved. i was always worried when i was with the government of actually giving any facts and figures because they often turned out to be wrong when they would send me out with these things in this
sort of area. but thousands and thousands of children are being abused and society is failing them now. and the failure is systemic and it has to do with way more than encrypted communications. it has to do with the inability of government to absorb all this material. it has to do with the technical systems that government has to deal with this kind of material and to deal with these perpetrators. it's a systemic failure across a long -- across many dimensions and society needs to deal with it in part by providing better tools, more money to the investigators and the centers that are trying to deal with this. for example, to try to think about how to do a better job, i think that leads into something i've been thinking about a lot lately which is not only does government needs to rethink encryption, it needs to rethink it's investigations and how it does investigations. embracing reality. the reality is these systems are
here, encryption is out of the box, the reality is, it's going to be used either in the united states or on other platforms. people are going to gravitate towards it to protect their communications and unlawful actors are going to gravitate towards it and they're going to find ways to communicate. government needs to adapt to the world we have today, try to the to go back to the past, figure out how to do a better job of analyzing data, doing deep data analytics with respect to finding the bad guys and the victims. they could invest much more in that. industry could assist law enforcement with that as well, something that might have to change some laws to be able to accomplish. but doing more data analytics, making more use of open-source information, and i think also
reinvigorating government's ability to use human sources, informants in organizations, undercover operations. the government has to do a better job of doing that. in my experience, those are the kinds of investigations that are the most effective, when you have good human sources in the places where they need to be. it's harder to do, it's more expensive, but it's more effective. >> we'll go to questions from the audience. >> i take jim's point. investigation in a child exploitation case, there's no substitute for having access to the images of the child who's been exploited. you might have a toddler and the individual who's abusing that toddler, there's no one else involved in that transaction, there's no only way to get that information. if that person is disseminating those images, there's no other way to get that information than
to have access to the content of those photographs is what i would say. >> okay. >> the other point i would say in response to jim's point, we're not trying to go back to the past here. we're trying to update laws from the past to today. we've had telephones forever, right? and we have a law called the communications assistance law enforcement act in which the telephone companies have to work with the government. we're updating those laws so that a different means of communication today will meet the same requirements as phone companies have had to do for decades. we've had wiretaps for decades. the wiretap is a fundamental tool that we need to be able to do. why is it different on the internet? >> because -- do you want me to -- the digital ecosystem has changed substantially and the volume, variety, and velocity of
the communications is a different world than it was five years ago, ten years ago. before the really advent of -- >> you can have the exact same voice communications via the internet as you can over a regular telephone lines. i see no legal or moral justification for that. >> so my point is, then go to congress. >> absolutely. i agree with that. >> congress -- so -- my point is, there are victims that we've been talking about here, the children are victims, and other people, kidnap victims, there's a whole range of victims who exist and who will suffer as a result of crimes that law enforcement proceeding in the way that it does today, cannot solve as quickly as it might otherwise, okay? there are victims there. there's also, maybe -- there are also substantial risks to society with respect to our, again, societal failure to build
a digital ecosystem that is secure. we don't have that. and we are more dependent on that than we've ever been in the past and if we have a significant, catastrophic failure for a period of time, i'm worried about society's ability to function effectively and i think people will be harmed, injured, die if we have a failure like that. so with victims on one side, victims on the other, how do we sort this out with the risk to the digital ecosystem from, you know, doing something that would interfere with the ability to have encrypted communications, congress needs to resolve that, the elected representatives of the people need to balance that, step up to the plate, pass a law and change the landscape or not. but it's not -- i don't think it's up to the private sector to sort that out. companies in the united states, i'm quite confident, will follow the law, whatever it is.
congress needs to act. so far the government has failed to persuade congress to act. and i think that's where the focus should be. >> that's the point i agree with you 100% on that point. >> do we have any questions from the audience? we have a microphone. let's start off with -- all right. i don't think we have a mic. i'll start up here. sir? >> my name is steven. i'm a retired foreign service officer and i also served two tours. i just have a few comments -- >> now we have a microphone for you. >> i just have a few comments on what you folks said and i would appreciate your reactions to it. first my assumption is that the vast majority of the people in this room, if not everyone, is against exploitation of children. >> of course. >> and i think it's a red herring to use that because the
law enforcement authorities were dealing with this sort of problem and a whole range of other problems long before we had the technology that we're talking about. so there are other techniques to deal with it. end-to-end encryption and all the other technologies we're talking about have very legitimate uses. they help protect dissidents in third world countries, business here, et cetera, et cetera technology, you can't make it disappear. if you forbid facebook from providing something, i'll be able to get it, other people will be able to get it, you know, in one way or the other. so i think it's not really feasible to even do what you're trying to talk about when the attorney general talking about having a back door, quite frankly, he's just showing that he doesn't understand the technology that's involved. and i've had conversations with
former cia directors and he's also of the opinion it's not possible to do what you're describing. so, again, thank you for your comments and i would appreciate hearing what you have to say. >> thank you. that does bring up a number of issues that i've got in mind and that is, we've -- experts in the field have said that if you put a back door into a system, regardless of how you approach it, there's room for abuse and there's room for breaking. there's also the concern that what can be warranted can also be abused the terms of access. we've seen a number of cases where legal access has been abused in the past and i understand they're not the majority, but they happen. given that and given the weaknesses that you would introduce into a system, what is the -- what is your response to
that? this is something that legislation has to decide. but from the standpoint of a mathematical perspective, there is no known way and a lot of people have tried to build a back door into things that allows for only warranted access. the only way this access would work if there was a man in the middle type of arrangement where everything flows through the service provider and you're given access through the service provider. and the service provider can be compromised. how do we deal with the laws and physics and mathematics in this? >> i'm not a crypting to fer, but the people at the nsa think the solution is doable. bill gates has not said this is not a question of ability, it's a question of will. a number of governments have
said, australia, united kingdom, united states, governments in europe, governments in other parts of the world have all said this is doable. >> two former nsa directors has said it isn't. >> let's talk about the systems that exist today, right? that exist today. facebook has a system today, right, that is not end-to-end encrypted. maybe it's not as safe as it will be with the end-to-end encryption. companies have made decisions for their own business reasons to maintain access to the information. if they can maintain access to sell advertising, why can't they do it? apple has a key where they can do the software updates for those funs. they have that key at apple. they have to protect that key. it would would be a huge
security asset if they lose that key. all we're asking is that there be a key that we can get at for law enforcement. >> are you looking at a solution similar to what australia has legislated, where a provider can make a modification to software against specific individuals to allow access to their accounts? >> we're looking to any solution that will allow us access. we're willing to have a discussion with the companies about what they think is most effective and address cyber security. we're investigating those same crimes. but we think this can be done consistent have cyber security. these are the most innovative companies in the world. i think it's not credible. >> we have another question.
where's the microphone? there's several. start off in the back, i guess. >> last year we heard about a provision for group messaging and lawful access for group messaging and even the doj didn't support that publicly. my question is really, why hasn't the doj put together a technical solution that they think would work because absent that, i think a lot of the people in this room are debating something that is sort of a hypothetical. >> we've talked about different options. i've talked about a couple of them today. our position has been we think all the companies have different platforms and services, some of them are device makers, some of them are making communication systems. they need to come up with their own methods that are is most
consistent with their business needs and with their technology for providing the access that we want, as opposed to having a government top-down solution. that's our philosophy. >> can i jump in real quick? >> if you dig through the video archives you will find a clip of me saying what brad has been saying in the past. i understood the problem exactly the way that he's articulating it now. having spent years and years working on this, my understanding is also that there actually is no technical solution that adequately in the sense of like perfectly, protects cyber security and provides the government access. it just didn't exist. real quick, yes, the companies have different systems where
they've made different choices, where they don't use encryption, and they've decided to use encryption, all these different things. but, again, given the fact that there is no system that actually provides cyber security, that provides strong encryption and provides the government with access, that thing does not exist. if you're going to introduce some cyber security risk into a system, then that's a call that congress has to make. i come back to that. it's not -- it's the -- they've got to legislate if they want that to happen and they then, on behalf of society, take the risk that some bad person, some bad organization, some bad foreign government is going to figure out a way to disrupt all the communications that we think today are encrypted when you change things in this way, they're no longer going to be effectively encrypted and society is going to bear that burden and so congress has to make that call.
>> what's facebook done in this space force looking at alternatives? has there been an examination of ways to do the back-door? >> technologists have said that it's simply not possible to build an encrypted system with exceptional access and have there not be a potentially very dangerous vulnerability that can be exploited by malicious actors. it's just not something that's possible and so, you know, we haven't, to my knowledge, invested in trying to build any such system and we certainly won't be investing and building any such system in the future. >> another question up there and -- >> thank you very much. i work for defending rights which is an organization that defends the right to political expression. the church committee cites the
fbi's conduct against our organization as an example of an abuse of power. i guess what's concerning to me is the chilling effect that putting in this law enforcement back door to encryption could have on free speech. up until two or three years ago, thanks to a supreme court ruling and the fec, the socialist workers party was immune from certain decisions on the basis but by disclosing the names of their donors, they would be making them potentially liable to law enforcement abuse based on a real history. so given that there are instances throughout our history with -- where the government has been the malicious actor, including when it was against my organization, do you worry about putting this law enforcement back door the encryption will have a chilling impact on speech or help to facilitate those
types of abuses? >> so my answer to that is we have to depend on your laws and our federal courts. what we're talking about here is only court-authorized access. today with that court-authorized access we can wiretap your phone. we can search your home. we can search your car. we can do all of those things when an independent judge has decided that we have probable cause to believe that you've engaged in criminal activity. that's been our standard since the founding of this country, right, that we can do that. we have to be able to do that to protect people, be able to search those people's homes, cars, et cetera the question now is we have new technology. should we be able to have that same ability with court approval to protect our privacy, civil liberties, first amendment rights, et cetera, when there's a new space or are we going to have a new space that's enclosed
from that. it's a house that your kids can go to down the street, and if your kid has disappeared, there's no way to get him back. you can't find that child. we're talking about a new technology, is it going to be immune from lawful access or not? >> you had a chance to that? >> i think there's a distinction because i think what you're talking about would be able to access communications if there was exceptional access. the problem with exceptional access is that the front door for the government is a back door for malicious actors. it means only you and your intended recipients are able to see the communications and there's just no security way to be able to build in that of exceptional access for the
government alone. >> that's where we disagree. we think companies have maintained that access for themselves. like i mentioned earlier, apple can send software to the phones, why can't we have that access as a government? facebook has that access today, right? and the government has access and that's been the case since facebook was created. it's not been a problem to date. >> only -- >> it's been a secure system today, right? >> users are demanding more secure -- >> maybe, i'm just saying it hasn't been a problem. >> got three people down there, so we'll start in the middle and work our way over. >> hi everyone, thanks for being here. i'm from cyber scoop. i wonder if you could speak to the situation we have here. there's absolutely a difference between facebook and the
department of justice on this issue. can we interpret this as possible legislation? does the department of justice have plans to use this moment to put sort of silicon valley on notice that amendments may be coming down the pike? >> so that's a broader discussion. >> i'd like to follow up then with another question. in terms of speaking to chilling speech if end to end encryption is delayed or not allowed in a broader sense in silicon valley. what can you say to what this would do to the market in terms of your ability to access end to end encrypted conversations if they take them to other markets that's not in the u.s., for example? thank you. >> so we want to work with the foreign governments so we have solutions that are not applicable only to u.s. companies but to their -- well --
>> i'm from the center of technology. question for you. you said there would be a court order and lawful process. is it really always the case, or if exceptional access was built in, are you telling us that nsa, for example, wouldn't be able to exploit that exceptional access that was given to the fbi in that the -- that there would never be the use, for example, of section 702 executive order 12333 to access communications through the exceptional process? >> we're talking about court authorized access. >> the nsa in the past has worked to break other encryptions for the purpose of surveillance so it doesn't mean that would exclude using that capability to go after foreign
intelligence targets. >> that's what nsa does, that's what we pay them for, to break encryption. what people are saying, a lot of people argue is why don't you just try to break into the system. that's a better model than having lawful access. i'm not sure why that's safer, why anyone think it's safer for us to identify vulnerabilities and exploit in the system and not tell anybody about them. why is that better? why is that a better model? jim well knows that's what the fbi spends a lot of time doing, trying to find those vulnerabilities. why is that better for anyone? why is that better protection? we all know, there is no perfect security. there's always ways to break in. there's no system, i agree there's no perfect security. it's always a balance. the way i think about it is, look, we regulate in other contexts for automobiles. we say, okay, you've got to have fuel emission standards.
we know for a certainty your car is going to be less safe if you have a car that's big and heavy, and it can result in more fatalities, traffic injuries, right? we make a decision as a society that we have competing goals and we want to have emissions standards, clean air, safer planet. so we're willing to accept that cost. i think it's the same trade off we're talking about here. there is no perfect security, the same way there's no perfect car that can be immune from any car accident whatsoever. we make these judgments as a society. i agree, ultimately these are things that congress should be tacking that have not tackled over the last several decades. they should be. there shouldn't be decisions made unilaterally by the -- >> let members of congress -- >> i agree. >> let members of congress cast a vote when everybody is telling them the result of that vote will be less cybersecurity, less
security for the american people. let them cast that vote. let them associate their name for that. >> more security for the child exploitation victims, all those out there who are being victimized by online activities. >> maybe. the failures with respect to children exist today. the world that you're talking about, the horrible world that you're describing now exist today and government has failed. >> we're able to save many of those children because of the access we have -- >> how many victims are there still? how many unknown victims are there? >> let's get another question from the audience while we still have time. >> hi. my question is always for brad. jim baker alluded to the issue earlier about going out there with numbers that might not be correct. in june of 2018, if i'm remembering the month correctly, it was -- it came out that doj and the fbi had been using an inaccurate figure on the number
of locked phones that it was unable to access. the number was 7,800. and the news accounts said actually the number might be closer to 1,000, but we're working on it. subsequently doj said this number is wrong. i'm wondering if you can give us any update or if doj is working to give more information as you seek to have the conversation on what the true extent of this problem is when you have cases that is thwarted by encryption preventing you from getting access to the phones. >> i don't have an update to that. you're right the number was erroneous originally. it's still a large number. it might not be 7,800. that's a piece of the pie. that's device encryption and devices we have in law enforcement custody and so forth. to answer your question, we have a new updated number. i have to get back to you on that. may have one, i don't personally
know. >> we have time for one or two more questions. >> can i just make a point about the damn numbers? if government wants to persuade congress to do something, it's got to do a better job of counting. i know how hard that is. it's very hard to do. they've got to do a better job otherwise they're not going to prevail. >> martin moulten, dclp, mr. baker, why in the world would we trust the u.s. government, the top terrorist on the planet, that the fbi has had information on sex trafficker jeffrey epstein for more than a decade and has done nothing to incarcerate or investigate the perpetrators of people who have exploited children and girls from all over the world and all over the country and from new york city public schools? >> i don't know the details about the epstein case, but my
understanding is it's still actively being investigated. that the u.s. attorney's office in the southern district of new york is working on it along with the fbi. i'm not in the government anymore. i can't explain what's happening with that. i would tell you i could not disagree more with respect to your original statement about the united states government being terrorists. that's preposterous. i don't go along with that. with respect to the other matters you're going to have to ask the government about that. >> thank you. my understanding is -- without getting too technical, on the one hand you have people saying it's not feasible to have a back door. on the other hand you talk about a google end to end encryption. i know they have two different encryption methods. one is for data and transit, the
other is for data at rest. for example when you're to the cloud. when there's a gap between switching from the one modality to the other, that's where google goes in to get data that they use for marketing purposes, et cetera. so my intuition is that as a point of fact the fbi does have access when it wants to from the technical perspective. but the issue is that then the doj can't quite use that information because it's sort of -- it's, you know, fruit of the poisoned tree, it's been improperly accessed. is this a legal matter or technical matter? it sounds like it's more of a legal matter going back to your congress point. but technically, even what we call end to end has numerous snap points where entities either malefactors or the fbi, for example, government entities
can get in. that's really not the issue. so i just would be interested in your thoughts on that. >> to interject a little bit here, so the point you're talking about is when it's end to end, the end points are themselves a point of access. so but whether the end be the storage on one end or the other -- so in transit -- so that happens in software, right. it would depend upon the software and the provider. that means there has to be some sort of interjection of logic in the software that picks up on the data as it's translated from receive to store. so that would be exploiting the software that the vendor provides. and i don't believe facebook is looking at doing it. there's a number of steps where that could happen. i asked about this a while ago, couldn't you say have something in the client side where the
receiver gets the message and you can process the image to see if it's harmful? and that's not going to happen because it requires too much overhead in different places and it's also totally breaks the whole idea of end to end. it ends the privacy. so but, you know, it is a good question in terms of whether that sort of surveillance is a solution from the department of justice's perspective. that's something that would have to be legislated to happen. because it require as change in software. >> i'm not sure what the question is. >> she's asking so the way encryption over the wire works, it's encrypted in one form and it's received and unencrypted. when it's stored, it's encrypted in a different way. with google, it's encrypted with user credentials. it's not encrypted in the
public/private key type of exchange that happens or the key that's used for the session. totally different type of encryption and storage. she's asking weighing the gap between the two to get the information that's passing over and processing for security purposes. >> i don't have the answer to that technical question. i think the comment is a good point, i've seen on both sides of this debate. i've heard it from many of the companies. look, you have access to all these categories, you don't get this. you can get these other three things. isn't it good enough for the government? right. we flip that around and say, well, if we can get access to these other things, why can't we get access to this? we have made decisions to maintain lawful access. all the things people have been talking about, it's impossible to do this. very sensitive data on these other platforms they say they're own business reasons we're going to maintain access to these systems, but not for these other ones. they argue, look, we're allowing you to have these other ones.
we the government is saying we'll turn that on the head, you're going to allow us access to these, why not? >> and -- >> because it exists today, many systems. every company here represented or otherwise will tell you yeah, there are plenty of systems today that are secure. they're going to continue to maintain access but they're not going to go to end to end encryption. >> we've got about a minute left if you would like to respond to that in any way. >> it's just it's apples and oranges. because sometimes -- and it's up to the user to decide how much risk they want to take on that the company or anybody else is going to look at hatheir communications. if you're using an e-mail system where when i send you an e-mail, it's encrypted while it's traveling. when it gets to me it's unencrypted and also the company can look at it, we know that and we can make a risk based assessment about whether we want to communicate certain data over that and whether we trust the company. in certain circumstances, however we don't want the company to do that. i want to send you a message and
i want it to be the case that only the two of us can read it. that's what we're talking about here with real end to end encryption. that's what it's all about. we make that assessment and for whatever reason that's the risk that we want to take or don't want to take. and so yes, in certain circumstances the companies make business choices and the customers make choices and they accept the risk or they don't or whatever. that's the multifaceted world that we live in today. encryption is out -- end to end encryption is out of the box. the cat is soout of the bag, whatever. it's not going back in. it's what we have to figure out how to deal with it. >> all we ask is that the public is let into the mix. >> last word? >> i would say i would think it's business reasons but it's policy reasons. we care about our users' privacy and the security of their data and making sure they can have sensitive communications in a way they don't have to worry about it being exploited.
you know, there are many, many cybersecurity threats. whether it's stored data and the billions and billions of records that are, you know, the subject of data breaches every single year, or whether it's other forms of exploitation. what jim's saying is right. the world out there when it comes to cybersecurity is pretty dangerous. but, you know, you're also raising extremely important points about the importance of safety on our platforms. we're extremely committed to making sure that we get that balance right. that we provide strong end to end encryption and find other non-content based ways to address the safety issues. because we're committed to safety, we're committed to continuing to be the industry leader in the space. and we really value and are appreciative of the important work law enforcement does to keep the public safe. we're going to be doing our part. >> i'd like to thank the three of you for this. we could go on for hours, i'm
sure on this. i know many of you have questions you still have, but we'll have to take them off stage. thank you for coming, thanks for watching and thank you for being here to talk about this very important topic. >> thank you. >> thank you again and thank all of you both here and at home for turning into the 2019 cato surveillance conference. usually i can end these days horrified by the enormous range of ways that were observed. but weirdly somewhat more pertinent as this one draws to a close thinking about, you know, the number of people who are thinking about how to ensure these powers are kept in their proper place. they were for us rather than being tools to be used against
us. i do want one more time thank not only all our speakers, but our wonderful, wonderful conference manager kiana graham who does all the actual hard work while i get to stand up here and looking clever for having assembled all this. everything makes this conference come together and work so smoothly is to kiana's credit. join me in applauding her. and then rather than stretch out my closing remarks unnecessarily i'm going to invite everyone. for those at home, i'm sorry, why didn't you attend in person. join us in the atrium for beer and wine and hors d'oeuvres. thank you again. now live to a house