Skip to main content

tv   Cato Institute 2018 Surveillence Conference Part 2  CSPAN  December 14, 2018 9:36pm-11:10pm EST

9:36 pm
slightly different presentation. >> and we should be clear the data that you talk about on the website represents those employees of the department of justice who elected to make contributions that does not necessarily an accurate reflection of the overall political world view department of justice employee as a whole so talking about samples we have to be very careful so with that we will conclude this panel with a 15 minute break please give them a shout out. [applause]
9:37 pm
[inaudible conversations] as i mentioned in my remarks there are so many fascinating issues surrounding surveillance and technologies that if we were to cover them all with a panel like you just saw it would be a conference that would last three weeks and because even i have limits to my capacity for issues that long the last couple of years we have been inviting scholars to present a shorter flash
9:38 pm
talk to focus tightly on a single subject and present the analysis they have been doing in a way that allows us to get a sense of the range of hard questions we face as citizens and policymakers. our morning flash talk talks about facial recognition and social media surveillance to the global war of encryption so look to the website for more information you will find links to the speakers names for more extensive biographies but we will begin with the recently passed legislation in australia with law enforcement access to encrypted software and tools in the sense and.
9:39 pm
>> with the open technology institute if you told me why an year ago i would be here today over australia i would've thought you were joking but i'm glad to have the opportunity to speak to you today and how this could allow the united states to look down under for the encryption backdoor.
9:40 pm
for those of you who may not be familiar with the long-standing encryption debate it pits security get security for your the us justice department and fbi have been arguing they will do increasing use of encryption complained they can access communications even with a court order now have encryption by default in products and services with companies and they do not have access to user communications the justice department fbi want to require tech companies guarantee government has exceptional access or they now call the use of responsible encryption so they can always access even encrypted messages otherwise they say they are hampered in their ability to keep americans safe from terrorist and other criminals
9:41 pm
but security researchers tech companies have said this would amount to the encryption backdoor there is no way to guarantee only the us government would be able to use the mechanism rather this amounts to deliberately the vulnerabilities built into products and services to undermine the security for all would harm everyone's privacy with cybersecurity and that we are all victims of criminal activity. in addition what the oti hosted last month to protect economic security and personal safety and freedom of individuals including victims of domestic violence this debate has been going on for years has now gone global to
9:42 pm
australia this past august the australian government believed the exposure draft of the telecommunications and other legislation amendment unlike u.s. congress which takes months and months or years before passing anything else on modesto australia parliament wrapped up consideration in four months following a public comment period on the exposure draft a slightly modified version of the bill was introduced in parliament and referred to the joint committee which opened a new public comment. my organization organized an international coalition with tech companies and trade associations and we had three rounds of public comments on the bill outlining concerns which i will describe in a moment the committee held a
9:43 pm
series of hearings then at the beginning of last week they issued a report recommending passage of the bill was certain amendments incorporated just last thursday december 6 parliament released an updated version with 173 amendments nobody had ever seen before but by the end of the day it was passed into law. so what does the australian law actually do cracks as one commentor put it it combines stupidity of the waiver that now means any it product hardware or software made in australia will be automatically too risky to use for anyone concerned about cybersecurity. so we are focusing on schedule one of the law that was designed for encryption there are also other section to
9:44 pm
create like increase powers of government height hacking that we are focusing on encryption and schedule one so it appears to be an encouraging statement to prohibit the government from demanding encryption backdoor and i have it appear on the slide. it says that government may not request or require communications providers to implement or build a systemic vulnerability but also it must not prevent them from systemic vulnerabilities. however it grants unprecedented new authorities to the government to undermine the promise specifically the law creates three new and powerful tools for the australian government technical assistance request
9:45 pm
or notices and technical capability notices. the request are voluntary the notices are mandatory the differences depends on which government official is authorized to issue the notice all authorize the government to demand any listed act so that is a long list and it includes things like removing one or more forms of electronic protection to be applied by on behalf of the provider and includes modifying or facilitating the modification of the characteristics provided by the designated communication provider. in short to demand tech companies weaken the security features of their products. for example the australian
9:46 pm
government can now make the same request to apple the fbi made in the san bernardino shooter case as they build a new system to circumvent those features as apple explained in that case building a software tool would have made that technique widely available threading the cybersecurity of other users and as we know in the lawsuit here in the us the united states government argued under the obscure act that dates back to 1789 they were permitted to make this demand of apple but apple supported by other tech companies and privacy advocates said it was unconstitutional the justice department withdrew the demand before the court could resolve the legal question because the fbi could pay an outside vendor to hack into the phone now in a still you they can make these demands.
9:47 pm
another worrisome scenario is it could use its authority in the same way the united kingdom looks to use it just this month the uk nsa put out a proposal under the proposal tech companies will be asked or required to add as a silent participant of n2 and encrypted chats they argue you don't even have to touch the encryption to add as a ghost user. there are several other threats posed by the new australia law with encryption at the coalition conference in addition to explaining those
9:48 pm
powers we also address three other key concerns. first any requirement for prior independent review or adequate oversight such as the authorization for technical capabilities notices were modeled on the powers act of 2016 the uk law also raises human rights but section 254 of the act does require the judicial commissioners must review and approve the capability notices before they may be issued they may have questions about the adequacy and independence under the uk law is truly his authority to pose even greater threat to individual rights because there is no provision requiring independent review. in addition australia has no bill of rights so while the procedures that they may
9:49 pm
challenge those orders they will be more difficult tech companies do not have the same legal arguments available to them protecting individual rights like in the uk or us the law requires undue secrecy although the statistical transparency reporting that also includes very strict nondisclosure requirements whenever the government issues a request violation of the secrecy rules is a criminal offense punishable by up to five years in prison and there are no limits to the duration of a gag order such as the us when the reason no longer exist and third said
9:50 pm
definition of covered communications provider is broad including anyone who provides an electronic service with one or more end-users in australia so that means anybody doing business providing electronic services is subject to government demands to weaken the security of products and services. so what does it mean in the united states cracks australia's legislation is part of a coordinated effort of the alliance five eyes is the intelligence alliance australia canada new zealand united kingdom and united states dating back to world war ii since 2013 these have also had a five country ministerial a convening a strategy and information sharing of national security and law enforcement over the past two years they have focused on strategy and policy
9:51 pm
to weaken encryption. just this past august the five countries released a statement on access to evidence in encryption that includes if the governments continue to encounter impediments for encrypted communications they may pursue legislative mandates the very same month that statement came out australia released the exposure draft of this encryption bill. so now australia's law provides the united states and other governments with the encryption back backdoor and now has the authority to create the back door and once they are providers are required to build a weakness in then other governments can exploit that we always talked about apple now if issue one -
9:52 pm
- australia issued a technical ability office to circumvent the iphone security features which is what the fbi asserted in san bernardino then if apple declines it can no longer argument lacks the capacity to settle cases similarly if australia wants to reengineer to be accessible to the legal system those would be vulnerable to other governments and finally there is a risk the government could expand its own authority to be a new model for responsible encryption legislation so weather as a pathway or the model it creates that privacy that goes well beyond australia's borders. think you. [applause]
9:53 pm
>> thank you. next the french philosopher who is known for his analysis between surveillance training with discipline in his book usually translated as discipline and punish so very naturally close monitoring is always a key component with every little part but it also
9:54 pm
means we need to worry if we are training them for compliance surveillance as the technological capability happen more closely it becomes a reality with widespread use. i often wonder if we are preparing children to accept that as normal a world that everything they do is closely scrutinize so the national media surveillance. >> thank you so much that is a perfect introduction i will be coming back exactly that point when i and my presentation i am senior counsel for the national security program at the brennan center for justice i will talk about today social
9:55 pm
media surveillance. to start off just for a moment talking about the prevalence of the deep saturation the kids have online so according to an internet study over the last month 97 percent. 97 percent of 13 through 17 -year-olds are on at least one major social media platform. 95 percent are connected to a smart phone and 45 percent said they are online almost constantly. so there is clearly a lot of content out there and a lot of time that teens and younger kids are spending online. with that social media presence come social media mongering and these are for a variety of purposes sold to prevent bullying or school shooting shootings, potential suicides and other online threats and not surprisingly
9:56 pm
also being defended so spending my public schools nationwide on nine major social media monitoring companies, you can see there are some spikes there are mountains and valleys but overall it is a massive increase of spending in 2010 through the spike in 2015 and then a big spike in the summer of 2018 potentially driven by the shooting in parkland florida the public school districts are spending more and more money on automated social media monitoring tools. this is similar reflected to keyword searches by social media monitoring between public schools and private companies and again showing the spikes over the last several years with a significant decrease than a
9:57 pm
major increase in 2018 public money is being spent. so based on these statistics you may think schools are getting more dangerous but in fact the opposite is true they are actually getting safer and while it is true this country has a unique list of school shootings among developed countries and a single shooting is one too many the overall crime decline in this country holds true so the odds a k-12 student to be shot and killed at a public school is one in 614 million so by way of choking is one in 3400 in 198510 percent of students aged 12 to 18 reported to be the victim of a crime at school the previous six months in 20152016 school year just
9:58 pm
3 percent so in that 20 year. it went from 10 percent down at 3 percent which is a major increase and in general less than 3 percent of homicide less than 1 percent of suicidal curd at school of discourse with social media monitoring hoping to pick up off of school grounds as well but by any measure school is a pretty safe place. now the one stay in the country that has legislated social media monitoring sure after the shooting last february when nicholas cruz shot and killed 17 students and injured 17 others and in the wake of that shooting the florida legislature passed a law with the creation of an office of state within the state department of education that office is required to coordinate with the florida
9:59 pm
department of law enforcement to have a centralized database so it also established a public safety commission recently recommended the development of protocols among social media monitoring. it is likely to pass in the new year. as it turns out nicholas cruz and people were reported to the fbi and local police at least three times for disturbing. one call warned he could become a school shooter and then with the youtube post in which the user had said he wanted to become a professional school shooter although he was not identified until after the shooting. so while there were warning signs on social media, that
10:00 pm
wasn't the case that they were flying blind. they could see the warning signals and were trying to act on them. summit failed those students it wasn't a failure to see the post with those coming out in august but the district itself had failed at nearly every turn to provide him with the support services he needed. . . . . monitoring could catch one future nicholas cruz, one future suicidal student, why not do it. if the stakes are that high what's the harm? and there are reasons to be coshing about this kind of monitoring. so the first is a real concern about the accuracy of social media monitoring tools.
10:01 pm
this plays out in a couple of different ways. so one way that these tools could be inaccurate is through overreach. so the fact they're likely to pull in more information than is going to be useful. by way of example, police in jacksonville florida set up a social media monitoring tool for key words that might indicate a risk of criminal activity. one of the words they set up was the word bomb. think if it was a bomb threat it would turn it up. it turned up there were no bomb threats flagged online. instead it was inundated about things like pizza that was "the bomb" so things coming in of little use. the second issue is underreach by which i mean the kinds of risks that social media monitoring tools would like to find often aren't going to
10:02 pm
appear online at all. so i mention earlier that nicholas cruz had posted on about his intentions and people had reported them. so it's not clear what the extra value of monitoring software would have been. and as it turns out to come extent he was the exception. so the brennan center did a survey of major school shootings and unfortunately that is a category. major school shootings since the sandy hook shooting in 2012. there was only one other perpetrator according to the public reporting that had put up social media postings and that was adam lapsa, the shooter in newtown. he had posted in discussion forums about the columbine shooting, and had premeditated tumblebering accounts that were named after school shooterses. these were not a secret and fellow users were able to see these and though they may not have been able to know at the
10:03 pm
moment it's hard to imagine that now these wouldn't have been reported to authorities. in fact we saw with nicholas cruz that is exactly what happened. individual concerned users would report that in. the online profiles of other shooters in other major school shootings which usually get a lot of reporting after the fact don't show anything that would flag them for an automated tool. so the perpetrator of a 2014 shooting in trout villoregon had a facebook page that showed he liked military themed games and first-person shooter games. he also liked knife and gun pages. sure these seem like warning signs, but in fact the official facebook page for call of duty world war ii has nearly 24 million followers, the remington arms facebook page has over 1.3 million likes.
10:04 pm
so sending up a red flag about every single person who enjoys these past times would create a huge quantity of noise for very little signal. and finally automated social media monitoring tools just have built in shortcomings. so i'll flag a terrific report from the center on democracy and technology called mixed messages which does a lot of research on this. and as their research shows automated monitoring tools generally work best when the posts are in andeng when the tool is looking for something concrete. they can be easily fooled by lingo, slang, pop culture references things like that. maybe the best example comes from the 2015 trial of the boston marathon bomber. during the trial the fbi produced as evidence several quotes from his twittedder account to try to show that he himself was an extremist, and not just following his brother's orders. so for instance, he had tweeted
10:05 pm
a quote that said "i shall die young" which maybe was suggesting something about his intent but it was a quote from a russian ponsong, and he linked to the pop song in the tweet. the agent hadn't bothered clicking on the tweet to see it was a song lyric. the other ones were south park episodes, and jz songs. social media is incredibly contextual, neither automated tools or human analysts are that great at parsing out that context. the second major concern is the risk of discrimination, and this kind of comes in two forms. so the first is that the key words themselves that the tools will be set to flag on will be discriminateory. so an aclu report found when a boston police department set up a social media monitoring tool the hash tags it was flagging
10:06 pm
included black lives matter, ferguson, muslim lives matter, and the muslim word for community. these are not signs of a public safety threat. these tools are only as good as the people who are using them and there's a lot of ways to use them to further a discriminateory mindset. the second is the risk of discriminatory mindset. there's going to be a huge amount of description to what's done with the results including what students are brought in, punished, and subjected to criminal justice consequences. we already know students of color at every level of schooling experience harsher discipline than white students even for the same infractions and even when they commit infractions at lower rates than white students. so there's a concern that social media monitoring could contribute to the school to
10:07 pm
prison pipeline. i remember the muslim teenager who brought a homemade clock to his dallas area high school and then reachtd on the suspicion that it concealed a bomb. he was well known for bringing in electronics, ficking other people's electronics and he had told his teachers and the principal arrangemently that it was in fact a clock. it raises suspicions that the scrutiny he was put under and his ultimately arrest was essentially grounded in islamaphobia. on the social media front a alabama high school paid a fbi agent to go throughout social media accounts on the basis of anonymous tips. the district suspended a dozen students on the basis of what he found online. 86 of the students expelled were blacks. even though blacks made up only 40% of the student body. not surprisingly where people are mistakenly identified as
10:08 pm
posing a threat because of their social media posts, the consequences can be serious. one connecticut teenager posted on snapchat a picture of a toy air soft gun that resembled a real rife, he said he thought it was awesome and knew his friends would think so too. another student saw the post and was worried about it so reported it to school officials. this does not strike me as a crazy thing to do. although as the student note if the officials had googled the name on the side of the gun or the manufacturer they would have seen it was a toy gun even though it did bear a resemblance to a real one. instead of discussing it with him, and thinking before you post discussions. he was not only suspended for the day but arrested for breach of peace a misdemeanor offense. now, because it's so hard to
10:09 pm
reliably pinpoint social media posts that indicate some kind of live threat, monitoring companies have kind of a perverse incentive. they have an incentive to sweep up everything so they can assure their customers that they'll spot that needle in the haystack. at the same time they have very little reliable way of gauging their effectiveness. a 2015 investigation by the christian science monitor revealed none of the three major school social media monitoring companies they looked into had firm metrics for measuring effectiveness. at least one said basically we know we succeeded when we get a call from a school saying something we sent them was interesting. so this was really a perfect storm for a mindset of more, more more. at the same time parents and students often know very little about these tools. research shows that while social media monitoring companies may assume that students are
10:10 pm
assenting to being tracked by virtue of posting on public sites, students more often believe that companies are prohibited from sharing personal information with third parties. so there's a real lack of information about how these programs operate. or rather asymmetrical information. and finally this goes to julian's point at the beginning. it's worth thinking about what it means for students to be under constant surveillance online. so as a practical matter they may stop posting or start posting less or in private forums which will blunt any effectiveness these tools would have had. maybe more concerningly it teaches students to expect surveillance. and even to anticipate an authority's figure's opinion and react accordingly. some of this you could say is good digital hygiene. i think we all know something we post publicly we need to think
10:11 pm
what that looks like, who might see it now, and who might see it in the future. it's not clear it's healthy for student who are learning about citizens role in a democracy to know they are under that surveillance all the time, and to be acting accordingly. so what does this all mean? at the very least before a school or a school district without a social media monitoring program it's incumbent on officials to weigh the costs and benefits and to involve parents and students in a frank discussion of what it means. and if they decide not to sed forth on a monitoring program they should remember they are most likely not going dark. there are a lot of concerned people out there who will spot posts and -- them. thank you so much. [applause] >> thanks so much rachel am.
10:12 pm
there is a science fiction writer who has a more opt mystic view of this. we are training our children to develop counter intelligence trade craft just to be able to have a normal childhood so the next generation will be very sophisticated about evading surveillance. i don'ti suppose we'll find out. two talks that focus on privacy in public, the myriad ways that just walking down an ordinary city street we are being observed in ways we may not recognize and also the ways existing networks of surveillance like closed circuit cameras can be transformed and fairly deep ways by existing infrastructure becoming a platform for new methods of monitoring. so the first of these is going to be an examation of camera
10:13 pm
networks for facial recognize surveillance from -- of the project and government oversight. >> thank you so much for having me here. i'm jay peruke, i'm a senior counsel at the pogo and i'm excited to be talking about facial recognize and a specific aspect of facial recognize how cameras and various aspects can empower and grow facial recognize surveillance into drag nets. so as a quick start about facial recognize surveillance itself, this is no longer a sci-fi technology or minority report in the future.
10:14 pm
it is happening now. the fbi conducts over 4,000 facial recognize searches a month. a quarter of all state and local police departments have the ability to do facial recognize scans. they use facial recognizes for outgoing flights, and they will apply this to sea ports and land ports across the country, and ice is looking to buy facial recognition technology as well. it is a very real and live surveillance threat. facial recognition depends on three key factors to be a powerful force for surveillance. first, you need a database of photos that are identified with people. they have about half of all american adults in the photo database. you need very powerful software technology that can scan across hundreds of millions of photos and scan phases rapidly, lots of companies is developing the
10:15 pm
technology, and other companies as well. and the government. you need a network of cameras you can tap into and use to see people's faces everywhere, all the time. there are four areas where this you have the potential to build these camera networks. first, government surveillance cameras cctb, second, police body cameras, third, privately owned security cameras and last social media photo databases. so let's start first with government surveillance programs cctb programs. about a decade ago then chicago mayor richard daily said he expected one day we would have a police camera basically on every corner. i want you to keep that quote in mind as we talk more about cctv in american cities but first let's go to where we truly have a cctv photo drag net and where it seems we achieved big brother status, and that is in china. china is by far the most powerful network of government
10:16 pm
surveillance cameras we can see in the world. the country has an estimated 200 million government run surveillance cameras throughout the country. and the effectives of this are profound. if you look at cities these networks are incredibly dense and incredibly powerful. for example, beijing maintains over 46,000cctb cameras that blanket the city. state media and police in beijing boston this network allows them to have 100 percent coverage of the city and see everything going on all the time. this can have powerful impacts for facial recognition. so for example, recent bbc reporter asked to test the system. he went to a city of 3.5 million people. gave his photo to the government to input in the system, asked them to find him. using their cameras and system, the automatic facial recognition software found him in 7 minutes in a city of 3.5 million people.
10:17 pm
so that is surveillance cameras at its peak. cctv is america to a strong degree. instituted in large cities such as new york, chicago, washington, and los angeles. in new york there is a cctv network hub this is called the domain awareness system the way it works is you have all cameras networked into a centralized hub that can be subject to realtime viewing analysis, and other tools facial recognition could become one of those in the future. oakland considered voting its own domain awareness hub this would link cameras all across the city used by government involving everything from port authority to police cars to cameras outside schools. smaller cities such as st. louis and new orleans have mass cctv networks and centralized hubs they used to watch. the city was largest by far with the largest is chicago.
10:18 pm
chicago is the closest to achieving big brother stats in america. right now chicago maintains a police surveillance network of cameras that is over 30,000 total cameras in the city. this in some ways surpasses the level of surveillance drag net that you'll see in china. although 30,000 cameras in chicago is less than the total 46,000 in beijing, if you look at area density the 128 cameras per square mile is far higher than the beijing drag net that covered 100 percent of the population. this can have really powerful effects for facial recognition and it's starting to in america. we're seeing this in orlando. they are where can a pilot program with amazon's realty recognition program, the way the system works is that you have cameras scanning throughout the city and they will try to scan faces and people and identify
10:19 pm
them and flag any persons of interest. whatever persons of interest means, not sure. so that is government cctv. next, i want to look at police body cameras this is the area of greatest risk in terms of establishing video surveillance drag nets in the united states. and the simple reason for that is that body cameras are becoming incredibly popular in america and in american police departments. axon, america's largest body camera producer in the united states has a systems already in half of american largest cities this is a huge -- because they offer cameras to police departments for free. so then you use axon the video storage system. studies from recent years of police departments indicate that 97% of the largest police departments in america all either have body camera programs in place, are in pilot and testing stages or if they don't havethet yem, are planning to
10:20 pm
build them in the future. so this is going to be a universal phenomenon of police wearing body cameras and that being a common thing we will see on streets as beat calms walk by. why is this a big deal for a proliferation of government surveillance cameras? it's because cities have lots of police in them. on average localities have between 16-24 police officers per 10,000 residents. big cities this amount gets higher. plenty of cities have 40 officers for every 10,000 residents or more dc's over 50. if you look at area density some cities are plopped with police officers for example ten different cities have over 20 police officers per square mile topping the list is new york city which has well over 100 police officers for every square mile. now, in terms of facial recognition we have actually seen a little bit of progress here. axon recently back tracked on a
10:21 pm
long-term plan to put facial recognition in his body cameras they acknowledge the fact that this tech in a lot of ways is flawed, very prone to misidentification so they scrapped plans that might have happened as soon as this year to put facial recognition in its system. not all vendors are taking that cautious approach. some are charging ahead with facial recognition and body cameras and it's only a matter of time until companies like axon are satisfied that it's good enough for their work and begin to institute it. after all an axon vp described their interest in body cameras a couple years ago that by putting a facial recognition in body cameras one day every cop in america would be robow cop. this is very worrying because well virtually all police departments are charging ahead with pleas body cameras very few are setting rules and standards for facial recognition. according to a score card on body cameras maintained by the leadership conference, basically
10:22 pm
no cities that pretty body camera programs have official rules on facial recognition. and that is many, many cities that are not acting with appropriate standards. so that's police body cameras. next i want to talk about private surveillance camerases and the capacity to build network surveillance cameras from them. now cooperating private cameras is a r similar to cctv and a way government could build out video surveillance networks but do so with very little work without the infrastructure, and at a fraction of the cost. we may not have the 200 million surveillance cameras that china does a but we do have 30 million privately owned security cameras throughout the country. so to tap into these instead of building your own cameras it's no surprise government may want to turn this into this. otherwise the quist aside a couple of cameras are amazon's
10:23 pm
ripping doorbell, it's a video doorbell system. just last night news broke amazon had patented a technology to build facial recognition into those door bells and connect to do police networks and notify them when anyone suspicious came up. so another fun innovation from amazon. now, police departments are not just thinking about this idea they are proactively soliciting people with private cameras asking for registration and asking for them to engage in formal agreements whereby those cameras can be accessed and readily used by law enforcement in video surveillance networks. so i mentioned new york before. and the domain awareness system they have there that allows realtime streaming of video cameras. of the 6,000 cameras that are connected to new york's network, two-thirds are privately owned cameras that have agreements that allow the new york police department to access and use them. washington d.c. and a lot of other cities offer incentived to
10:24 pm
try to get people to hook up their cameras into police networks. so here is a mayor -- saying police purchase security cameras and connect them to our networks we will pay you to do this. excellent use of emojis mayor becauser. that is privately owned cameras again. it's very similar to government cctv, it's a network of stable cameras that can provide a video drag net that can be cooperated and a simple way to build it out and a simple risk because we don't have the ability to stop the government in its tracks we're just worrying about potentially have law enforcement tap into them. the cameras are already there. last i want to talk about social media photos. this is a different thing we're not talking about cameras taking images they're already images that are being stockpiled. social media photos are potentially the greatest risk in
10:25 pm
terms of a photo drag net that could be used or corrupted by government for facial recognition and that's because of the sheer size of photo databases. we've already seen face recognition used for social media to a limited degree by the firm get o feedia a few years ago they got caught and admitted during protests had run social media photos through facial recognition technology during protests in baltimore to find individuals with any outstanding warrant and directly arrest and remove them from the crowd. now luckily when this came out as a product of aclu research companies responded properly and blocked and shut down their access to their services, it's really important that social media companies continue to be vigilant on this front to limit their api to prevent photo data bases to bam means of government surveillance and facial recognition surveillance. i think it's important companies
10:26 pm
think about data harvesting through api but court orders and skating those those means. we've seen similar means in the recent past. for example, a couple years ago yahoo received and complied with a court order asking that they scan all email content in their databases for specific bits of content that the government was looking for. it's not hard to ma'am the government coming with a similar court order to someone that maintains databases and asking for a mass scan to find very particular faceprints. so we have google talking about surveillance transparency, and facebook talking about it, these companies maintain very large photo databases, google has over 200 million users store photos in its cloud service, over 24 billion selfies. they have billions of photos up loaded every day. it would be great if these
10:27 pm
companies build out transparent reports to think about possibly including a warrant canary for facial recognition. so if the government ever does come with a broad excessive order saying we want to start scanning all your photos for facial recognition photos we will get the head's up and be able to start acting. and with that i want to conclude by talking about what actions can we take if we start to feed these activities, and how should we response. first of all there's a lot of potential at the local level. oakland had had a proposed domain that would have connected all their cameras into a hub. this was a success story. oakland activates, when they found out about it got organized and got mad and got it shut down. that's the sort of thing we can see in other cities if we take action and i want to give a shout out to the great program, c-cons campaign this is an effort to improve transparency,
10:28 pm
and limit surveillance in can thes all across the country, i'm sure it will do great work to limit advanced surveillance tools and video surveillance like facial recognition being built into cameras. on the federal level we have potential in terms of limiting and conditioning funds. so, we talked a little bit about government cctv, a lot of funds for local government cctv networks don't come from those localities, they come from the federal government. doj funds cctv and police grants very often. for example, orlando, which is running a cctv realty official recognition network received funds from the department of justice. it would be great in the future when they handed out funds for video surveillance activities. they say you can not use it for failings recognition. dhs funds surveillance cameras for cities on the largest degree as well. this is another opportunity
10:29 pm
where sit strict rules and limits could be a very quick way of stopping these turning into scanning and facial tracking network. the department issues grants in tens of millions of dollar for police body cameras but we do not see virtually any departments putting in good rules for face recognition on body cameras. it would be a fast improvement if when doj was handing out grants they say you need to put in effect rules and guidelines and limits to protect privacy before we give you the money. so threz some actions we should take i think it is important we take now because we are very quickly approaching the point we're all going to on a daily basis be like the bbc reporter tracked down through an automated computer system being monitored with a million little eyes. thank you so much, you can read more at pogo.org and looking
10:30 pm
forward to the rest of the conference. [applause] >> so the classic feature of surveillance makes it a mechanism of power is unequal in the jeremy benison punoctgon the prisoners and the ultimately surveillance prison knew they are under potential observes they can be seen but can't see the view fers so when it comes to public networks of cameras one of the most effective things we can do in order to encourage people to react to the changes that are happening around them is to be aware of them. so i'm really fascinated by a tool, electronic frontier foundation has developed to try to help people recognize the ways in which surveillance in public is exploding around us. i'd like to invite dave moss.
10:31 pm
>> thank you for having me today. my name is dave moss and i'm with electronic frontier foundation and if you're not familiar we're based in san francisco, around since 1990, and around to make sure our rights and libertiies continue to exists as societies use technology. i particularly work on efs street level surveillance project which aims to ensure there is transparent, regulation, and public awareness of the vawrs technologies that law enforcement is deploying in our communities. and a lot of times that work looks like filing public records requests. so for example with license plate readers, eff teamed up with the organize to file
10:32 pm
hundreds of public records requests around the country to find out how law enforcement agencies were sharing license plate reader data among themselves. or let's say drones. we'll file a public records request for mission log reports on how uc berkeley police used drones to surveil protesters in 2017. or we'll file a public records request with the san francisco district attorney's office to get a spreadsheet of every camera in their database, similar to what jake was talking about. and this is all a problem because too often our work looks like this mooch we have chucking public records at people saying here you go. here are documents on document cloud or here's a white paper we wrote, or 3,000 word blog post. or even worse, it's me standing in front of you doing a powerpoint presentation and if we're lucky i have a funny cartoon to go with it. i don't have one today so i had
10:33 pm
to use this one. really, our work should look like this to the public. i can textualize within their communities. if i could i would run a walking tour company where i could take people around and show them the surveillance technology around them. i'm a very busy person and i don't know that doing tower groups of 6-7 people is a good way to get the message across. however maybe this concept can transfer over to virtual reality. take a step back we look at virtual reality and law enforcement technology, police are already working on virtual reality stuff. so, this is a company out of georgia called motion reality that has a warehouse side space where police officers put on virtual reality hell unless and given realistic feeling fake electronic firearms, and they're
10:34 pm
wired up head to toe, and they go and run scenarios and they can replay it back. one of my favorite ways about this is they're covered in electrodes so if they're shot they get shocked and demobilized in that part of their body. there is a company that has taken one of these to modify it to work as a field sobriety tests. the whole flash. light would happen within an avr visor and a surveillance aspect something called bounce imaging and it is a little ball covered with cameras and a s.w.a.t. team officer might chuck that into a hostage situation and then somebody could set outside in virtual reality looking around before they go in, and then recording a 360 view of everything going on. so i haven't looked at that what can we do on the other side with vr. i'm going give you a background.
10:35 pm
this is our one of our founders, both a lyricist for the grateful dead as well as a digital pioneer. in 1909 he wrote an essay after he had gone and consider visited the early vr companies he thought it was a psychedelic experience. of course he thought a lot of things were because i think he was on psychedelics a big chunk of the time. now we're going to jump 25 years because a lot happened since then. in 2015 we saw vr start to move towards the mass commercial market. this was the occuls risk. the htc vibe, the playstation vr, all came out early 2016. for our organize there were two big questions we were looking at. first of all, what are the digital rights implications of virtual reality on our society, and two what is the potential for virtual reality as an
10:36 pm
advocacy tool and an educational tool. start with the t what the i think of a privacy element. the intercept had a great piece in 2016 about hypothesizes the virtual reality might be the most nefarious kind of additional surveillance in regards to the internet yet. and i tend to agree with this. this voices a lot of concerns i was having and me were talking amongst ourselves and we hadn't seen it floated publicly yet. the reason is biometrics, virtual reality tends to rely on our physical characteristics in order to function. so on a very basic level that is how your head is moving, the distance between your hands and your head. how long your arms are, if you're left-handed or right-handed but even something simple as how your head is moving in a virtual reality conditions can be correlated to mental health conditions. more advanced technology is starting to involve devices that measure your breath or track your eyes or map out facial
10:37 pm
expressions and that's a whole other world of it. and one of the creepy things is when you have companies in order to gather reactional biometrics are throwing stimul little at you in fairly quite manner without saying why so they can find something measurable to how you respond to it. we'll not going to get into aug.ed reality. a lot of the devices are scanning the world around you to produce con tent. in augmented reality. there was a research study that found in pluto vr. 90% of vr users are taking steps to protect their privacy whether that is jeffing adjusting their facebook settings or using an ad blocker. while. p three-quarters of users was okay for companies using their
10:38 pm
biometric for product development, but they were against it being sold to other entities. , now as far as vr as an advocacy tool we're not the first one to try this. planned parenthood has an experience called "across the line " that puts people in the position of a woman trying to seek health reproductive services that has angry protesters there. peta has a virtual reality to step inside a factory farming situation. what's it look tike a calf at a farm or chicken. and some groups out of brooklyn massachusetts worked on the united nations to do virtual reality visualizations of data on air pollution and they took that and ran that through a bunch of un delegates in
10:39 pm
nairobi. so that brings us to eff spots the surveillance project. and this is at its base a virtual reality experience that uses a very basic simulation to teach people about the various spying technology that police may deploy in their communities. when we were trying to pusewer this we had considerations and wanted it to be a meaningful advocacy experience, we wanted to not collect biometric information, and we wanted it as an organize that supports open source and accessibility to technology we wanted to make sure it worked on humidity platforms not just the occuls or vibe store. we wanted to do function on a modest budget we are a non-profit and we are not sony. what i mean advocacy experience we didn't want to rely on the novelty factor of vr. you can take anything and put it in vr and if it's somebody's first time using vr they say it's amazing. regardless of what it is. we want to make sure ours was
10:40 pm
representing a way that only vr could allow. and we didn't want people to be watching a move, we wanted them to be doing something and to be challenged by it, and learn information that even though they were experiencing it in a virtual world we wanted them to carry it back to the real world. once you put the headset on, you can put it on and you're placed in a street scene in western edition neighborhood of san francisco where there is a police encounter going on between a young citizens and two officers and as you look around and find something you get a pop up and a voice over explaining what it is. it's not meant how quickly you can go through it and score points about the surveillance technology. it is supposed to be an educational tools. four goals, one was can we do a virtual reality, can we do the experience cheaply and if we can do it the first time can we do other things down the road.
10:41 pm
number two just to educate people about the forms of surveillance then we also wanted to help them figure out where they are in their communities. and then finally we had a thought that police encounters are very stressful situations, protester stressful situations things move quickly but it can be useful for people to take note of what surveillance technology they saw in those scenes. so perhaps by putting people in a simulation where they're able to gain practice looking for technologies it might carry over to these higher stress situations. so we decided not to go with a computer generated environment and go with a 360-degree photo. this is the rico state of v you can see it here and on the screen. it has two concave lenses one on each side and it captures beyond 180 degrees on each side and stitches them together. so you're able to take a photo
10:42 pm
of everything. if you used it now you'd get all of this and this. the only thing you might not get is the base of the tripod underneath the camera. this helped us get past what people refer to as the uncanny valley when it comes to video games, the more you try to create a realistic person or environment the more creepy it is to people. by using an actual photo with a real scene with a fuse things photo shopped in it by passed that although. this is what the photo looks like that we took. it's obviously once you're in the virtual reality headset it's wrapped all the way around you. you can see there is a scenario there going on and you can see us at the bottom we are going to show you a little bit. this is what it looked like and you don't see this in the game. this is a behind the scenes exclusive here. we were hiding under a longer version of this pole, and hiding there outside this police station hoping police would come
10:43 pm
outside, and eventually it did, and it being san francisco, they didn't question two people with a weird piece of technology on the street. [laughter] which was great because it was kind of the perfect shot for us. for those of you who won't have a chance this is what it looks like on there, if you looked at the body cam you would get a pop up that explains what it is and has a voice over. such a violence medium we didn't want it to be that you have to be fully sided to enjoy this experiences or learn from it. so if you are only able to see out of one eye or have limited visibility, but you have a certain amount of awareness of an environment you can actually go in and still learn things through audio. we did our beta launch on november 5. this is at the internet archive at the air force international hack thon. that's brewster kale the founder testing it out. for the most part we're looking at having tables like this.
10:44 pm
there's not a lot of at this point not a lot of people have devices in their homes even though this one dropped down to $200 recent, not a lot of people have it but it is something we can take to conferences and have our grassroots activists when they're going to community groups bring it with them like they would bring one-pagers or brochures, they could bring one of these with them. we've run it through 500 people in the lastal month if you think about it in terms of an activism organize. eighty four able to spend 7-9 minutes get them to only exclusively focus on surveillance, that's like -- that's a lot of time. but it was available on the internet, and so one of the things i found gratifying is that portland, maine is as far from san francisco as you can get but you can see there are hacker spaces and media labs that are trying this out and having people demo it and we started to see social media respond to it as well. my favorite tweet is the one in
10:45 pm
the middle. vr tech is so rad, i went spinning through my apartment, lol sob is exactly what we were going for with this. so i feel pretty good about that. so next steps for us, we're still in beta mode so we're going to continue doing demo to gather user feedback, we're going to improve experience, one of the things working with open source technology there might be a tweac in the language and everything breaks. so we've had bugs come up and we need to fix them and get everything stable for an april 2019 launch. once we have that we'll send it to communities, come up with a educational curriculum, and then look at what would the next version of this project be. and we have a few ideas. some of them are let's do an internet of things version merchandise let's do a home office, you see a printer and ways you might be surveilled
10:46 pm
through divided in your home. or we do one where not everyone is in san francisco, or they want to know what it's like in new york city so we build the same thing for various areas. or we abandon vr and we go on to ar and have ways for people's phones to project things into the world. all these depend on how the technology develops and what kind of interest we get, whether there is a return on investment, and what kind of grants there are. it's a new world and we don't quo where it will be in a year or five years. i can tell you where it's going to be after lunch. time and that is just outside the lunchroom, if you want to try it out i have two devices and i'd be happy to show you the camera. if you do have a headset at home, or play it around on your computer browse it's eff.org -- spot. >> i love this idea there was a
10:47 pm
concept called the tetras effect the idea that when people play games especially if it involved pattern recognizing behavior very often it spills over to the non-game lives. the tetras is named after people that start seeing shapes everywhere, and see how they can fit them together. it shows up in the assassin's creed games the bleeding effect where someone is reliving a smulings of his ancestor's lives and takes on their super human murder abilities, and that seems realistic and desirable, but it might be imaginable to spot surveillance through games. a more useful version of the tetras device. turning back to the question of encryption as we heard from sharon bradford franklin earlier. law enforcement have for years been complaining that the spread
10:48 pm
of encryption is causing them to go dark, making it more difficult to do electronic surveillance with communications. the fascinating report from the center for -- studies that's really points out there are a lot of ways that difficulties law enforcement is having with intercepting electronic communications really doesn't have a lot to do with the need to book doors, and a lot of low-hanging fruit being left on the table that we ought to examine before we talk about legislating breaches in -- platforms for breaches in the tools we rely e rely on. to talk about that i want to invite gendasessal to discuss a report you'll turbined on the table outside. >> thank you julian.
10:49 pm
and thanks to cato for putting on this excellent conference. as julian said the focus of my talk is the range of challenges that law enforcement faces and accessing digital evidence separate and apart from the encryption related challenges. and this talk through the report i worked on with co-author will carter under the auspices for the center of strategic or international studies. oh cis. the debates about encryption will continue but it is and was more emphatically more so our view in working on the report that encryption and debates about encryption have taken up so much of the lime light there are a range of other challenges that law enforcement faces that need to be dealt with. and they can be dealt with relatively easily, and they need to be dealt with now.
10:50 pm
and so, as -- and these challenges will continue no matter what happens with respect to encryption. no matter if in fact there ever were a clear decription mandate there would be other ongoing challenges that need to be dealt with. so as our title low-hanging fruit indicates these are problems that can be relatively solved, not completely, nothing in the space ever pleads to a complete solution and we make a mistake if we assume we are seeking a complete solution or that we ever trying to eliminate some of the friction in the process. some of that friction is in fact healthy. but some of the friction is unnecessary, and actually collectively harmful to both security and to privacy, and minimizing that friction is not only a laudable goal but one that is eminently achievable. to that end i'll note the report that we worked on was endorsed by a number of individuals and
10:51 pm
also groups and entities it was endorsed by the former cia director john brennan, former fbi ken wine stein, two form f democrat do you understand, the former boston commissioner, police commissioner eddavis, a former assistant attorney general for national security david chris, it's been praised by a number of different groups and providers and several providers have introduced a number of reforms consistent with what we called for in this report. so now that i've given you the hard sell i'm going to spend the reminder of my time talking about the substance and talk a little bit about the methodology we used in doing this report a little bit about our findings and ultimately recommendations. so this report stems from a year's worth of research including a series of qualitative interviews with state, local, and federal law enforcement officials. prosecutors, representatives from a range of different tech companies and members of the civil society community.
10:52 pm
it also involved a quantitative survey of state, local, and federal law enforcement officials. and the survey results are notable, hopefully you can all read at least a little bit of this. the survey according to the survey results, those surveys found difficulties accessing analyzes and utilizing difficult evidence in over a third of their cases. we believe that's a problem only going to continue to grow. as digital information becomes more and more you bick wutasis and digitals evidence is need in every criminal investigation. this chart shows the response to the question what is the biggest challenge that your department's encounters in using digital evidence? and accessing data from service providers was ranked as the key challenge amongst our respondents, separate and apart from questions about interpretation. identifying which service provider has the data was
10:53 pm
reported as the number one challenge. 30% of our respondents ranks it as their biggest problem. obtaining the data once it was identified was reported as the number two challenge, 29% of ous ranked it as their biggest challenge. accessing data from a device wad 19% ranked it as the biggest challenge they faced and collectively analyzing data from devices and analyzing data from providers that's been disclosed from providers which are two separate things that's 21% so that was their biggest problem. that's important because these are problems that can be fixed or at least largely reduced. without huge changes in the system but with more resources and more dedicated system@ic thoughts as to how to address the problems. so to the extent that law enforcement doesn't know where to go to get data of interest,
10:54 pm
that is a problem that can be solved with better information flows, and better training. to the extent that law enforcement faces challenges in obtaining data, that is a bigger challenge, and we heard two very different stories from the law enforcement officials we talked to and the provider community. the law enforcement officials talked about what they perceived as very long delays in getting information back from service providers, what they perceived as service providers dragging their feet of service providers having inseventy six resources to respond to their needs of requesting slow -- or turned down in what they perceive to be invalid circumstances. providers on their side told us a very different story. they complained about what they saw odds as over broad requests, law enforcement asking for things that just simply weren't available, as delays being the fault of law enforcement as they
10:55 pm
were internally debating and deciding whether or not to get nondiscloser orders that would prohibit there the customers data had been obtained and providers holding off at law enforcement requests on turning over the data until they learned whether or not they had permission to tell the customer or the subscribe. now the data interestingly kind of supports both sides of the story. this chart shows the requests from -- that the u.s. law enforcement issued to six key companies facebook, microsoft, twitter google yahoo and apple. over time this is based on the company's own transparent reporting, the there is no other good source of data. you see from this chart a pretty dramatic increase over time. this shows requests in 6-month
10:56 pm
intervals, so ending in decembe0 requests to these six u.s. based providers by december 2017, the previous six months before that almost doubled or at least increased by a significant amount in about 650,000, almost 700,000 requests in the prior six month-period. what's interesting about the chart is that the grant rates have hovered more or less the seam rate. 80%. consistent over time in terms of percentage of requests or demands that providers complied with but that also means the number the absolute number of requests that are being turned down are the number of disclosure demands that are not being complied with is higher given there's a bigger volume of actual requests. so to some extent law enforcement is frustrated because they're sensing this
10:57 pm
bigger number of request denials where as providers are saying we're pretty consistent in how we're treating this over time. two caveats about the data, the chart only shows where the requests were made not where they weren't made. the law enforcement didn't know where to go over otherwise estymied requests and the grant rates say nothing about the legitimacy about the requests or the grounds for rejecting the requests and there is and should be ongoing disagreement about the appropriate scope of a request this is an area where some friction is not only healthy it's actually productive, and it's just going to persist inevitably because there's different views about the appropriate scopes of these requests. but there's also a number of areas with respect to grant rates and law enforcement issuance of request of providers where there is unnecessary friction and some of the reduction in this friction can both support privacy and
10:58 pm
security at the same time. so some of the things that can be helpful in this regard are better up to date law enforcement guides provided by the providers, resourcing of law enforcement teams by the providers, better training and dissemination of training to state and local law enforcement officers, better training of judges that review and approve the range of requests subject to court order or warrants. and these have obvious security benefits in the sense that it provides law enforcement more streamlined ability to access data of interest but it also has privacy benefits to the extent it leads to better tailored, better more privacy protective requests and less -- and as a result more tailored more narrow requests. now, to the extent that law enforcement cannot interpret that if that's disclosed this is a problem that stemmed in part
10:59 pm
from encryption but also what we heard from over and over again was the absence of technical tools to decipher nonencrypted data that was disclosed. so, this is a problem that results one from a absence of tools to some extent and a distribution problem. so sometimes some of the bigger law enforcement entities would have access to the appropriate tools, but it was not disseminated to the 18,000 state and local law enforcement entities that exist around the country. so despite what appears to be pretty clear need, and a pretty easy to identify solutions with respect to resourcing training, resources training and dissemnations of tools, the sole federal entity with an explicit mission to better facility cooperation between law enforcement and providers is an fbi's national domestic communication center, it has a
11:00 pm
budget of 11.4 million this fiscal year and that is spread out among several different programs designed to distribute knowledge about service providers policies and products, develop and share technical tools, train law enforcement, maintain a 24/7 hotline center among p many other initiatives. that is a drop in the bucket given the need out there. . .
11:01 pm
>> what is working and what doesn't. so this is the recommendations with the creation to be authorized and resourced by congress to do that kind of work that is needed with the distribution of research
11:02 pm
including grantmaking of the data that is collected with authentication systems to ensure that the person who was asking for the request is in fact entitled on those international efforts to report to congress of what is in fact going on it does not have an independent authorization act two articles on - - adequately resourced to do its job in the broader
11:03 pm
digital policy office of the technology and allow them to do on a very slim budget to disseminate information about service providers and to provide a hotline system and we've also been cleary included a hearing of recommendations and where they could facilitate that that we do do that but it with those across the country with wrapper one - - rapid turnover that could disseminate the
11:04 pm
training leading to more request that is helpful for law enforcement. >> with the online portal to facilitate the request process to help with the authentication to provide rejection with a dialogue with rapid responses that is being requested with that transparency that providers are doing with that law enforcement request even more in terms of the categories of request over the different
11:05 pm
smaller categories as well but it's important to think as well the bigger providers helping to develop best practices so those challenges will only grow over time and these are structures and resources that need to be put in place now because to argue this has benefits for privacy and allows us to do something with encryption. [applause] >> but they are encrypted as a
11:06 pm
spreadsheet you need to open it. [laughter] and as we see very often and that ability to navigate and is always easiest and with that afternoon's question on - - session as well and at the end of the day so please join
11:07 pm
me in thanking our speakers one last time. [applause] [inaudible conversations]
11:08 pm
what i would say in this is part of a process i have experienced firsthand the reflection of god's grace that is an incredible humbling journey to walk. >> and then they step on their
11:09 pm
toes and i spent ten / one there is a lot more than that. and they don't even live in california. >> please welcome ask io's executive editor. >> welcome back thank you for joining us for this

6 Views

info Stream Only

Uploaded by TV Archive on