Skip to main content

tv   National Freedom of Information Day Discussion on a Government Tracking and...  CSPAN  March 9, 2019 2:18am-3:36am EST

2:18 am
of the first and second which allows us to settle our differences in a court system so you do not have to jump into foxholes and dodge bullets. >> voices from the road on c-span. next, a discussion about the legal issues related to government surveillance technology like facial recognition and artificial intelligence. this is part of the freedom of information conference. it is about one hour 20 minutes.
2:19 am
know, the reason we are excited to have representative gomez here, he has personal experience having been misidentified by facial recognition software. was identified as someone who committed a crime. that inspired him to work to regulate this area a little bit and figure out what is going on with facial recognition software and advanced technologies. we are tryingment to develop framework and policies around these types of technologies. we are pleased to report today, government inc, amazon, government security and secrecy with my colleagues. it is available out front.
2:20 am
we hope you pick one up. it is an in-depth look at the ways that private companies provide advanced technology to they arenment, and how doing so in secret. we look at the impact of this technologies in the secrecy's onund them, that they have military decisions, civil rights, and privacy. there is so much we do not know intelligenceial systems. i do not know what goes into a computer that makes it spit out a decision, and i'd venture that the people impacted by likeicial intelligence congressman gomez who might be misidentified do not either. neither do most of the people who use or oversee the technology. it is difficult for ai programmers to understand the decisions of complex systems.
2:21 am
combine that mystifying technology with the secrecy that surrounds our government contracts, that puts a lot of power in the hands of companies who make the technology. one of his companies is amazon. research, we had no intention of making amazon our case study. that is where the research led. when we thought out experts to help us with our research, once we narrow the focus to amazon, a handful of journalists hesitant to talk to us. we could not figure out why, and then it dawned on us that they were concerned we might be involved in corporate opposition research targeting amazon, and they understandably did not want to play a part in that. we are not engaged in any form of corporate opposition research. we get no corporate funding whatsoever. this report is the result of our research.
2:22 am
more importantly, our recommendations apply across the board. we make policy recommendations for federal and state governments, demand transparency from any government contractor supplying artificial intelligence, she learning, and surveillance technology to the government. we make recommendations for the contractors themselves to be more open about the services they provide and conduct accuracy testing. the report covers the ofuisition and use artificial intelligence across the national security and law enforcement agencies, including the military where this technology is integrated into military targeting decisions. it has scary vulnerabilities and use gift cards. i encourage you to pick up a copy of the report out front. we will not cover all of it during our panel discussion. our panel will focus on one type
2:23 am
of ai technology that is the subject of controversy, and that is facial recognition technology. to dig into these issues more deeply i am pleased to introduce our amazing panel. we have the director of the ational immigration project the national lawyers guild. we have the u.s. manager at access now. we have the senior counsel at the project on government oversight. each of them provided there insight and expertise for the report, and we thank them for their help. a reporter fors bloomberg news who writes extensively about the tech industry, focusing on issues of complex of interest and lobbying. she is covering the competition for the jedi contract, a $10 billion contract the dod plans to award to a single company.
2:24 am
she has gained a deep understanding of the role government contractors play in u.s. government efforts. betterd not ask for a media expert to shine a light on what the public does not know and need to know about this dynamic. with that i turn it over to naomi. for that kindou introduction. i am excited to monitor this panel. -- i am excited to moderate this panel. major technology companies are driving the development of these technologies but also their rapid adoption in the public sector. i think there is a lot of unanswered questions about what this means and what the implications are. i am excited that we get to have a proactive discussion about how
2:25 am
these technologies are shaping the future of our country and communities. don't we start off and have the panelists introduce what they do and how their work relates to this particular issue. do you want to start? >> sure. thatk at an organization is membership-based. i work at the intersection of immigration law and criminal justice. it is unpopular, immigrants in our criminal justice system. and immigrants were facing consequences of being in the criminal justice system, and the reason why this area of work has become everyone's business is because the technologies that
2:26 am
are being deployed are being targeted -- or the justification for those technologies is a group of people i work with. to be clear, our organization is based upon this idea that all immigrants, no matter what happens, deserve a right to be heard. and they have a right to talk about their circumstances and families. automatic deportation and mandatory detention are not part beour justice -- should not part of our justice system. and we want to end of that. end that.want to we work with communities to help them better navigate the system, and to assist a campaign for fighting this. advocacy to bring
2:27 am
together the conversation we need to have. and this group here. that is my role in this work. i am a u.s. policy manager at access now, an international organization. it you may not have heard of us, we work everywhere. the centerpiece of the organization is an amazing project which i am super impressed with and i have nothing to do with, called the security visual helpline which are technologists who provide around-the-clock support and assistance to a population we call the user at risk.
2:28 am
that means different things in different contexts in different countries. it is anybody who might be targeted by the corporate sector or the government for inalienable qualities about themselves. we try to help these people out when they have technical issues, be it needing to shutdown theunts because they may arrested by the government. or have their website blocked. i am on the policy team. we try to support policies that will help the same community continue their work, support them, and will promote their in theiro take place individual societies. that could mean in the context of ai, looking at how human rights need to be protected.
2:29 am
a lot of these conversations take place around ethics. we put out a report on artificial intelligence and the human race, try to look at it through a human rights framework. to insert these conversations into the technical debate and make sure they are looking at the policy naomi: great. jake? am a senior counsel. is a watchdog organization that focuses on holding government accountable. we work with whistleblowers. oversight institutions and roles within congress. we conduct investigations. project, theion component that i am a part of, focuses on protecting rights and
2:30 am
holding government accountable. maintaining the balance of people. i focus on our surveillance policy. that can be everything as big as programs conducted by an assay in secret to tap into internet cables down to what is in the toolbelt of the officer on the street. what do they do to conduct surveillance. a big part of that is facial recognition. it is technology spreading quickly. it has a huge impact on civil andts and civil liberties it is virtually unregulated in the united states right now. naomi: great. let us get a lay of the land first. based on your work, talk about some of the latest ways you see
2:31 am
government agencies and bodies adopting and using the latest technology official recognition. jake, let us start with you. are a few different ways that you can use facial recognition and a law-enforcement context. i won't go down from what i think is the least controversial to the most worrisome. when i first started -- if you have already booked someone and are taking mugshots. either you do not have an id or you want to verify it using facial recognition. going to the database to confirm an individual's id. more controversial is the investigative use in the field. stopping someone on the street for a cause. or say you have photos from a and youene or video have an unidentified suspect and
2:32 am
using facial recognition for that. it seems there could be legitimate uses for those operations but it is critical that you have oversights and limits because there is also a huge potential for abuse and those cases. the most controversial in terms of the scale of privacy in public safety concerns is what is starting to creep into law enforcement now. called 12 on amazon real-time facial recognition. instead of looking at one andon's face in a photo telling the computer system to tell you who it is come it will scan every face in the crowd. the camera would look across all of you now and run it against a andhlist of people and flag say that these people match the list. orlando is running a pilot of this with amazon's recognition system. they described it as a few cameras in public spaces that will dry man -- that will expand
2:33 am
dramatically. person ofa detects a interest, we don't know what "person of interest" means or how you get on that list. it flags it for police officers. they act accordingly. >> that is a comprehensive list. we are seeing a lot of this technology inserted into digital identities around the world. facial recognition level photos. they are being used in these id cards which presumably governments could use to identify someone on the street back to the profile that they on them in other places. that is kind of happening in the u.s. the real id program, the image is high quality biometric. mostly we are seeing it in other countries. the thing i want to flag is when we wrote our ai report, we came
2:34 am
across a company that is actively contracting with governments, not the u.s. as far as we know, to sell and ai system built on facial recognition they say they can use to identify who are , theyists, at risk people say they can identify your intelligence based on your face. all of these different kinds of qualities. there is no way to do this based on a face print. it is all proxy for discrimination and bias. it is grossly racist for lack of a better term. and this is being actively used by governments. adding that peace in their that these ai tools that do not have a right explanation. you do not know how they are coming to these conclusions. they are being sold to governments based on what i would basically saw -- basically called junk science.
2:35 am
-- call junk science. , from thefrankly department of homeland security is the agencyhis primarily involved in policing and detaining immigrants. the context for me, i make two primary assumptions to evaluate the technology. there is a mass incarceration crisis in the united states and byis a crisis that is driven anti-blackness that has now gone into the agency, the department detainingd security, 55,000 people at any given time in the united states. one of the largest jailers. hhs is now more of a policing entity. it has the largest number of federally armed agents in the united days. the way i think we think about
2:36 am
dhs as an immigration agency has to shift. as we evaluate these technologies. entity.policing the way that we began seeing it deployed was first in the where we sawontext mobile fingerprint technology across the united states. people were being arrested. it started in seeing more of an international fingerprint technology where people were being fingerprinted outside of the united states and that information was feeding in and being associated with facial prints. that was troubling for us because we do not know how people were picked up in the first place. sawthen, lastly, i think we that we could not really track who was doing it. we could not pin down the agency when we raised these questions
2:37 am
about facial profiles. because, they always said that there was a contractor who was involved in collecting this data. and what i think that meant for us was we began to follow the money and what we have now seen is, by following the money, through our report, no tech for ice, we sell billions of dollars being put into dhs for facial recognition technology, biometrics, different kinds of sensors, they are being deployed in airports, ankle monitoring. for immigrants. that will all include facial recognition technology. that was, i think, something that we were very deeply concerned about because we never had the public conversation around how is this technology an ice,ployed with
2:38 am
especially under a trump administration. and now that we are kind of in our worst-case scenario, we have seen the agency fast-track the deployment of these technologies on their website without or very little comment -- public comment. naomi: interesting. i have a follow-up question for you. dhs and othernk government agencies are using private companies for this kind of technology? tot is the need they need fill with this and what is driving the adoption? paromita: i think it has to do with the deep bipartisan consensus around technology generally. that using technology in law enforcement is a good idea. it is the presumption that it is subjective. it is acquired without harm and
2:39 am
we do not actually need a public conversation because everyone is collecting it the right way for the right purpose. and it is really hard for the courts to attack it. dhs is a place especially where the immigration courts really do not have the capacity nor the will to look at these technologies. there will never be a reckoning for facial recognition in immigration court. therefore, there will be very little reckoning in an appeals court. at ao, instead of looking bad immigration case, whatever the decision may be. i look i think that when at the excitement in dhs in developing a portfolio around this, i think it just has to do with a deep caucus. the excitement on trade magazines when dhs announced its budget was wide.
2:40 am
the conferences continue to happen. i can see that the corporations are hundreds of them including amazon that are about applying this technology. i think dhs sees itself like many law enforcement as a policing partner that should be tied with industry as they launch a new brand of policing. now that there is going to be very little accountability in it. naomi: it is also relatively cheap. they are spending a lot of money on it at the idea that you could track every single person historically, which is prohibitively expensive for law enforcement, you cannot follow everyone, enter technology and that limit goes away.
2:41 am
and they can do it. and the law has not caught up yet so we do not have the right prohibitions in place. why not? they do not have the limits to it. is theye other factor have a profit motive to sell them and part of that is marketing something is really great. at pogo, we looked at the vendors and they, for all of concern about the tech, there are extremely exaggerated claims about what it is capable of doing. there will be materials that say an officer on the street can phone photo with a smart and instantly it can tell you if that person is a dangerous criminal. it does not work that well.
2:42 am
it is highly inaccurate. it is slower. these types of exaggerated things are good to sell it. and it can be especially convincing if you have secrecy about the sales without outside scrutiny. but it is very bad policy. amazon is another example. someone was improperly identified by amazon's facial recognition done in a study. congress were properly identified. amazon kicked back and said about the study that you have to set a higher confidence threshold. recommend lawever enforcement do 99% confidence threshold, it really high and you tested lower so this will never happen with law enforcement. lo and behold, a report came out from washington county, oregon that said amazon help them
2:43 am
develop their thresholds. a big factor is vendors have incentive to make these sci-fi look a lot more glamorous and a lot more prone to errors, flaws, and problems than they actually are in reality. one more incentive is the debate around sanctuary cities is one that has been dogging the government for a long time. a public conversation that we have been having about the rule of law enforcement in immigration policing. in your city and in your town. i think dhs is looking for a backdoor. if you can create the interoperability, quickly and easily without that conversation, it will undo the
2:44 am
.eally hard work i was involved and frankly, i can tell you that it trouble to me. i did not want to come into this work of technology but i was dragged into it as i saw the growth. and our communities, whenever we talked about it, were only hearing the conversation about progressive smart cities but did not really relate it to the policing. and how do we reconcile that conversation of what it means want thes to really smart, progressive city? and how does that relate to facial recognition technology is the question that we were trying to answer. naomi: let us talk about some of the unintended consequences. what are some of the potential
2:45 am
impacts of government entities using facial tech -- facial recognition software that might occur that we might not be thinking of or they might not be thinking of as they are adopted? -- adopt it? amie: i can think of two. one will be the collection of more information. flagged recognition and iowa's get worried. accuracy problems is collecting more data specifically on the populations for which the accuracy is low. i do not want to necessarily set amazon out to be creating bigger databases. and that might be one of the results here. thato just to flag
2:46 am
question could have incredibly negative consequences. the other one is we elect people , presumably in a public interest, a fair and open process, presumably. those people are supposed to make political decisions. that is their job. those decisions are being coated , without transparency into these ai systems, and there is no accountability. electoralermining the democratic process of having decision-makers sit in elected office by outsourcing these decisions to these private companies. i do not even think that in most cases the elected officials know that these decisions have been made. someone has to put a little more way to on this thing and a little less weight on this thing. it is about fine tuning. if we do not have the proper level of explained ability in how these systems work, we will
2:47 am
not have the accountability back to the people contracting the system. the base impact is the impact on the democracy. lawsuits that we have brought against immigrant -- immigrant activist. just yesterday, cpp was shown to have been surveilling and facial face scans, pictures of people at the border who they thought were activists. and presented public safety the family separation crisis. immigrationerance policy at our border. .hat database was laid bare anyone can look at it. what we have seen from dhs is as
2:48 am
it turns into policing -- into a policing entity, it looks at protests, activism as an ,nti-trump, anti-police anti-law-enforcement activity. as aad of looking at it conversation we need to have around immigration. and immigration policy. and that is deeply troubling for me. because, if the technology augments that kind of surveillance and it is deployed in a way that immigrant activists are deported, and they will be deported and some of them are being deported. terriblet is a condemnation of where we are now. jake: i would like to add some input to that and jump back to misidentification and the harm
2:49 am
of that. a personal example of that in practice -- a report came out that hsi, a division of ice, has been compiling a list of .nti-trump protesters we have an investigative report that last year, hsi was in talks with amazon to purchase software. the prospect of if you are at one of these protests and you think a cctv is at a street body camera of an officer could be scanning the crowd -- whether it is happening or not, the prospect that it is a realistic possibility will chill first amendment speech and chill participation in those events. that is problematic. rules, younce of have that effect. on misidentification, i want to not just at there is
2:50 am
risk in general but for certain demographics. there has been a lot of great work done at m.i.t. showing there are higher rates of misidentification for women and people of color. that will be the impact on facial recognition. it can cause a lot of problems if you are using this to stop people, arrest people. if it is ever used as a justification for a use of force. if it is used in investigations, you will augment the impact and harm to communities of color. we should think about it also outside of the law enforcement context. facial recognition could be used for public sector employment to conduct background checks and conduct survey -- security clearances. if you have a disproportionate rate of error and people are being flagged, they might lose a job opportunity or chances to move up in a job for a security
2:51 am
clearance and if your name is -- you mighthner not ever even know why it is a problem. one final bit on misidentification. i do think that amie is correct in that we should be wary about curing the problem by collecting more. there are also factors where orre is simply no response responses where we as privacy advocates should be wary. for facial recognition ideally you will have best results if someone is in a profile photo, like a dmv photo which is why takes all of your dmv photos and puts them into a database. very scary. when you get to things like real-time facial recognition where this is people moving around on a video feed, you have huge variance in lighting, you
2:52 am
do not have ideal angles, low-resolution cameras -- they will not provide the same level of accuracy and as a result, we should be much more skeptical of tom and much more reluctant take police action on the basis of what a computer tells us in those cases. naomi: interesting. currentd you assess our legislative framework in tackling some of these issues? we do have laws that aim to tackle discrimination, unlawful stable arivacy -- how foundation is that, our current laws in tackling this new technology and the implications you up highlighted? --you have highlighted? putting you on the hot seat. paromita. paromita: i do not think first
2:53 am
amendment law is prepared for this. i mean, we do a lot of work with public defenders or we do work with immigration attorneys. the presumption of accuracy that ,s given to facial recognition the presumption that the person is who they say they are or thatct a certain action supposedly they have committed, is really hard to fight. -- where i most see this is i think this kind of huge profiling of alleged gang members -- there are so many databases humongous of gang members. chicago. they discovered there were
2:54 am
200,000. that is primarily black and brown men and young people. and same thing in washington, d.c., same thing in california. rates, the massive error the fact that in california, they found one-year-olds and databases.s on these or black panthers in chicago on these databases. easy just a very collection of the face is associated -- for them to have these massive consequences. it has been hard to challenge in the courts. get the to identification. it is presumed to be reliable. good presumed to be probable cause.
2:55 am
entereds really not specifically into the courts ultimately. so, i do not think -- i am not a constitutional law expert why any means, but i do not think what we have seen so far, it least in our day-to-day practice, is adequate to meet the challenge. recognition technology and fingerprint technology as well. if we cannot even address fingerprint technology, then i do not know how we do facial. the law is just so woefully far behind. fromrivacy act, which is 1974 supposed to give you an insight into government ,nformation held about you everyone exempts themselves from those requirements on dhs with
2:56 am
great regularity. there was a recommendation last year to amend the privacy act to fix this but it has not been even at and there is not serious effort underway to fix that. there is no federal protection. we have something in california finally that is supposed to come into effect in 2020 but it is california. we do see some serious steps to limit the collection of data about people by companies and to allow you to be able to access that information but it is really up in the air right now of what that will look like. and then, you go into the law enforcement sector -- the electronic communication privacy workingeople have been for as long as i have been working on this area to try to fix the electronic munication privacy act. even the fixes are not going to fix everything that we know is
2:57 am
weird about it now. and it is not going to prepare as for the future. one question i had -- i was talking to people before this panel -- you have the face, the wellre, the front facing, lit photograph. and then you have the biometric information in that photo. is that content or a record? that.t know the answer to and so, i could not tell you what you -- what law enforcement tonks it might have to do gain access to that information. these are big questions we will have to think through and the law will have to presumably catch up --? but we have been working on it for a while. me, onhat reminds content and non-content. the law enforcement question. things that would take no law enforcement effort -- government effort that would be easy.
2:58 am
putting transparency reports out on how many orders they get. it is time they start putting out numbers. and if nothing else, worn canaries ifwarn they are getting orders to scanned faces. i would not be surprised. a stronghere is foundation for our constitutional right to limit facial recognition. carpenter, last year we saw that the supreme court did embrace this idea that you have a right to privacy, even in public spaces if government's power becomes so great it can upend the balance between the government and the governed. on the other hand, i share that beingism about the court unable to fix this because of the speed with which they work. we had a basic concept that the government should not be able to continuously track you with your
2:59 am
cell phone all the time but it -- because it was so revolutionary for the supreme court to say that you have privacy rights in public sometimes, there are like five loopholes that seemed obvious but they said they would get to it later and they did not want to rush too much. i think it will be years if not a decade for they can have a strong protection as high as facial recognition and we do not have time to wait for that. this is a technology where over half of the american adults are already in a database that can be scanned for facial recognition. more and more police deposits -- police departments are adopting it. need to get government agencies to voluntarily adopt practices and pressure them to do that -- that will be an uphill ourhave highlighted gaps in laws for governing technology.
3:00 am
if you had a magic wand, where would you like policymakers and congressmen to start? what questions would you like them to tackle around this? pass the geolocation act. past data protection laws. data protection laws. i don't know where they should start because we need to pass so many laws. i would love to see, where are the exemptions going to be? ,henever we ask about it related to law enforcement and national security are being utilized right now. informationn get
3:01 am
about noncitizens, especially because they are exempt from the privacy act. if i had that, i would be looking at exemptions and looking at reducing the amount of money we put into these technologies. has now set up the office of biometric identity management within the agency and is getting hundreds of millions of dollars. they should not be giving that -- the conversation about the border which has been about a wall, we need to make sure whenthe technology wall, they say nonintrusive technology or drones, where we are going to
3:02 am
see more cameras were better lighting, better pictures they can capture. those things need to be discussed. be appropriated to this agency which we know is going to be using them directly -- very recklessly and has a record of using them recklessly. i would like to see the shrinking of the budget, which to be clear, has tripled since an agency in 2003. has the money shifts to this area of policing and detaining, i think we should be concerned about what money are we appropriating also. >> on the facial recognition front, a shameless plug i mentioned before, we just came out with a big report on facial recognition. it is available on our website, so you can read the whole thing
3:03 am
and we include a set of policy recommendations for that. i won't go through the whole would besome of that establishing a system of appraisal -- court approval for facial recognition, independent testing through an agency -- independent agency and factors before the stuff can be deployed at all and limited to serious crimes. going back to what i mentioned about the uphill battle we will e,ce in the legislatur whenever you fight to limit police power, there is kicked back law enforcement and i expect the same with any limitation on facial recognition but it would be hard to argue this type of incredibly powerful surveillance tool should be littering and jaywalking and grabbing people off the street for unpaid parking tickets from a couple of years ago.
3:04 am
those are a couple of reforms that would be viable even in a challenging political environment. >> let's talk about how the operating inr is all of us. the government agencies are making the decisions to buy the technology, but what do you see as the role of private companies, and what is the responsibility as they continue to pitch their products to government bodies? jake: one thing they should do is when congress sends them letters asking questions about their tech, they should respond and this is something amazon has not done so well. the last congress, a number of congressmen sent them letters asking about facial recognition. they have gotten very little back and it shocks me there seems to be a very mounting attention from a number of congress getting ignored.
3:05 am
i suspect that can go on much longer before they start to get called in for hearings or start to get talk of regulation that will make them feel more compelled to speak. i think it is very important that they be honest brokers in this discussion. materials that don't serve the public or law enforcement when they have unrealistic expectations and the circular argument amazon has put up about the accuracy settings for their product and how prone they can be to misidentification -- even if you have a company considered an honest broker, independent testing is essential. a lot of facial recognition vendors test with mist. to thewho wants to sell government should go through independent testing before we are using taxpayer dollars and
3:06 am
putting those tools in the hands of government entities. human rights law, it is very clear companies need to respect human rights. i would like to see more companies commit to conducting real human rights assessments that they publish about their technologies and abiding by the conclusions experts come to about what they should do with thee because the report, report we are here today to discuss talks a lot about these companies don't even know what the technology is going to be used for, the extent to which it can be used to harm certain communities. company working in this space should maybe take notice that these violations .ould occur
3:07 am
there are a lot of experts in this space and amazon should be working with them. -- at the expense of rolling out technologies where they -- there has been a record and we don'trm have any accountability measures for them. we have zero. the problem here for me is when i think about these companies and if the communities in which they make inroads like new york, or amazon that is coming here, they act very much with the locality as they do with the technology.
3:08 am
to contracts that they try enact with the locality is secret. they don't disclose it to the local communities asking about what are the tax benefits they are getting from coming into the city? they arethe exemptions going to get from certain requirements or municipal requirements. investe not required to in the communities that they are coming into and these are massive corporations. amazon, $1 trillion, would not want to build roads or help build roads. it is problematic for me that they act like they are principled collaborators with community and on one hand have these technologies and go to conferences where they deploy them and set up human rights people within these corporations and claim they actually do the
3:09 am
business of accountability while , they want toand look like close community partners. i think their contract should not be secret. cities and towns should make the contract open to the public. if they want to expand their iotprint in the market, really believe they should disclose what they are trying to shouldi think they respond to what we give to the government about the technologies they are designing for them. if they want to play the role of honest broker, they should do those things but i don't think they are. reckon we still have to with this idea of companies barons,e new railroad
3:10 am
of the united states and how do we treat them now? what are the principles local communities should have? how does that pivot the federal reform so that we can have the genuine public dialogue that we need to have? otherwise, we are making recommendations that hit them at the edges but don't force them to make real changes. in my opinion, and maybe this is not the place to make it -- i don't know how we win the accuracy argument with the corporation. they will always say their technology is accurate. they will always win that argument. i don't know how to stop them from doing that, so we should meet banking -- be making that argument but also demanding transparency for practices. >> this has been a mostly sobering conversation.
3:11 am
there many things that encourage you about the way companies or government entities proposalcing a policy or meaningful limits on the use a technology or the way particular company is coming to the table around the implications of facial recognition? any bright spots to take home today? amie: you should pick up this report -- it was amazing. response to some of these we have seen is really inspiring. i don't think it should be on the employees. that obligation shouldn't fall on them, but they have taken that up and it is amazing to see them say we are not going to do this anymore and it has inspired change on the corporate level. the report discusses google saying we will pull back from some of this. we're not going to go into the
3:12 am
jedi process and that is a real change employees should feel really proud of. paromita: i totally agree. those are great conversations about workers, which is a conversation we are having across the country. technology workers are going to be part of that. -- two thingsg that encourage me our first, the level of public attention that seems to finally be rising up on this issue. i remember working this all the way back in law school and being just as scared of it as i am now. the technology is going to destroy anonymity one day. we need to shut it down. no one had heard of facial recognition. in "minoritying report," and now there is a huge amount of attention in the news to all of that. the second, there is strong bipartisan consensus, a very
3:13 am
rare thing these days, that this is a technology that should not ingiven reckless treatment how it is put into the world and used by government. there was a hearing on facial recognition last year and there was universal condemnation about the way this was used and how broadly, and the lack of transparency and rules on it from most democrats and republicans. hopefully, that can lead to good policy and if it doesn't immediately, the idea you could actually get congress united on this is the sort of thing that could pressure companies to start making changes and being more responsible in how it is destroyed -- deployed. finally, what is your one burning question about the implications of governments that yous technology, think we should take away, think as we continue to
3:14 am
think about this issue? unanswered questions? amie: you can never solve everything. you can't stop all bad things from happening. on accuracy rates, you have to choose how many good people you want it up in it and how many bad people you want to let go. we -- whatying, and kind of world do we want to live in? what costs are we willing to accept by deploying these tools because we will not get to perfection and we are risking a lot in terms of civil rights, human rights, civil liberties along the way. >> any mascots, questions you have? we had,: the questions the companies who looked at the
3:15 am
, the infrastructure that amazon provides on the cloud to the people designed the technology, the consultants that will be used, the new analysts that are going to be used in policing, i guess my question is how do we incorporate that conversation with communities because i think there is a really disconnect with what is happening in policy and how to connect it to communities? it feels like a very hard problem to solve because we don't have solutions. i feel like we do need that table. we need tables where we bring walks ofom different life to have the conversation not only about litigation and policy, but about solutions on the ground and those need to be cordoned dated. my question is, where are those
3:16 am
tables? if we don't have them now, who is building them? i shouldn't we build them? lawost every other area of -- this area needs to catch up. some of what is stopping us from getting to the standards that will be rolled out by government on what is accuracy? what is going to be due process? what is going to be probable cause? they will make definitions that will be the new definitions, right, in our law and they will begin in deference, especially in federal agencies. i need to have those conversations that think their strategies about how we encounter and fight those inroads into all of our lives. >> in our last 30 seconds? say, how important
3:17 am
is of security? it is not something we think about much, but -- obscurity? it is not something we think about much, but now that these programs can put a police tail on everyone in america all the that how important is it you can be not noticed walking in and out of a medical clinic, a lawyer's office, and alcohol anonymous meeting, all those sensitive activities that technically you are not writing on a piece of paper you put in your desk in your house, but when most people think about it, are just as important for your privacy, for your identity. how much are we going to fight for that value now that new tech is putting it at risk? >> well thank you so much for such a robust conversation. i certainly learned a lot and i'm sure our audience has, as well. [applause]
3:18 am
let me add my thanks, as well and while we give a moment for the panel to escape the stage, a it were, i am reminded of seminal moment in privacy in the united states. 1891 essay by the future justice brandeis and the technology of the era and the invasion of privacy it represented. i mention this in the sense of facing challenges like this before. the technologies were the ability of newspapers to mass-produce your image in a photo because of until that time, your image was only known by the people you physically met. they were concerned about that and the second was the use of this invasive technology for the era that was beginning to happen in which the reference was that --ike a proper gentleman
3:19 am
forgive me, international women's day, but this was 1891 -- unlike a proper gentleman who knocks upon the door and introduces themselves, this device pierces the sanctity of one's home. could a telephone, which ring unbidden into your home. have lines and cords. now we will move into our featured speaker. with that, if you'll take over. lisa: i'm back. thanks again. i appreciate the panel discussion. insightful. was to solve,hard problem and i have the great pleasure right now of introducing our next speaker, who is working to solve this problem.
3:20 am
congressman jimmy gomez, who probably represents california's 34th congressional -- proudly presents that represents california's 34th congressional district. prior to his selection in 2017, congressman gomez served foreign half years in the california state assembly as the chair of the appropriations committee and was instrumental in shaping landmark legislation concerning public health, civic engagement, campaign finance disclosure, access to education, just to name a few. these are issues near and your to my heart so i think him for that. his role as a champion for civil rights and civil liberties is the reason he is giving special remarks here today. gomez hastive examined the civil rights implications and spread of facial recognition technology. he has demanded more information
3:21 am
to shed light on the role of amazon and companies in providing tools to government agencies and is leading efforts to create policy and regulatory environment that catches up with the speed of technology. you've heard references to the 28t that he was one of the members of congress who was misidentified by amazon's facial recognition technology. of the false matches were representatives of color, and this is despite the fact they make up only 20% of congress. i am looking forward to hearing enlightentive gomez us on his perspective of this and what we might do moving look forward to working with him on the policy solutions. would just, again, give a round of applause and welcome representative gomez. [applause]
3:22 am
rep. gomez: is it afternoon yet? not yet, not yet. thank you so much for that kind introduction and i want to thank you for the invitation to come and speak at the national freedom of information act day. to technology.r i represent the 34th congressional district, which is downtown l.a. and the eastside quickly onew coming of the premier areas for new tech companies to relocate their businesses. we have the infrastructure, we have to human capital. we also have something most areas don't, which is space. relatively affordable housing compared to the silicon valley and venice beach and silicon beach. this is also in close proximity to the fashion district, to
3:23 am
entertainment, to green tech. it is all converging in downtown los angeles where i represent and as all of you know, technology is going to shape our lives consistently, not only today but moving forward and to some of these advancements totally benefit us and others are proving less helpful or even dangerous. the aclu test, last year, aclu -- they did a test using amazon's recognition technology that misidentified 28 members of congress and the aclu called my office, called my director for a comment but didn't tell us the results. i asked what the results were. let me guess, it is mostly people of color.
3:24 am
the is before i ever saw results, before i read the results are heard the results. seemsason why was it people of color and minorities are always the second thought when it comes to any policy discussion, especially when it comes to tech. this example is something that woke me up, i would say, to this technology out there and a visit i went to a few months before, i visited sxsw, the text portion of it and somebody mentioned about diversity in tech and how that lack of diversity also leads to a bias in technology and that sat with me. , the aclu ranther that test using the same software that law enforcement is using.
3:25 am
they paid $12.33 and that is less than an uber ride from here to the lincoln memorial. it showed it is a deeply flawed technology, but it has severe consequences when implemented in real life, especially when it comes to latinos, african-americans, and undocumented immigrants. like of academics yourselves and other folks have been showing face recognition is less accurate for darker skinned faces and women. these concerns of mine became even more pronounced when i saw the test that was done and the fact that amazon is selling this to law enforcement and government agencies -- i think it is one thing to use facial recognition technologies for consumer products, when someone is selling a para of jeans, but we are not talking about
3:26 am
snapchat or instagram filters. we're talking about law enforcement. seen,stake, as we have can lead to a deadly interaction between law enforcement and the public. that is what i am deeply concerned about. that is what terrifies me, and i started conveying my concerns to jeff bezos directly, along with my colleagues john lewis and senator edward markey, both were also misidentified. i reached out to amazon to ask several questions. builtas amazon protections into their software to protect the privacy of innocent americans? contain athe software mechanism for automatically deleting unused biometric data? does, what audits, if any, amazon conduct of their software to ensure it is being used in an ethical and elysee -- legal capacity?
3:27 am
it integrated into police body camera technology or existing public camera networks? we also want to know the results of any internal accuracy or bias assessment amazon has performed on the software and detailed information on how the tests for facial recognition, accuracy, and bias, especially racial bias. these are just a few of the companies we have and we develop more as time goes on, because we learn more and more and more. because this is relatively new technology, when it comes to new facial recognition software being used by law enforcement, it is still in the wild west stages. we learn new things about the impact of the technology every single day and that is why we need to address these questions immediately.
3:28 am
unfortunately, amazon has been delaying. they haven't been as forthright with answers as i would like. i'll be honest with you, they paid less attention to me when i was in the minority pilot -- party. when i was natural resources and oversight but all of a sudden, when i got -- when we became the majority party and i gone on to ways and means, one of the most powerful committees in the house of representatives, there consideration of my concerns all of a sudden changed, especially when it came to the oversight committee. asleep amazon is kind of and doesn't really recognize the public furor starting to brew around the country. some of my colleagues from my legislative days in sacramento visited me in d.c. and asked what i was working on. i mentioned facial recognition as a big concern and they said you know what, we are looking into that, as well. i know one thing.
3:29 am
california can pass a piece of legislation, introduce it in january and february and have it signed into law by october. if it starts there, it often leads the country and will spread from one state to another, so it is not just the federal government looking into it, but folks at the state level. sure -- not only do i want to see amazon address some of our concerns with the sense of urgency i think it deserves, but we also want to know which law enforcement agencies have purchased their software? who are they marketing the software to? and if their products are being used for massive government surveillance? i think the american people have the right to know. while we ask amazon these additional questions, we in congress need to ask ourselves important questions, as well. int is the role we will play
3:30 am
proliferation of this technology, particularly when used as a law-enforcement tool? what is the correct balance between regulation in law enforcement? what should be the mechanism for oversight and accountability? the answer will be clear as we learn more about this technology and its applications in the field. and in the meantime, we need to continue an honest public dialogue with the stakeholders. tech companies like amazon, say experts and academics are on the front lines of research. since we sent our letter to amazon, i'm happy to report that they set up multiple meetings with my office, sent one of their top ai professionals to meet with my staff, and proposed to set up guidelines for facial recognition technologies that lawmakers could adapt. if they are serious, it wouldn't be in the blog post, it would be
3:31 am
in a press conference or press release, something more official. i know they are a tech company and maybe a blog post is the way they do it, but it doesn't have the same weight as a real public statement. i think it's a good start but it is definitely not enough. to ease my concerns. that this line is software is just not ready for prime time, in right now i'm in discussion with members of my committee, the house committee on oversight and reform, to hold hearings on the use of this technology in the field. clearly our community has a long list of issues to tackle, but i believe that this is one of the issues at the intersection of privacy, civil liberties, and oversight, and the sooner we can address this issue, the sooner we can educate the public and identify what appropriate are, enhancing the
3:32 am
effectiveness of our local law enforcement officials, and to keep our government and communities safe. i once also bring up one last point. don't really know how to interact with government. they say stay out of our business, we are the experts, and our technology and our issues evolved to quickly. maybe that is true. but we have a role and responsibility to ask the questions that are laid out because if we do not ask these questions, if we do not cause, then we will be stumbling into the night without understanding what is in front of us. and that is where we are headed. i am not somebody who was against tech companies. i am not somebody who doesn't want them to relocate to their districts.
3:33 am
but i think they have to bring perimeters around them that really forces them to look at these issues. is it a regulatory mindset? is it direct intervention? what is it? it's a lot of issues that we have dealt with in california. california tends to be at the forefront of a lot of these issues, but it is still difficult to pass legislation. i remember the battles when it came to body cameras, the technology, how much it cost, who gets to use it, who gets to review it. i was originally told it would solve all our problems and people would be a lot safer because of it. but you have to ask these questions because if we don't, the genie will be let out of the bottle and there's no way of putting it back in. i appreciate all of you coming in having this does russian. congress, one thing i know, it moves slow, and i have only been
3:34 am
here for a year and a half. but it is something that we will keep pushing. i also send my appreciation for this new report. i will take a look at it, try to learn from it, see where we can push our case. i am actually heading back to sxsw for a panel of facial recognition with the aclu. this is something that we have to keep pushing. the american people i don't think truly understand the implications yet, and it is something that, if we don't continue to talk about it, is going to run away from us. this is my call to everybody that is watching endless thing, thank you so much for being claimedd i have never to be an expert on tech issues but i do know how to ask questions, and i know that if we continue asking these questions we will find the right balance between liberty and security.
3:35 am
thank you so much. [applause] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit] [captions copyright national cable satellite corp. 2019] announcer: at that same event, congressman elijah cummings of maryland, who chairs the house oversight and reform committee, received an award from the news media for open government coalition. this is about 30 minutes. [applause] >> hi, everybody. good afternoon. as recommended, my name is melissa, and i in the coalition director for the news media for open government, also known as nmog. we are pleased to honor the chairman of the house committee on oversight and reform, congressman elijah cummings, as the 2019 recipient of the sunshine in government


info Stream Only

Uploaded by TV Archive on