Skip to main content

tv   House Oversight Hearing on Facial Recognition Technology  CSPAN  June 2, 2019 12:28pm-3:30pm EDT

12:28 pm
for god to heal people. q&a,night on c-span's school assistant professor and gospel scholar talks about her memoir, everything happens for a reason, reflecting on being diagnosed with stage iv colon cancer at the age of 35. >> it is really gone, there is no pain in your stomach, right? >> you can see how quickly he moved from praying for her. his confidence in himself as that vehicle. then the idea that she did not have pain in that moment that she is definitely healed. his very dramatic approach to faith healing is one i often found to be somewhat manipulative. >> q&a, tonight at 8:00 eastern on c-span. on wednesday, the house
12:29 pm
oversight and reform committee held a hearing to examine the use of facial recognition technology by the government, commercial entities and its impact on civil rights and liberties. this is three hours. our first hearing of today, we are having our first hearing of this congress on the use of this recognition technology. the oversight committee is uniquely suited to conduct a comprehensive review of this issue.
12:30 pm
we have extremely wide-ranging jurisdiction. -- and local entities and the private sector as well. cummings: i want to make clear that this is a bipartisan issue. both the conservatives and liberals alike have real questions about when they are being monitored. also, why they are being monitored and who is monitoring them. what happens to this information after it is collected? we have been working closely with the ranking members. i sincerely appreciate his and his assistants and the assistance of his death. facial recognition is a fascinating technology with huge potential to affect a number of different applications.
12:31 pm
right now, it is virtually unregulated. we issued that the fbi made numerous changes to the facial recognition database to ensure accuracy, transparency and privacy. in the last month, they sent a letter highlighting six priority recommendations that the fbi has yet to fully implement. this is our rapidly expanding use of special -- facial recognition. other cities like san francisco are going in the complete opposite direction. banning the government's use of facial recognition technology altogether. we are seeing private companies
12:32 pm
use this technology more and more for advertisement, security and a variety of different customer experiences. again, there are virtually no controls on where this information goes. held a, our committee hearing to review law enforcement use of facial recognition technology. as part of that hearing, we found that 18 states have memoranda of understanding to share their databases. half of american adults are part of facial recognition databases and they may not even know it. we have also heard testimony that special recognition technology misidentifies women and women -- women and minorities at a much higher rate than white males. increasing the risk of racial and gender bias. this issue is very personal for
12:33 pm
me. baltimore. includes after the tragic death of freddie gray at the hands of the police in 2015, my city took to the streets in anger. we also walked the streets of baltimore. we walked together for two to protest this tremendous loss of life. also, to urge our fellow students -- citizens to find a peaceful solution to this crisis. the police used facial recognition technology to find and arrest protesters. it is likely that i and other members of our community who exercising our identified scanned,
12:34 pm
and monitored using this technology. they about what i just said. -- think about what i just said. whatever walk of life you come from, you could be part of this process. rallyuld be at a gun protesting gun violence. be pressing for the theal of the api or expansion of health care. the government could monitor without -- monitor you without your knowledge and enter your face into a database. we to do more safeguarding of the rights of free speech. the rights of equal protection under the law under the 14th
12:35 pm
amendment. my hope is today's hearing could be a broad review of these issues and that we are honored and thankful to have such a distinguished panel as we have today. on june 4, we will be having our second hearing on this topic. we will hear from law-enforcement witnesses. after that, i will be asking our to conductes investigations on issues involving law enforcement, state local issues and the private sector. our goal with this review is to identify sensible recommendations, legislative or otherwise. also, to recognize the benefits of this technology and protected against abuse. mr. jordan: this is a critical
12:36 pm
hearing on a critical issue. congressional oversight is of paramount importance. i want to thank you. this committee has a history of working in a bipartisant manner when it comes to civil liberties -- bipartisan manner when it comes to civil liberties and privacy rights. i champion your efforts on this. a few years ago, we had a technology.tingray some of you were at that hearing. wes is a technology where have this device. instead of peoples cell phones going to the tower, government can get your son number and frankly we can know exactly where you are standing. as the chairman mentioned, the potential for mischief, when you think about people exercising their first amendment liberties as some sort of political
12:37 pm
liberty, whether it is on the right or left, i think it is scary. we learned in that hearing that the irs was actually involved in using this technology. the same irs that targeted people for their political beliefs. we found that there is scary. about actualk facial recognition and real-time video as the chairman talked about. that is a scary thought. is 1984 george our -- george orwell stuff. it troubles us all. i appreciate this hearing. i am aware that we will hear from her witnesses. we will talk about this important subject and how the chairman has said it is virtually unregulated. i think that needs to change. with that, i would yield back, i
12:38 pm
look forward to hearing her witnesses and the discussion. >> i want to welcome her witnesses. this is the founder of the justice league and mr. andrew ferguson. the university of the district of columbia. the david a clark school of law. associate for the center on privacy and technology at georgetown university law. i did the best i could with what i had. thank you very much. also, dr. cedric alexander.
12:39 pm
if you would all please stand and raise your right hand, i will announce you in. do you swear that the testimony you are about to give is the truth, the whole truth and nothing but the truth, so have you got -- help you god? the witness has entered in the affirmative. the microphones are extremely sensitive, please speak directly into them. they sure it is on when you speak. without objection, your written statements will be made part of the record. you are recognized to give an oral presentation of your testimony. >> thank you gala committee members for the opportunity to testify. i am an algorithmic bias
12:40 pm
researcher. i have conducted studies that show some of the largest recorded racial and skin type i ibm, devices sold microsoft and amazon. they have some flaws. facial recognition technology has flaws. on the face of oprah winfrey, labeling her a male. i have had to resort to wearing a white masks to have my face detected by some of this technology. whiteface is the last thing i expected to do at m.i.t.. the american epicenter of innovation. not having my face detected could be seen as a benefit. this,s being employed for the technology is being used to track muslim minorities.
12:41 pm
there are many ways for this to about -- fail. this could lead to false arrests and accusations. in rhode island, a brown university senior was misidentified as a terror suspect in the sri lanka eased bombings. police eventually corrected the mistake but the damage was done. she received death threats and her family was put at risk. mistaken identity is more than an inconvenience and it can lead to great consequences. at a minimum, congress should pass a moratorium on the use of facial recognition as the capacity for abuse, lack of oversight and technical immaturity poses too great a risk, especially for marginalized communities. the brown university senior is a woman of color under the age of 30. we fall into multiple groups that the technology repeatedly fails on the most.
12:42 pm
with nonwhite skin, women and youths. due to the consequences of the failure of our technology, i decided of is my research on facial analysis systems. these are test with guessing the gender of a phase. microsoft and amazon had eras of no more than 1% in white men. those errors rose to 30% for darker skinned women. i wondered how large tech companies could omit these issues. problematicwn to data set choices. i found some surprising imbalances. one nested they sent was 75% lighter skin. or what i like to call a pale male data set. we cannot analysis -- analyze spatial analysis technologies without -- facial recognition
12:43 pm
without talking about this. this must be made public and updated to better inform decision-makers about the maturity of facial analysis technology. guidelinesequires and oversight. companies like facebook had the capabilities by training their systems using our base data -- face data without our consent. regulations make a difference. facebook now makes facial for usersn an option in europe. americans should have the same assurances that they will not be subject to this without consent. no one should be forced to submit their face data. uberthis week, a man sued after having the driver account deactivated over facial
12:44 pm
recognition failures. this was an unnecessary facial recognition entry system. there is bias in the use of this work health care purposes. facial recognition is being sold to schools, subjecting children to face surveillance. this could be the final frontier of privacy. congress must act out to uphold american freedom and rights. congress should require all federal agencies and organizations using federal funding to stop using face technologies. thank you for the invitation to testify and i welcome your questions. >> thank you very much. you for the: thank opportunity to testify today. studiesaw professor who fourth amendment freedoms. i have been studying how new
12:45 pm
surveillance technologies shape always powers. i have a very simple message for you today. ms. buolamwini: congress must act to -- prof. ferguson: congress must act to limit to this. -- facial recognition technologies. i have five main points. the fourth amendment will not save us from the privacy threat posed by facial recognition technology. the supreme court is making solid strides in trying to update these principles in the face of these 2 -- new technologies. they are chasing accelerating trend and will not catch up. second, the fourth amendment was never meant to be the sole source of government regulation. our entire system is premised on congress taking a leading role got a bite and in a rare instance overruled by our founding company -- constitution.
12:46 pm
we have welcomed congressional assistance in this area. it would be very unfortunate if privacy protection in the 21st century were left primarily to the federal courts using the blunt instrument of the fourth amendment. third, the few steps the supreme court has made offer guidance about how to avoid drafting a law that could get struck down on fourth amendment grounds. the supreme court struck down this act and cover diverse the united states. -- in this case against the united states. this fourth amendment floor should be a baseline consideration. congress goes a scaffolding off that constitutional floor, we have to think about the technology not just through the
12:47 pm
lens of today but with an eye toward expanding surveillance technologies that will link and share data in ways that will reshape the existing power dynamics of government and the people. we are not just talking about technological hardwood, cameras, hardware, cameras, computers and tools. legislation must approve privacy protections with an eye toward the sophistication of the systems of surveillance. finally, this fourth amendment questions must be coupled with civil rights and fundamental fairness when it comes to public safety protections. the burden of surveillance technology has never been shared across socioeconomic or racial groups. withess needs to regulate racial justice in mind. in my written testimony, i laid out the different types of facial recognition that congress needs to address today.
12:48 pm
monitoring without any into the lives -- any individualized suspicions. separately, nonlaw enforcement or emergency uses have also attempted to analyze the applications of surveillance and face recognition. that theconclusion fourth amendment will not satisfy these core questions. i would like to emphasize two points that range from the empty legal analysis. first, federal legislation should be drafted to ban generalized a surveillance for all ordinary law enforcement purposes. whether it is third-party image arbitrarily scanning and identifying individuals without any criminal suspicion and discover personal information about their location, interests or activities should be banned by law. federal legislation should
12:49 pm
authorize the use of facial recognition for only targeting on a probable cause search. plus, declarations that are taken to minimize the collection of other face images. this standard would apply to all face recognition. even-party image scams and government collected image scans. in my written testimony, i try to defend these recognitions as law and reality. i hope they offer a bipartisan way forward. unregulatednt, facial recognition technology should not be allowed to continue. it is too powerful, too chilling, to undermining to principles of privacy, liberty and security. i am happy to answer questions. morning distinguished
12:50 pm
members of the committee, thank you for inviting me to speak to you today. face recognition divides unique threats to our liberties. i would like to raise three core point about our constitution that i hope will be helpful as this committee continues to examine this powerful technology. face recognition gives law-enforcement a power they have never had before. this power raises questions about our fourth and first amendment protections. fingerprint people from a crowd across the street. they can't walk through the crowds demanding everybody's produced their drivers license, they can scan their faces remotely and in secret. last year, the supreme court governmentt for the to see whether monitor our movement across space is violating our privacy.
12:51 pm
this is precisely this type of monitoring. that has not stopped chicago, is other-- detroit and cities from highlighting and using this capability. ms. garvie: the first amendment protects the right to anonymous speech and association. technologygnition threatens to upend this protection. law-enforcement agencies themselves have acknowledged this. cautioning that the technology could be used as a form of social control, causing people to alter their behavior in public, leading to self-censorship and inhibition. that did not stop the baltimore county police from using facial recognition on the freddie gray protest. facial recognition makes mistakes and the consequences will be more disproportionately felt by african-americans. more ines of color are danger.
12:52 pm
that facialund recognition was used on african americans up to two times more. people of color are disproportionately enrolled in facial recognition databases because of much i databases that they work on. this mary's depends on the person being searched. facial recognition makes mistakes. it is making more miss identifications of african-americans. this is like the brown university student previously identified as one of the sri lankan bombers earlier this month. police using facial recognition in with this principle. this threatens our due process rights.
12:53 pm
my research has uncovered the fact that police submit what can only be described as garbage data into facial recognition systems expecting good returns. woody harrelson was entered. they submitted a photo of a suspect with a mouth and eyes from another photo. essentially ever getting evidence. officers skip identification procedures and go straight to arresting someone on the basis of a facial recognition search. this was counter to common sense and to the department policies. these practices raise serious and the about accuracy innocence of the person arrested because of a face recognition search. defendants are left in the dark about all of this, often never told that facial recognition was
12:54 pm
used to help identify them. this is information that under our constitutional rights to due process must be turned over to the defense. it is not. for all these reasons, a moratorium on the use is both appropriate and necessary. it may be that we could establish common sense rules. uses that promote public safety and threaten our civil rights and liberties. this is too powerful, too pervasive, too acceptable to continue unchecked. thank you for your time. i look forward to answering questions. thank you for the opportunity to testify today. law-enforcement across the country including the fbi use facial recognition without legislative approval and in most cases in secret. it is time to hit the pause
12:55 pm
button. congress must intervene to stop the use of this dangerous technology until we can fully debate what if any uses should be permitted by law enforcement. use of this technology is resulting in very real harm. cummings, in your opening statement, you discussed the use of facial recognition at the freddie gray protest in baltimore. i worry about our rights. it is not the only disturbing example. in florida, there is the case of william lynch, an african-american men -- man arrested and convicted of a $50 crime. this was based on a poor quality photo that was secretly taken. now mr. lynch cannot even get key information about the reliability of the algorithm used in his case that he can challenge conviction.
12:56 pm
to highlight three reasons why it is particularly urgent that congress act now and then offer to recommendations for areas where additional oversight is needed. why is this so urgent russian mark we have never seen anything like this technology before. 50 millions over surveillance cameras. this, combined with face recognition leads to a surveillance state. exploiting large-scale databases like drivers licenses for face matching. this impacts the rights of everybody in these databases. we don't have the option of home to avoidces being surveilled. this technology is more likely to harm vulnerable communities, including communities of color. it is less accurate on certain subgroups, including women and people of dark skin.
12:57 pm
they have tested amazon's facial recognition product. there were 28 false matches, including representative gomez. 40% of the matches were members of -- failed matches were members of color. world, poor communities and communities of color are over policed, they are more liquid to be stopped, arrested and to have things charged against them. this heightens the risk of associated errors. this technology is not being used consistently with the constitution. facial recognition is even more than how it was used in the carpenter case. it is being used without a warrant and without protection.
12:58 pm
the government is not complying with obligations. over a 15 year timeframe, they used face recognition in investigations. the county public defender reported never once receiving information as evidence, that is required by the supreme court decision. debate this issue, we have to do so with facts. other ice, the fbi and organizations using this technology. ? the fbi has broken more promises than it has cap. ,t has not tested the accuracy nor does the agency seem to be applying -- complying with the constitution. has met with amazon representatives. we know little else.
12:59 pm
the committee should examine these issues. two, we should look at companies that are aggressively marketing this technology to the government. how accurate their technologies are and what responsibility as they take to prevent abuse. marketing these technologies for serious uses, like identifying someone during a police encounter. we know far too little. amazon will disclose this technology and companies like microsoft and facebook have not received significant congressional attention. countryforts across the should stop the dangerous spread of this technology. amazon shareholders are taking of unprecedented set -- step forcing them to study the human rights impacts. congress should put a moratorium on law enforcement use.
1:00 pm
i look forward to your questions. alexander: -- dr. alexander: thank you for me having the opportunity to be here. i'll speak from a -- from the perspective of a 40 year police veteran. based on a 2016 investigation on privacy and technology, at least a quarter of u.s. law enforcement agencies use facial recognition searches of their own databases or those of other agencies. 16 states permit the fbi to use a technology to compare suspects, bases with images of
1:01 pm
state ids. facialorcement uses recognition prudently and wisely sometimes. sometimes recklessly. the washington post reported that some of these agencies used altered photos, rising artist sketches and even celebrity look-alikes on facial recognition searches. using artificial intelligence to confirm -- confer on a highly subjective visual impression, a halo of digital certainty is not a fact-based nor just. not illegal for the simple reason that no federal laws governed the use of facial recognition. at this point, law enforcement use of facial recognition is not only unregulated by law, it operates without any consensus on best practices. artificial intelligence systems do not invent results from the nap. ofy operate from databases
1:02 pm
identified faces in an attempt to match one of those identified faces with the face of a subject of interest. an artificial intelligence system is only as good as its databases. there is currently no standard to govern content of any facial recognition database. who is in it? who knows? thise do know is that fails to represent the size and diversity of the american population and are therefore inherently biased samples. real-time video surveillance is can identify criminal activity in progress but for the purpose of the investigation, what the fourth amendment guarantees is against unreasonable search and seizure. required. should be
1:03 pm
there we leave constitution, is the concept of surveillance and facial recognition going to impute on our rights? these answers are barely even addressed. cap accurate is facial recognition technology? -- how accurate is facial recognition technology? the answer is it depends. for lawacceptable enforcement. this identifies partially turned faces of those trying to read a badly smudged finger print. imagespplies for quality and result in erroneous identifications. one must think that artificial
1:04 pm
intelligence would preclude a racial and other vices. recognition algorithms marketed by software suppliers in this country where genetically more likely to misidentify black women than white men. the case of white men. of darkerin the case skinned females. ais is serious enough that said that san francisco considered a citywide ban. this seems to be an overreaction. even generally agreed-upon best practices, it is an understandable overreaction. beings are higher -- todwired by evolution
1:05 pm
suspect and fear the unknown. plays intognition that primal fear. the georgetown lawsuit on privacy and technology reports that attorneys never received facial recognition evidence as part of a disclosure. many of us know that is required . it may prove the innocence of a defender. only a small minority disclose how frequently they use facial recognition technology. very few agencies even monitor proper use of facial recognition systems. the vast majority of agencies don't have oversight to detect misuse. neither federal, state or most global governments subject police policies concerning
1:06 pm
recognition to legislative or public review. ,uman rights and civil rights they promote fear and suspicion. like so many digital technologies, they shall ago theion was not long stuff that we thought of as science fiction. now many of us carry a smartphone that recognizes our face when we take it out to send a call or text. it has become a normal part of 20% rate living and most americans have no trouble accepting that facial recognition can be of value to law enforcement. without the judicious and just application of human intelligence, including full disclosure, transparency, public accountability, prove of -- andtion & space
1:07 pm
regulation, they are blunt instruments. blunt instruments become weapons. >> thank you. >> thank you to all of you for being here. this is of particular interest to me because one of the communities in my district is planning on it lamenting facial recognition technology -- implementing facial recognition in the city. the court found that police violated the fourth amendment when the collected cell phone location data without a warrant. that congressote is a better arbiter of controlling lawful use of technologies in the court. muchegislation is preferable to the development of entirely new body of fourth amendment case law for many reasons. including the complexity of the subject, and the fourth amendment limited scope. to. garvey, can you speak
1:08 pm
how this is -- ms. garvie, can you this is developing? i can, very fast. they are getting better. the results of the systems will be getting better. andoesn't matter how good algorithm gets, if law enforcement agencies put wrong data in, they will not get reliable results. these algorithms are not magic. hill: do you think that supreme court -- the sprinkler can roll quickly enough on this rulee supreme court can quickly enough on this? >> this kind of technology should be regulated first by congress, the fourth amendment floor will exist. prof. ferguson: this body has the primary responsibility to
1:09 pm
regulate in this field. >> from your testimonies and your answers right now, it sounds like federal legislation avoid asary in order to patchwork about this. san francisco barred law-enforcement to use this technology. this attracted attention from around the country. at the same time, we have local governments adopting this. the san francisco ordinance is bans in for further oakland. many states have also partnered with the fbi to grant access to their collections of drivers licenses and photos for this. the use of facial recognition can be difficult to navigate. be affordedld
1:10 pm
certain protections but then drives several miles and be subject to verbally different regime. can you discuss the range of different standards we see across the states? ms. gulian: by and large, when we are hearing from communities, it is with a great deal of anxiety. this is done without community input, without clear rules and the right standards to protect the first amendment and other core values. one of the important things of this hearing is that people ask the questions and then we have to debate. until we have that, we should not be having -- using the technology. rep. hll: we are talking about the local government or another company that want to implement this, what advice would you give? what recommendations do you have for us in this body? to be a little move quickly and
1:11 pm
set up a baseline of how local government should operate at this stage? ms. guliani: do not use the technology until there has been a legislative process. you have seen this pop up in a lot of cities around the country. not puta that we should the cart before the horse, we should study the harm before you roll something out. that would be my greatest recommendation. we should do that federally as well with the technology. i would add that most state and local law enforcement systems have been purchased using federal grant money. that means congress has incredible power to decide how ith transparency goes into lamenting these technologies and what rules are in place as well. >> i think congress has able to set the floor and allow local
1:12 pm
governments and cities to build off that. prof. ferguson: this way you can have places like san francisco, we have seen real leadership about democratic control over surveillance technologies. i don't think that is a reason to have congress also act. ms. buolamwini: could you also provide us recommendations for that? -- rep. hll: could you also provide us recommendations about that? prof. ferguson: i would be happy to work with the committee going forward. rep. hll: thank you, i don't back. facial recognition systems make mistakes. they disproportionately impact african-americans and people of color. this appears to be a direct violation of american first amendment and fourth amendment liberties. threaten seems to american due process rights. all that happens in a country
1:13 pm
with 50 million surveillance cameras. ?s that accurate ms. guliani: that is correct. rep. jordan: how does the fbi get the data in the first place? ms. guliani: they used state drivers license databases, updating status have used these by the fbi. rep. jordan: they use passport photos. who made the decision to allow -- ms. guliani: the use passport photos. rep. jordan: who made the decision to allow the fbi to do that? ms. guliani: there were memorandums in the 16 states. against a lawwas and ultimately the attorney general had to suspend that use. rep. jordan: did the state legislature and the governor actually pass legislation saying it was ok for the fbi to access every single person in the state that has a drivers license? ms. guliani: no, that is the
1:14 pm
problem. this was all in secret. rep. jordan: some unelected person at the fbi talked to some unelected person at the state ?evel and they said go ahead in the case of ohio, we have 11 annoying people, most of them drive, here are 10 million people that you could have this data of. ms. guliani: and the people who wanted a drivers license did not always know the systems are operating. rep. jordan: that was my next point, the individuals themselves did not get her mission. ms. guliani: that is right. rep. jordan: where they notified at any time when they take their pictures? any type of information given to them that this may go to the fbi? ms. guliani: i think people have been unaware of the systems and how they operate. rep. jordan: the individual is unaware. notr representatives did make a decision. that information is going to the fbi.
1:15 pm
that scares me, particularly in the context of you can use your example. do you really want to over having this capability? with the things that we learned that peter engaged in and the bias that he had, no one in an elected position made the determination? ms. guliani: that is right. rep. jordan: when the fbi has this and they access the database, what sort of things did they have to put in place before they access it? is there a type of probable cause? a type of due process? ms. guliani: they don't have to meet a probable cause standard. they're not even notifying people in cases where it is relevant in their case. rep. jordan: 50 million cameras, a violation of first amendment and fourth amendment liberties. kinds of mistakes.
1:16 pm
this disproportionately affects african americans. no due process, no elected officials gave the ok. does the fbi chair is information with other federal they havems. guliani: partnerships with federal agencies like the state department to scan through their passport photos. we don't know very much about how other federal agencies are using this. rep. jordan: does the irs have access to this? do they have a partnership with the fbi or any other federal agency russian mark ms. guliani: i don't know the answer to that -- federal agency? ms. guliani: i don't know the answer to that question. there should probably be some kind of restriction. mr. ferguson, you said we should just abandon it prof. ferguson: i think there should be regulation about-face recognition technician -- technologies.
1:17 pm
if we're not going to regulate, we should push the pause button on this technology now. it is as dangerous as you are expressing. rep. jordan: it seems to me it is time for a timeout. 50 million cameras, real concerns. what troubles me is the fact that no one in an elected position made a decision on the this is more than half the population of the country. that is scary. particularly in light of what we have seen. thehave to remember framework. it years ago, the irs targeted people for their political beliefs. they did it. no matter what side you're on, this should concern us all. this is why i appreciate the chairman's work on this issue and the bipartisan nature of this. i yelled back. yieldt to clarify, -- back. >> you are recommending a
1:18 pm
moratorium? >> until there is scientific evidence that shows that these technologies have reached maturity. with what we know from human centric computer systems, as they are based on statistical methods, there is no way the technology will be 100% flawless. there are trade-offs that need to be made. the academic research does not exist to say this is what it looks like for meaningful thresholds. >> thank you for this hearing. the ranking member. we have these cameras everywhere. we are a little late. late in saying that you should not be surveilling people when there is nowhere that we don't survey of people. toember when we first began survey of people, it became an issue.
1:19 pm
we wondered if this was really right. i don't know if it was ever challenged. this takes me back to my days as a law professor. there are all kinds of hypotheticals that occur to me. i have to ask you mr. ferguson admitting this -- and maybe m s. guliani, is there a connection between misidentify people that happens all the time so that the police draw in people? based on people saying that as you i saw? what is the difference? how are we to argue that in court ms. guliani:? one of the big differences is -- in court? ms. guliani: one of the big differences is you can ask them how far away they were, if they were intoxicated.
1:20 pm
without rhythms, you can put them on the stand in the same way. technology is being presented as if it is perfect when it is not. let me ask you about police encounters. suppose the police stopped the first bidding, there is some probable cause there. you can contest it. can he put your photo in a database that the police have? having already have probable cause? you can show up in court. >> right now, a police officer can. there is no regulation on any of this. the concern is that may not be the way we want to go forward. that is a use. there are companies that sell technology for that reason.
1:21 pm
it is a reason to act. i raise an: hypothetical because i think we are already doing what we are already afraid of. and that we ought to look very closely at regulation. watch out, we will be regulating stuff that is already done by law enforcement. to that we have given a pass . i wanted to have been any recent supreme court decisions on cell phone location data? -- wonder if there have been any recent court decisions on cell phone location data? prof. ferguson: when you're talking about a system of cameras that can track where you
1:22 pm
go, principles that chief justice roberts was concerned about, this idea of the permanent way you can go back and see where people have gone. this footage is available and you can track where you have been at every moment. we don't like arbitrary police powers all speak to the fact might seeupreme court this as a fourth amendment violation. unfortunately, these cases take a long time to get there. relying on the fourth amendment that webe the place want to be. i think congress has the -- rtunity to act now to the norton: in light of issues raised by the ranking thisr, i mentioned that
1:23 pm
monitoring of facial recognition has been done on a mass scale over time. do you think this is a violation amendment if in doing the very monitoring that is done now? for example, if we have an inauguration, that monitoring is done all the time. that is monitoring on a mass scale. if it is done over time and is of regular surveillance for the safety of those involved, do you think the court would see that monitoring over time as unconstitutional? >> you may answer the question.
1:24 pm
decision ishat available. when you mentioned tracking people over long periods of time , the court said that the tracking was unconstitutional. i think there are significant first amendment concerns. policy ofs a identifying every protester every time they would to a protest, i think there is strong case law that would raise constitutional concerns with that. fundamentally, they decided to decades after we all started using cell phones. totakes time for the things her through the system. it is harder when people are not seeing notice. were you trying to answer the question as well mr. alexander? dr. alexander: the question you raised was a very good one. respond to a from a
1:25 pm
law-enforcement perspective. this technology that we are referring to can be very valuable. in terms of keeping our communities and keeping our country safe. there are opportunities for that. the problem that has occurred is like a horse that has already got out of the gate. now we are trying to catch up with it. the vastink about utilization of facial recognition that is going on and the questions we are opposing, today, they're going to come with a great deal of challenges. waysd of cringe in some when i hear my colleagues respond. be ag there should complete moratorium on facial recognition. i'm not sure that is the answer. what i am more concerned about is the misuse of the technology and how we think dollars that and how we differentiate the between when it is being used
1:26 pm
correctly and when it is not. here is the problem i have for policing in this country as it relates to this technology. the police and of being the end-user of a technology created by some software or a technology firm somewhere. i go out and use the technology. if i am not properly trained, if i am not properly supervised, if there is no policy, there is no transparency i'm using about how and when this technology is being utilized. the police chief, that police department ends up being the bad guy. god knows that is one thing that police don't need, considering the environment we are all very much and -- in. there is a place for this technology but i think more importantly, to me and for any of my colleagues out there, i
1:27 pm
need to be able to be sure that i can train my people adequately. the software companies can't just pass this technology to me, i have to be sure that my people are trained. there are ethics and morals that go along with it, there are policies, there is standards. there are good practices that we know when we feel good about. i'm not sure that a total moratorium works in light of the fact that we live under a great deal of threat. we can still use this technology. while we arethat trying to develop some standards? >> thank you very much. >> i could not have said any better the concerns shared without
1:28 pm
accountability by people who like to represent them. --s is care -- this is gary this is scary. how agencies have gone and this is real scary stuff. yourlexander, i like analogy of the horse getting out of the gate too quick. of let'she tendency air on the side of liberty -- err on the side of liberty. when would be limited appropriate use and that will point doing need to have the technology developed? >> for me, that is a very tough question, but i think this hearing in the hearings that are and maybe even
1:29 pm
some smaller sessions, particularly with federal law enforcement utilize this technology to fight off potential threats on a much larger scale. i think when you start talking about local policing in another itself, i think to have an opportunity in terms of how they are using this technology and thinking it can best benefit them if we can develop some limited framework in which they can operate because it is problem we serious have. it is not as transparent as it should be and certainly going to create a great deal of anger among americans in the country, , their firstpeople and fourth amendments are violated and we find ourselves in the position we don't want to be in. requirethis is going to
1:30 pm
further conversation beyond today, but it is something we have to act on now, but i'm not sure if the total moratorium on this is going to be the answer because we still have a home and we have to protect and there's still some value in facial recognition. outn my testimony, i lay the way to regulate this which involves probable cause, requiring a sworn affidavit to be able to search the database other placesld be in this. the documentation so you could answer the questions and whether it was used and by whom and to make sure there was a process in place to see what had happened and be able to check the abuses.
1:31 pm
facebook inoned your remarks and i find that interesting because i am extremely concerned about the check.ent having this i would be curious to get your also if you want to speak to that too. >> you're looking at a platform that has over 2.6 and users and over time, facebook has amassed facial recognition capabilities using all the photos we tagged without our permission. wet we are seeing is that don't necessarily have to accept this as the default, so in the eu were their gdp are was
1:32 pm
passed, they have an option where you have to opt in. right now, we don't have that in the u.s. and that is something we could immediately require today. >> just to add to that, it is certainly something were federal legislation is needed. you can look at illinois for biometric laws. very importantly, they have a private right of action, so this book or any other company violates my rights and uses my image without my permission, i can take them to court and that is an important accountability mechanism. >> the other point to bring up it ist oftentimes collected in one use and ends up in another scenario. a recent example is whenever a photo sharing company where users uploaded images and later on, they found out the photos
1:33 pm
were used to train facial recognition, so we definitely need data protection when it comes to the disclosure around data. >> mr. clay. >> thank you and think the witnesses for being here. the use of facial recognition technology has already generated a heated debate on whether it is a necessary tool to combat crime and an unlawful breach of privacy. the technology identifies peoples faces and puts them against of -- puts them against of -- puts them against a list of images of missing people, want people. the have described technology as orwellian.
1:34 pm
i have questions about the accuracy, protections against misidentification in obviously civil liberties issues. i want to hear more from you about the testing of amazons recognition software. the aclu alleges amazon software incorrectly matched members of them as, identifying people who have been arrested for a crime. if mr. -- members of congress can be falsely matched with a mug shot, what should be the concern for the average american about this facial recognition? >> i think one of the most interesting things is that $12ing the test was about
1:35 pm
and we took photos of members of congress, matched them against mugshots and we found the 28 matches and these were independently verify. i think that is one of the things that is important to note that it is not just our testament to others that noted similar problems with other face recognition outdoor thumbs. imagine this in the real world. you are arrested or convicted or anded over by police in they say we identified you as this person. you don't have the information wrong. they got it the prudent thing to do would be to hit the post button, understand the danger, understand whether ms. technology is helpful and let
1:36 pm
legislators like this aside. >> has the aclu shared the that resulted in false matches with criminal mugshots? >> we had it independent berlin -- independently verify it weford professor will stop do not want to have their photos in newspapers. tested amazone recognition using the same methodology that we tested. -- methodology came and was available over your before we tested and also found they had rates of over 30% error
1:37 pm
-- and 0% with lighter skin. >> has any of that been corrected? did,e first studies we they did improve their accuracy disparities, but even when they approved, they still improved -- performed better on men and those with lighter skin. you know what the collateral damage can be through misidentification and i have fought for years to free people wrongfully who were convicted and this is where this is going from blacks rules and
1:38 pm
regulation and technology. >> we don't even have the recording requirements. at least in the u.k., there are results. there is a big report that came out and showed over 2000 innocent people had their faces misidentified and so they are building on what aclu said. we don't have any kind of requirement in the united states. >> all in the name of profits. before he go, i listen to , it seems like we system.efective
1:39 pm
say that it has good , it also can be extremely harmful and when you balance people's rights, that is the problem. in the law, we are constantly trying to do a act, but when you have a product that is defective -- that is the problem. it has a chilling effect on that. i'm going to come back to you. thank you. the supreme court case held that the government is required to release to the defense evidence
1:40 pm
come upon and i'm worried that it presents a .hreat to that for instance, if multiple photos in the narrow it down to a single suspect, does it require the fbi to share those other photos that were similar to the suspect in question? >> there could be a scope of tory evidence that the all the rhythm has a reliability problem or that it returned similar photos indicating they could be the person. beencould say look, i have misidentified and other people were tagged by the system. one of the concerns is we are not the brave disclosures -- brady disclosures. times.thousands of
1:41 pm
judges are not having the opportunity to rule on critical issues. >> when a human makes an identification or falsification, you can cross examine the human. all of those things you can cross examine, but you cannot cross examine an algorithm. has the government been providing descriptions of how the algorithms work? >> they have not. and we have found that they usually will not know if it is used in the first instance. law enforcement agencies don't have access to the training data because these are private theanies that develop
1:42 pm
systems and it is considered a trade secret, so the office and agency says they cannot turn that over, so we have not seen public defenders having access. face recognition systems are designed to return multiple matches. essentially, the our was playing witness, saying that 90% confidence this is this other guy any of the person who is 70% is the one who is charged. i have a couple of minutes left. we have talked about the case for facial recognition doesn't work. it is very concerning. onward about the case where they work wanted to percent of the time for there are no mistakes
1:43 pm
and nobody gets left out. can you speak briefly to how china is using real-time facial surveillance? >> we see china as a roadmap that is -- of what is possible. isis a system or everybody enrolled in the back end and there are enough cameras to track where someone is anytime they show their face in public, upload their photo and see where they have been over the last two rallies, that public and alcohol anonymous meeting .or a rehab clinic that information is now available at the click of a photo.or the upload of a that is what regulation looks like with no rules. that we have any evidence federal or u.s. agencies are
1:44 pm
using cameras and monitoring them today. >> at least two major jurisdictions have purchased this capability and have paid to maintain it. chicago says they do not use it in detroit is not deny they are using it. like gaslly locations stations and liquor stores and churches and clinics and schools. minimum threshold of suspicion or evidence before inr face is searched real-time in one of these databases? >> in those jurisdictions, there are rules about who ends up in the databases. by large, there are no rules around this. >> thank you, i yield back. you, this is a
1:45 pm
difficult discussion as we try to balance our private rights with the potential proper use of facial recognition technology. i want to focus on three different areas. the first one is getting a sense of the panel. other some of the believe we outright need to stop this technology from being used? secondary, if we do agree to proper use, some sort of definition around that for private use and talking about if there are violations of that proper use what would we consider appropriate penalties? is there anyone on the panel that flat-out says we need to ?top this technology >> it depends on the kind of
1:46 pm
.echnology we are talking about we have facial recognition being used by employment. a company purports to do video analysis on candidates for jobs with verbal and nonverbal cues. there's a case where an over driver had their account deactivated because uber uses a face verification system to determine if you are actually past year and this you had transgender folks kicked out of luber because of these misidentification's. i would not necessarily say it is a flat-out ban, but we need to be specific about which cases we are talking about. as it stands right now, i think ande should be a moratorium
1:47 pm
we don't have regulations. things.ld say two one, there will be uses of this haveology were we real-time tracking and two, i think if any are permissible, we need the facts. a u.k. 70, there was a 95% inaccuracy rate. , it is hard to answer all the questions. >> my concern is that bad actors are always going to use the tools that they can access and if they can access these tools even though we want to prohibit it from happening, they will access it. my sense is that we need to figure out what is proper legislation and if we do move to that question, law-enforcement
1:48 pm
versus private, law-enforcement has been using digital enhancement photos for years and i don't think anyone suggesting that is stepping over the line. my question is, how to make sure law-enforcement in using this technology is using it in the proper way? >> let me respond to that and i think to your question quite frankly, there is a string that goes between them and it goes place tohere may be a utilize technology. the problem is, the technology is developed by the software company and sold to a police department without proper training, without proper unintendedng and the consequences. that becomes a real problem.
1:49 pm
we know that this technology has been given to the police and when police were asked to describe how it's used, they cannot do so. that is a problem and it should not be utilizing this technology. going back to the question that , has the a moment ago been times where this technology has been proven to be valuable for law enforcement to keep community safe? , we are trying to keep communities safe and at the same time, trying not to violate people's first and fourth amendment rights. they rub against each other and we somehow have to figure this out, but i don't think you can -- ne and just >> is this technology, we see there are mistakes. is there a greater propensity for mistakes whether it is
1:50 pm
artist rendering or photographs in general? >> i think the concerns are different. we talked about how law-enforcement can identify you. our id anded me for i raised a couple of points. frankly, the community would not be able to raise that with their leaders. different, the achievement is different and so we have to address those before he can talk about our good uses. >> thank you. >> is wanted to ask questions about facial recognition .echnology reforms first question to professor
1:51 pm
ferguson, should state municipalities able to activate their own facial recognition technology laws? >> i think state and local governments can raise that and create more protection. >> you testified that they have .aken steps to ban to do speak on what those look like? was a sefra sisto, there recent attempt to ban from the government and there has been legislation introduced that was put in place and study the technology and not permit use until the study is complete and 13 the localities that require recognitionike make , there has to be a process and there needs to be a look at
1:52 pm
privacy impacts. agreed all the panel that it needs to set the floor before municipalities create their own rules and regulations? no?or >> they both need to act, both the state and federal government. >> we have 50 million cameras in the country, a system that we said earlier next mistakes all the time and disproportionately her people of color. i think it finally the first amendment liberties, due process standards. no elected officials are weighing in on this and then i also think there is this chilling impact, this
1:53 pm
intimidation concept out there and it seems to me in some ways insulin -- naacp where disclosure, this is going to be constant disclosure. am i reaching or is there similarities to the whole intimidation that happens and this is going to be an effect a constant disclosure of what to?re up >> i think there nothing more american than the freedom of association. we have seen that this technology can chill both of those. being able to support an incumbent and political candidate who wants to go .gainst the incumbent
1:54 pm
we're not going to be able to act in ways that we used to. for intimidation reasons, this is an effect the same thing? >> the same person in a problem, yes. >> i think that one of the fundamental concerns that the need to address, i don't think any loss want to live world where we have a camera everywhere you are. do that so it doesn't look like china? if there is a framework put in place, i think it needs to address the real concerns that for my opinion, we don't have the solutions and we should not use the technology until we can
1:55 pm
assure people's rights are protected. again, i just want to say thanks. i want to thank our panel and thank you again for this hearing. we need to do that and i would say sooner rather than later. >> i have to run to the same meeting. the sweet spot brings progressives and conservatives together and when you have a diverse group on this committee here as you might see, i'm to tell you we are serious about this. let's get together and the time is now before it gets out of control. i will yield back. >> i think we can get something done here in a bipartisan way. that is music to my ears and i
1:56 pm
think -- secular much. thank you very much. [no audio] i asked that it be entered into the record. >> i think you and i want to thank the witnesses. when there is a technology, it will be used, however the users seem to advance whatever the cost is without any public input or limitations and congress has not been paying much attention, so i want to say thank you for this and the work you have done. was great to be with you at m.i.t. and ie at see another compass member there
1:57 pm
as well. we have heard a lot of disturbing examples about mass data collection. that makes a lot of sense. , how would that curb the effects of machine bias?-- >> a consent to happen because oftentimes the technologies being used without our technology. storiesbuys in the love were people say i did not even know the technology was being used. the consent matters when you're thinking about a case in brooklyn were the system is being installed against their will, but at least you have public defenders who are coming biasso regardless of the
1:58 pm
and how the technology works, there shouldn't be a choice. secondly, we need to know how well the technology works and what research has shown is that standards for technology are not even reflected of the american people, so we have to start their to make sure we have a baseline for what is going on and that there is continuous oversight, because regardless of accuracy, these systems can be abused in all kinds of ways. in your report, you said that she cited an example where they were looking for a suspect who they use the celebrity woody harrelson to get the identity. this allowed to happen?
1:59 pm
>> because there are no rules and no transparency to decide if there was [no audio] and to the court to say is this producing reliable evidence? defendants are not able to challenge it in court. you also use the sketches of art, so there will be an artist rendering of who they think it might be. how does that work with respect to protecting defendants? admittedic catches are in at least six or additions -- six or sections around the country. imagine if we had a fingerprint or where fingerprints
2:00 pm
the finger ridges and it would depend -- with a pen. that would be a reason for a mistrial. again, if the technology that racial bias was eliminated, would you still recommend mandating affirmative consent? >> even if you improve the racial biases, there's a case they searched by their skin type, clothing, so you can also automate the racial profiling even if you made these disparities go away, which right
2:01 pm
now the evidence does not support, sup yes you would still need consent in the use of the technologies. >> thank you. >> thank you very much. thank you for holding this hearing today on racial recognition technology. the subcommittee held a hearing on artificial intelligence that i wrote. we discussed issues of violence -- of violence -- bias. facial recognition is used more more, it is vital that this technology not perpetuate real-world biases that harm communities. what are some of the benefits that facial recognition technology can provide? is the use of facial
2:02 pm
analysis technology in health care. could we spot something like a stroke or heart disease or other things that might actually be perceptive from this? that is the promise and often times i see the promise is not met by the reality. you have research coming out from the university of toronto that shows even for health care systems, facial analysis technology, you are starting to when you are at age or someone has dementia versus someone who does not. research can continue to explore, but until we show it actually makes a promise, you should not be used. >> in her testimony, he facialed the lack of recognition technology. have you believe this should be
2:03 pm
conducted? we have tog knowledge is we are looking at technologies, one metric, accuracy is not enough. howonly do we want to know accurate this system might be, but we want to know how it failed. national, we have the institute for standards and technology. they are voluntary test, so that that is one agency figuring out the necessary metrics that could come in place. what we are seeing is a way the system is tested is very limited. canntioned earlier on, we actually have a full sense of .rogress way in to change the which we invite way facial
2:04 pm
technology so we truly understand who it works for and collect sales on. >> i'm not sure how much time i have. should the use case dictate the level of maturity and should the government have different standards? >> i definitely believe the use case matters. if you are using facial technology for snapchat, that is different. the use case absolutely matters. >> you mentioned a study where the author has recommended facial recognition only be used to identify individuals already detained by law-enforcement under policies of criminal misconduct. you believe this is an inappropriate use and are there any other safeguards you would
2:05 pm
like to see implemented? >> i do. we talked about having legislation, 30 think -- or do you think -- who else do you think should be at the table? >> i believe it is up to communities to decide and take a close look at how this technology is being used and the limitations and decide whether the risks outweigh the benefits. in my personal view, i think sun communities will cannot differently and we'll say there are instances where law-enforcement need to know who is in custody, fingerprinting has failed, that may be an appropriate use of the technology. -- thatdecision
2:06 pm
decision is to be made by law-enforcement and not lawmakers. developer oflogy the software needs to be at the table, public safety needs to be at the table, aclu needs to be at the table and other legal persons as well so that if we are going to utilize this technology in public safety and law enforcement, i think one thing needs to be made clear to these software manufacturers is that if you're going to develop this technology, it is going to by the be a standard scientists and those that are here. it needs to meet that standard. if it cannot meet that standard, then there's no place for it in our society. police need to be at the table so that if your jurisdiction decides to acquire this technology, you will be held to
2:07 pm
a standard as well. not just training, but the way you apply it, how it is applied and you will be responsible to sharing in your local community how this technology was developed, why was developed and the use of it and also where it may not be as effective as we think because this is a huge, very complicated piece of technology that may have some benefits that you just heard, but also have a significant amount of challenges attached to them. >> are there an amount of hours that police and law enforcement are trained right now or is it just hit or miss? >> i cannot say that specifically. we know there are agencies out there right now and i have the persons who can certainly attest that this technology is
2:08 pm
introduced and there is very little training and certainly, no policy. when you ask those, tell me about the technology and how it works, they can't do it and that is a real problem. to miss miller, is shame, the it is a thought of people being arrested losing their jobs and everything based on errors. that is the problem. one of the things that question is congressman john lewis were mistaken word -- were mistaken with each other. if i go out there right now, there will be five or six people out there who call me john lewis and i've had them in my district
2:09 pm
where i live. they called him me. that is a problem. is john lewis because of the name. you to all of you for being here today. as technology continues to evolve and grow, the question of proper use for facial recognition in our society is becoming increasingly important. out are we how far from having facial recognition technology that is 100% accurate on photos of people in all demographics? i cannot speak to where the technologies that, but i will say based on how law-enforcement agencies using technology, it doesn't matter how good these algorithms get, if the images -- imagesality
2:10 pm
submitted are low-quality. >> other any cities that are deploying real-time face to surveillance?face >> we have seen chicago acquire it. they do not use the capability. about a handful of other agencies in los angeles, west virginia have either purchased or piloted the technology as well. >> are there any federal agencies to your knowledge that utilize facetime -- real-time face surveillance -- surveillance? >> we know the cap the acquired
2:11 pm
for using amazon recognition which is the thing capability as orlando. faceh is the same- capability as orlando. [no audio] >> i think one of the concerns is we don't have a handle on the cases. number one, we are not seeing matchesulting in false and when defendants are given all the information. [no audio] >> including the use of space
2:12 pm
technology. -- face technology. those defendants were not told that face recognition was used. a vast majority of them were not. >> i have had many people tell me they have seen it somewhere else. what were the federal standards and what did they look like? >> one of the questions we should be asking is a lesson basis. it raisesed about how fundamental concerns with regard to first amendment rights. there are very real risks with
2:13 pm
the technology and should be asking if there are other alternatives that are less invasive and less concerning. [no audio] >> because of two short risk provides good guidance for prohibitions that violate the fourth amendment. >> that was my next question. thank you. i yield back my time. >> thank you, mr. chair. amazon can scan your face without your consent and sell it to the government all without our knowledge, correct? >> yes.
2:14 pm
i would like to seek unanimous consent on how amazon met with facial skins over a recognition system that could identify immigrants. it is not just amazon that is doing this. microsoft, ak, very large amount of tech corporations, correct? >> correct. >> do you think it is fair to say americans are being spied on on a massive scale without their knowledge? >> i would make a distinction between facebook and other countries are doing. we need more specifics on this. i would say most systems operate on the mug shot databases, so information that has been collected by agencies rather than companies. >> mr. ferguson, what are the
2:15 pm
prime constitutional concerns about the use of special recognition technology? -- companies, governments, agencies, can essentially use or biometric data without your consent and this is outrageous because this is america and we have a right to privacy. isn't that right? >>1 yes. yes. >> what was a supreme court that that identified the right to privacy. >> i do not remember the specific case. >> was there a landmark supreme court decision that established that recently? >> we have seen the carpenter case where the court said it was
2:16 pm
unconstitutional to warrant is -- you cannotssly search without a warrant. >> more specifically, it was roe that established the right to privacy. >> that was addressed as well. >> bat right to privacy was alluded partly in that case. it is not give a right to my uterus, a gives a right to my hands, shoulders, knees, toes and my face and in some ways, part of the case, although it is not all of it, in our right to here that also see this is about our entire body
2:17 pm
and the similar principle that keeps a legislator out of my keeps anhe same that algorithm out of our faces. do you think it is fair to draw the connection? >> i think when we are talking about privacy, it is important to think about more than our face. we are seeing the fbi is voice-recognition, all by metrics that race and the descent concerns that is talked about in the panelists today. >> i heard your opening statement and we saw the are effective to different degrees, so are the most effective on women? >> no. >> most effective on people of color? >> no. >> people of gender expression? >> no. areost -- what demographic
2:18 pm
whatmostly affected on -- demographic are the most effective on? >> whiteness. > men. demographic that developed this and they are trying to impose it on the entirety of the entire country. -- >> do you think this could exasperate the inequalities in our criminal justice system? >> it already is. the propensity for misidentifys to black individuals or brown individuals and you have confirmation bias where if i
2:19 pm
have said to be a criminal, i am casetargeted, so there's a with an 18-year-old african-american man who was misidentified it in apple stores as a thief and he was falsely arrested multiple times because of this misidentification and you have the case where you are thinking about putting facial technology on police body cams that can be used to confirm the presumption of guilt, even if that has not been proven because you have these algorithms that we already have sufficient information showing [indiscernible] communities of color. i guess this could be for anybody. china makes a lot of use of this technology, correct?
2:20 pm
>> could you let me know to what degree it exists in china and i believe they write about selling this technology to other countries. >> we probably don't know the full agree, but we do see capabilities of the government is attempting to enroll all to be ablel citizens to effectively identify at a given time, in addition to classify people by the characteristics of their faith and who is a weaker muslim majority. -- muslim minority. >> have they tried to sell it to other countries? >> yes. >> i'm not sure what other countries are showing it to, but what are the benefits that they describe the technology as being used for? >> china has created a true
2:21 pm
surveillance state where they are able to use hundreds of millions of cameras, artificial intelligence masking to identify people on the streets and so forth or karen governments that is attractive. >> why is that attractive? out to have rolled it prevent jaywalking. , you willwalk across be shamed for jaywalking. >> as i understand, it is used to monitor how people the war people who may think one way or the other. >> it is being used to surveillance religious see how their way they think or react. that is a lot of human rights abuses we see coming out.
2:22 pm
we have placed them so we know exactly what is going on .here and they sell this >> i think they have a free market in the economy, so it is a wealthy society, but they have complete control on what they can do. >> the scariest thing is that the technology could be rolled out, but there's no law saying that it could be. this is a big part of the form of government to other countries. areou know any other that
2:23 pm
biting at this or starting out? this is a multibillion dollar and there's a tremendous investment by the chinese government in improving it. >> you know if other countries have had to test selling this to? >> there's an example where you provided they that government with the technology and it enables them to have , so you see the emergence of data colonialism. so they are telling zimbabwe they can do what china does. is something that is viable, which is the dark skinned faces so they can train the system
2:24 pm
where they can then say it -- sell it to the u.s. as i understand it, the clear goal of a government using these , as we have micro aggressions, as we begin to have politically incorrect pandering places, a gun show or something. is it something we should fear that the government would use to identify people who have ideas that are not politically correct? >> ball for the agencies themselves have expressed this concern. -- law enforcement agencies and sells have expressed this concern. as-a-it could be used form of control. this is something environments
2:25 pm
themselves have recognized. >> thank you for having hearing on this topic. this is very important. >> thank you. last week, san francisco became the first city to ban the use of the technology, but a similar ban is he considered in my district in massachusetts and it is the first on the east coast to propose such a ban. the technology has been used by law enforcement since 2000 x, but there's concerns about due process. i believe federal agencies should not use the technology without authorization and these are a perfect example as to why. thisnies have been pushing despite knowing that it only works 30% of the time.
2:26 pm
it underscores the need for congressional oversight. proud thatay i'm so they are in the massachusetts seven. i'm so glad that you call your home. , youticle last year described why these in a procedure exists and referred to the [indiscernible] >> you might have heard about gays.hite and these are descriptions of who has the power to decide and so when i talk about it, i'm invoking the male gaze and the white gays and it is a question and preferences are shaping the technology we
2:27 pm
are seeing, so right now, the way i look at ai, we see this that it is mainly male and doesn't represent the majority of society. report,ur georgetown you found there is no independent test is this still the case today? looking at the differential error rate between race and gender in their studies, they have yet to look at intersectional consequences, but they are starting to look. are theretive: measures developers of the technology can take now to increase accuracy of racial recognition systems? ms. buolamwini: it tends to be
2:28 pm
around the kind of data that is used to train the systems in the first place, but we have to be cautious. even if you make an accurate facial recognition system, it can and will be abused without regulations. you raised the example of willie lynch, who claims to have details of a facial recognition algorithm that led to his arrest and conviction. in his case it was a poor-quality photo. can you talk about challenges individuals face and rebutting a facial recognition match? you can put an eyewitness on the stand, you can raise questions about their eyesight, how far away they were, whether they were intoxicated at the time they made the identification. it is different with facial recognition. people assume it is 100% accurate. a lot of individuals are not able to get information about the reliability of the
2:29 pm
algorithm. the case you referenced is an ongoing case where willie lynch is fighting to get information about an algorithm that could be used to challenge his conviction. representative: do you believe the fbi or other law enforcement agencies have adopted sufficient face guards -- sufficient safeguards to prevent these abuses? >> absolutely not. additional oversight is needed. when the fbi rolled up the system they made promises about accuracy, promises about promises about protecting first amendment rights. and years later a lot of those promises have been a broken -- have been broken. the agency has not acknowledged responsibility to test their systemsused -- test created by external partners. those things are cause for concern and should cause us to question whether the systems
2:30 pm
should still be operating, given the lack of safeguards. representative: the supreme court recognized recently a permit does not's -- a person does not surrender forth amendment rejection by venturing into the public sphere. facial recognition shatters expectations americans have that the government cannot track our movements without suspicion and a warrant. it is difficult to comprehend the impact such surveillance would have on our lives. with facial recognition technology deployed throughout a city, anyone with access could track a person's associations, religious, medical, recreational activities. access tovernment's factdata have a chilling on first amendment and other constitutional activities and rights, such as gun ownership and free exercise of religion,
2:31 pm
freedom of speech and of the press end of the right of people to peaceably dissemble -- to peaceably assemble? and how could data from facial recognition be weaponized by the government against activities protected by the constitution? yes. your first question, this is something law enforcement has acknowledged. mere threat or fear of monitoring or identifying every single person at a protest are a aroundparticularly contentious or disputed concepts , could cause people to stay home, to not have those conversations, to not engage in those discussions that our -- discussions that are so valuable for participation in an active democracy. ofwe have seen examples requests for facial recognition
2:32 pm
without cause. in vermont there was a request for a facial recognition match even though the individual was not suspected of a crime. they were the girlfriend of a fugitive. there was another case where the basis of the request was somebody asking concerning questions at a gun store, without allegations they had committed a crime. that speaks to your concern, we all want to live in a world where -- we all don't want to live in a world where we can be identified secretly and on a massive scale. that is what facial recognition allows. we requireive: if law enforcement to run a search on facial recognition data from surveillance cameras, would it possible for cameras to use facial recognition technology in public areas without effectively gathering or discovering information on innocent people who are not subject of an investigation? >> now. that is not the way the systems work. in order to identify the face of a person you are looking for, you have to scan every face of everyone else who you are not
2:33 pm
looking or. >> that is correct. even a probable cause standard may not be enough. you have to take steps to minimize if you are going to do this, which is why probable clause -- which is why probable cause plus or something higher should be part of legislation on this. dangerntative: what could access to this data pose to the rule of law and keeping our government free of corruption? could abuse of facial recognition give individuals the ability to influence political or business decisions or to unfairly benefit our target such decisions? does it need increased protections to make sure these abuses don't occur? -- the riskis abuse of abuse is substantial. number one, the technology is cheap. you can run thousands of searches for a handful of dollars. number two, it is being done secretly.
2:34 pm
individuals don't necessarily know and can't raise concerns. at three, it is being done on a massive scale. we talked about access to driver license photos on the extent that affect everybody in those databases. we are getting to a point where virtually everybody is going to be in a facial recognition database, which gives the government enormous power. we need to think about those concerns before moving forward with this technology. identifying somebody just because they show up in public with a camera present could be used, and it can and will be used in the absence of rules. chairman cummings: thank you very much. miss gomez. want people to i imagine they are driving home from work, and see in the rearview mirror red and blue lights. they have no idea why the police
2:35 pm
are behind them. they were not speeding. they did not run a stop sign. but they pullover like everybody should. a voice over a loudspeaker commands them to exit the vehicle, and as you do you see police officers, guns drawn and pointing right at you. mistake, aove, a misunderstanding, a miscommunication, can mean the difference between life and death. that is what is called a felony stop, one of the most common, high-risk situations police find themselves in. and it all started earlier in the day when a police officer ran the search for a facial recognition database, and it incorrectly identified you as a violent felon, and you had no occurred.that even that is one scenario i think about when it comes to this technology, one of the many
2:36 pm
things that could go wrong. i was not even paying attention to this technology until i was misidentified last year during the aclu test of members of congress. it did really spark an interest and curiosity in this technology, and it did feel wrong deep in my gut that something is wrong with this technology. i started looking into it. since last year i have had nine meetings, my office has had nine meetings with representatives from amazon. we ask questions of experts across the spectrum, and my concerns only grow day by day. year,february of this amazon had not submitted its controversial facial recognition technology to a third-party institute known as nist. a january 2019 blog post,
2:37 pm
amazon stated amazon recognition can't be downloaded for testing outside amazon. in short, amazon would not submit to outside testing of their algorithm, despite the fact amazon had not submitted his facial recognition product stillside testing, it sold that product to police departments. in 2017 police in washington county, oregon started using amazon facial recognition technology. do you think third-party testing is important for safe deployment of written -- deployment of facial recognition technology? ms. buolamwini: absolutely. one thing we are doing at the algorithmic justice league is testing these companies when we can. we are only able to do the tests for the output. so we don't know how these companies of training the system, we don't know the processes in place when they are selling the system, all we know
2:38 pm
is what we test on our more inclusive data sets what are the outcomes. so we absolutely need third-party testing a need to make sure the ends -- the national institute for nash -- the national institute for standards and technology, nist, are comprehensive enough. 85% male and 75% lighter skinned, so even when nistnies like microsoft test their, even when we have those results, we need to see what data set is being evaluated -- set is being evaluated on. gomez: if it is a data set that is incorrect it is going to lead to incorrect results. 2014 facebook: in reported 90% accuracy on the gold standard benchmark at the
2:39 pm
time. but when you look at that benchmark it was 70% male and around 80% white individuals. or over 80%. so you don't know how well it actually does on people who are not as well represented. gomez: whatve organizations are equipped to accurately test facial recognition technology? ms. buolamwini: the national institute of facial technology is doing ongoing testing, but they need to be better. representative gomez: this is a major concern. you are seeing both parties and across the ideological spectrum showing reservations about the use of this technology. i am glad to see the chairman of this committee look at this issue, and i look forward to the next hearings. i yield back. i expect wemings: are going to get out some legislation on this.
2:40 pm
i talked to the ranking member. there is a lot of agreement. do you have an, all-out moratorium and at the same time try to see how this process can be perfected? you are absolutely right, there is a lot of agreement here. thank god. with little to no input, the city of detroit created one of the nation's most pervasive and sophisticated surveillance networks with real-time facial recognition technology. the system is tracking residents with hundreds of public and private cameras at parks, schools, immigration centers, gas stations, churches, health centers and apartment buildings. in the 13th includesonal district
2:41 pm
people who bear overwhelming burdens, and policing in our communities has become more militarized and flawed. we have for-profit companies pushing so-called technology that has never been tested in communities of color, let alone been studied enough to conclude it makes our communities safer. dr. alexander, have you seen police departments develop their own approval processes concerning the use of facial recognition? if so, how are these policies determined? >> dr. alexander stepped out, but i could take a stab at answering. a research includes forays to couple of hundred agencies. we have seen some develop policies. detroit does have a policy around facial recognition. concerning late, that policy states their face surveillance system may be expanded beyond
2:42 pm
its current conception, to drones and body cams as well. so it is not uncommon to see policies saying, there might be some restrictions but also affirmatively encouraging the use of the technology and reserving the right to use it far beyond existing capabilities. know iftative: do you these policies include justification on the need for the program? policy for that detroit does it prevent them from sharing this data or information with any federal or state agencies? most policies i have read don't provide an initial justification beyond that it is a law enforcement investigative to. i would have to get back to you on what detroit's policy specifically says. i don't recall any language either prohibiting or allowing that. seeingntative: are we
2:43 pm
any uniformity in these policies across law enforcement agencies? >> now. representative: have any of these policies proven to be effective? we all agree there are too many flaws for to be effective, correct? >> correct. and one of the problems is, most of these policies are not available to the public. we are seeing far more policies now thanks to foia policies add foia litigation to try to get lapd, nypd, but other jurisdictions tell us they have no record even though they may have records or systems. there is a fundamental lack of transparency around this as well. representative: if i may submit for the record, an article about researchers alarmed by detroit's facial recognition program, talking about little to no transparency. chairman cummings: without objection, so ordered. representative: we have heard
2:44 pm
organizations advocate for police oversight or judicial approval for the use of facial recognition technology on a case-by-case basis which would require a probable-cause standard. professor ferguson, can you speak to the benefits of this approval process? professor ferguson: there would be at least a check if there was a probable cause standard. with the danger of facial recognition it might be need to be higher than just probable cause. you would take care of the minimization requirements, you would be certain of the data and what happened to it, but it is a check. right now this technology is being deployed without a check in most cases. having federal, state and local legislation on it and having a third-party check of a judge would be an improvement. some argueive: probable cause plus would be to
2:45 pm
burn in some -- too burdensome for law enforcement officials and would not allow them to move quickly. do you agree? >> now, i don't agreed is that hard. if you have probable cause for a crime and want to search a database, judges can now get warrants electronically on their ipads. i don't think it is much of a burden anymore. representative: i couldn't agree more. a person's freedom is at stake. lynch: let me congratulate you on an excellent hearing. i want to thank the witness. you have done a tremendous service to this committee. i am sure your input will be reflected in legislation that comes out of here. i would like to ask unanimous consent to submit a massachusetts senate resolution
2:46 pm
and a massachusetts house resolution, legislation to put a moratorium on facial recognition in my state of massachusetts. thank you. i read this excellent book about , "surveillance capitalism." it really changed the way i look at all of this. i am the chairman of task force in congress on the financial services committee and she did a wonderful job with the book. and i believe that after reading this, our focus today just on onial recognition and just law enforcement's use of this and just public too narrow. is far
2:47 pm
we have about 250 7 million smartphones in this country. 100 million are iphones. agree, whenlick, i we use those apps, even though the average american spends about 14 seconds reading that agreement, and we click, i agree. so what we don't know is that when we click i agree, it allows apple, google, facebook, to use and share, to sell all our information. so they track not only what we look like, but who our friends are, where we go every day, everyng our motions, selfie we take that gets uploaded, every book we have read, every movie we see, how we drive. if youe has an app where
2:48 pm
let them track you and you don't drive crazy, they will lower your rates. my the internet of things, coffee maker and my toaster are hooked up to the internet. i'm not sure i need that. my vacuum cleaner, although i don't use it often, is hooked up to the internet of things. my iwatch. i'm a victim here. i have everything going. [laughter] the problem is, we have total surveillance and we are looking at this one little piece of it, facial recognition. i worry that we are missing all of the other dangers here just because it is not the government , it is facebook, google, microsoft, apple. opn gathered all our information here in congress and then they were hacked.
2:49 pm
they did not encrypt social security numbers or anything. we lose to the chinese. they got all that information. and so now we are allowing facebook toe, maintain these huge and granular descriptions of who we are, and they have been getting hacked as well. is there a bigger mission here we should be pursuing in terms right toelieve in the be forgotten. i believe in the right not to be surveilled. i believe there should be sanctuary for us, that we don't have to be surveilled all the time. is that something you think about as well? ms. buolamwini: absolutely. the one thing i can credit the massachusetts bill for addressing is, instead of just saying we are going to look at facial recognition, they talk about biometric surveillance, so we are talking about voice
2:50 pm
recognition, gait analysis, anything that is remote sensing. we need to talk beyond facial analysis technology? absolutely. let's look at self-driving cars. a study came out of georgia tech showing that for pedestrian tracking, self driving cars were less accurate for darker skinned individuals than lighter skinned individuals. so when we talk about human centric computer vision, it is not just facial recognition that should be concerning. many organizations have called on congress to pass baseline consumer privacy, -- privacy legislation to put guardrails on how companies deal with your private data including biometrics. this is very much needed. that legislation has to include not just protections, but real enforcement, so when your dater is misused to have an opportunity to go to court and get accountability.
2:51 pm
representative: mr. chairman, i want to thank you and the panel. this has been terrific. i don't know if it is true or not, but i think we are part of the same book club. i think i suggested the book to mr. lynch, and it is a fabulous book, "the age of surveillance capitalism." i was also misidentified. i was hoping i would be misidentified as george clooney, just by my perception of myself. i want to talk about this as a representative. denial is not a river in egypt. i have watched tech companies transform and i include amazon as part of this culture.
2:52 pm
i have listened to them talk about disruption. if i have questions as an elected official i was inhibiting innovation. and havingspective, met with ceo's, one of them once told me he didn't want to deal with people like me, and i laughed because what he was expressing was that people from the government were slow, didn't understand how this culture was transforming the world. and i really think they believed that, they bought into it. it is not about the billions of dollars they make. and my argument, including with amazon, it would be nice if you tried to work with us to deal with societal impacts, because you have to deal with them one way or the other. but the problem now is, when i listened to the talk of supreme court cases, i think of louis brandeis writing his paper on privacy and convincing other justices like oliver wendell holmes that they were wrong
2:53 pm
about it. he said americans had the right to be left alone. how can we say that any longer? none of us are left alone. "forbes"a story in this month about eco and alexa listening to children. think i am encouraged by what i have heard in a bipartisan way today. we need to stop. this has gone too far. we are not starting at a metric where we are just beginning the deployment of this. it is already deployed. and to mr. lynch's comment, it is being deployed for everything we do. and there are benefits, and we can see that, but we need a timeout societally, as europe has led us on, to say no. the example in san francisco is interesting, knowing people in the local government in san francisco. when scooters were trying to get
2:54 pm
permits, it was great what san francisco did, the hub of innovation disruption. the two companies that came and asked for permission were the ones that got permission. the ones that didn't were horrified when they were told, we are not going to give you permits to use these. you should have come in and the first place. and i think we have a responsibility to be more responsive, but they are not even coming halfway. ais is a moment for us in bipartisan way to say, stop. there are benefits to species in this planet but you need we have input, and already lost our right to be left alone. they have to factor taken that away from us. ms. giuliani, could you respond? the cultural attitude they are taken, and they will apply
2:55 pm
politically vis-a-vis campaign and other things, they really believe they have done no harm and you and i are in the way by raising questions about how they deploy this technology. thing we haveone seen with private companies is that they are actively marketing some of these uses. amazon was pushing some of the most concerning uses, face recognition, body-worn cameras, opening the possibility of a near-surveillance state. they are not passive actors in the system and should be forced to take responsibility. that should include questions about how accurate their technology is, are they disclosing problems and risks, and are they saying no when they should say no? of the lawes to some enforcement uses that aren't life-and-death scenarios, these companies should not be selling to the government in those scenarios, given the risks we've talked about. lack oftative: given a
2:56 pm
regulatory enforcement, how do we provide civil enforcement for laws that are not being enforced right now, in my view? >> there are questions about whether they have been honest in isir disclosures, and there investigation and analysis into that. from a regulatory standpoint it is up to congress and others say now, we are going to hit the pause button. let's not roll out the technology before we understand the harms and think about whether there are frankly, better alternatives that are more protective of privacy and ones that may also have law enforcement benefits. representative: i ask unanimous consent to enter this letter from information technology and innovation foundation into the record. chairman cummings: so ordered. upresentative: if i can pick
2:57 pm
, theere we just were ubiquity of this technology strikes me. maybe we have already mostly lost this battle. airports increasingly are using facial recognition technology to process passengers in a more expeditious way, correct? cbp have introduced face recognition plans. they have not done rulemaking. some of their plans go far beyond what congress authorized. is clearative: technology already in airports? >> yes. representative: so we already have it. are you aware of restrictions on that private company in terms of how it uses whatever data is collected? representative: -- >> i don't know that company specifically. with regards to airport use,
2:58 pm
there are a lot of concerns. for example, requiring americans to provide biometric, their face, and questions about whether there is a way to opt out. i tried to opt out. it was not an easy process when i was traveling internationally. there are questions about the buildout. it has been presented as, we are using this to make travel faster. but when you look at some documents, some of the use cases are far beyond that, to find people of interest, whatever that means. think that is: i something for further examination as well. what restrictions exist on private companies using this technology? constitutional point of view, what restrictions can there be, or should there be? that is worthy of examination as well. let me ask about a real-life example.
2:59 pm
has agreementsly with various states in terms of driver licenses, including states that use facial recognition technology for their drivers license. that that is regulated at all. i don't know that the average citizen getting the driver license or getting it renewed understands that they have passively agreed to allow that piece of data to go to a federal law enforcement agency to be used however they apparently deem fit. i wonder if you would comment on at one point, the faa was urged to determine whether external facial
3:00 pm
recognition systems are sufficiently accurate to be justified for fbi use, and whether they would agree to limit it if it wasn't. and the fbi actually refused to do that, raising questions in terms of misuse or misapplication of the technology. what about the fbi and that relationship with states? should we be concerned? for thea good question fbi but it's a good question for local, state and federal law enforcement. community,ic safety we exchange information back and forth with each other on a constant basis. and in this particular case you are referring to, that would not be unusual. that this hasw is been very much unregulated without any oversight whatsoever. and in light of the fact we are looking at technology that is
3:01 pm
questionable, it is raising concern as we continued this afternoon in this hearing. that is part of what has to be , and further questions have to be asked from the federal, state and local level in the sharing of information that is very sensitive and very questionable when it comes to our constitutional liberties. that does raise a great deal of concern. and that is part of the complexity in this, because for me to be effective as a chief of police at a local level, i'm dependent on my state and federal partners, and vice versa. because we have seen benefit in much around facial recognition technology but just
3:02 pm
being able to share and communicate with each other. use we it comes to fbi should be concerned. these are systems that have been in place for years. not evenbi is acknowledging a responsibility to fully test the accuracy of systems it is using and relying on. that builds to a larger question of, do we really want this massive database of all of our faces? . we are approaching a place where virtually every adult will have their face in a system that can be searchable by the fbi. in the state of virginia, i don't want my face that is on my driver license and any database. [laughter] representative: the witnesses ofe described a technology
3:03 pm
potential totalitarian surveillance and social control. thank you for calling this extremely important hearing, and as chair of the civil rights and civil liberties subcommittee, we will work with you to follow-up and make sure we are looking at all the dimensions of our privacy that are threatened by this and similar technologies. i want to thank all the witnesses for their testimony. ferguson, back in the day we wrote a book together. that was back in the days when i wrote books, today i write tweets. i'm glad you are still writing books and articles. i know something that has interested you a lot is the right to protest and petition for redress of grievances in the district of columbia. since 2017i've been to a lot of protests here, the women's march the climate march, the science
3:04 pm
march, the march for our lives, and on and on. and i'm wondering, if people knew this technology were being deployed by the government that they were being photographed, what effect do you think that would have? it would fundamentally undermine the first amendment done the right of free expression and freedom of association. it is chilling and a problem and it needs to be banned. >> i couldn't agree more. the last thing we want before someone goes to a protest to exercise their constitutional rights is to think, am i going to have my face scanned? china seems to be taking a step that needs not be taken by our society. it has been leveraging facial recognition technology for a social scoring system. the new york times says, beijing
3:05 pm
is embracing technology and artificial intelligence to identify and track 1.4 billion people. it wants to assemble a vast national surveillance system with crucial help from its thriving technology industry. we are now seeing that most companies that develop facial recognition systems offer also real-time software. do we know how many are selling technology to government doctors in the u.s.? >> most if not all companies that market face recognition to law enforcement in the u.s. also advertise the ability to do face surveillance. we have no idea how widespread this is thanks to a fundamental absence of transparency. we have limited visibility into what chicago, detroit, orlando, the secret service in washington , and new york are doing, next to foia records and
3:06 pm
investigative journalists' work. but for a vast majority of jurisdictions we have no idea. representative: so you can't estimate how many communities are deploying this technology now? >> no. we can estimate that facial recognition used as an investigative tool and potentially a surveillance to is accessible to at least a quarter of all law-enforcement agencies across the u.s. that is a conservative estimate based on 300 or so records requests, where there are 18,000 law enforcement agencies across the country. makesentative: you powerful arguments in your call for a moratorium on the use of this technology. what objections would you anticipate from people who say there are legitimate law enforcement uses that are helping to solve cases and
3:07 pm
identify suspects? >> that is the objection, that there is this hypothetical, good-use case. but we have to look at the reality. in the united kingdom they have metrics,performance false-positive match rates over 90%. so the promise of security versus reality doesn't match up. representative: and that is positively dangerous, because you are violating somebody's civil liberties and you are leaving the real criminal out there at large because you chose the wrong person. >> true. i also wanted to touch on your point about 1.4 billion people surveyed in china. facebook has two point 6 billion people. as representative lynch spoke to, it is not just state surveillance we have to think about. we have to think about corporate surveillance. facebook has a patent where they say because we have all these face prints collected often without consent, we can give you an option as a retailer to
3:08 pm
identify somebody who walks into the store, and in their patent they say we can give that face a trustworthiness score. and based on that trustworthiness score, we might determine if you have access or not to a valuable good. representative: facebook is selling this now? >> this is a patent they filed and something they could do with the capabilities they have. as we talk about state surveillance, we have to absolutely be thinking about corporate surveillance as well and surveillance capitalism. representative: and they would sale of that is built into whatever contractor licensing agreement that people send 10 seconds signing off on when they set up a facebook account? ms. buolamwini: you would not have to consent. somethingtive: interesting just took place. a number of us signed a letter expressing concern about the
3:09 pm
chinese government-owned and controlled businesses getting contracts to run subway systems in america, including in the nation's capital. there are a lot of civil liberties and national securities concerns raised by it. i want to introduce a front-page article from "the washington post." gop leader mccarthy blocked bipartisan bid to limit china's role in u.s. transit system. chairman cummings: i want to thank all of you for an excellent presentation. i think you really laid out the problem. tell me tell my staff, what, so watch, and now what.
3:10 pm
professor ferguson, thank you. the aclu released documents revealing the baltimore county police department, which is part partnered with a private company to identify individuals protesting the shooting of freddie gray in 2015. "e company stated, quote, police officers were even able to run social media photos through facial recognition technology to discover rioters with outstanding warrants, and arrest them directly from the crowd." to be clear, police may have used vasil -- used facial
3:11 pm
recognition technology on citizens' personal photos from social media to identify and arrest them while they were exercising their first-amendment right to assemble. crowd,one who was in the i find this discovery to be very disturbing. professor ferguson, the constitution and down south with the right of people to peaceably assemble. of you, how would widespread police use of facial recognition technology at protests effect a citizen's right to peaceably assemble? >> it is fundamentally american to protest and fundamentally un-american to chill that kind of protest. baltimore is a great example of both the problems of public
3:12 pm
surveillance and then using third-party image aggregators like facebook and other social media groups to use -- to do that kind of private surveillance. both will chill future protests, it will chill the desire of american citizens to say that the government may have acted inappropriately, and it is a real problem which deserves congressional regulation. >> this is what i have been saying from the onset. if this type of technology is ethical,zed in an moral, constitutional type of way, it continues to do exactly what it did to you out there, congressman, and other people. it separates the community from its public safety. there is a lack of trust. there is a lack of legitimacy. there is a fear of you being a watchdog over me in a warrior
3:13 pm
sense, as opposed to being a guardian of the community. no one should have been subjected to what you just articulated. that is the wrong use of this technology, particularly when you have individuals, as was eloquently stated by mr. ferguson, who are just trying to exercise their first amendment right. and for people to be able to assemble and not be subjected to this type of technology, which -- used wrongly i baltimore wrongly by baltimore, i will say that publicly and privately, because they lacked the training and lacked the understanding of the potential for what this technology can do to harm their relationship with the community. that is a serious problem. these companies that develop this technology, they also have to be held responsible.
3:14 pm
and those police departments that acquire that technology from these companies have to be held accountable as well. but after listening myself to much of the testimony that has been stated here today, i came here in hopes of being able to say, for public safety, for law enforcement itself, there is good use of this technology. but i think a lot of things that we have talked about now have to be further discovered before we can continue with this technology. because my concern is a former law-enforcement official, i don't want technology being utilized by police that is going to separate it further from the community it already serves. if there is a benefit, let's find it, utilize it, keep our communities safe. because there is no exception and no shortcut around that.
3:15 pm
aairman cummings: we have had lengthy discussion here, but i have heard very little about court cases. i'm surprised i haven't. has thisbeen tested, been an issue and are there court cases with regard to this? the ranking to me member made good points, that have the fbi making agreements with police departments, nobody elected in the process, using this stuff. and you have the arguments you have all made with regard to the effectiveness of the machinery. what is happening on that front? appeals is no court of that has addressed the constitutionality of real-time facial recognition or matching against a driver license database. one big reason is, for a
3:16 pm
defendant to raise that challenge they have to be notified, and people are not being notified. so that is insulating this a judicialfrom review that is sorely needed. there are other bodies of case law, the carpenter decision and others, which are relevant and could apply to uses of the face recognition. but we need notice so these cases can come for the courts. without that it becomes difficult to have developed case law. >> one way people are getting notice is by having economic opportunity denied. had a man in missouri who submitted a case against uber technologies. the reason is because uber requires selfie verification to make sure you are the driver you say you are. and in his particular case,
3:17 pm
because he had to lighten his photo so he could be recognized, uber said he doctored the photo and unilaterally kicked him off the platform with no recourse. this was just filed. i don't know how it will go through, but the only reason the person knew was because they no longer had this access to economic opportunity. chairman cummings: how do you think the use of facial recognition technology to survey crowds is changing police tactics? one thing i noticed in baltimore is that they, i think they still do it, use a helicopter and take images. how does that relate to all of this? >> you're talking about drone technology, i would imagine, sir. a lot of this is in many ways in its early stages, but it goes back to the entire privacy issue.
3:18 pm
one of the biggest things i find from my experience is that when new technology is developed and we take that technology and introduce it into our communities across the country, we never tell our communities what it is, why it is being utilized and how it helps benefit public safety. what ends up happening is that people draw their own it, andons around suspicion of what work police are doing because oftentimes they were operate in -- oftentimes they operate in a clandestine sense. but as this technology continues to emerge, whether we are using air support with infrared cameras, drone technology, whatever the case may be, as we continue to evolve our technology, when that technology comes to my front door as a law enforcement official, i want to know all about it. i want to ask all the right
3:19 pm
questions. i want the type of people that are at the table with us right now to ask the appropriate questions. so we are going to advance this technology and be able to educate my community in terms of what it means and what is the benefit and what of the challenges associated with it. technologyat type of much better for people to digest and be able to understand. and as we run across problems that evolve as a result of it, we are able to work with our communities to help resolve those, even if we have to enact new legislation. chairman cummings: an hour ago you said you were not anxious to see a moratorium. it sounds like you may have changed that a bit. one thing i support, chairman, i support technology, but i support good technology and i support technology that
3:20 pm
, policies oversight written around it. i would rather not see a issuesium, however, if articulated here today are as serious as we believe them to be , we have to ask ourselves that question. but here is the thing we have to be cautious of if we are going to put a moratorium on this technology. haveo want to near what been the benefits, if any, if any. what have been the benefits, and how do we utilize some of those benefits in some constructive way until we work out the bigger problems around the issues we have discussed here today? i don't want to throw the baby out with the bathwater if there is some way in which this technology, which i'm going to make a reasonable assumption
3:21 pm
based on my own experience that in some ways it has been useful, but it is going to continue to harm the american people, then there is certainly something we need to consider, putting a pause to come if you will -- pause to, if you will. through this process of learning more and putting legislation around it. we have to resolve fundamental problems. how are we going to prevent this technology from having a disparate impact, either because of inaccuracy or existing biases in the judicial system? how do we prevent the buildup of a surveillance state, where there is a camera on every corner? firstprotect our amendment liberties and no one
3:22 pm
says to themselves, i can't go to this protest because i'm afraid my face will be scanned? we can't move forward with this technology into we can answer those fundamental questions. mr. chairman, thank you for holding this hearing. it is critical that part of our isration as government making government responsive and making sure we are ahead curve, so we are not operating out of a reactive space. there are forces out there, whether it is a corporation trying to make and squeeze a dollar out of every bit of information about your life, or whether it is foreign governments trying to hack these databases for that information as well, there are folks out there that know the kind of world they want to create that
3:23 pm
advances their interests. it is encouraging that this is a strong, bipartisan issue. whether we are concerned about this for a civil liberty reason, criminal justice reform reasons, right to privacy, this is about who we are as america and the america that is going to be established as technology plays an increasingly large role in our societal infrastructure. we have to make sure american values and our bill of rights and constitutional protections get translated to the internet and in the digital age. i want to thank our witnesses and our chairman for this hearing. we have to get something done. chairman cummings: and we will. looksentative: and i forward to working with our colleagues on the other of the
3:24 pm
island caucuses aligned around these basic principles. gomez: we are not anti-technology and we are not anti-innovation. but we have to be very aware we are not stumbling into the the sameind, and at time giving up liberties and protections we have all cherished, not only for decades but for generations. balanceways that between innovation and protecting individual rights and civil liberties. we get that. but this is an issue that must be looked up. i was never planning on working on this issue, the issue fan to me, thank you, aclu. i just got word shareholders did not end up, shareholders of amazon did not pass a ban on the
3:25 pm
sale of recognition, and that just means it is more important congress acts. thank you mr. chairman, for your leadership. chairman cummings: without objection the following statements will be included in the hearing record. facing the future of surveillance, the electronic privacy information center on the fbi next-generation identification program. onber three, a case study partnering to protect the public during freddie gray riots. i would like to thank our witnesses. i have been here now for 23 years.
3:26 pm
it's one of the best hearings i have seen, really. you were all very thorough and very detailed. members will have five legislative days to submit fortional written questions the witnesses to the chair, which will then be forwarded to the witnesses. i ask our witnesses to please respond as promptly as you can. again, i want to thank all of you for your patience. sorry we started late, we had some meetings that we have gone on for a while, but thank you very much. this meeting is adjourned. [captions copyright national cable satellite corp. 2019] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit]
3:27 pm
3:28 pm
3:29 pm
president trump released more tweets about why he decided to impose a series of tariffs on mexican goods beginning on june 10. the president tweeting, "people have been saying for years we should talk to mexico. the problem is that mexico is an abuser of the united states, taking but never giving. it has been this way for decades. either they stop the invasion of our country by drug dealers, cartels, human traffickers, coyotes and illegal immigrants, which they can do very easily format our many companies and jobs they have been foolishly allowed to move to the south of the border will be brought back to the u.s. through taxation. >> in april,


info Stream Only

Uploaded by TV Archive on