tv Hearing on Violence Extremism Digital Responsibility CSPAN September 20, 2019 5:50pm-8:02pm EDT
from their platforms. this is a matter of serious importance to the safety and well-being of her nation's communities. i sincerely hope we can engage in a collaborative discussion about what more can be done within the jurisdiction of this committee to keep our communities safe from those wishing to do us harm. today we welcome representatives from the world's largest social media companies and on line platforms. we hear from ms. monika bickert, head of the local policy management for facebook. and mr. nick pickles public policy director of twitter. mr. derek slater global director of information policy at google and mr. george salim senior vice president of programs for the anti-defamation league.
over the past two decades the united states has let the world in development of social media and other services that allow people to connect with one another. open platform like google twitter facebook and instagram and youtube have dramatically change the way we communicate and have been used positively in providing spaces for like-minded groups to comeed together and shedding light on abuses of power throughout the world. no matter how great the benefits to society these platforms provide important to consider how they can be used for evil at home and abroad. on august 3, 2019, 20 people killed and more than two dozen injured in a mass shooting at an el paso shopping center. police have said that they are reasonably confident that the suspect posted a manifesto to a web site called eight chan.
27 minutes prior to the shooting the moderators removed the original posts so users continue cheering copies. following the shooting president trump called on social media companies to work in partnership with local state and federal agencies to develop tools that can detect mass shooters before they strike. we will talk about that challenge today. the el paso shooting isn't the only recent example of math violence with an on line dimension. in 1951 people were killed and 49 injured in shootings at two mosques in christchurch new zealand. the perpetrator used to buy the camera and livestreamed the footage to his facebook followers. he began to reload the footage
to facebook and otherer sites. access of the footage quickly spread on facebook and are removed 1.5 million videos of the massacre within 24 hours of the attack in 1.2 million views of the videos were blocked before they could be uploaded. like the el paso shooter at the christchurch shooter uploaded a manifesto. the 2016 shooting at the pulse nightclub in orlando florida killed 49. the orlando shooter was reportedly lysed by isis and other jihadist propaganda through on line sources. days after the attack the fbi director stated investigators were highly confident that the shooter was self-radicalized through the internet. according to an official involved in the investigation analysis of the shooters like tennis devices revealed that he
had consumed quote a hell of a lot of jihadist propaganda unquotets including isis beheadg videos, shooting survivors of family victims brought a lawsuit against those three social media platforms under the act on terrorism act. the sixth circuit dismissed thet lawsuit on the grounds that this was not an act of international terrorism. with over 3.2 billion internet users this committee recognizes the talent facing social media companies and on line platforms. their ability to act and remove content threatening violence from their sites, these are questions about tracking of users on line activity.
the system date an individual's privacy tort due processua or violate constitutional rights click the automatic removal of threatening content may impact and on line platforms ability to detect possible warning signs. the first amendment offers strong protections against restricting certain speech. this undeniably adds to the complexity of our task. i hope these witnesses will speak to these challenges and how their companies are navigating through these challenges. in today's internet connected society misinformation, fake news, deep fakes in viral on line conspiracy theories have become the norm. tyis hearing is an opportunity for witnesses to discuss how their platforms go about identifying content and material that threatens violence and poses a real and potentially immediate danger to the public.
our witnesses will also discuss how their content moderation processes work. this includes addressing how human review or technological tools are employed to remove or otherwise limit violent content before it is posted and disseminated across the internet communication with law enforcement officials at the federal state and local level is critical to protecting our neighborhoods and communities. we would like tone know how companies are coordinating with law enforcement in violent or extremist content is identified. finally i hope witnesses will discuss how congress can assist in ongoing efforts to remove content promoting violence from on line platforms and whether best practices for codes ofry conduct in this area would help increase safety both on line and off-line. i look forward to hearing testimonies from our witnesses
as wee engage in a constructive discussion about mutual solutions to a pressing issue andan i'm delighted at this poit to recognize my friend and ranking ranking member senator cantwell. >> thank you mr. chairman and thank you for holding this important hearing and forwards as for being here this morning. crossed the country we are seeingry and experiencing a sure of hate and of hate in the server so we need to think harder about the tools and resources we have to combat this problem both on line and off-line. while the first amendment to the constitution protects free violencecites imminent that is not protecting congress should repeal laws that prohibit threats of harassment stalking intimidation to makeke sure we stop this on line paper that doesn't cite the evidence. testimony before the senate judiciary committee in july at the federal bureau off investigation fbi director said the white supremacist violence is on the rise. he said the fbi takes this extremely seriously" macinnis
made over 100 arrests this year. we are seeing in my state over the last severalst years where e suffered a a shooting at the jewish point of the center in shooting of a seikh in camp ikwashington a bombing attempt t a martin luther king day parade in spokane and over the last year we have seen a rise in the desecration of synagogues and mosques. the rise of hate across the country is also led to multiple mass shootings including the tree of life congregation pittsburgh, the pulse nightclub in orlando and most recently the walmart in el paso. social media is used to amplify that hate and the shooter at one high school in the parkland posting said the image of himself with guns and knives on instagram can pose fire to the attack on students. an anti-immigration manifesto on
a messagege board and my colleae just mentioned dreaming of life content related to the christchurchre shooting and the horrific incidents that happen there. the military engage in systematic campaign on facebook using fake names and sham accounts to promote violence against churkina. these human lives were cut short by hatred and extremism as we have seen to become more common. this is a particular problem on the dark web where we see certain web sites like the host ofs, 24/7, 365 hate rallies havg technology to mainstream web sites to stop the spread of the dark web site is a start but there needs to bes more a conference of imported effort to make sure people are not direct it into the cesspools. i believe calling on the department of justice to make sure we are working across the board on international basis with companies as well to fight this issue is an important thing to bee done.
we don't want to push people off the social media platform only to then be on the dark web where we are finding less of them. we'd need to do more at the department of justice to shut down these dark webeb sites and social media companies need to work with us to make sure that we are doing this. .. washington about initiatives, the state of washington has passed three gun initiatives by closing of the people, loopholes related private sales and extreme loss. all voted on and by majority of people in our state successfully passed. i do appreciate just last week representatives from various companies of all sizes, sending us asking for passage requiring background checks. very much appreciate that.
keep the hands out of the hands that are determined dangerous. so this morning we look forward to asking you about ways in which we can better fight these issues. i do want us to think about ways in which we can all work together to address these issues. i feel that working together these are successful tools that weekend deploy. trying to fight extremism online. thank you mr. chairman for the hearing. >> thank you very much. now we will hear oral testimonies from our witnesses. your statements will be submitted for the record that went out objection. we thanks you to limit your comments at this.to five minutes. >> thank you.
thank you for the opportunity to be here today. and to answer your questions and spider efforts in these areas. my name is monica and i am facebook his vice president for global posse management and counterterrorism. i am responsible for are rules around content around facebook enter company his response to terrorist would be attempt to use our services. on behalf of everyone at facebook, i'd like to begin by expressing my sympathy and solidarity with the victims families and communities and everybody affected by the recent terribleev attacks across the country. in the face of such heinous acts, we remain standing against the community of hate and violence. we are thinking to be able to provide a way for those affected by the horrific violence to communicate with loved ones organize events for people together in grief,s raise money
to help the poor communities and begin to heal. our mission is to give people the power to connect with one another and build community. but we know that people need to be safe. in order to build that community. that's why we have rules in place against harmful conduct including hate speech and fighting violence.ee our goal is to ensure facebook as though place where people can express themselvesre but with there are also safe. who are not aware of any connection between the recent attacks in our platform, a we certainly recognize that we all have a role to play. this is to keep our communities safe. that's why we remove content that encourages real world harm. this includes content that is involved violence, promoting or publicizing crime, or noting harmful activities or encouraging suicide or self injury. we don't allow any individuals or organizations to proclaim a
violent mission advocate for violence, or are engaged in violence to have any presence on facebook. even if they are talking about something unrelated. this includes organizations and individuals involved in or tivocating for terror activities, domestic and international, organize hate, and that includes white supremacy, white separatism or white nationalism or other violence. we also don't allow any content posted by anyone that praises or supports these individuals organizations or their actions. when we find content violence are sinners, we remove it promptly. we also disable accounts when we see severe or repeated violations. and we work with law enforcement directly. when we believe there is a risk of suitable harm or it addresses public safety. there's always room for improvement, weaver already moved content and much of that
is before anybody has reported it to us. our efforts to improve our focused in three areas. first building new technical solutions. they allow us to proactively, investing in people who can implement these policies, at facebook we now more than 30000 people across the company who are working on safety and security efforts. this includes more than 350 people whose primary focus is counter hate and counterterrorism. third, building partnerships with other companies society, researchers, and government so that together, we can come up with shared solutions. were proud of the work we've done this far to makes facebook a place at the work will never be complete. we know that bad actors will
continue with more sophisticated efforts and we are dedicated to continuing to advance our work and show our progress we look forward to working with the committee regulators, others in the tech industry and civil society to continue this progress. again i appreciateai the opportunity to be here today. i look forward to your questions. thank you. >> thank you. y publicly committed to improving the collective health, openness and stability of public conversation on our top forum. our policies designed to keep people safe on twitter and they continuously evolve to reflect the realitiese . working faster, in remove content before it is reported including terrace and ms. miss content. tackling terrorism, and
rprevention on these attacks requires a whole society response. including from social media. the me be clear, twitter is incentivized to keep terrorists off our service. both from a business standpoint and the current legal frameworks. such content does not serve our business address and it breaks our rules, and is fundamentally contrary to our values. communities in america and around the world, have impacted by mass violence, terrorism with tragic frequency in recent years. these events demand a robust public responsibility from everybody. tech companies have a role to play however it is important to recognize that content alone cannot remove these issues. first let's zero tolerance approach. individuals may not promote terrorism or engage in recruitment or terrorist acts.
since 2015, we've disbanded more than 1.5 million accounts. for violations of rules linked to terrorism and continue to see more than 90 percent of these accounts suspended through the other measures. in the majority of the cases, we take action before in counters have even been tweeted and the remaining 10 percent is identified through user reports chips.d secondly, we prohibit these extremist groups. these are defined in our roles as group both of by statements or oral flat bar use or promote violence against civilians. to further the cause or whatever their ideology. this policy in 2017, we've taken action on more than 186 groups globally and suspended more than 2000 unit accounts. certainly, we do not allow people content on our service. they are not threatened or violent comments or attack
people. wherever these rules are broken, we will take action to remove and permanently remove from terrace activity. we prohibit selling or buying or transactions and weapons, firearms and explosives ammunition, and instructions and making them. will take appropriate action on any account found to be engage in this activity including permanent suspension where appropriate. additionally, we prohibit the formation of weapons and accessories globally through our paid advertising policies. collaboration with our industry peers and civil society is critically important to addressing the common threats from terrorism globally. ine june 2017, we launch a globl platform against terrorism. the partnership with twitter, youtube and facebook. this facilitates among other things information sharing, technical corporation and
research collaboration including academic institutions. twitter technology companies have a rule to play assuring our platforms cannot be exploited by those promoting violence. this cannot be the only public public seat response. removing content alone will not stop those with determined to cause home. what if and when we remove platforms it removes ideologies into the dark corners where they cannot be challenged. as a parent companies have improved their efforts, this content continues to migrate to less government pot platforms and services. we committed to learning and improvingfo of every part of the park to play. addressing much violence is the whole society response. we welcome opportunity and continue to work with industry pears and government institutions, law enforcement, academics and civil society to find the right solutions. thank you for your time today. >> thank you.
my name is derek, i'm global director of information policy. undated team that advises with online content. terrorism extreme speech. before i begin, i'd b like to te those moment on behalf of everyone at google to express our work on the attacks. condolences to the families and friends of communities. dog local services were not involved in these recent services, we have engaged with people around the globe tore be sure our platforms are not used to support violence or hate speech. my testimony today i will focus on three key areas where were making progress to help protect people. firstt we work with governments and law enforcement second,e we prohibit products that would
cause damage harm or injury. and third, policies around terrorism hate speech. first, google engages an ongoing dialogue with law enforcement and agencies. then understand the threat landscape and respond for the safety of our usersd and broadr public. for example when we have a good faith belief with there is a threat for life or serious bodily harm made an art platform in the united states, the google cybercrime investigation group reports it to california regional intelligence services. in return it gets in the hands of law enforcement hands. we are also deeply committed to working with government and tech industry. in academia. since 2017, we've done this in particular through the global counterterrorism of which youtube is the founding company with its first year, recently is to keep entries joint content in said it protocols, also released
its first report on the counter speech campaign toolkit. second, we take the threat of gun violence very seriously and are advertising policies have long prohibited the promotion of weapons, ammunition, and similar products to cause damageon, harm or injury.mi similarly, we've also prohibit instructions that would make any harmful products. we employ a number of proactive and reactive measures to be sure everything is appropriately enforced. we are causally in proving our systems. we are updating our management and reviews. on youtubebe we have rigorous policies and programs to offend against the use of our platform to spread hate or violence. for the past two years we have invested heavily in machines and people to quickly identify and
remove content that violates our policies. machine learning technology to effectively enforce the policy itself. over 10000 people across google accounts with reviewing and removing content. and until desk, proactively looks for nutrients. improved escalation pathways.d finally going beyond removals by actively creating programs to promote the beneficial counter speech, such as thee change program and alphabets use of methods. cross-sectional rework is the tangible results. over 87 percent of the 9 million secondwe remove in the quarter of 2019, the first leg by ourov automated systems. more than 80 percent of those out of plate systems were removed before there was a single view. overall videos that violate our policies, or a fraction of a percent. our efforts do not end there. we are presently involving to these challenges and ways to
improve our policies. for example, youtube the updated policy prohibits videos alleging that a percent per year. segregation or exclusion, age gender race caste, religion, sexual entered orientation or status. first to ramp up our policies. we've already saw a spike in removals. in conclusion, we take the safety of our users very seriously and value are close in collaborative relationships with law enforcement and government agencies. we understand these are difficult and we want to be responsible for our part of the solution. as these issues involved, we will continue to invest to meet the challenges. we look forward to the collaboration with the committees. thank you for your time.am i look forward to taking your questions. >> thank you.
your group refers to adl for short. we appreciate you being with us today. we are happy to receive your rtstimony. >> thank you for the opportunity be here with the distinguished members of this committee this morning. my maid is george. i serve as a senior vice president for program at the a adl.io and for decades, the adl has fought against bigotry and anti-semitism by exposing extremist groups and individuals who spread hate to into site violence. today, the adl is the foremost non- governmental authority on domestic terrorism, extremism, acres, and hate crimes. i personally have served in several roles. in the government his national security apparatus, and the department of justice, homeland security, why house national security council, and outside government on the frontlines of combating and semitism and all forms of bigotry at the adl.
my testimony i'd like to share with you some key data, findings and analysis enters his committee to take action to counter a severe national security threats. the threat of online white supremacist extremism that is threatened in our communities. the alleged el paso shooter posted a mass, prior to the attack. when he expressed support for the accused shooter and christchurch in new zealand you also posted on hn. before the mass secure in poly california, the alleged shooter posted a link to his manifesto on hn. citing the terrace in new zealand and in the pittsburgh tree of life attack. three killing sprays, three whites and premised manifestoes. one targeted muslims, and other, and a third targetedhe immigrant etcommunities. one thing these three killers had in common, was hn. an online platform that had become the go to for many bigots
and extremists. access to platforms mainstream has significantly driven the scale speed and effectiveness of these forms of extremist attacks. our adl research shows that domestic extremist violence is trending up. and the anti- somatic hate is trending up. the fbi data showed similar trends. the online environment today, amplifies hateful voices worldwide and facilitate the wood coronation recruitment and propaganda that fuels the excuse them at the terrorizes our communities. all of our communities. both of it's through government or private sector or society. immediate action is need it that could take innocent lives. who have tried to address that haiti and its rapid nature online. we have been part of the conversations to improve the terms of service and content member fission programs and
better support of those individuals experiencing hate and harassment on those platforms. we appreciate this work greatly. much more needs to be done. adl has called on these companies at this hearing as well as many others to be far more transparent about the prevalence and nature of hate on our platforms. we need meaningful transparency to give actionable information to policymakers and stakeholders. but the growth of heat and extremists violence will not be solvedio by addressing these issues online alone.le we are just committee to take immediate action. first, our nation his leaders must clearly and forcefully call out bigotry and all of its forms at every opportunity. our nations law enforcement leadership must make enforcing hate crimes laws a top priority. our communities need this congress is immediate congress and avoidance of ways. to address domestic terrorism and extremism and create transparent and comprehensive reporting. such as that required in the
domestic terrorism prevention act, and similar measures in the domestic terrorism data act. our federal legal system lacks the means to prosecute the lights from the terrorist as a terrorist. congress should explore both of it is possible to craft a rights protecting domestic terrorism sent you. any statute that congress should consider would need to conclude specific careful congressional and civil liberties oversight to ensure the spirit of such protections is the fully executed. addition, the state department should examine both of certain foreign white supremacist groups meet the criteria for designation of fto or foreign terrorist organizations. for technology andti social meda companies, we look forward to companies expanding the terms of service and exploring the accountability and governance challenges, expiring to greater transparency in how you address these issues. and partnering with civil society groups to help in all of these efforts. adl stands ready at both the
government and the private sectorme to better address all forms of threats online. this is an all hands on deck moment to protect all of our communities. i look forward to your questions mr. chairman ranking member and other distinguished members of this committee. this thank youo . >> thank you. on platforms, how do you define, violent content. an extreme content. >> thank you mr. chairman. we will remove any content that satellite celebrates any violent act, physical injury or death of another person. we also remove any organization that isem or has proclaimed a violent mention or engage in acts of violence. we also don't allow anybody who has engaged in organized hate to
have a presence on site. and we remove hate speech. we defined that as an attack on a person based on his or her characteristics like race, religion, sexual orientation, gender, list them out in our policies. >> harder to define, extreme>> than violent. isn't that correct question and more. >> yes, we see people use that different ways. any organization that has proclaimed a violent mission or engage in document acts of violence. o we remove them.m. it doesn't matter what the reason is we just don't allow it. >> mr. pickles, what is your platforms definition of extreme. >> we similar to facebook, a great that the word itself has raised objective in some contexts can be people can have extremely active on an issue and in itself isn't a conflict. so we have a three stage test.
any funds groups, and that test is that we identify through that stated purpose publications or actions as extremist, and engage in violence, e and colonel lee y be in violence already. or they promote violence. and they target civilians. so we got a three stage test. the ideology in audits, we believe that framing allows us to protect speech and to prote protect. it's a remove that from our online site. call for harbor wish of harm of people is much broader and again not dependent on ideology. >> mr. slater can you add any nuances. >> broadly similar for designated foreign designations. glorification of violence and of course violence and hate speech. so broadly and similar.
>> mr. celine has suggested that your three platforms need to be more transparent. what you say to that mr. slater customer. >> thank you german and i think transparency is the bedrock of the work we do particularly around on my content we try to help people to understand what the rules are andnd how we are enforcing. we continue to get better at that. we look forward to working with this community. we have in the last year on youtube, invited ut guideline enforcement report. to go in and see many videos we've removed and a a quarter, for what reasons in which were flagged by machines versus users and we break itge down by violet extremism and hate speech and other safeties. so i think this is the really key issue and we looking forward improve.uing to roof >> perhaps you could help them
understand how and you frankly don't believe they're quite transparent enough at this.en >> to be clear, the point i'm making on transparency is to make sure that there are more clearly delineated categories between the point mr. slater was making in terms ofat what the machines were the algorithms used to remove certain types of content or stop it from going up in the first place. and what users of any of these platforms go on to say like we think this is the violation of terms of service. his degrees of inconsistencies. across these platforms that at the table as well t as others. so to get a holistic picture of what a certain issue may be, while individuals may flag or algorithms pulldown, there are different consistencies in that. so what we are asking and transparency as we are really looking for a much more balanced approach and that across all of the platforms. >> mr. pickles, is when he
touching on something that has a.? >> absolutely. i think the balance between the companies investing in technology, understanding what came down because of a person his reported that versus because technology found it is very important. i now publish a breakdown of six policy areas. the number of user reports received, 11 million every year. but 40 percent of the content that we remove,1 we remove because technology found it. not because of user reports. 40 percent. selling that sternum milling meaningful way is it a challenge. >> was that percentage in facebook is bigger. >> when it comes to violent content, and terror content, or the 99 percenton of what we reme is spotted by our technical tools and we had a production. by artificial intelligence. some of it is image matching. some known videos where we as a
software to reduce that to basically digital finger print and were able to stop uploads of that video again. we work with adl for years on this and i think transparency is key. i think we'd all agree. over the past year and a off, have published not only our detailed implementation guidelines for exactly how we define hate speech and violence but also reports on exactly how much we removing any his category and much of that like maker pickle says how much is actuallyly like our technical tools before we get user reports. >> thank you very much. >> cantwell: mr. celine, i think you've mentioned hn but what do you think we need to do to monitor incitement on hn and other dark websites. >> i think you can really approach this issue from two categories. there are a numberr of increased measures some of which i noted
in my written statement submitted to this committee.ur these companies as well as others can take it to create a greater degree of transparency and sanders. so that weekend have a really accurate measure of the types of hatred and bigotry that exists in the online environment enlarged.nv w in result of that increase or better data, we can make better policies that apply to content moderation in terms of service etc. so i think really having the good data is the framework for better policies and better applications and content moderation programs. >> using there is more that they can do.gr >> yes ma'am, much more they can do. >> i looked in your statement about you include auditing. third-party evaluations for that transparency as well as responsibility. as a mentor my opening segment, basically the drive all of this to a dark web. then we have less access.
what more do you think we should be doing. >> a number of measures, the first is having our public policy be very starting from a place where we are become the focus. both of it is pittsburgh el paso or any of the number of cities that other panelists and members of this committee and mentioned in their statements, we need us army measures that combat extremism with domestic terrorism. be from preventing other such horrific tragedies in order to do that, we really need to stop to do that to have a better accounting for eight related incidents etc. and we start from the place,teti think we can make better policy and better program at the federal government and state and local and also inst the private industry levels as well. >> one of the reasons i'm deftly going to be calling on
department of justice to thanks what more can we do in this. several years ago, microsoft and others have worked on working on international bases child pornography to better skill law enforcement at policing crime scenes online. i would assume that the representative today would be supportive and they may be helpful and even financially helpful in trying to address these crimes as they exist today. as hate crimes on the dark side of the web. is that or do i have any responses from our tech companies here? >> thank you senator cantwell, this is it across the industry we have been working on for the past few years in a manner very similar to how the industry came together online. we launch the global global internet terrorism both of my colleagues referred to, as a way
of getting industry to create sort of a no go for this terrorist and violent content. as part of that, we train hundreds of smaller companies on best practices and make technology available to them. reality is for the bigger companies we often are able to build technical tools that will stop videos of the time of upload, much harder for me smaller companies which is why we provide technology to them. we now have 14 companies that are now involved in a a hash sharing consortium so that weekend help the small company stop terrorist in content at the time of upload. >> i appreciate that there's more that you can do.re on euro size. but setting that aside for a minute, what do you think we should do about page and in the dark websites. we do all think we should do. >> i can tell you we can ban any link thatt is connected to a temple where these manifestoes have appeared.
so those manifestoes would be el paso shooting, will probably not available through facebook. sick i'm staying what more can you doro in law enforcement and government together and besides what you do to enforce it. >> i think certainly if there is criminal activity on these platforms, the primary response of the toolsri we have in our toolbox are related to content. if people are promoting violence against individuals, many crimes. i think it's something that it should be looked at and applicant strengthen the industry our cooperation with law enforcement, we can make sure that the information sharing is is as strong as it needs to be to support those dementias. >> see you think we need to more law enforcement resources addressing this issue. >> i think in the question of resources insulates.that george washington university last week, looking at the framework around some of these faces.
if there are opportunities to strengthen them and many of the as we mentioned again i think that is a worthwhile public policy conversation to have. >> i definitely think you need a lot more law enforcement issues. i look at the progress with everybody fighting on other issues, i think this is something and i hear that from mr. you-all, or recesses. >> senator fischer. >> thank you mr. chairman. in june, health subcommittee hearings on persuasive design. as we discussed facebook twitter and youtube are engineered to track capture and keep our attention both of it's through predictions in the next video to keep us watching. what content to push to the top of our newsfeeds. i think we have to realize that when platforms fail to blocked extremist content online, this content doesn't just flip
through the cracks, is amplified. and simplified to a wider audience and we saw those during the christchurch shooting. the new zealand terrace facebook live broadcast was up to an hour. that was by the wall street journal. that was before it was removed and it gained thousands of jews during that timeframe. ms. beckert,t, how do you concentrate on the increased risk fromat how your algorithms boost content while gaps still exist on getting dangerous content off of the platform. touch on that a little pit in your response to senator rickert. but how are you targeting solutions to address that specific tension that we see. >> center thank you for the question. it's a real area of focus. there are three things we are doing. probably the most significant is
technological improvements. icll come back to that in the second. second is making sure that we are staffed to very quickly review reports are coming. so the christchurch video, was that was reported to us by law enforcement, we were able to remove it within minutes. that response timele is critical to stopping the right realities you mentioned. and finally partnerships, we have hundreds of easy and civil society organizations that we partner with. if they are staying something that theyet can flag it for us through a special channel. and going back to the technology briefly. with the horrific christchurch video, one of the challenges for us was that our artificial insured intelligence tools did not spot violence in the video. what we are doing going forward is working with law enforcement agencies including in the us and the uk, to try and gather videos that could be helpful training data for our technical tools. this is one of the many efforts we have to try to improve these machine learning technology so
we can stop the next viral video at the time of upload. her entire creation. >> we talk about working with law enforcement, you said law enforcement contacted you. was that reciprocal. do you see something show up and then you intern try to get it to law enforcement as soon as possible so that individuals can be identified. what is the working relationship there. >> absolutely senator. we have a team that is our law enforcement outreach team, anytime we identify a credible imminent harm, we will reach out proactively to law enforcement agencies and we do that regularly and also when there is some sort of mass violence incident, we reach out to them even if we have no indication that her services involved at all, we want to be sure the lines of communication open and they know how to submit emergency process to us. we respond around the clock. in a very timely fashion because we know that every minute is critical in this dive boat situation. >> i'm a former prosecutor
myself so i do think these things are very personal to me. >> the platform his representative here today you have increased your efforts to take down this harmfulke contact but as we know there are still shortfalls that exist in order to get that response made and not just a timely manner but one that is going to truly have an effect. mr. sign when it comes to liabilities, doing media platforms, do you need more skin in the game? so that you can ensure better accountability. and be able to some kind of timely solution. >> take you senator for the question. ict think if you look at the practices that we are all investing in, certainly from our perspective, getting better over time. the framework strikes a reasonable balance. in particular in both provides
protection from liability that we go is it too far, also acts as a sort not a shield. empowering us to give us a legal sectioning that we need to invest in these technologies ofc people to monitore or to track remove the sort of violent content. i think that way the legal framework continues to work well. >> can you comment on this as well mr. slane, do you think there is enough real motivation for social media platforms to prioritize some kind of solutions out there. that's what this hearing is about is to find solutions so that weekend curb that online hate, that i think continues to grow. >> when thinking through the issues of content moderation, the authorities that exist within the current legal frameworks that reside within the company his representative at this table,id is sufficient r
them to take action on issues the content moderation and transparency reporting etc. there certainly is a degree of legal authority that affords these companies as well as others the opportunity to take any number of measures. >> mr. bickert, in your testimony, you say that facebook live will ban the user for 30 days for a first time violation of his platform policies. is that enough? can users be banned permanently. would that be something to look at? >> one serious violation will lead to a tipperary removal of the ability to lose life however if we see repeated violations, we simply take their account away. that is something we simply do the board not just with hate but other content as well.
thank you. >> thank you mr. chairman think you all forin being here today d thank you for outlining the increase attention and intensity of effort that you know providing to this very profoundly significant area. i welcome the you are doing more but i would suggest that even more needs to be done in a need to be better any of the resources and technological capabilities to do more. just to take the question that senator fischer asked of you mr. your answer was that they have authority to provide them with opportunity. the question is, really don't they need more incentive. to do more and to do it better to prevent its kind of mass
violence that may be spurred by hate speech appearing on the site. or in fact, may actually be a signal of violent to come. i just want to highlight that 8s of mass violence, provide clear signals and signs that they are about to kill people. that is the reason the senator grandma night of a bipartisan measuree to provide incentives o more states to adopt extreme risk for protection order will laws that will impact give law enforcement the information they need to take guns away from people. who are dangerous to themselves or to others. and that information is so critically important to prevent
mass violence but also suicides, domestic violence, and case information and signals often appear on the internet. in fact, this just this past december in monroe, a clearly troubled young man mayday series of anti- somatic rants and violent post online. when he bragged about shooting up a school and according while armed with an ar 15 style weapon and on facebook posted that when he was quote shooting for 32 his. fortunately, the adl, saw that post and it went to the fbi, and the adl vigilant, prevented another tree of life attack. burr of florida met with me and told me about a similar incident
involving a young man in coral springs who said when he was about to ready to ship the high school there. paul enforcement was able to foresee it. so my question is, to facebook twitter and google, what more can you do to make sure that these times of signed involving references to guns,na it may not be hate speech but it references to possible violence with guns or use of guns to make that available to law enforcement. >> thank you senator ♪ ♪. the biggest things we can do is engage with law-enforcement to find out what is working in a relationship and what isn't. thus the dialogue over the past
year that has led to us establishing a portal through which they can electronically submit requests are content with legal process and we can respond very quickly. >> what are you doing proactively i apologize for interrupting but my time is limited. proactively, what are you doing with the technology you have to identify the signs and signals that somebody is about to use a gun. in a dangerous way in that someone is dangerous to himself or others and is about to use a gun. >> we are now using technology to try to identify anyus of thoe early signs. including then violence but also suicide or self injury. evacuate reported to law enforcement. > yes we do, and 2018, we reported many cases of suicide or self injury or wheat detected them by artificial intelligent to law enforcement they were able to then intervene in many cases save lives.
>> we have incredible threat, so it is at risk to others or themselves, we were with the fbi to make sure they have the information. >> similarly would we have a good-faith belief of a credible threat and we will proactively refer to the northern california regional intelligence center and we will then fan and out to the right source. spirit because my time has expired, we thanks each of you if you would please give me more details in writing as a follow-up for how you and what identification signs you use and what kind of technology and how you think it canan be improved assuming that congress approves and i hope it will, the emergency risk protection order statute to provide incentives more than just the 18th now but that others would do the same. thank you. >> thank youk.
your participation here today is appreciated. preserving openness on your platforms seeking responsible management towards the actions of those who use your services to expend extremist and violent content. we held a hearing looking at terrace recruitment, propaganda online we discussed the cross sharing of information between facebook, microsoft, and youtube which allowed each of those companies to identify potential extremism faster. and more efficiently. soes i'm just direct this questn asked how effective is that shared database of hatches been. >> through the shared database, we now have more than 200,000 distinct patches of tara pop up again and that has allowed and i can speak for facebook only but that is allowed facebook to remove a lot more than we would
otherwise have been able to. >> the reassuring thing is that we don't just share hatches now. grow the partnership shall we share a link to a content like a manifesto, were able to share that across industry and furthermore, i think in areas that we have been improved. we now have real-time communications in the crisis. so industry can talk to each other in real time operationally. a they can say, not content related but situational. aat partnership between industry now also involves law enforcement. that wasn't there when we had that hearing last. so i think it's not just about the hash program, broadening out new programs in developing network. >> broadly i would say look at howim we have been improving ovr time, surely systems are not perfect. the voice would have to evolve toto deal with bad actors. but on the whole we are doing a better job in part because
technology sharing in removing this sort of content before it has a wide exposure of any sort. >> i would only add that the threat environment that we are in today as a country, has change involved in the past 24 to 36 months. likewise,ir the tactics and techniques that these platforms as well as others, use to evolve in nature, of the terrace and save online, both of it be forum or domestic need to keep pace with the threatened environment that we are in today. >> other similar partnerships as well as smaller platforms to specifically identify mass violence ? >> center one of the things that we've done over time is expand the mandate of the global nonfarm to counterterrorism. we relatively recently expanded to include mass violent incidents.
we are now sharing both art crisis and protocol and are sharing a broader variety. >> youtube is an automated recommendation systems has come on criticism for potentially staring users to violent content. earlier this year i let ato subcommittee hearing and use of persuasive technologies. algorithm transferred to themlo. in content selection. i asked the witness that the google provided at the time for that hearing, several specific questions about you two. they were not thoroughly answered. i would just say the providing complete answers to questions. is essential as we look to work together as partners to combat many of the issues discussed here today. i would like your commitment to provide thorough responses to any questions you might get to the record. do i get that. .uestions >> to the best of our ability. >> in addition i would like to just explore the nexus between
persuasive technologies and today's topics. specifically what percentage of youtube video use are the result of it automatically suggesting or playing another video after the abuser finishes watching a video?. >> i don't have an statistic specific. the purpose of our recommendation recommendation system is to show people the videos that they may like they're similar to the ones that much before. also we do recognize this concern for the borderline content. maybe it's not removed butre rushes right up against his lines. and we have introduced changes this year, to reduce recommendations for thosese that borderline. >> if you can get the number, that's gotta be available, so the question again is too thanks you specifically. what is youtube doing to reduce the risk of these features. they're pointing the user to
violent content.. >> and that change you made in january. it's been key it is working well. we have reduced the abuse from those recommendations by 50 percent just since january. the systems get better, we hope that that will improve. >> think is based on presidents at the gamble, we next have senator blackburn. followed by senator scott. >> mr. chairman i want to thank each of you for being here this morning. thank you for talking with us. this committee has looked at this issue on the algorithms and their utilization for some time. and were going to continue to do this. looking at content. in the extremist content is online. it is certainly important and then we know that there are a host of solutions. and we need to come to an
agreement and an understanding of how you are going to use these technologies to really protect our citizens. in social media companies are innocence open public forums. they should be. where people can interact with one another. part of your responsibility in this thing is to have an objective cop on the beat. and to be able to see what is happening because you know looking at it in real time. but what has unfortunately happened in many terms, is that you don't get an objective view. you don't get a consistent view. you get a subjective view. and this is problematic. it leads to confusion by the public. that is using the virtual space for entertainment and for their
transactional life, for obtaining their news. so indeed, as we look at this issue, we are looking for you to approach it in a consistent and objective manner. and we welcome the opportunity to visit with you today. >> got a couple of things i want to talk to you about. we've all heard about these third-party facilities. where contractors are working long hours and they are looking at grotesque and violent images. they are doing this day in and day out. so talk a little pit about how you transition from that to using modern technologies. what facebook is going to do. in order to capture this to extract it into minimized arm, you talked about you've got 30000 employees that are working
on safety and security. and then third-party entities that are working on this. let's talk about that impact on the individuals and then talk about the use of technologies to speed up this and make it more consistent and accurate. >> making sure that were enforcing our policies as a priority for us. making sure that our country is healthy and safe in their jobs. this is paramount. some of the things we do is that we make sure that we are using technology to make their jobs easier and to limit the amount of content and types of content that they have see. i a couple of examples. with child exportation videos, and graphic violence, with terror propaganda, we are in a
able now to use technology to review a lot of that content so that people don't have to. and in situations. >> let me thanks you this. sorry to interrupt but we need to move forward. your reviewers, your 30000 viewers are they scattered around the country or around the globe or where they located ? >> we have 30000 people working in safety and security, some of them are engineers and lawyers. the content reviewers, we have more than 50000, based around the world. soon a great for any of them not only are we using technology in their ways that we are using even when we cannot make a decision on the content using technology alone, there are things we can do like moving the volume or separating video into steel frames. that can make the experience better for the reviewer. >> let me thanks you about this, and more zuckerberg any washington post had called for us to regulate to that lawful
but awful speech. so tell me how you think you could define or we could define lawful but awful speech but not overreach or infringe in somebody's first amendment free-speech rights. >> on the things we are looking to with our dialogue just government is clarity on the actions that governments want us to take. so we have our set h of policies lay down very clearly how we define things that we don't do that in a vacuum. we do that with a lot of input from civil society organizations and academics are in the world. but we also like to hear the views from government so that weekend make sure that we are mindful of all of the different safeties. >> hours are constitutionally based. i am out of time. mr. pickles i'm going to submit a question to you for the record. mr. celine, i've got one that i'm going to send to you. mr. slater, i always have
questions for google. she can depend on me to get one to you we do hope that you all are addressing your prioritization issues also. with that mr. chairman, i guild back. >> you very much. senator scott. >> you i'm glad we are having a meaningfulhe conversation about what's happening in our nation. so we face the facts in our culture is producing an underclass of predominantly white young men who place no value on human life. these individuals that live purposely life his and an amenity. increasing after the most evil desires sometimes with racial hatred. as you all know we have the well as governor, the horrible goshooting at the school in parkland. within three weeks, we've passet historic legislation include orders that we did it by law
enforcement and mental health counselors and educators to come up with the right solution. now with regard to the shooting in parkland. the killer nicholas cruz, a long long history of violent behavior. in september of 2017, the fbi learned that someone with the username, nicholas cruz, and posted a comment on a youtube video that said i am going to be a professional school shooter. and when he made other threatening comments on various platforms.s. individualn is video nicholas cruz posted the comment on ported to the fbi. unfortunately the fbi close the investigation after 16 days that went out ever contacting nicholas cruz. the fbi claimed they were unable to identify the person who made the comments. partially we now have 17 and sent lives that were lost because of nicholas cruz. my questions for mr. slater, how
is the platform like youtube which is owned by google, not able to track down the ip address and identity of the person who mayday comment and when did you to remove the content and did you report the comment to law enforcement. if so, who and when. if you did report this comment to law enforcement, did you follow up in what was the process. was there any follow-up to see if there is any corrective action. . . i do not have the details on this so specific facts that you are saying, going forward, looking ahead park was a moment to proactively leach out to all under law enforcement and how can we do this better. that's part of how we work more
closely with northern california regional intelligence center. to make sure when we did have these, we can go to a one-stop shop who can get it from the right lawe enforcement locally instead of calling people. this month impact or in the last month, there was an incident where pbs was streaming in 70 put a threat in a live chat, we referred that to the regional intelligence center and then the orlando police who took the person into custody appropriately. and this was reported in the news. that is not to say things are perfect. but i do think we continued to improve over time. >> with regard, you'll get me the information, who did you contact when, when was it taken down, to this day, i cannot get an answer on what anybody did with regard to this shooter.
what u2 did, what the fbiob did, nobody wants to talk about it. so if you get me the information, are you comfortable if another nikolas cruz put something up that you've a process that you will contact somebody in there will be a follow-up process? >> senator, i think our processes are getting better all the time, their robust. i think this is an area where it's an evolving challenge. both because technology evolved, tactics have evolved and they might use codewords and so on. i'll follow up with the team and see how practices operating how we work together. >> mr. pickles. >> how can nicolas maduro who is committing against the citizens, withholding clean water, food, medicine still have a twitter account with 3.7 million followers?
>> you highlight the behavior that has been taken apart and the question for us is a public company that provides a public space and dialogue, is someone breaking our rules on our service, we recognize her situations where their circumstances where there's countries -- and so we do take a view and hope for the dialogue that the person be on the platform, it helps contribute to saw that you outlined. >> he has been doing it for a long time and it is not getting better in venezuela. it's getting worse. >> i think this is a good illustration of how the role of technology comes other parts of public policy responses and if we remove thatge person's accout it would not change. so we need to bear in mind, how do the other levers come into play.
>> maduro at certain talks about things and continues to act like he's a world leader and it seems to me that what you're doing isa allowing him to continue to do that? >> his current account has not broken our rules, if you broker rose he be treated the same as the other users and take action when necessary. >> i know we have both the party started in your tried to get to other people, i'd be happy to work with the senator on ford on this issue. i do think were not doing enough and i think this specific and my opening statement about what happened on facebook is another example and i'd be happy to work with you on i. this issue. >> yes, thank you, senator and thank cantwell and senator scott for raising this. ii am shocked to hear that they're going to leave it open until 1130.
which is generally what happens. senator duckworth. >> thank you, mr. chairman, while i do appreciate this committee's consideration of issues and extremism in social media, many i think would agree that today's hearing is a long history of congressional hammering on gun violence. according to the gun violence archive, since 2019 began 260 days ago, we have witnessed 318 mass shootings in the u.s., more than one per day. mass shootings are at least were four people are shot excluding the shooter. after 20 children, six adults in the shooter lost their lives at sandy hook 2012, many elected officials including myself declared an end to congressional and action. nowhere we said. but since that day our nation that there was 200226 mass shootings. think about that number for a
minute. but here we are not focused on gun violence, rather this court of social media. i am not going to say there's no connection but every other country on the planet has social media, video games, onlinend harassment, crimes and mental health issues. but they do not have mass shootings like we do. nothing highlights the absurdity to solve the gun violence crisis then seen 318 shootings in 260 days and then holding a hearing on extremism and social media. this is a chart from the digital marketing institute that according to their website highlights the average number of hours that social media users spent on platforms like facebook and twitter, as you will see, the united states in the u.s. relatively middle come past when it comes to time spentwh online. my question to you both, do you
agree that americans useha socil media is not especially unique on a per capita basis and in other words are you aware of specific trends on your platform strikes by the amount of gun violence in the united states? >> this will not come out of your time, some of us cannot see the details. >> it says how much time the average number of hours that social media users and using social mediaia each day via any device. >> the arrow points to the united states? >> the highest as the philippines and the lowest is japan, the u.s. is right in the middle. american users -- i have a four and half-year-old and 18 -month-old, when ive get home he says iphone, and she is on it and knows how to go to youtube kids and goes right to what she wants. would you both agree that in the summer in the middle of o the pk compared to the rest of the
world? >> yes senator according to study which i'm not familiar with, yes. >> are you aware of specific trends on your platform to explain the amount of gun violence in the united states customer. >> no i think it reflects a sense of our users outside the united states and i think you're right and it speaks to itself. >> mr. celine, you brought up that videogame can play hate and harassment. i agree that any discrimination regardless of the platform used in a meaningful connection between video games and gun violence exist, he you would think the widespread use of video games would reflect that connection, correct? look at the start, the availability of guns inth the u.s., the amount f time both in japan and south korea spent on video games is
far greater than anywhere else. we are third and if you look at the number of gun violence and gun deaths, here is the u.s. we are not the biggest users of video games. with this be accurate customer. >> senator, thank you for your question, i have not read the specific study but i do have one data point if i may share with you for a moment. according to a report looking at extremist related murders and homicides over the past decade, research shows that 73% of extremist related murders and homicide were in fact committed with firearms. to the extent that you're making a point that extremist with weapons result in violence and homicide, we have the data that backs that point up. >> thank you. as were reminded daily, the individuals who use social media platforms to disparage others have false equivalency in question fax.
some use clou online platforms o spread hate, but other video games explains the 200 -- 200,226,000 shootings since sandy hook. it allows individuals to develop online communities to share ideas. it is a week gun laws that allow the hate to become, there's a clear and undeniable connection between the number of guns in the united states and the number of gun deaths in her community. look at the platform. this is a number of guns per 100 people in the number of gun related deaths of 100,000 people. we are appear, here's the rest of the world, some use more social media than we do. some who engage in more video games than we do. we are saturated in weapon range that was designed for war but made available to anyone who attends a local gun show they could shoot 100 round drum.
i do not have one when i served in iraq. we send marines into salute with 10100 round drums but you can by them at gun shows. congress should expand background checks and laws. 60% have high-capacity ammunition which is what we need to do, this is not controversial, it is well past time that leader mcconnell brings hra to the house inel background checks for the senate floor for a vote. l i hope leader mcconnell will allow president to keep the americans safe act the disarm hate act, and domestic terrorism prevent act, it will keep her children and her neighbor safer. i hope my republican colleagues will join in the bipartisan efforts. thank you and i yield back. >> senator duckworth, let's do this. so we can have a complete record, if you would reduce
those three posters to a size that we can copy and bill lee omitted in the record at this point in the hearing without objection. >> thank you very much sir. >> senator young. >> thank you, mr. chairman. i want to thank all of our panelists for being here today, i appreciate your testimony in your answering ours. questions. if we all need to collaborate in curbing online extremism which i understand to be one of multiple causes that we can cite as we all think about the issue of mass casualty events in extremist events more generally. the nation's wrestling with mass violence, extremism and digital responsibility and for some of
these events, in my home state of indiana, hoosiers in crown point indiana recently experienced firsthand how a person can become radicalized over the internet, something that i know many of her has studied and are working on. in 2016, a man was arrested and convicted for plantin planning a terrorism attack after radicalized by isis over the internet. thankfully the fbi and the indianapolis joint terrorism attack force intervene before any violent attackk occurred. however, that is not always the case as we know and we see this across the country. that is why it's could go woman have his hearing to work together collaboratively knowing that your products and platforms provide incredible value to consumers and they obviously were not intended for this
purpose. so it's our responsibility in congress indefinitely your responsibility as businesspeople to make sure that we monitor how the great value that you provide can be used in the illicit improper dangerous in the various manner. in one minute or last, could because i have three minutes left, i would request that representatives from google, facebook and twitter should be confident that each of the companies is taking this issue seriously and white americans should be optimistic about your efforts going forward. >> one minute each. >> indeed. >> google. >> thank you, senator. i would start by appointing to youtube guidelines which details every quarter videos were reasons why and is being flagged first by
machines. dealing with this issued in removing violent content is accommodation of technology and people, technology can find patterns and people can help you with the nuances. we have seen over the time that technology is getting better and better by taking on the content faster and before people have viewed it. with the od offset of 9 million videos that we removed in the second quarter of thisse year,% of those were first flagged by machines. 80% of those were removed before a single view and we talked to a violent extremism that is better in terms of removal before wide viewing. we are already seeing advancements in machine learning, across the industry broadly. and the thing about machine learning, it learns from fixes, if you were wrong, though systems will get better so why won't you be optimistic, though systems ideally will continue to get better, will they be
perfect? no, but it will continue to evolve. i do think there is reason for optimism and i think there's reason for optimism based on collaborative between all of us today. >> facebook i. >> thank you, senator, the first thing i will say, facebook won't work as a service if it is not a safe place, this is something we are aware of every day. we want people to come to together to build this to be sacresafe. one of the things we have in a team of 350 people primarily dedicated in their jobs to countering terrorism and hate, but his expertise. my background is more than a decade with the federal terminal prosecutor, but people that i've hired on to this team have background in law enforcement, academia, studying terrorism and radicalization. this is something that people come to work on at facebook because this is what they care
about, they're not assigned to work on it at facebook, this is bringing in expertise and i want to make that clear. finally similar to my colleagues, we have taken steps to make what were doing very transparent. the report in the past year end half show a steady increase in our ability to detectli terror violence and hate much earlier when it's uploaded to the site and before anybody reported to us. now more than 99% of the violent videos and terrorist propaganda that we remove fromrr the site e finding herself before it's been reported to us. >> thank you, twitter. >> i think people can be optimistic. a few years ago the peak of so-called people challenge our industry to do more and be better. i now look at a time where 90% t of the tears content that twitter removes is through technology. with the independent academics
and talk about the highest community being decimated on twitter. i look at the collaboration that we have between our companies which did not exist when i joined twitter five and half years ago. all of those areas have proven better technology, faster response and aggressive posture. there is benefit in other areas but i think we can also pay confident that no one is going to tell this committee our worko is done. and everyone of us will leave here today knowing we have more to do and we can oversee the adversarial and we have to keep it up. >> thank you so much, i spent five days, five weeks, five months or five years, i know i only have five minutes and emerging one minute over. mr. chairman. >> thank you, senator rosen your next. amid ago boat and i can assure you i will not let them close that boat until you've asked her questions and get over there. >> i appreciate it senator. thank you for holding this important hearing, i want to
think although witnesses for being here to talk about this real, and difficult issue. a rise of extremism online is a serious truck. in the internet is unfortunately proven of value will tool for extremist connecting with one another to various forms to spread hate and dangerous ideologies. while were here to focus today on the proliferation of extremism online which isre hocredibly important, we must not lose site of the fact that violent individuals who find communities online to feel the hatred and acted in the name of hate. we cannot ignore the fact that the absence of some simple gun safety measures like background checks are allowing individuals to access dangerous weapons fara too easily. and so we know the majority of americans want us to supportst that. but i represent the great state of nevada, and as we approach unfortunately the two-year anniversary of the october
shooting in las vegas, the deadliest mass shooting in modern american history, we know coordination with in between law enforcement is more important than ever. the counterterrorism center also known as a fusion center is an example of a compliment between 27 different law enforcement agencies to rapidly and accurately respond to terrorist and other threats. with las vegas hosting nearly 50 million tourists and visitors each year the fusion center is blsponsible for preventing countless crimes and even acts of terrorism. so to all of you, can you please discuss with us your coronation efforts with law enforcement when a violent or threatening undert is identified platform and what do you need from us as a legislative to promote to facilitate the
partnership to keep our community safe from another shooting like the one in octob october. >> thank you, senator. that was incredibly tragic in our hearts are with those who suffered ended suffer in the attack. our relationship with law enforcement first is an ongoing effort. we have a team that does training to make sure that law enforcement understand how they can best work with us, that something that we do proactively. anytime there is a mass violence incident, we reach out to law enforcement immediately even for not aware connections between our service in the incident, we want to make sure they know where we are and how to reach us. we also have an online portal through which they can submit online legal process including an emergencycy request. in a team 24 hours a day so we can respond quickly. we proactively refer intimate
threats to law enforcement whenever we find them. >> thank you. >> thank you, senator, i want to echo her sympathy to your constituents. who were victims by the horrible tragedy. the lessons that we have learned since of attack have come in use to inform us in for example not waiting for the political attempt for the shooter to be known before acting. one of the challenges we have it might look for an organization affiliation before we would say the terrorist attack, we don't wait anymore, we act first to stop people. as monica said, we work with law enforcement to provide credible threats. one of the questions that i as long as colleagues in a number of agencies yesterday to discuss how we can work with our collaboration and there's a huge amount of information within the law enforcement and within the
dhs umbrella that is classified that might help understand the threat and the trends and awareness. so understanding how more information that we share with industries to betterin affect. >> can you provide us in writing that might need to help you better cooperate to protect her community. >> that was a subject of the meeting yesterday we had a very productive conversation. >> the horton sympathy, tragedies like that one in the ways that we proactively cooperate with law enforcement refer credible threats and emergency disclosure expeditiously. >> thank you. i see my time is up, i'm going to submit a question for the record aboutut combating violent anti-semitism online and i know other people are waiting, i appreciate your time and your commitment to solving and working on the issue.
>> thank you, senator rosen your questions will be cemented for the record. >> i want to start with a simple yes or no question, answer either yes or no or yes or no with a brief one sentence caveat. i like to hear from each of the three of you do provide a platform that you regard and present to the public as neutral in the potable sense? >> yes senator our roles are politically neutral and we applied neutrally. >> so you aspire left versus right. >> we want to be a service for political. >> mr. pickles customer. >> reinforce our roles partially with ideology included. >> mr. sawyer. >> similarly, we practiced without regard applicable ideology, were not neutral
against terrorism. >> i appreciate you pointing that out, that is of course not one talking about and that leads into the next question that i wanted to race with each of you, i think it's important that the work each of your doing in this area and it's important for anyone occupying the space to be a concept of the same space and those who access or services by removing pornography and terrorism agatha is an tran1 ad. there's a lot of debate surrounding this. as you know section 230 of the commancommunications decency acs received a lot of criticism and protect the website from being help liable of the publisher of informatione by another information content provider. sinsignificantly, sexton 230 gd
samaritan gives you the promise that you will not be held liable for taking down this type of objectionable content that were talking about whether it's something that is constitutionally protected ort not. so for each of the same witnesses again, i'd ask, each of you represent a private company in each of you are accountable to your consumers within your company, in some sense, you have incentives to provide an enjoyable experience under respective platform. so a question about 230, the section 230 particularly, the good samaritan provisions help you in your efforts to swiftly take down things like pornography and terrorism
content off report forms? and would be more difficult without the legal certainty that section 230 provides? >> absolutely senator, section 230 is critical to our efforts with security. >> mr. pickles? >> i go further and say it is been critical to the leadership ofle american industry and the technology sector. >> mr. slater. >> absolutely yes. >> other related point, imaginei a world where this is suddenly taken away and those provisions no longer exist, large companies like yours to be able to infect, i strongly suspect would still be able to filter out this content between the artificial intelligence capabilities at your disposal and your human resources that you have, i suspect you could and probably would still do your best to perform the same function. what about a startup? what about a company trained in
turenter into.what would happen? >> senator, thank you for that question. this reminds me of industry co conversations involving smaller company, awkwardly formed in june of 2017, we were having closed-door sessions with companies large and small to talk about the bestt ways to combat the threat of terrorism only. that smaller companies were very concerned about liability section 230 is very important for them to be able to begin to proactively act in the content. >> i would say it's a fundamental part that maintaining a company and without it the ecosystem is less competitive. >> i may just that, the u.s. section 230 as part of the reason why we have been a leader
in economic growth, innovation and technological developer, other countries that do not have suffer andike it study after study has shown that. i'd be happy to discuss thatow more. >> if it were to be taken away, so all three of your company coming yours in particular mr. slater, not exec lyons for being a small business, or business with a modest economic impact. but you can identify with the concern of expressing. if we were to take that away, google be able to keep up with what it needs to do but wouldn't it be harder forto someone to start a new search in the company, newtek platform of one sort over another and somebody starting out in the same position where company was a couple of decades ago, when that be exponentially more difficult? >> i think it would create problems for intervenors but surly small medium-size businesses would have a lot of trouble eating their arms around that significant change to the
fundamental legal framework of the internet. >> they can. my time has expired. >> senator baldwin. >> thank you. i wanted to begin by thanking our o committee chairman for holding this hearing, it's a vital conversation for us to be having and we need to be taking a hard look at how we address the rising tide of online extremism and its real-world consequences in our country. i do have some questions for you on this important topic. first i wanted to echo of what my colleagues haverty said, which is, there is much more that the senate must do to address god violence, whether or not connected to hatred on the internet. so more than 200 days ago, the house of representatives passed a bipartisan universal check bill in this common sense gun safety measure has an extraordinary level of
public support. it deserves a vote on the senate floor. and i feel like we cannot have hearings but we have to act to reduce gun violence. the center on extremism has closely studied hate crimes in extremist violence in this country. is it fair to say that their husband and alarming increase in bias motivated crimes including extremist in the last several years? >> yes, that is accurate. >> in the case of extremist killings, what role do, you fel that access to firearms has played in the increase? >> thank you for the question, as int briefly alluded to earli, just expand on what i was mentioning, according to a recent media report, extremist
of all ideological spectrums that committed murders or homicides in theits united stat, 73% of those acts were committed with firearms. >> thank you. what impact do you believe this increase in hate crimes including extremist killings, have on the minority communities whose members have been the target of these attacks. let me just add to that question, what are the unique aspects of a hate crime is that not only in victimizers the targeted victims but it strikes fear among those who say this the same characteristic with the victim or victims. >> senator thank you for making this point, in the past 24 months we saw a calendar year 2017 with 57% increase of anti-semitic incidents across the country. the fbi and d.o.j. own hate crime data showed 70% increase of hate crimes in calendar year
17. we continue to see the statistics year after year end it's imperative and part of my testimony today with the smitty testimony speaks to the need for greater enhancement and enforcement of hate crime laws and protection for victims. >> i met original cosponsor of the legislation to disarm hate crime -- hate act which would bar those convicted of misdemeanor hate crimes from obtaining firearms. you agree that this measure could help keep guns out of the hands of individuals who might engage in extremist violence? >> yes senator, thank you for your leadership in all members who have supported this legislation. >> thank you. i appreciate the efforts that are witnesses from the social media companies have described regarding the company's efforts to combat online extremism including to provide transparency to their users in the general public. it is critically important to
understand how you are addressing problems within your existing services and platforms and i would like to learn more about how you are thinking about this issue as you develop and introduce new products. in other words, i think a lot of us feel that the approach is rapidly introducing a new product and then assessing the consequent as later is a problem. so i would like to ask you, how do you plan to build combating extremism into the next generation of ways in which individuals engage online and why don't we start with you. >> thank you for the question senator. by design is an important part of building new products under company. one of the things that we built in the past 85 years is a new product policy theme, the
responsibility is to make sure they're aware of new products and features that are being built. in explaining to these engineers who are think of all the wonderful ways that they can be used, all of the abuse scenarios that we can envision to make sure we have reporting mechanism or other safety features in place. >> i think as i said earlier, when a very adversarial phase, we know it'll change the behavior and over time we have a future, aof policy decision, the key processes in the discussion, and how can this be used against us, how will people change your behavior. i think you're absolutely right, we need to share that with more companies. and working with more companies around the world to share that with them and help them understand the challenges is also valuable. >> similarly, with product
managers and engineers from the conception of an idea all the way to the development of possible release. frompm ground up safety by desi. >> thank you. >> i went to think the witnesses and i will be taking over the chair and i will call on myself as the next witness. i want to ask all of you, your companies, technologies are famous for their algorithms which seem to have the ability to pinpoint on whato people wa, you can put an e-mail out or even some people think talk about your interest in yellow sweaters and next thing you know we have as popping up on facebook talking about yellow sweaters. who knows how that happened but to a lot of us it happens. it's pretty impressive but here's my question, if your
algorithm technology is so good at pinpointing things like that, what people are interested in and then circulates to ads, what are the challenges with regard to directing that technology to help us and help you find what has been talked about on both sides of the aisle which is the people who are committing this violence are typically disaffected youngsa males. and aren't there signs, aren't there things that you can do with the technology that you do so well onolhe other spaces to t least provide more warning signs of this violence from these kind of individuals who in some ways already have a profile online? >> i will throw that out to any of you and are you working on it? >> thank you for the question senator. technology plays a huge role in
what were doing to enforce the safety policies at facebook. in the area of terrorism, extremism and violence, it is not just the matching software's that we have to stop think like organized terror propaganda. we are using artificial intelligence machine learning to get better and identify new content that we have not seen before, and might be promoting violence or trying to incite violence or harmful behavior. anytime that we find a credible threat of physical harm we proactively send that to law enforcement. and it's getting better everyday. >> are you using the algorithms and advanced technologies that you used in other spaces to identify thoseva e? threats? >> there are certain cross learnings across the company, different products work in different way. >> is a priority the way it would be for selling yellow sweaters. >> absolutely. >> can affect of all the companies are? >> absolutely investing in technologies to find content is
a priority. >> it is a taught proteus. >> senator i would only add, this part of the conversation as a buddy who study the research and data around these issues for nearly two decades, the environment that we are in today has changed significantly. white supremacist terrorist in the united states has training camps in the same way that foreign terrorist groups to. theirs s training camp where ty connect, learn and cornet with one another is within the online space, it's imperative that the question you're asking about machine learning, technology, artificial intelligence continue to advance to disrupt the environment and make it in his inhospitable place for individuals who want to promote violent content to the abuse disrupted. >> this is a bigger policy question, all of your companies have this tension be you want
eyeballs on more click, more time on, and yet with facebook or google or twitter and yet there is increasing studies that are showing for example, the amount of young men and women, young girls who feel a sense of loneliness from their time online, there is indication that among teenagers that they are increasing particularly for young girls, one of the things i worry about, were all dealing with the opioid epidemic and were looking back saying my god, do wed we do that, how get to this position in the '90s and the policies and other things that 72000 americans died of overdoses last year. so we're kind of looking backwards and saying how did this happen, in your c suites of
policymaking, give or wonder, are we going to be looking back in 20 years saying how did we addict a bunch of young americans to look at the damn iphones eight hours a day in 20 years from now will be seen the social and physical and psychological ramifications where we all might be kicking ourselves saying why do we allow that to happen. you overthink about that? because i think about that and it worries me but you have tension because don't you want more face time, you want young teenagers spending seven hours a onday staring at the iphones because that helps your revenues? do you worry that 15 - 20 years from now will be in the same spot that we are with opioids and saying what do we do to our kids and our citizens? any of you worry about that? >> your negative implications of
what's happening in society right now? >> senator thank you for the question. as a mother, i take these questions of wellness very seriously in a company does as well. this is something that we look at and we talk to groups to make sure that we are crossing products and policies that are in the best long-term histor int of the people that connect through facebook. i also want to say, we have seen social media be a place for support for those who are thinking of harming themselves, eating disorders, opiate addiction, or getting exposed to hateful content. so we are also exploring in developing ways of linking people up with health resources pre-willie stood up for opioiddd addiction and thoughts of self harm, and people who are asking
or searching for hateful content, we provide them with health resources. we think this could be a positive thing for overallll wellness. >> we have similar programs in placeac for opioid and for peope who are referencing self-harm or suicide. we provide and intervene and provide them with a source of support. that is also ruled out around health. we've also recognized issues in the industry and we need to invested to make sure our people are using their services and also have the skills and the awareness to use them disturbingly and finally, our ceo has committed the company to looking at the health of the conversation. the notches using the biometrics you referenced that much more broader metrics into measure the health of the conversation rather than just revenue. >> thank you, mr. chairman. >> thank you, senator sullivan, senator cruz. >> thank you, mr. chairman and i
will say, thank you to my friend from alaska for sharing this deep void and longing in your heart. i want to reassure you for christmas you'll be getting the yellow sweater. [laughter] mr. slater i want to start with you. i want to talk about project dragonfly. in august of 2018 it was reported that google was developing a sensor search engine under the alias of project dragonfly. in response to those concerns, also but shareholders requested that the company published human rights impact assessment by
october 30 of this year. examining the actual and potential impacts of censored google search in china. however, during awful but shareholder meetings june 19, the proposal for the assessment was rejected. an alphabet board of director especially encourage shareholders to vote against it. and also but commented that google has been open about his desire to increase its ability to serve users in china and other countries. we have considered a variety for services in china in a way that is consistent with her mission and gradually expanded our offerings to consumers in chinai so i want to start with clarity. mr. slater, has google ceased any and all development with work on project dragonfly? >> senator, to my knowledge yes. >> has google committed to forgoing future projects that may be named differently but would be focused on developing a censored search engine in china? >> senator, we have nothing to
announce at this time and i think we will look very carefully at things like human rights and we work with a rubble network initiative on an ongoing basis to evaluate power principles, practiceswi and products comply with human rights and the law. >> so, roughly contemporary early, google decided that they did not want to work with u.s. itdepartment of defense, how dos google justify having been willing to work with the chinese government and complex projects including artificial intelligence, under project maven and at the same time not being willing to help the department of defense develop ways to minimize civilian casualties through better a.i. you reconcile those two approaches? >> senator as we talked about
today, we do partner with law enforcement and we do partner with the military in certain ways offering some of our services and also as a business we draw responsible lines about where we want to be in business including limitations of getting in the field of building weapons and so,, on. and we will continue to evaluate that over time. >> let me shift to a different topic, which is this people has talked about combating extremism in the efforts of social media to do that. many americans including myself have a long-standing concern that when big tech says it is combating extremism, that is often a shield for advancing politicalal censorship. mr. pickles, i want to talk about recently twitter extended its pattern of censorship to the level that it took down the twitter account of the senate majority leader, mitch
mcconnell. that i found a remarkable thing for twitter to do and it did so because that account, as i understand it has sent out a video of angry protesters outside of senator mcconnell's house including an organized and black lives matter in louisville whose heard in the video that the senate majority leader "should've broken his little ass andwrinkled somebody else who had a voodoo doll of the majority leader in angry protesters said just stab the ms heart although they do not abbreviate mf. senate majority leader sent out those threats of violence and found rather remarkably his own
twitter account taken down. how does twitter expand that? >> thank you, senator for the opportunity to discuss praise something we been asked around the world is in many jurisdictions and safety of people who hold public office. when we sell the video posted by numerous users include the identified someone home and contained as you referenced quite severe threat, we had abundance of caution, we did remove the live video, we do not remove the account, the single tweet from everybody who posted it because the essence of the video was someone's personal home where the senate majority leader has been residing at the time with several violent references, we thought we should remove. we then discuss this further that we understood their intent was called attention to those very threats of violence and so we did put the video on twitter with a warning message saying this is sensitive media but it's
not balance that were striking, i've been in many different situations where i've been asked the exact opposite where the similar content should be removed. violence is something that weda strive to get right every day but i first thought of safety of leader mcconnell and his family. >> would you agree there's a difference between someone posting video where the threatening someone else in the target of that aen threat of pog the video. >> i think that is said but in a situation with the person's home visible in the video, there is still a risk there and most of preventing the and it could've occurred because the homet t was visible. we appreciate the leaders discussion and discussing with his campaign team in senate office. but this is something that motivation was to prevent harm and not the potentially issues you may allude to.
>> thank you. >> mr. pickles, have you rethought your policy since senator cruz asked about, i will call your attention to this bickert testimony on page two which says, the propaganda assembles in the organizations under individuals to be shared re a platform unless they're being used to condemn or inform. is that language instructive to your platform and don't you think that clearly it was readily evident from the beginning that senator mcconnell
and his campaign posted that video to condemn and inform? >> i think this is an irrelevant issue, we as a company have taken a more aggressive posture and we did see people posting book excerpts of the manifesto and content of the video to condemn it. we have decided in both circumstances we would remove it. and for other tax were recently on the new states where images manifestoes even where they are condemning it. we have taken the decision to remove theom material. it's something that is constantly on the tension and illustrates and highlights for us the complexity of getting this right but if we are going to air on the side of caution, fewer violent threats and fewer people hosting on a platform is mostly a good thing. we have to work harder at taking into account the kind of context you outlined but this isis the
first time and i been with the committee five and half years, i've never been asked why do we not leave something up with the content of violent threat. that in itself is a complexity of the situation. >> in terms of the context, in this instance, it was the owner of the home who chose to inform the world about what was being set against him and it was the individual himself who hosted this and it seems to be a clear-cut case in that instance the differing chip from the condemnation of the larger incident of the christchurch violence. i would suggest that it should not have taken very long for
twitter to understand that. senator sullivan your recognize. >> i have a couple follow-up questions. mr. slater, went to senator cruz's question, i think whether a company wants to work with the pentagon is something that leadership of the company, each individual company has to make that decision. i think that is certainly something to define. what troubles a number of us is there is a declaration that you're not willing to work with the department of defense on certain issues and yet there is a willingness to work with one of our countries potential adversaries particularly on sensitive technological issues that are important to the competition between the two nations. do you understand why that has caused bipartisan concern here.
and how should we address that? >> should congress take action on those kind of situations? not saint everybody has to work for the pentagon, that's your decision, but if you don't want to work to help with the nation defense and you're working with a country that poses a very significant net long-term to the united states, do you understand the causes concern? >> senator i do appreciate the concern, we are probably marking companies in a business that wants to draw responsible lines and we will engage with you, the committee and others to make sure we're doing the. >> do you think there's incidents of that, and hey we will not do anything termination defense with the u.s. department of defense but will work with the chinese, something very clear and obvious you think
there's something we should do to prevent that? or penalize not? >> i think it's an import question, we as a business try in stride consistent lines. but the details were certainly after matter. >> mr. pickles, let me ask a final question, you said that that the twitter account of mindoro and venezuela is not broken any of the rules, what are the rules and at what point would you look to have somebody who certainly not treating the citizens well and senator scott has been a leader on this issue, but, what are those rules and what point would you look at what they're doing to their own citizen has a way to not provide them the platform that you have. >> firstly the rules apply to
any one person the same. i can make a full copy available and for example, whether if it twitter account was used in some of the ways that we have seen around the world to encourage violence against minorities to organize violence. we would take actions on those accounts breaking those rules. >> will twitter allow food to have an account or she didn't ping have an account questioning. >> at the acting within our rules, some government have thought to manipulate our platform to spread propaganda through breaking our rules. one of those is venezuela, we made a public declaration of every account that we remove from twitter for engaging information operations covertly and we believe isev responsible
and we made the whole archive available to the public and researchers. we have taken the same staff with information operation from china, iran and russia and we believe it's not just a single twitter account that some government due to manipulate our platform and we will take action to remove them and make it public so people can learn. >> so if it government takes violence against its own citizens is that breaking the twitter world? >> that is always a happening off-line in the key question happening on twitter. >> thank you, mr. chairman. >> thank you, senator solon and her witnesses, the hearing record will remain open for two weeks and during this time senators are asked to submit any questions for the record and upon receipt the witnesses are requested to submit their complete written answers to the committee as soon as possible but no later than wednesdayay october 2, 2018 to the close of
[inaudible conversations] [inaudible conversations] >> coming up tonight on c-span2 a discussion on ways to improve relations with the committe com. cyber security and how companies can protect themselves will share information for the internet. that is followed by hearing on holocaust insurance claims. . . .
who spoke before the house judiciary committee. they will talk about ways to improve police relations within the community and increasing ability among law enforcement. this is three hours and 40 minutes. house committee on the judiciary will come to order, without objection the chair is authorized to declare recess of the committee at any time. we welcome everyone to this morning's oversight hearing on police practices. before we began, i want to briefly recognize susan jensen whose last day on the committee after more than 20 years of services tomorrow. susan is highly respected on both sides of the aisle as one of the preeminent experts on