Skip to main content

tv   Facebook CEO Zuckerberg Testifies on User Data  CSPAN  April 10, 2018 2:17pm-5:01pm EDT

2:17 pm
organizations, like record labels or corporations have larger consequences that may require a large amount of monetary reparation. it is a good rule of thumb to ask permission from creators and give them credit. if licenses are required make sure you both know what you are fighting for. those making creative work, know your rights and do what you can do to protect them. creators sharing their works with their viewers and viewers sharing their work with the world. so help creators you know share their work to the world by making sure you know your copyright. >> to watch all the ocumentaries, visit studentcam org.
2:18 pm
>> and live now to the hart senate office building on capitol hill from testimony from facebook founder and c.e.o. he is going to testify about the users' personal data. this is a joint committee hearing. we expect it to get under way shortly. and what this online at c-span and listen n --.org with the free c-span radio app.
2:19 pm
2:20 pm
2:21 pm
2:22 pm
2:23 pm
2:24 pm
2:25 pm
2:26 pm
2:27 pm
2:28 pm
2:29 pm
2:30 pm
2:31 pm
senator grassley: the committee on the judiciary and commerce, science and transportation will come to order. we welcome everyone to today's hearing on facebook's social media privacy and the use and abuse of data. although not unprecedented, this
2:32 pm
is a unique hearing. the issues we will consider range from data privacy and security to consumer protection and the federal trade commission enforcement, touching on jurisdictions of these two committees. we have 44 members between our two committees. that may not seem like a large group by facebook standards, but it is significant here for a hearing in the united states senate. we will do our best to keep things moving efficiently given our circumstances. we will begin with opening statements from the chairman and ranking members of each committee starting with chairman thune. and then proceed with mr. zuckerberg's opening statement. we will then move on to questioning. each member will have five
2:33 pm
minutes to question witnesses. i would like to remind the members of both committees that time limits will be and must be striggetly enforced given the numbers that we have here today. if you are over your time, chairman thune and i will make sure to let you know. there will not be a second round as well. of course, there will be the usual follow-up written questions for the record. questioning will alternate between majority and minority and between committees. we will proceed in order based on respective committee seniority. we will anticipate a couple short breaks later in the afternoon. and so it's my pleasure to recognize the chairman of the commerce committee, chairman thune, for his opening statement. senator thune: thank you chairman grassley. today's hearing is
2:34 pm
extraordinary. extraordinary to hold a joint committee hearing. even more extraordinary to have a single c.e.o. testify before nearly half of the united states senate. but then facebook is pretty extraordinary. more than two billion people use facebook every month. 1.4 billion people use it every day. more than any population of any country on earth except china. more than 1500 the home state of south dakota. and adults get some of their news from facebook. in many respects, facebook's incredible reach is why we are here today. we are here because of what you have described is a breach of trust. a quiz app used by 300,000 people led to information about
2:35 pm
87 million facebook users being obtained by cambridge analytica. there are plenty questions about cambridge analytica and will hold a similar hearing. but as you said, this is not likely to be an isolated incident. a fact demonstrated by suspension of another firm this past weekend. you promised when facebook discovers other apps that had large amounts of user data and will ban them. and that's appropriate. but unlikely to be enough for the two billion facebook users. one reason that so many people are worried about this incident is what it says about how facebook works. the idea that for every person who decided to try an app, information about nearly 300 other people was scraped from your services, to put it mildly,
2:36 pm
disturbing. and the fact that those 87 million people may have consented to making that data available doesn't make most people feel any better. the recent revelation that actors use the default privacy settings to match email addresses and phone numbers found on the so-called dark web to public profiles potentially affecting all facebook users only adds fuel to the fire. what binds these two incidents is they don't appear to be caused by the negligence that allow breaches to happen. instead, they appear to be the result of people exploiting the fools you created to manipulate users' information. i know facebook intends to take steps to address these issues. nevertheless, some have warned that the actions facebook is taking to ensure that third parties don't obtain data from
2:37 pm
unsuspecting users, while necessary, will actually serve to enhance facebook's own ability to market such data exclusively. most of us understand that whether you are using facebook or google or some other online services, we are trading certain information about ourselves for free for low cost services. but for this model to persist, both sides of the bargain need to know the stakes that are involved. right now, i'm not convinced that facebook users have the information they need to make meaningful choices. in the past, many of my colleagues on both sides of the aisle have been willing to defer to tech companies' efforts to regulate themselves. but this may be change. just last month an overwhelming bipartisan fashion congress voted to make it easier for prosecutors and victims to go after web sites that knowingly facilitate sex trafficking. this should be a wakeup call for the tech community.
2:38 pm
we want to hear more without delay about what facebook and other companies plan to do to take greater responsibility for what happens on their platforms. how will you protect users' data? how will you inform users about the changes you are making? and how do you intend to proactively stop harmful conduct instead of being forced to respond to it months or years later. mr. zuckerberg, you and the company you have created, represent the american dream. many are incredibly inspired by what you have done. at the same time, you have an obligation and up to you to ensure that that dream doesn't become a privacy nightmare for the scores of people who use facebook. this hearing is an opportunity to speak to those who believe in facebook and those who are deeply skeptical about it.
2:39 pm
we are listening. america's listening and quite possibly the world is listening, too. senator grassley: ranking member feinstein. senator feinstein: thank you very much, mr. chairman. thank you for holding this hearing. mr. zuckerberg, thank you for being here. you have a real opportunity this afternoon to lead the industry and demonstrate a meaningful commit mnlt to protecting individual privacy. we have learned over the past few months and we have learned a great deal that's alarming. we have seen how foreign actors are abusing social media platforms, like facebook, to interfere in elections and take millions of americans' personal information without their knowledge in order to manipulate public opinion and target individual voters. specifically, on february 16,
2:40 pm
special counsel mueller issued an indictment against the russia-based internet research agency and 13 of its employees for interfering operations targeting the united states. through this 37-page indictment, we learned that the i.r.a. ran a coordinated campaign through 470 facebook accounts and pages. the campaign included ads and false information to create discord and harm to secretary clinton's campaign and the content was seen by an estimated 157 million americans. a month later, on march 17, news broke that cambridge analytica exploited the personal information of approximately 50 million facebook users without
2:41 pm
their knowledge or permission. and last week, we learned that number was even higher, 87 million facebook users who had their private information taken without their consent. specifically, using a personality quiz he created, the professor collected the personal information of 300,000 facebook users. and then collected data on millions of their friends. it appears the information collected included everything these individuals had on their facebook pages. and according to some reports, even included private direct messages between users. professor is said to have taken data from over 70 million americans and also been reported that he sold this data to cambridge analytica for
2:42 pm
$800,000. aim bridge analytica then took this data and created a psychological welfare tool to influence united states elections. fact, the c.e.o. declared that cambridge analytica ran all the digital campaign, the television campaign and its data, informed all the strategy for the trump campaign. the reporting has also speculated that cambridge analytica works with the internet research agency to help russia identify which american voters to target with its propaganda. i'm concerned that press reports indicate facebook learned about this breach in 2015, but appears not to have taken significant eps to address it until this year. this hearing is important and i
2:43 pm
appreciate the conversation we had yesterday. and i believe that facebook, through your presence here today and the words you are about to tell us will indicate how strongly your industry will regulate and/or reform the platforms that they control. i believe this is extraordinarily important. you lead a big company with 27,000 employees. and we very much look forward to your comments. thank you, mr. chairman. senator grassley: thank you, senator feinstein. the history and growth of facebook mirrors that of many of our technological giants. founded by mr. zuckerberg in 2004, facebook has exploded over the past 14 years. facebook currently has over two billion monthly active users across the world, over 25,000 employees and offices in 13 u.s.
2:44 pm
cities and various other countries. like their expanding user base, the data collected on facebook users has also skyrocketed. they have moved on from schools likes and relationship patterns. today facebook has access to dozens of data points from events you have attended and location based upon your mobile device. it is no secret that facebook makes money off this data through advertising revenue, although many seem confused by all together unaware of this fact. facebook generates -- generated $40 billion in revenue in 2017 with about 98% coming from facebook across facebook and instagram. significant data collection is also occurring at google,
2:45 pm
twitter, apple and amazon. and an ever expanding portfolio of products and services offered by these companies grant endless opportunities to collect increasing amounts of information on their customers. as we get more free or extremely low-cost services, the tradeoff for the american consumer is to provide more personal data. the potential for further growth and innovation based on collection of data is limitless. however, the potential for abuse is also significant. while the condors of the cambridge analytica situation are still coming to light, there was completely lack of consumer trust and likely improper transfer of data. the judiciary committee will hold a separate hearing
2:46 pm
exploring cambridge and other data privacy issues. more importantly, though, these events have ignited a larger discussion on consumers' expectations and the future of data privacy in our society. it is exposed that consumers may not fully understand or appreciate the extent to which their data is collected, protected, transferred, used and misused. data has been used in advertising and political campaigns for decades. the amount and type of data obtained, however, has seen a very dramatic change. campaigns including presidents bush, obama and trump all used these increasing amounts of data to focus on micro targeting and personalization over numerous social media platforms and especially facebook.
2:47 pm
in fact, presidents -- president obama's campaign utilized the same facebook feature as cambridge analytica to capture the information not just millions of app users but millions of their friends. the digital actor for that campaign described the data scrapping act as something that would quote wind up being the most groundbreaking piece of technology for this campaign, end of quote. so the effectiveness of these ocial media tactics. and the use across the political sp spectrum cannot be ig forward. our policy towards data privacy and security must keep pace with these changes. data privacy should be
2:48 pm
intelligentered to consumer needs and expectations. now at a minimum, consumers must have the transparency necessary to make an informed decision about whether to share their data and how it can be used. consumers ought to have clear information, not opaque policies and click through consent pages. the tech industry has an obligation to respond to widespread and growing concerns over data pifes and security and to restore the public's trust. the status quo no longer works. moreover, congress must determine if and how we need to strengthen privacy standards to ensure transparency and understanding for the billions of consumers who utilize these products. senator nelson. senator nelson: mr. zucker --
2:49 pm
mr. zuckerberg, let me cut to the chase. if you and other social media companies do not get your act in order, none of us are going to have any privacy anymore. that's what what we are facing, talking about personally identifiable information that if not kept by the social media ompanies from theft, a value that we have in america being our personal privacy, we won't have it anymore. the advent of technology and of course all of us are part of it. from the moment we wake up in the morning until we go to bed, tabletsn those handheld and companies like facebook are
2:50 pm
tracing our activities and collecting information. facebook has a responsibility to protect this personal information. we had a good discussion yesterday. we went overall of this. you told me that the company had failed to do so. it's not the first time that facebook has mishandled its users' information. the f.t.c. found that facebook's privacy policies had deceived users in the past. and in the present case, we recognize that cambridge analytica and an app developer lied to consumers and lied to you, lied to facebook, but did facebook watch over the operations? we want to know that. and why didn't facebook notify
2:51 pm
87 million users that their personally identifiable information had been taken? and it was being also used, why were they not informed for unauthorized political purposes. so only now -- and i appreciate our conversation, only now facebook has pledged to inform those consumers whose accounts were compromised. i think you are genuine. i got that sense in conversing with you. you want to do the right thing. you want to enact reforms. we want to know if it's going to be enough. and i hope that will be in the answers today. now since we still don't know what cambridge analytica has
2:52 pm
done with this data, you heard chairman thune say, as we have discussed, we want to haul answerge analytica in to these questions at a separate hearing. i want to thank chairman thune for working with all of us on scheduling a hearing. there's obviously a great deal of interest in this subject. i hope we can get to the bottom of this. and if facebook and other online companies will not or cannot fix the privacy invasions, then we are going to have to, we, the congress. how can american consumers trust folks like your company to be caretakers of their most personal and identifiable
2:53 pm
information? and that's the question. thank you. senator grassley: thank you, my colleagues and senator nelson. our witness today is mark zuckerberg, founder, chairman, chief executive officer of facebook. mr. zuckerberg's launched facebook february 4, 2004 at the age of 19. and at that time he was a student at harvard university. as i mentioned previously, his company has now over $40 billion annual revenue and two billion active users. mr. zuckerberg with his wife established the chan-zuckerberg foundation for philanthropic causes. i turn to you and welcome to the committee. and whatever your statement is orally, if you have a longer one, it will be included in the record.
2:54 pm
so proceed, sir. . zuckerberg: members of the committee, we think a number of important issues around privacy, safety and democracy. and you are rightfully will have hard questions for me to answer. before i talk about the steps we are talking to address them, i want to talk about how we got here. facebook is an i'd is particular and optimistic company. for most of our existence, we focus on all of the good connecting people can do. people everywhere have gotten a powerful new tool for staying connected to the people they love, for making their voices heard and building communities and businesses. st recently, we seen the #metoo movement on facebook. after hurricane harvey, people came together to raise $20 million for relief and more than 70 million small businesses use
2:55 pm
facebook to create jobs and grow. but it's clear we didn't do enough to prevent these teels from being used for harm as well and that is fake news and oreign interference. we didn't take a broad enough view of our responsibility and that was a big mistake and it was my mistake and i'm sorry. i started facebook. i run it and i'm responsible for what happens here. so now, we have to go through all of our relationship with people and make sure that there is a broad enough view of our responsibility. it's not enough to just connect people. we have to make sure that those connections are positive. it's not enough to give people a voice. we need to make sure that people aren't using it to harm other people or spread misinformation. and not enough people to give control over their information.
2:56 pm
we need to make sure the developers they share it with protect their information, too. across the board, we have the responsibility to not just build tools but to make sure they are used for good. it will take some time to work through the changes we need to make across the company, but i'm committed to getting this right. this includes the bake responsibility of protecting people's information which we failed to do with cambridge analytica. here are a few things that we are doing to address this and to prevent it from happening again. first, we are getting to the bottom of exactly what cambridge did and telling everyone affected. what we know is that cambridge analytica improperly accessed some information about millions of facebook members by buying it from an app developer. that information -- this was information that people generally share publicly on their facebook pages like names and their picture and the pages
2:57 pm
they follow. when we first contacted cambridge analytica, they told us they deleted the data. about a month ago we heard new reports that suggested that wasn't true. and now we are working with governments in the u.s., u.k. and around the world to do a full audit and get rid of any data they may still have. second, to make sure no other app developer out there is misusing data, we are investigating every single app that had information in the past. if we find that someone improperly used data we are going to ban them from facebook. to prevent this from happening again going forward, we are making sure that developers can access as much information now. the good news is that we already big changes to our platform in 2014 that would have prevented this situation with cambridge annual it can ca from occurring
2:58 pm
again today. there is more to do and find more details on the steps we are taking in my written statement. my top priority has always been our social mission of connecting people, building community and bringing the world closer together. advertisers and developers will never take priority over that as long as i'm running facebook. i started facebook when i was in college. we have come a long way since then. we now serve more than two billion people around the world and every day people use our services to stay connected to the people that matter to them most. i believe deeply in what we are doing and i know that when we address these challenges, we will look back and giving people a voice is a more positive force in the world. these aren't just issues for facebook and our community, they are issues and challenges for all of us as americans.
2:59 pm
thank you for having me here today and i'm ready to take your questions. senator grassley: i remind members that we are operating under the five-minute rule and that applies to -- the five-minute rule and that applies to those of us who are chairing the committee as well. i start with you. facebook handles extensive amounts of personal data for billions of users. a significant amount of that data is shared with third-party developers who utilize your platform. as of this year, you did not actively monitor whether that data was transferred by such developers to other parties. moreover, your policies only prohibit transfers by developers to parties seeking to profit from such data. umber one, besides the
3:00 pm
professor's transfer and do you know of any instances where user data was improperly transferred to third party in breach of facebook's terms? if so, how many times if so, how many times has that happened and was facebook only made aware of that transfer by some third party? mr. zuckerberg: mr. chairman, thank you. as i mentioned, we're now conducting a full investigation into every single app that had access to a large amount of information, before we lockdown platform to prevent developers from accessing this information around 2014. we believe that we're going to be investigating many apps, tens of thousands of apps, and if we find any suspicious activity, we're going to conduct a full audit of those apps to understand how they're using their data and if they're doing anything improper. if we find that they're doing anything improper, we'll ban them from facebook and we will tell everyone affected. as for past activity, i don't
3:01 pm
have all the examples of apps that we've banned here. but if you'd like, i can have my team follow up with you after this. mr. grassley: you have ever required an audit to ensure the deletion of improperly transferred data and if so, how many times? mr. zuckerberg: mr. chairman, yes, we have. i don't have the exact figure on how many times we have. but overall the way we've enforced our platform policies in the past is we have looked at patterns of how apps have used our a.p.i.'s and accessed information, as well as looked into reports that people have made to us about apps that might be doing sketchy things. going forward, we're going to take a more proactive position on this and do much more regular spot checks and other reviews of apps, as well as increasing the amount of audits that we do and, again, i can make sure that our team follows up with you on anything about the specific past stats that would be interesting. mr. grassley: i was going to assume that sitting here today
3:02 pm
you have no idea and if i'm wrong on that, you're able -- you're telling me, i think, that you're able to supply those florida figures to us -- those figures to us, at least as of this point. mr. zuckerberg: mr. chairman, i will have my team follow up with you on what information we have. mr. grassley: but right now you have no certainty of whether or not -- how much of that's going on, right? ok. facebook collects massive amounts of data from consumers, including content, networks, contact lists, device information, location and information from third parties. yet your data policy is only a few pages long and provides consumers with only a few examples of what is collected and how it might be used. the examples given emphasize benign uses such as connecting with friends, but your policy does not give any indication for more controversial issues
3:03 pm
of such data. my question, why doesn't facebook disclose to its users all the ways the data might be used by facebook and other third parties and what is facebook's responsibility to inform users about that information? mr. zuckerberg: mr. chairman, i believe it's important to tell people exactly how the information that they share on facebook is going to be used. that's why every single time you go to share something on facebook, whether it's a photo in facebook, or a message in messager, every single time there's a control right there about who you're going to be sharing it with, whether it's your friends or public or specific group. and you can change that and control that in line. to your broader point about the privacy policy. this gets into an issue that i think we and others in the tech industry have found challenging. which is that long privacy policies are very confusing. and if you make it long and
3:04 pm
spell out all the details, then you're probably going to reduce the percent of people who read it and make it accessible to them. so one of the things that we've struggled with over time is to make something that is as simple as possible so people can understand it. as well as giving them controls in line in the product, in the context of when they're trying actually use them. taking into account that we don't expect that most people will want to go through and read a full legal document. mr. grassley: senator nelson. else in else in thank you, mr. chairman -- nels -- mr. nelson: thank you, mr. chairman. yesterday when we talked i gave the relatively harmless example that i'm communicating with my friends on facebook and indicate that i love a certain kind of chocolate. and all of a sudden i start receiving advertisements for chocolate. what if i don't want to receive those commercial
3:05 pm
advertisements? so, your chief operating officer, ms. sanberg, suggested on the nbc today show that facebook users who do not want their personal information used for advertising might have to pay for that protection. pay for it. are you actually considering having facebook users pay for you not to use that information? mr. zuckerberg: senator, people have a control over how their information is used in ads in the product today. so if you want to have an experience where your ads aren't targeted using all the information that we have available, you can turn off third party information. what we've found is that even though some people don't like ads, people really don't like
3:06 pm
ads that aren't relevant. and while there is some discomfort for sure with using information in making ads, more relevant, the overwhelming feedback we get from our community is that people would rather have us show relevant content there than not. so we offer this control that you're referencing. some people use it. it's not the majority of people on facebook. and i think that that's a good level of control to offer. i think what cheryl was saying was that in order to not run ads at all, we would still need some sort of business model. mr. nelson: and that is your business model. so i take it that -- and i use the harmless example of chocolate. but if it got into more personal things, communicating with friends, and i want to cut it off, i'm going to have to ay you in order not to send me
3:07 pm
using my personal information something that i don't want. that in essence is what i understood ms. sanberg to say. is that correct? mr. zuckerberg: yes, senator. although to be clear, we don't offer an option today for people to pay to not show ads. we think offering an ad-supported service is the most aligned with our mission of trying to help connect everyone in the world. because we want to offer a free service that everyone can afford. that's the only way that we can reach billions of people. mr. nelson: so therefore you consider my personally identifiable data the company's data, not my data, is that it? mr. zuckerberg: no, senator. actually, the first line of our terms of service say that you control and own the information and content that you put on facebook. mr. nelson: well, the recent scandal is obviously frustrating. not only because it affected 87 million, but because it seems
3:08 pm
to be part of a pattern of lax data practices by the company going back years. so back in 2011, it was a settlement with the f.t.c. and now we discover yet another incident where the data was failed to be protected. when you discovered the cambridge anly the -- analytica that had fraudulently obtained all this information, why didn't you inform those 87 million? mr. zuckerberg: when we learned n 2015 that cambridge, a nalytica had bought data, we did take action. we took down the app. and we demanded that both the pp developer and cambridge delete and stop using any data they had. they told us that they did his.
3:09 pm
that is not a mistake we will make. mr. nelson: you did that and apologized for it but you didn't notify them. and do you think that you have an ethical obligation to notify 7 million facebook users in -- users? mr. zuckerberg: senator, when we heard back from cambridge analytica that they told us they were not using the data and deleted it, we considered it a closed case. in retrospect that was clearly a mistake. we shouldn't have taken their word for it. and we've updated our policies in how we're going to operate the company so we make sure we don't make that mistake again. mr. nelson: did anybody notify the f.t.c.? mr. zuckerberg: no, senator, for the same reason. we considered it a closed case. mr. thune: dwue that differently today, presumably? -- would you do that differently today, presumably?
3:10 pm
mr. zuckerberg: yes. mr. thune: this may be your first appearance before congress, but it's not the first time that facebook has faced tough questions about its privacy policy -- policies. wired magazine recently noted you have a 14-year history of apologizing for ill-advised decisions regarding user privacy not unlike the one that you made just now in your opening statement. after more than a decade of promises to do better, how is today's apology different and why should we trust facebook to make the necessary changes to ensure user privacy and give people a clearer picture of your privacy policies? mr. zuckerberg: thank you, mr. chairman. so we have made a lot of mistakes in running the company. i think it's pretty much impossible, i believe, to start a company in your dorm room and then grow it to be at the scale that we're at now without making some mistakes. and because our service is about helping people connect
3:11 pm
and information, those mistakes have been different in how -- we try not to make the same mistake multiple times, but in general a lot of the mistakes are around how people connect to each other, just because of the nature of the service. overall, i would say that we're going through a broader philosophical shift in how we approach our responsibility as a company. for the first 10 or 12 years of the company, i viewed our responsibility as primarily building tools that if we could put those tools in people's hands, then that boo empower people to do -- that would empower people to do good things. what we've learned now across a number of issues, not just data privacy but fake news and foreign interference in elections, is we need to take a more proactive role and a broader view of our responsibility. it's not enough to just build tools. we need to make sure they're used for good and that means that we need to now take a more active view in policing the ecosystem and in watching and kind of looking out and making sure that all of the members in our community are using these tools in a way that's going to be good and healthy.
3:12 pm
at the end of the day, this is going to be something where people will measure us by our results on this. it's not that i expect that anything i say here today to necessarily change people's view, but i'm committed to getting this right and i believe that over the coming years, once we fully work all these solutions through, people will see real differences. mr. thune: i'm glad that you all have gotten that message. as we discussed in my office yesterday, the line between legitimate political discourse and hate speech can sometimes be hard to identify. especially when you're ri relying on artificial intelligence and other technology for discovery. can you discuss what steps facebook currently takes with making these evaluations, the challenges that you face and any examples of where you may draw the line between what is and what is not hate speech? mr. zuckerberg: yes, mr. chairman. i'll speak to hate speech and then i'll talk about enforcing
3:13 pm
our content policies more broadly. so, actually maybe, if you're ok with, it i'll go in the other order. so from the beginning of the company, in 2004, i started in my dorm room, it was me and my roommate. we didn't have a. irment technology that could look -- a.i. technology that could look at the content people were sharing so we basically had to enforce our content policies reactively. people could share what they wanted and then if someone in the community found it to be offensive or against our policies, they'd flag it for us and we'd look at it reactively. now increasingly we're developing a.i. tools that can identify certain classes of bad activity proactively. and flag it for our team. by the end of this year we're going to have more than 20,000 people working on security and content review, working across all these things. so when content gets flagged to us, we have those people look at it and if it violates our policies we take it down.
3:14 pm
some problems lend themselves more easily to a.i. solutions than others. so, hate speech is one of the hardest. because determining if something is hate speech is very linguistically nuanced. you need to understand what is a slur and what -- whether something is hateful. not just in english, but the majority of people on facebook use it in languages that are different across the world. contrast that, for example, with an area like finding terrorist propaganda. which we've actually been very successful at deploying a.i. tools on already. today as we sit here, 99% of the cispa and al qaeda content that we take down on facebook our a.i. systems flag before any human sees it. that's a success in terms of rolling out a.i. tools that can proactively police and enforce safety across the community. hate speech i am optimistic that over a five to 10-year period whether he have a.i.
3:15 pm
tools that can -- we will have a.i. tools that can get into the nuances of different types of content to be more accurate in flagging things for our systems. but today we're just not there on that. so a lot of this is still reactive. people flag it to us. we have people look at it, we have policies to try to make it as not subjective as possible. but until we get it more automated there's a higher error rate than i'm happy with. fine fine thank you, mr. chairman -- ms. feinstein: thank you, mr. chairman. mr. zurkberg, what is facebook doing -- mr. zuckerberg, what is facebook doing to prevent foreign acters from interfering in u.s. elections? mr. zuckerberg: thank you, senator. this is one of my top priorities in 2018. to get this right. one of my greatest regrets in running the company is that we were slow in identifying the russian information operations in 2016. we expected them to do a number of more traditional cyberattacks, which we did identify and notify the campaigns that they were trying
3:16 pm
to hack into them. but we were slow in identifying the type of new information operations. ms. feinstein: when did you identify new operations? mr. zuckerberg: it was right around the time of the 2016 election itself. so since then -- 2018 is an incredibly important year for elections. not just with the u.s. midterms, but around the world there are important elections in india, in brazil and mexico and pakistan and in hungary that we want to make sure that we do everything we can to protect the integrity of those elections. now, i have more confidence that we're going to get this right because since the 2016 election, there have been several important elections around the world where we've had a better record. there's the french presidential election. there's the german election. there was the u.s. senate alabama special election last year. ms. feinstein: explain what is better about the record. mr. zuckerberg: we've deployed new a.i. tools that do a better job of identifying fake accounts, that may be trying to interfere in elections or spread misinformation. and between those three
3:17 pm
elections, we were able to proactively remove tens of thousands of accounts that, before they could contribute significant harm, and the nature of these attacks, though, is that there are people in russia whose job it is to try to exploit our systems and other internet systems and other systems as well. so this is an arms race. they're going to keep on getting better at this and we need to invest in keeping getting better at this too. which is why one of the things i mentioned before is we're going to have more than 20,000 people by the of the year working on security and content review across the company. ms. feinstein: speak for a moment about automated bots that spread disease information. what are you doing to punish those -- disinformation. what are you doing to punish those who exploit your platform in that regard? mr. zuckerberg: well, you're not allowed to have a fake account on facebook. your content has to be authentic. so we build technical tools to try to identify when people are creating fake accounts. especially large networks of
3:18 pm
fake accounts like the russians have. in order to remove all of that content. after the 2016 election, our top priority was protecting the integrity of other elections around the world. but at the same time we had a parallel effort to trace back to russia the i.r.a. activity, the internet -- the part of the russian government that did this activity in 2016, and just last week we were able to determine that a number of russian media organizations that were sanctioned by the russian regulator, were operated and controlled by this internet research agency. so we took the step last week, it was a pretty big step for us, of taking down sanctioned news organizations in russia as part of an operation to remove 270 fake accounts and pages, part of their broader network in russia, that was actually not targeting international interference as much as -- i'm sorry, let me correct that. it was primarily targeting --
3:19 pm
spreading misinformation in russia itself, as well as certain russian speaking neighboring countries. ms. feinstein: how many accounts of this type you have taken down? mr. zuckerberg: across -- in the i.r.a. specifically, the ones that we've pecked -- pegged back to the i.r.a., we can identify the 470 in the american elections and the 270 that we specifically went after in russia last week. there were many others that our systems catch, which are more difficult to attribute specifically to russian intelligence. but the number would be in the tens of thousands of fake accounts that we remove and i'm happy to have my team follow up with you on more information if that would be helpful. ms. feinstein: would you please? i think this is very important. if you knew in 2015 that cambridge analytica was using the information of professor cogen's, why didn't facebook ban cambridge in 2015? why did you wait? ms. feinstein: that's a great
3:20 pm
question. cambridge analytica wasn't using our services in 2015 as far as we can tell. so this is clearly one of the questions that i asked our team as soon as i learned about this. why did we wait until we found out about the reports last month, to ban them? it's because as of the time that we learned about their activity in 2015, they weren't an advertiser, they weren't running pages. so we had nothing to ban. ms. feinstein: thank you. mr. grassley: now senator hatch. mr. hatch: well, in my opinion this is the most intense public scrutiny i've seen for a tech-related hearing since the microsoft hearing that i chared back in the late 1990's -- chaired back in the late 1990's. the recent stories about cambridge analytica and data mining on social media have raised serious concerns about consumer privacy. and naturally i know you understand that. at the same time these stories touch on the very foundation of the internet economy and the
3:21 pm
way the websites that drive our internet economy make money. some have professed themselves shocked, shocked, that companies like facebook and google share user data with advertisers. did any of these individuals ever stop to ask themselves why facebook and google don't charge for access? nothing in life is free. everything involves tradeoffs. if you want something without having to pay money for it, you're going to have to pay for it in some other way, it seems to me. that's what we're seeing here. these great websites that don't charge for access, they extract value in some other way. and there's nothing wrong with that. as long as they're up front about what they're doing. to my mind, the issue here is transparency. it's consumer choice. do users understand what they're agreeing to when they access a website or agree to terms of service? are websites up-front about how
3:22 pm
they extract value from users or do they hide the ball? do consumers have the information they need to make an informed choice regarding whether or not to visit a particular website? to my mind, these are questions that we should ask. or be focusing on. now, mr. zuckerberg, i remember well your first visit to capitol hill back in 2010. you spoke to the senate republican high-tech task force, which i chair, you said back then that facebook would always be free. is that still your objective? mr. zuckerberg: senator, yes. there will always be a version of facebook that is free. it is our mission to try to help connect everyone around the world and to bring the world closer together. in order to do that, we believe that we need to offer a service that everyone can afford and we're committed to doing. that mr. hatch: if so, how do you sustain a business model in which users don't pay for your service? mr. zuckerberg: we run ads. mr. hatch: i see. that's great. whenever a controversy like
3:23 pm
this arises, there's always the danger that congress' response will be to step in and overregulate. that's been the experience that i've had in my 42 years here. in your view, what sorts of legislative changes would help to solve the problems that cambridge analytica story has revealed and what sorts of legislative changes would not help to solve this issue? mr. zuckerberg: senator, i think there are a few categories of legislation that make sense to consider. around privacy specifically, there are few principles that i think would be useful to discuss and potentially codify into law. one is around having a simple and practical set of ways that you explain what you're doing with data. we talked a little bit earlier around the complexity of laying out this long privacy policy. it's hard to say that people fully understand something when it's only written out in a long
3:24 pm
legal document. this stuff needs to be implemented in a way where people can actually understand it. where consumers can understand it. but that can also capture all the nuances of how these services work in a way that that's not overly restrictive on president ghani: the services. that's one -- on providing the seasons. that's one. the second is around giving people complete control. every piece of content that you share on facebook, you own. and you have complete control over who sees it. and how you share it. and you can remove it at any time. that's why every day about 100 billion times a day, people come to one of our services and either post a photo or send a message to someone, because they know that they have that control and who they say it's going to go to is going to be who sees the content. i think that that's control is something that's important. that should apply to every service.
3:25 pm
the third point is just around enabling innovation. because some of these use cases that are very sensitive, like face recognition, for example. and i think there's a balance that's extremely important to strike here. where you obtain special consent for sensitive features like face recognition, but don't -- but we still need to make it so american companies can innovate in those areas or else we're going to fall behind chinese competitors and others around the world who have different regimes for different new features like that. mr. grassley: senator cantwell. ms. cantwell: thank you, mr. chairman. welcome, mr. zuckerberg. do you know who palin tear is? mr. zuckerberg: i do. ms. cantwell: some people have referred to them as a stanford analytica. do you agree? ms. cantwell: i have not heard that. --
3:26 pm
mr. zuckerberg: i have not heard that. ms. cantwell: do you think they've taught cambridge analytica how to do these tactics? ms. cantwell: i don't know. mr. zuckerberg: do you think that they've ever scraped data from facebook? mr. zuckerberg: i'm not aware of that. ms. cantwell: ok. do you think that during the 2016 campaign, as cambridge analytica was providing support to the trump campaign under project alamo, were there any facebook people involved in that sharing of technique and information? mr. zuckerberg: we provided support to the trump campaign similar to what we provide to any advertiser or campaign who asks for it. ms. cantwell: so that was a yes. is that a yes? mr. zuckerberg: can you repeat the specific question? i just want to make sure i get specifically what you're asking. ms. cantwell: during the 2016 campaign, cambridge analytica
3:27 pm
worked with the trump campaign to refine tactics and were facebook employees involved in that? mr. zuckerberg: i don't know that our employees were involved with cambridge analytica. although i know that we did help out the trump campaign overall in sales support in the same way we do with other campaigns. ms. cantwell: so they may have been involved and all working together during that time period, maybe that's something your investigation will find out? mr. zuckerberg: i can certainly have my team get back to you on any specifics there that i don't know sitting here today. ms. cantwell: you have heard of total information awareness? do you know what i'm talking about? mr. zuckerberg: no. ms. cantwell: ok. otal information awareness was 2003, john ashcroft and others trying to do similar things to what i think is behind all of this. geopolitical forces trying to get data and information to influence a process. so, when i look at palintear and what they're doing and i
3:28 pm
look at what'sapp, which is another acquisition, and i look at where you are from the 2011 consent decree and where you are today, i'm thinking, is this guy outfoxing the foxes or is he going along with what is a major trend in an information age, to try to harvest information for political forces? and so my question to you is, do you see that those applications, that those companies, palintear, and even what's app, is going to fall into the same situation that you've just fallen into over the last several years? mr. zuckerberg: i'm not sure specifically. overall i do think that these issues around information access are challenging. to the specifics about those apps, i'm not really that familiar with what palintear
3:29 pm
does. what's app collects very little information. and i think is less likely to have the kind of issues because of the way that the services is architected but certainly i think these are broad issues across the tech industry. ms. cantwell: i guess given the track record of where facebook is and why you're here today, i guess people would say that they didn't act boldly enough. like e fact that people john bolton basically was an investor, a "new york times" article earlier, i guess it was actually last month, that the bolten pack was obsessed with how america was becoming limp-wristed and spineless on national security issues. so the fact that there are a lot of people who are interested in this larger effort, and what i think my constituents want to know is was this discussed at your board meetings and what are the
3:30 pm
applications and interests that are being discussed without putting real teeth into this? we don't want to come back to this situation again. i believe you have all the talent. my question is, whether you have all the will to help us solve this problem. mr. zuckerberg: yes, senator. so data privacy and foreign interference in elections are certainly topics that we've discussed at the board meeting. these are some of the biggest issues that the company has faced and we feel a huge responsibility to get these right. ms. cantwell: do you believe european regulations should be applied here in the u.s.? mr. zuckerberg: i think everyone in the world deserves good privacy protection. and regardless of whether we implement the exact same regulation, i would guess that it would be somewhat different because we have somewhat different sensibilities in the u.s. as do other countries, we're committed to rolling out the controls and the
3:31 pm
affirmative consent and the special controls around sensitive types of technology like face recognition that are required in g.d.p. -- gdpr, we're doing that around the world. so i think it's certainly worth discussing, whether we should have something similar in the u.s. but what i would like to say today is that we're going to go forward and implement that regardless of what the regulatory outcome is. mr. grassley: senator thune will chair next. senator wicker. wick wick thank you, mr. chairman -- mr. wicker: thaurks mr. chairman. mr. zucker -- thank you, mr. chairman. mr. zuckerberg, thank you for being with us. my question is going to be a follow-up on what senator hatch was talking about. let me agree with basically his advice that we don't want to overregulate to the point where we're stifling innovation and investment. i understand with regard to
3:32 pm
suggested rules or suggested legislation there are at least two schools of thought out there. one would be the i.s.p.'s, the internet service providers, who are advocating for privacy protections for consumers that apply to all online entities equally across the entire internet ecosystem. now, facebook is an edge provider on the other hand. it's my understanding that many edge providers, such as facebook, may not support that effort because edge providers have different business models than the i.s.p.'s and should not be considered like services. so, do you think we need consistent privacy protections for consumers across the entire internet ecosystem that are based on the type of consumer information being collected, used or shared, regardless of the entity doing the collecting or using or sharing?
3:33 pm
mr. zuckerberg: this is an important question. i would differentiate between i.s.p.'s, which i consider to be the pipes of the internet, and the platforms like facebook or google or twitter, youtube, that are the apps or platforms on top of that. i think in general the expectations that people have of the pipes are somewhat different from the platforms. so there might be areas where there needs to be more regulation in one and less in the other. but there are going to be other places where there needs to be more regulation of the other type. specifically, though, on the pipes, one of the important issues that i think we face and have debated is -- mr. wicker: when you say pipes you mean -- mr. zuckerberg: the i.s.p.'s. i know net neutrality has been a hotly debated topic and one of the reasons why i have been out there saying that i think that that thub the -- that that should be the case, is because i look at my own story of when
3:34 pm
i was getting started building facebook at harvard. i only had one option for an i.s.p. to use. and if i had to pay extra in order to make it so my app could be seen or used by other people, then we probably wouldn't be here today. mr. wicker: but we're talking about privacy concerns. let me just say, we'll have to follow up on this. but i think you and i agree, this is going to be one of the major items of debate, if we have to go forward and do this from a governmental standpoint. let me just move on to another couple of items. is it true that, as was recently publicized, that facebook collects the call and text histories of its users that use android phones? mr. zuckerberg: we have an app called messager for sending messages to your facebook friends. and that app offers people an option to sync their text messages into the messaging app
3:35 pm
and to make it so that -- so basically you can have one app where it has both your texts and your facebook messages in one place. we also allow people the option of -- mr. wicker: you can opt in or out of that? mr. zuckerberg: it is opt-in. you have to affirmatively say that you want to sync that information before we get access. mr. wicker: unless you opt in, you don't collect that call and text history? mr. zuckerberg: that is correct. mr. wicker: is that true for -- is this practice done at all with minors or do you make an exception there for persons age 13 to 17? mr. zuckerberg: i do not know. we we can follow up. mr. wicker: let's do that. one other thing. there have been reports that facebook can track a user's internet browsing activity even after that user has logged off of the facebook platform. can you confirm whether or not this is true?
3:36 pm
mr. zuckerberg: senator, i want to make sure i get this accurate so it would probably be better to have my team follow up. mr. wicker: you don't know? mr. zuckerberg: i know that people use cookies on the internet. and that you can probably correlate activity between sessions. we do that for a number of reasons. including security and including measuring ads to make sure that the ad experiences are the most effective which of course people can opt out of. but i want to make sure that i'm precise in my answer. let me follow up. mr. wicker: when you get back to me, sir, would you also let us know how facebook discloses to its users that it's engaging in type of tracking gives us that result? mr. zuckerberg: yes. mr. wicker: and thank you very much. mr. thune: senator leahy is up ext. mr. leahy: thank you. mr. zuckerberg, i assume facebook has been served with subpoenas from the special
3:37 pm
counsel's option. doctors office. is that correct? mr. zuckerberg: yes -- special counsel's office. is that correct? mr. zuckerberg: yes. mr. leahy: you have or anyone from facebook been interviewed by the special counsel's office? you have been interviewed? mr. zuckerberg: yes. i have not. mr. leahy: others have? mr. zuckerberg: i believe so. i want to be careful here because our work with the special counsel is confidential and i want to make sure that in an open session i'm not revealing something that's confidential. mr. leahy: i understand. that's why i made clear that you have been contacted and have had subpoenas. mr. zuckerberg: actually, let me clarify that. i actually am not aware of a subpoena. i believe that there may be. but i know we're working with them. mr. leahy: thank you. six months ago your general counsel promised us that you were taking steps to prevent facebook for servicing what i would call an unwitting
3:38 pm
co-conspirator in the russian interference. but these unnever faillified paste pages are on -- unverified pages are on facebook today. they almost like like what russian agents used to spread propaganda during the 2016 election. are you able to confirm whether they're russian-created groups? yes or no? mr. zuckerberg: are you asking about those specifically? mr. leahy: yes. mr. zuckerberg: last week we actually announced a major change to our ads and pages policies that we will be verifying the identity of every single advertiser. mr. leahy: these specific ones, do you know whether they are? mr. zuckerberg: i'm not familiar with those pieces of content specifically. mr. leahy: but if you decided this policy of a week ago, you'd be able to verify them? mr. zuckerberg: we are working on that now. what we're doing is we're going to verify the identity of any advertiser who's running a
3:39 pm
political or issue-related ad. this is basically what the honest ads act is proposing, and we're following that. and we're also going to do that for pages. mr. leahy: you can't answer on these? mr. zuckerberg: i'm not familiar with those specific -- mr. leahy: will you find out the answer and get back to me? mr. zuckerberg: i'll have mee team get back to you. -- my team get back to you. i do think the worth adding that we're going to do the same verification of the identity and location ofed a minutes who are running -- ad -- of admins who are running large pages. that will make it significantly harder for russian interference efforts or other inauthentic efforts to try to spread misinformation through the network. mr. leahy: it's been going on for some time, some might say it's about time. six months ago i asked your general counsel about facebook's role as a breeding ground for hate speech against rohingya refugees. rise recently u.n.
3:40 pm
investigators blais -- recently u.n. investigators blamed facebook for inciting possible genocide in myanmar and there has been genocide there. you say you used a.i. to find this. this is the type of content i'm referring to. it cause for the death of a muslim journalist. that threat went straight through your detection systems, it spread very quickly. and then it took attempt after attempt after attempt and involvement of civil society roups to get you to remove it. why couldn't it be we moved within 24 hours? mr. zuckerberg: senator, what's happening in myanmar is a terrible tragedy and we need to do more. mr. leahy: we all agree with. that mr. zuckerberg: ok. mr. leahy: but u.n. investigators have blamed you, blamed facebook, for playing a role in the genocide. we all agree it's terrible.
3:41 pm
how can you dedicate and will you dedicate resources to make sure such hate speech is taken down within 24 hours? mr. zuckerberg: yes. we're working on this. and there are three specific things that we're doing. one is we're hiring dozens of more burmese language content reviewers. because hate speech is very language-specific. it's hard to do it without people who speak the local language and we need to ramp up our effort there dramatically. second, is we're working with civil society in myanmar to identify specific hate figures so we can take down their accounts rather than specific pieces of content. and third, we're standing up a product team to do specific product changes in myanmar and other countries that may have similar issues in the future, to prevent this from happening. mr. leahy: senator cruz and i sent a letter to apple asking what they're going to do about chinese censorship.
3:42 pm
my question, i'll place it for the record. i want to know what you'll do about chinese censorship when they come to you. mr. thune: senator graham's up next. mr. graham: thank you. are you familiar with andrew bosworth? mr. zuckerberg: yes, i am. mr. graham: he said, so we connect more people, maybe someone dies in a terrorist attack, coordinated on our tools. the ugly truth is we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good. do you agree with that? mr. zuckerberg: no, i do not. as context, bos wrote that, that's what we call him internally, he wrote that as an internal note. we have a lot of discussion internally. i disagreed with it at the time he wrote it. if you looked at the comments
3:43 pm
on the internal discussion, the majority of people internally did too. mr. graham: did you a poor job as c.e.o. communicating your displeasure with such thoughts. because if he had understood where you were at, he would never have said it to begin with. mr. zuckerberg: well, senator, we try to run our company in a way where people can express different opinions internally. mr. graham: this is an opinion that really disturbs me. and in somebody worked for me that said this i'd fire them. who's your biggest competitor? mr. zuckerberg: we have a lot of competitors. mr. graham: who's your biggest? mr. zuckerberg: i think the categories -- did you want just one? i'm is not pour that -- not sure i can give one. can i give a bunch? there are three categories that i would focus on. one are the other tech platforms, so google, apple, amazon, microsoft. we overlap with them in different ways. mr. graham: do they provide the same service you provide? mr. zuckerberg: in different ways, different parts of it, yes. mr. graham: if i buy a ford and it doesn't work well and i don't like, it i can buy a
3:44 pm
chevy. if i'm upset with facebook, what's the equivalent product that i can go sign up for? mr. zuckerberg: there's the second category that i was going to talk about. mr. graham: i'm not talking about categories. i'm talking about real competition you face. car companies face a lot of competition if they make a defective car. it gets out in the world, people stop buying that car, they buy another one. is there an alternative to facebook in the private sector? mr. zuckerberg: yes. the average american uses eight different apps to communicate with their friends and stay in touch with people. ranging from texting apps to email. mr. graham: the same suffer provide -- mr. zuckerberg: we provide a number services. mr. graham: is twitter the same as you do? mr. zuckerberg: it overlaps. mr. graham: you don't feel like you have a monopoly? mr. zuckerberg: it certainly doesn't feel like that to me. [laughter] mr. graham: it doesn't? so instagram, you bought instagram. why did you buy instagram? mr. zuckerberg: because they were very talented app
3:45 pm
developers who were making good use of our platform and understood our values. mr. graham: a good business decision. my point is that one way to regulate a company is through competition. through government regulation. here's the question that all of us got an answer, what do we tell our constituents given what's happened here? why we should let you self-regulate? what would you tell people in south carolina, that given all the things we just discovered here, it's a good idea for us to rely upon you to regulate your own business practices? mr. zuckerberg: my position is not that there should be no regulation. i think the internet is increasing -- mr. graham: you embrace regulation? mr. zuckerberg: i think the real question as the internet becomes more important in people's lives is what is the right regulation, not whether there should be -- mr. graham: you as a company welcome regulation? mr. zuckerberg: if it's the right regulation, yes. mr. graham: do you think the europeans have it right? mr. zuckerberg: i think they get things right. graham --
3:46 pm
[laughter] mr. graham: that's true. so would you work with us in terms of what regulations you think are necessary in your industry? mr. zuckerberg: absolutely. mr. graham: would you submit to us proposed regulations? mr. zuckerberg: yes. i'll have my team follow up with you so that way we can have this discussion across different categories where i think this discussion needs to happen. mr. graham: look forward to. it when you sign up for facebook, you sign up for terms of service. are you familiar with that? mr. zuckerberg: yes. mr. graham: ok. it says the terms govern your use of facebook and the products, features, apps, services, technology, software we offer, facebook's products or products, except where we expressly state that separate terms and not these apply. i'm a lawyer, i have no idea what that means. but when you look at terms of service, this is what you get. do you think the average consumer understands what they're signing up for? mr. zuckerberg: i don't think that the average person likely reads that whole document. but i think that there are different ways that we can communicate that and have a
3:47 pm
responsibility to do so. mr. graham: do you agree with me that you better come up with different ways? because this ain't working? mr. zuckerberg: well, i think in certain areas that is true. and i think in other areas, like the core part what have we do, right, if you think about just the most basic level, people come to facebook, instagram, what's app, messager, about 100 billion times a day to share a piece of content or message with a specific set of people. and i think that that basic functionality, people understand because we have the controls in line every time, and given the volume of the activity and the value that people tell us that they're getting from that, i think that that control in line does seem to be working fairly well. now, we can always do better. and there are other services -- services are complex and there's more to it than just, you know, you go and post a photo. so i agree that in many places we can do better. but i think for the core of the service, it actually is quite
3:48 pm
clear. mr. graham: thank you. mr. thune: senator klobuchar. ms. klobuchar: thank you. i think we all agree what happened here is bad. you acknowledged temperatures a breach of trust and the way -- it was a breach of trust and i way i explain it to my constituents, if someone breaks into my apartment with a crowbar and they take my stuff, it's like if the manager gave them the keys or if they didn't have any locks on the doors. it's still a breach. it's still a break-in. i believe we need to have laws and rules that are sophisticated as the brilliant products that you've developed here. we just haven't done that yet. one of the areas that i focused on is the election. i appreciate the support that you and facebook and now twitter actually have given to the honest ads act, a bill that you mentioned that i'm leading with senator mccain and senator warner. and i just want to be clear, as we work to pass this law so, that we have the same rules in place to disclose political ads and issue ads, as we do for tv and radio, as well as disclaimers, that you're going
3:49 pm
to take early action as soon as june, i heard, before this election so that people can view these ads, including issue ads, is that correct? mr. zuckerberg: that is correct. i want to take a moment before i go into this in more detail to thank you for your leadership on this. this i think is an important area for the whole industry to move on. the two specific things that we're doing, one is around transparency. so, now you're going to be able to go and click on any advertiser or any page on facebook and see all of the ads that they're running. so that actually brings advertising online on facebook to an even higher standard than what you'd have on tv or print media, because there's nowhere where you can see all of the tv ads that someone is running. for example, whereas you will be able to see, now on facebook, whether this campaign or third party is saying different messages to different types of people. i think that's a really important element of transparaphernalia. is -- transparency. the other important piece is
3:50 pm
around verifying every single advertiser who's going to be running political or issue ads. ms. klobuchar: i appreciate that. senator warner and i have also called on google and the other platforms to do the same. memo to the rest of you, we have to get this done or we're going to have a patchwork of ads. i hope that you'll be working with us to pass this bill, is that right? mr. zuckerberg: we will. ms. klobuchar: thank you. now on the subject of cambridge analytica, were these people, the 87 million people, users, concentrated in certain states? are you able to figure out where they're from? mr. zuckerberg: i do not have that information with me. but we can follow up with your office. ms. klobuchar: ok. because we know that election was close and it was only thousands of votes in certain states. you've also estimated that roughly 126 people -- million people may have been shown content from a facebook page associated with the internet research agency. you have determined whether any
3:51 pm
of those people were the same facebook users whose data was shared with cambridge analytica? are you able to make that determination? mr. zuckerberg: we're investigating that now. we believe that it is entirely possible that there will be a connection there. ms. klobuchar: ok. that seems like a big deal as we look back at that le last election. former cambridge analytica employee christopher wily has said that the data that it improperly obtained, that cambridge analytica improperly obtained from facebook users, could be stored in russia. do you agree that's a possibility? mr. zuckerberg: are you asking if cambridge analytica's data could be stored in russia? ms. klobuchar: that's what he said this week on a sunday show. mr. zuckerberg: i don't have any specific knowledge that would suggest that. but one of the steps that we need to take now is go to a full -- go do a full audit of cambridge ate analytica systems to understand what they're doing, to see if they have any
3:52 pm
more data. that audit, we have temporarily ceded that in order to let the u.k. government complete their government investigation first. because of course the government investigation takes precedence over a company doing that. but we're committed to completing this full audit and getting to the bottom of what's going on here so we can have more answers to this. ms. klobuchar: you earlier stated publicly and here that you would support some privacy rules so theanch's playing by the same rules here -- so everyone's playing by the same rules here up. also said you should have notified customers earlier. would you support a rule that would require you to notify your users of a breach within 72 hours? mr. zuckerberg: that makes sense to me. and i think we should have our team follow up with yours to discuss the details around that more. ms. klobuchar: thank you. i just think part of this was when people don't even know that their data's been breached, that's a huge problem and i also think we get the
3:53 pm
solutions fast when are we get that information out there. thank you and we look forward to passing this bill. we'd love to pass it before the election on the honest ads and looking forward to better disclosure this election. thank you. mr. thune: thank you. senator blunt's up next. mr. blunt: thank you, mr. chairman. mr. zuckerberg, nice to see you, i saw you not too long after i entered the senate in 2011. i told you when i sent my business cards down to be printed, they came back from the senate print shop with the message, it was the first business card they'd ever printed a facebook address on. there are days when i've regretted that. but more days when we get lots of information that we need to get. there are days when i wonder if the facebook friends is a little misstated. it doesn't seem like i have those every single day. but the platform you've created is really important and my son, charlie, who is 13 is dedicated to instagram. so he'd want to be sure i
3:54 pm
mentioned him while i was here with you. [laughter] i haven't printed that on my card yet. i will say that. but i think we have that account as well. ots of ways to connect people. the information obviously is an important commodity and it's what makes your business work. i get that. however, i wonder about some of the collection efforts. maybe we can go through largely just even yes and no and then we'll get back to more expansive discussion of this, but do you collect user data through cross-device tracking? mr. zuckerberg: i believe we do link people's accounts between devices in order to make sure that their facebook and instagram and their other experiences can be sinked between their -- synced between their devices. mr. blunt: and this includes offline data? that's linked not necessarily to facebook but some device
3:55 pm
they went through facebook on? mr. zuckerberg: i want to make sure we get this right. so i want to have my team follow up with you on that. mr. blunt: that doesn't seem that complicated to me. you understand this better than i do. but maybe you can explain to me why that's complicated. do you track devices that an individual who uses facebook has, that is connected to the device that they use for their facebook connection but not necessarily connected to facebook? mr. zuckerberg: i'm not sure the answer to that question. mr. blunt: really? mr. zuckerberg: yes. there may be some data that is knows provide the service that we do. but i don't have that sitting here today. so that's something that i would want to follow up on. mr. blunt: the f.t.c. last year flagged cross-device tracking as one of their concerns generally that people are tracking devices that the users
3:56 pm
of something like facebook don't know they're being tracked. how do you disclose your collected -- collection methods? is that all in this document that i would see and agree to facebook? tered into mr. zuckerberg: yes. there are two ways we do this. one is we try to be exhaustive in the legal documents around the terms of service and privacy policies. but more importantly, we try to provide in-line controls so that people thank, that are in plain english that people can understand, they can either go to settings or we can show them at the top of the app periodically. so that people understand all the controls and settings they have. and can configure their experience the way they want. mr. blunt: so do people now give you permission to track specific devices in their contract? and if they do, is that a relatively new addition to what you do?
3:57 pm
am i able to opt out and say it's ok for you to track what i'm saying on facebook, but i don't want you to track what i'm texting to somebody else off facebook, on an android phone? mr. zuckerberg: oh, ok. yes, senator. in general facebook is not collecting data from other apps that you use. there may be some specific things about the device that you're using that facebook needs to understand in order to offer the service. but if you're using google or you're using some texting app, unless you specifically opt in that you want to share the texting app information, facebook wouldn't see that. mr. blunt: has it always been that way? or is that a recent addition to how you deal with those other ways that i might communicate? mr. zuckerberg: my understanding is that that is how the mobile operating systems are architected.
3:58 pm
mr. blunt: so you don't have bundled permissions for how i can agree to what devices i may use that you may have contact with? do you bundle that permission or am i able to individually say what i'm willing for to you watch and what i don't want you to watch? i think we may have to take that for the record based on everybody else's time. mr. thune: thank you, senator blunt. next up, senator durbin. mr. durbin: thank you very much, mr. chairman. mr. zuckerberg, would you be comfortable sharing with us the name of the hotel you stayed in last night? mr. zuckerberg: um -- no. [laughter] mr. durbin: if you messaged
3:59 pm
anybody this week, would you share with us the names of the people you've messaged? mr. zuckerberg: no. i would probably not choose to do that publicly here. mr. durbin: i think that may be what this is all about. your right to privacy. the limits of your right to privacy. and how much you give away in modern america in the name of, quote, connecting people around the world. a question basically of what information facebook's collecting, who they're sending it to and whether they were asked in advance, my permission to do that. is that a fair thing for a user of facebook to expect? mr. zuckerberg: yes, senator. i think everyone should have control over how their information is used. and as we've talked about in some of the other questions, i think that that is laid out in some of the documents, but more importantly, you want to give people control in the product itself. so the most important way that this happens across our services is that every day people come to our services to
4:00 pm
choose to share photos or send messages and every single time they choose to share something, they have a control right there about who they want to share it with. that level of control is extremely important. mr. durbin: think certain -- they certainly know who their friends are but they may not know, as has happened and you've conceded this point in the past, that sometimes that information is going way beyond their friends and sometimes people have made money off of sharing that information. correct? mr. zuckerberg: you're referring, i think, to our developer platform. it may be useful for me to give some background. mr. durbin: i have three minutes left, so maybe do that for the record because i have other questions to skfment you recently introduced an app called messaging kids, to allow
4:01 pm
send en as young as 12 to messages and videos and pictures through their parents' account. the campaign for the commercial free childhood and others warned facebook, they pointed to a wealth of research demonstrating that excessive use of digital devices is harmful to kids and argued that young children simply are not ready to handle social media accounts at age 6. in addition there are concerns about data that's being gathered about these kids. now there are certain limits of the law, we know, children's online privacy protection act. what guarantees can you give us that no data from messenger kids is or will be collected or shares with those that might violate the law? mr. zuckerberg: a number of things are important here. the background on messenger kids
4:02 pm
is we heard feedback from thousands of paraphernalias that they want to be able to stay in touch with their kids and call them and use apps like facetime but they want to have complete control over that. i think we can agree, when your kid is 6 or 7, even if they have act stose a phone, you want to be able to control everyone who they can contact. there wasn't an app that did that we built this service to do that. the app collects a minimum amount of information that is necessary to operate the service, so for example, the messages that people send, is something we collect in order to operate the service. but in general that data is not going to be shared with third parties. it is not connected to the broader facebook. mr. durbin: i picked up on that freas, "in general." that seems to suggest in some circumstances it will not be shared with third parties. mr. zuckerberg: no.
4:03 pm
mr. durbin: should someone who s reached adult age who grew up with messenger kids, should they be able to delete all day tafment mr. zuckerberg: absolutely. when you turn 13, our minimum age for facebook, you don't automatically get a facebook account, you have to apply for a new account. mr. durbin: i'll close, illinois has a biometric information privacy act, our state does which is to regulate the commercial use of facial, voice, finger and iris scans and the like. we're now in a fulsome debate on that i'm afraid facebook is trying to carve out exceptions to that fill me in on how that is consistent with protecting privacy. thank you.
4:04 pm
mr. thune: thank you senator durbin, senator corbin. mr. corbin: up until 2014, the mantra or motto of facebook was move fast and break things. is that correct? mr. zuckerberg: i don't know when we changed it but the mantra is currently move fast with stable infrastructure which is a much less sexy mantra. mr. corbin: during the time it was facebook's mantra or motto to move fast than break things do you think some of the misjudgments, perhaps mistakes that you have admitted to here, were as a result of that culture or that attitude, particularly as regards to personal privacy of the information of your subscribers? mr. zuckerberg: senator, i do think we made mistakes because mistakes t the brdest we made here are not taking a broad enough view of our
4:05 pm
responsibility. and the move fast cultural value is more tactical around whether engineers can shift things an different ways we opet. i think the ebig mistake we made looking back on this is viewing our responsibility as just building tools rather than viewing our whole responsibility as making sure that those tools are used for good. mr. corbin: i appreciate that. previously, or in the past, we've been told that platforms like facebook, twitter, instagram and the like are neutral platforms aened the people who own and run those for prosecutor fit, i'm not criticizing doing something for profit in this country but they bore no responsibility for the content. do you agree now that facebook and other social media platforms are not neutral platform bus
4:06 pm
bear some responsibility for the content? mr. zuckerberg: i agree that we're responsible for the content. but i think there's -- one of the big societal questions that i think we're going to need to answer is, the current framework that we have is based on this react i model that assumes that there weren't a.i. tools that could proactively tell whether something was terrorist content or something bad so it naturally relied on requiring people to flag for company and the company to take reasonable action. in the future we'll have tools that are going to be able to identify more types of bad content. i think there are moral and legal obligation questions that i think we'll have to wrestle with as a society about when we want to require companies to take action proactively on certain of those things and when that gets in the way. mr. cornyn: i appreciate that, i have two minutes left to ask you questions. interestingly, the terms of the,
4:07 pm
what do you call it, the terms of service is a legal document which discloses to your subscribers how their information is going to be used, how facebook is going to operate. and -- but you concede that you doubt everybody reads or understands that legalese, those terms of service. so is that to suggest that the consent that people give subject to that terms of service is not informed consent? in other words they may not read it and even if they read it they may not understand it? mr. zuckerberg: i think we have a broader responsibility than what the law requires. so -- mr. cornyn: what i'm asking about in terms of what your subscribers understand, in terms of how their data is going to be used. but let me go to the terms of service under paragraph 2, you say you own all of the content
4:08 pm
and information you post on facebook. that's what you've told us here today a number of times. if i choose to terminate my facebook account, can i bar facebook or any third parties from using the data that i have previously supplied for any purpose whatsoever? mr. zuckerberg: yes. if you delete your account we should get rid of all the information. mr. cornyn: you should or do you? mr. zuckerberg: we do. mr. cornyn: how about third parties you have contracted with o use that information, do you claw back that information as well or does that remain in their custody? mr. zuckerberg: this is a very important question and i'm glad you brought this up. there's a misperception about facebook that we sell data to advertisers and we do not sell data to advertisers.
4:09 pm
mr. cornyn sfk you clearly rent it. mr. zuckerberg: we allow advertisers to tell us with they want to reach and we do the placement. if a advertiser says i'm a ski shop, toip sell skis to women, then we might have some sense because people shared skiing related content or said they were interested in that, they shared with whether they're a woman and then we can show the ads to the right people without that data ever changing hands and going to the advertiser. that's a very fundamental part of how our model works and something that's often misunderstood. i appreciate that you brought that up. mr. thune: thank you, senator cornyn. we indicated we would take a couple of breaks and give our witness an opportunity. we have been going now for just under two hours. mr. zuckerberg: we can do a few more? mr. thune: you want to keep going? mr. zuckerberg: maybe 15 minutes? does that work?
4:10 pm
mr. thune: senator blumenthal is up next, we'll continue. mr. blumenthal: you have told us today and told the world that facebook was deceived by alexander cogan when he sold user information to cambridge an lit ka, correct? -- analytica, correct? mr. blumen thaul: i want to show ou the terms of service that facebook.cogan gai to facebook was on notice that he could sell that information. have you seen the terms of service before? mr. zuckerberg: i have not. mr. blumenthal: who in facebook was responsible for seeing those terms of servais vist that put you on notice that that information could be sold?
4:11 pm
mr. zuckerberg: our app review team would be responsible for mr. blumen thaul: has anyone been fired on that app review team? mr. zuckerberg: not because of this. mr. blumenthal: doesn't that term of service conflict with e f.t.c. order that facebook was under at the time this terms of service was provided to facebook and you'll note that the f.t.c. order requires facebook to protect privacy, isn't there a conflict there? mr. zuckerberg: senator, it certainly appear we should have been aware that this app developer submitted a term that was in conflict with the rules of the platform. mr. blumenthal: what happened here was in effect willful
4:12 pm
blindness. it was heedless and reckless which in fact amounted to a violation of the f.t.c. consent decree. would you agree? mr. zuckerberg: no, senator. my understand is -- understanding is that -- not that this was a violation of the consent decree but as i've said a number of times today i think we need a take a broader view around privacy than just is what -- what is mandated by terms of law. mr. blumen thaul: here's my concern, and i apologize for interrupting you. we've seen the apology tours, you have refused to acknowledge a violation of this f.t.c. consent decree. and we have letters, we had contacts with facebook employees and i'm going to submit a letter or the record from sandy parakilis, with your permission
4:13 pm
that indicates not only a lack of resources but lack of attention to privacy. and so my reservation about your testimony today is that i don't see how you can change your business model unless there are specific rules of the road your business model is to monetize user information, to maximize private profit over privacy and unless there are specific rules and requirements, enforced by an outside agency, i have no assurance that these kinds of vague commitments are going to produce action. so i want to ask you a couple of very specific questions and they are based on legislation that i've offered, the my data act, legislation that senator marquee is introducing today, the
4:14 pm
consent act which i'm joining. don't you agree that companies ought to be required to provide users with clear, plain information about how their data will be used and specific ability to consent to the use of that information? mr. zuckerberg: senator, i do generally agree with what you're saying and i laid that out earlier when i talked about what mr. blumenthal: would you agree to an opt-in instead of an opt-out. mr. zuckerberg: i think that's something to discuss. mr. blumenthal: would you agree users should be able to access all their information. mr. zuckerberg: yes, of course. mr. blumenthal: all the information from purchases of data brokers and tracking them.
4:15 pm
mr. zuckerberg: we have already a down load your information tool that allows people to see and take out all the information they've put into facebook so yes, i agree with that, we already have that. mr. blumenthal: i have a number of other specific requests you agree to support as part of legislation. i think legislation is necessary. the rules of the road have to be the result of congressional action. we have -- facebook has participated recently in the fight against -- the scourge of sex trafficking and the bill that we've just passed that will be signed into law tomorrow, the stop exploiting sex trafficking act, was the result of our cooperation. i hope we can cooperate on this kind of measure as well. mr. zuckerberg: i look forward to having my team work with you on this mr. thune: senator cruz. mr. cruz: thank you, mr.
4:16 pm
chairman. mr. zuckerberg, welcome, thank you for being here. does facebook consider itself to be a neutral public forum. mr. zuckerberg: we consider ourselves to be a platform for all ideas. mr. cruz: are you a first amendment speaker expressing views or allowing everyone to speak. mr. zuckerberg: here's how we think about this i don't believe that -- there's certain content we do not allow. hate speech. terrorist content. nudity. anything that makes people feel unsafe in the community. from that perspective, that's why we generally try to refer to what we do as a platform. mr. cruz: it's just a simple question. the predicate for section 230 immunity is that you are a neutral public forum. do you consider yourself a
4:17 pm
neutral public forum? or are you engaged in political speech which is your right under the first amendment? mr. zuckerberg: our goal is not to engage in political speech, i'm not that familiar with the specific legal language of the law that you speak to so i would need to follow up with you on that. i'm just trying to lay out how broadly i think about this. mr. cruz: there are a great many americans who i think are deeply concerned that facebook and other tech companies are engaged in a pervasive pattern of bias and political censorship. there have been numerous instances with facebook in may f 2016, gizmodo reported facebook routinely suppressed conservetive stoorries if tending news, about cpac, about mitt romney, about glenn beck. in addition to that facebook has nitially shut down the
4:18 pm
chick-fil-a appreciation page, blocked posts of a fox news reporter. blocked over two dozen catholic pages and most recently blocked trump supporters diamond and silk's page with 1. million facebook followers after determining their content and brand were, quote, unsafe to the community. to a great many americans that appears to be a pervasive pattern of political bias. do you agree with that assessment? zip senator let me say a few things about this. first, i understand where that concern is coming from because facebook and the tech industry are located in silicon valley which is an extremely left-leaning place. this is actually a concern that i have in that i -- and i try to root out in the company is making sure we deponent have any bias in the work we do and i think it is a fair concern that people would wonder about. mr. cruz: are you aware of any
4:19 pm
ad or page that's been taken down from planned parenthood? mr. zuckerberg: senator i'm not but let me just -- mr. cruz: how about moveon.org? or any democratic candidate from office? mr. zuckerberg: i'm not specifically aware. i'm not sure. mr. cruz: in your testimony you say you have 15,000 to 20,000 people working on security ancon tent review. do you know the political orienting aof those 15,000 to 20,000 people engaged in content review? mr. zuckerberg: no, we do not generally ask people about their political orientation when they're joining the company. mr. cruz: have you ever made hiring or firing decisions based on political decisions or what candidate they supported? so no. mr. cruz: why was palmer lucky fired? mr. zuckerberg: that's a specific personnel matter. mr. cruz: you said you don't make decisions on political view.
4:20 pm
mr. zuckerberg: it was not because of a political view. mr. cruz: of the 15,000 to 20,000 people engaged in content review how many if any, ever supported financially a republican candidate for office? mr. zuckerberg: senator i do not know that mr. cruz: your testimony says it is not enough that we just connect people we have to make sure those connections are positive. it says we have to make sure people aren't using their voice to hurt people or spread misinformation. we have a responsibility not just to build tools to make sure those tools are used for good. mr. zuckerberg, do you feel it's your responsibility to assess users whether they are good and positive connections or ones that those 15,000 to 20,000 people deem unacceptable or deplorable? mr. zuckerberg: you're asking me personally? mr. cruz: facebook. mr. zuckerberg: there are a number of things we all agree are clearly that. foreign interference in elections. terrorism. self-harm. mr. cruz: i'm talking crener is
4:21 pm
-- censorship. mr. zuckerberg: you would probably agree that we should remove terrorist propaganda from the service. we want to get that done and we're proud of how well we do with that what i can say, and i do want to get this in, is i'm very committed to making sure that facebook is a platform for all ideas. that is a very important founding principle of what we do we're proud of the discourse and the different ideas that people can share on the service. that is something that as long as i'm running the company i'm going to be committed to making sure is the case. mr. thune: thank you, senator cruz. do you want to break now? or do you want to keep going? mr. zuckerberg: sure. that was pretty good. mr. thune: we have senator whitehouse is up next. if you want to take a phi-minute break right now, we have now been going a good two hours. we'll recess for five minutes and reconvene.
4:22 pm
[captions copyright national cable satellite corp. 2018] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org]
4:23 pm
4:24 pm
4:25 pm
4:26 pm
4:27 pm
4:28 pm
4:29 pm
4:30 pm
4:31 pm
4:32 pm
mr. thune: the committee will come to order.
4:33 pm
i think we ought to read this. before i cull on senator whitehouse, senator feinstein asked permission to put letters and statements in the record and without objection, they will be put in from the aclu, the electronic privacy information center, the association for computing machinery public policy council and public knowledge. senator whitehouse. mr. whitehouse: thank you, hairman. mr. chairman, i want to correct -- hing
4:34 pm
mr. zuckerberg: mr. chairman, i want to correct one thing i said earlier he asked why we didn't ban cam ridge analytica at the time when we learned about them irk answered was what my understanding was they were not on the platform, not an app developer or advertiser. when i went back and met with my team afterwards they let me know cambridge analytica did start as an advertiser in 2015 so we could have in theory ban them then. e made a mistake not doing so, i wanted to make sure i updated that because i misspoke, got that wrong earlier. mr. thune: senator whitehouse. mr. whitehouse: on the subject of bans, i want to explore what these bans mean. obviously facebook has been done considerable reputational damage by its association with alexander cogan and am bridge
4:35 pm
analytica, which is one of the reasons you're having this enjoyable afternoon with us. your testimony said that alexander cogan's app has been banned. has he also been banned? mr. zuckerberg: yes, my understanding is he has. mr. whitehouse: if he were to open up an account under a different name and you found out it would be closed down. mr. zuckerberg: we are preventing him from build anything more apps. mr. whitehouse: does he have a facebook account? mr. zuckerberg: i believe no, but i can physical low up with you afterwards. r. whitehouse: and as far as cambridge analytica, you required them to formally has been taken
4:36 pm
down. what did that entail? mr. zuckerberg: they sent us an email notice from their chief data officer telling uh thaw -- telling us they didn't have any data, deleted it and weren't using it. we followed up with a full legal contract where they certified hat they had deleted the data. mr. whitehouse: in the legal contract? mr. zuckerberg: i believe. -- . whites can house who exactly is banned, what if they opened up princeton, rhode island, analytica, would that also be banned? mr. zuckerberg: that would be the intent. cambridge analytica has a parent company. we banned the parent company. we also banned a company called
4:37 pm
ample i.q. that is associated with them. if we find other firms associated with them we'll block those as well. individual e: are principals of the company also banned? mr. zuckerberg: my understanding is we are blocking them from doing business on the platform but we are not blocking people's personal accounts. mr. whitehouse: can any customer amend your terms of service or sit a take it or leave it for any customer? mr. zuckerberg: the terms of service are what they are but the service is defined by people. you get to choose what information you share. mr. whitehouse: senator graham
4:38 pm
held up that big document, it's easy to bury things in a document that later turn out to be of consequence. but thatting to be yumet senator graham held up is not a negotiable thing with individual customer, that's a take it or leave it proposition for your customers to sign up to or not use the service. mr. zuckerberg: that's right on the terms of service though we offer a lot of controls so people can configure the experience how they want. mr. whitehouse: last question, on a different subject, strg do with the authorization process you are undertaking for entities putting up political content or issue ad content you said that they all have to go through an authorization process before they do it. you said here, we will be verifying the identity. how do you look behind a shell corporation and find who is
4:39 pm
really behind it through your authorization process? well, step back, do you need to look behind shell corporations in order to find out who is really behind the content that's being posted, and if you may need to look mind a shell corporation, how will you go about doing that? how will you get back to the true, what lawyers would call, beneficial owner of the site that is putting out the political material? mr. zuckerberg: you're referring to the verification of political issue ads. mr. whitehouse: yes and before that, political ads. mr. zuckerberg: we're going to require a valid government idebitity and verify the location. we're going to do that so that way someone sitting in russia, for example, couldn't say that they're in america an therefore able to run an election ad. mr. whitehouse: but if they were running thru a corporation domiciled in delaware you wouldn't know they were a russian own her
4:40 pm
mr. zuckerberg: that's correct. mr. whitehouse: thank you. my time has expired. i appreciate the courtesy of the chair for the extra seconds. >> mr. zuckerberg, you said there are some types of con stent tent facebook would never want to have any part of and takes extra steps to avoid disseminate, including hate i ech, nudity, racist speech assume you also meant terrorist acts and threats of violence. beyond that, would you say facebook ought not to be putting its thumb on the scale about the content of speech? mr. zuckerberg: senator, yes. there are generally two contents of -- cat goifers content we are worried about. one is things that could cause
4:41 pm
real-world harm. terrorism fits into that. self harm fits into that. i would consider election interference to fit into that. those are things, i don't consider there to be much discussion about whether they're good or bad. mr. lee: what i'm asking is, once you get beyond the cat gorse of things that are prohibited and should be, is it facebook's position it should not be favoring or disfavoring speech based on content, based on the viewpoint of that speech? mr. zuckerberg: in general that's our position. what we -- one of the things that's important, though, is that in order to create a service where everyone has a voice, we also need to make sure that people aren't bullied or basically intimidated the environment feels unsafe for hem. mr. lee: when you say in general, that's the exception you're referring, to the exception if someone feels
4:42 pm
bullied, etch if it's not a terrorist act, nudity or something like that, you might step in there beyond that, would you step in and put your thumb on the scale as far as is the viewpoint of the content being posted? mr. zuckerberg: senator, no. in general our goal is to allow people to have as much expression as possible. mr. lee: so except for the objections we skissed you'd stay out of that let me ask you this. isn't there a significant free market incentive that a social media company, including yours, has in order to safeguard the data of your users? don't you have free market incentives. mr. zuckerberg: yes. mr. lee: don't your interests align with those of us here who want to see data safeguarded. mr. zuckerberg: absolutely. mr. lee: do you have toe the technological means at your disposal to make sure that doesn't happen and to protect
4:43 pm
say an app developer from transferring facebook data to a third party? mr. zuckerberg: senator a lot of that we do and some of that happened outside of our systems and will require new measures. so for example what we saw here was people chose to share information with an app developer. that worked according to how the system was designed. that information was transferred out of our system to serbers that this developer, alexander cogan had, and that person chose to sell the data to cambridge analytica. that's going to require much more active interveengsvention and auditing from us to prevent going forward because once it's out of our system it's harder for us to have a full understanding of what's happening. mr. lee: from what you said today and from peeves statements by you and your company, data is at the cent of your business model. it's how you make money.
4:44 pm
your ability to run your business effectively, given that you don't charge your users is based on monetizing data. and so the real issue it seems to me really comes down to what you tell the public. what you tell users of facebook, about what you're going to do with the data. about how you're going to use it. can you give me a couple of examples, maybe two examples of ways in which data is collected by facebook in a way that people are not aware of, two examples of types of data that facebook collects that might be surprising to facebook users? mr. zuckerberg: i would hope what we do with data is not surprising to people. mr. lee: and has it -- mr. zuckerberg: in this case people didn't expect this developer to sell the data to
4:45 pm
cambridge analytica. in yen there are two types of data that facebook has. the vast majority, in the first category is content people chose to share on the service themselves. that's all the photos that you share, the posts you make. what you think of as the facebook service. that's -- everyone has control every single time they go to share that. they can delete that data any time they want. full control, the majority of the data. the stecked cat goir is around specific data that we collect in order to make the advertising experiences better and more relevant and work for businesses. and those often revolve around measuring, ok if we showed you an ad and you click through and go somewhere else, we can measure that you actually, that the ad worked. that helps make the experience more relevant and better for people who are get manager relevant ads and better for the businesses because they perform better. you have control completely of the second type of data. you can turn off the ability for
4:46 pm
facebook to collect that, your ads will get worse, so people don't want to do that, but you have complete control there as well. >> i want to follow up on the questions around the terms of service. your terms of service are around 3,200 words with links. one link is 2,700 words with 22 links. i think the point has been well made that people have no idea what they're signing up for and i understand that at the present time that's legally binding but i'm wondering if you can explain to the billions of users in plain language what are they signing up for? mr. zuckerberg: senator, that's a great and important question here. in general, you sign up for the facebook, you get the ability to share the information that you want with people. that's what the service is. you can connect with the people you want and you can share whatever content matters to you, whether it's photos or links or
4:47 pm
posts. and you get control over who you share it with, you can take it down if you want, you don't need to put anything up in the first place if you don't want. mr. schatz: what about the part people are worried about, not the fun part. mr. zuckerberg: what's that? r. schatz: the part people are worried about, are your d.m.'s informing the ads? are your browsing habits being collected? everybody kind of understands when you click like on something or if you say you like a certain movie or have a particular proclivity, i think that's fair game. everybody understands that. what we don't understand exactly because, both as a matter of practice and a matter of not being able to decipher those terms of service and the privacy policy is, what exactly are you going with the -- doing with the data and do you draw aties dings between -- distinction between
4:48 pm
data collected in the prosofse utilizing the platform and that which we clearly volunteer to the public to present ourselves to other facebook users. mr. zuckerberg: senator i'm not sure i fully understand this. in general people come to facebook to share content with other people. we use that in order to also inform how we rank services like news feeds and ads to provide more vell rant. mr. schatz: if i'm emailing within what's app, does that nform advertisers? mr. zuckerberg: no, we don't see anything in what's app, it's encrypted. mr. schatz: say i'm emailing about "black panther" in what's pp, do i ged ads that?
4:49 pm
mr. zuckerberg: facebook doesn't see. . schatz: but do the systems talk to each other without human s touching it. mr. zuckerberg: i think the answer is no, if you message about "black panther" you won't see ads, for it. mr. schatz: i can't imagine that facebook data my because you're monetizing it. it doesn't seem to me that we own our own data, otherwise we'd be getting a cut. mr. zuckerberg: well, senator, you own it in the sense that you choose to put it there, you can take it down any time and you completely control the terms
4:50 pm
under which it's used. when you put it on facebook, you're granting us permission to show it to other people. mr. schatz: so your definition of ownership is, i sign up voluntarily and may delete my account if i wish and that's it. mr. zuckerberg: i think the control is much more granular than that. you can choose each photo you want to put up or each message and you can delete those and you don't need to delete your whole account, you have specific control. mr. schatz: i want to propose something to you. i read an afrl this week by yale, r jack balkin at about an information fiduciary. this is about a trust relationship like doctors and lawyers. tech companies should hold in trust our personal data. re you open to the idea of
4:51 pm
information fiduciary enshrined in statute? mr. zuckerberg: senator, i think it's certainly an interesting idea and jack is very thoughtful in this space so i do think it deserves consideration. mr. schatz: thank you. mr. thune: senator fisher. ms. fisher: thank you for being here today, appreciate your testimony. the full scope of a facebook user's activity can print a very personal picture. additionally, you have the two billion users that are out there every month and so we all know that's larger than the population of most countries. so how many data cat girs do you store? does facebook store? on the categories that you collect? mr. zuckerberg: can you clarify what you mean. ms. fisher: there's information
4:52 pm
that says facebook collects about 96 data categories for the two billion act i users. that's 192 billion data points being generated, i think, at any time from consumers globally. so how many do -- does facebook store out of that? do you store any? mr. zuckerberg: senator i'm not actually sure what that is referring. -- referring to. ms. fischer: on the points that you collect information. if we call those categories. how many do you store of information that you are ollecting? mr. zuckerberg: senator, the way i think about this is there are two broad categories. this probably doesn't line up with whatever the specific report you're seeing is and i can make sure we follow up with you afterwards to get you the
4:53 pm
information you need. but the two broad categories i think of are information a person cause agreed to share and then the other category are data that are connected to making the ads relevant. you have complete control over both. you can turn off the data related to ads, you can choose not to share any content or take down the content in the category. ms. fischer: does facebook store any of that? mr. zuckerberg: yes. ms. fischer: everything we click on is that in storage somewhere? mr. zuckerberg: we store data about what people share on the service. and information that's required to do ranking better, to show you what you care about in news feed. ms. fischer: do you store text history, user content, activity,
4:54 pm
device location? mr. zuckerberg: senator some of that content with people's permission we do store. ms. fischer: do you disclose any of that? mr. zuckerberg: yes, senator, in order to -- for people to share the information with facebook i believe that almost everything you just said would be opt-in ms. fischer: and the privacy settings, it's my understanding that they limit the sharing of that data with other facebook sers, is that correct? mr. zuckerberg: yes, every person gets to control who gets to see their content. ms. fischer: does that limit the ability for facebook to collect and use it? mr. zuckerberg: there are criminals that determine what facebook can do as well. so for example, people have control about face recognition. if people don't want us to be
4:55 pm
able to help identify when they're in photos that thoir -- that their freppeds upload they turn it off and we won't store that for them. fs misher: and there was action take bin this f.t.c. in 2011. you wrote a facebook post at the time on a public page on the internet that it used to seem scary to people but as long as they could make their page private they felt safe sharing with their friends online. control was key. nd you just mentioned control. you a hatch asked question and you responded about complete control. you and your company have used that term repeatedly and i believe you use it to reassure users, is that correct? that you co-do have control and complete control over this information? mr. zuckerberg: well, senator this is how the service works.
4:56 pm
the core thing that facebook is ms. fischer: is this then a question of facebook is about feeling safe or are users actually safe? s facebook being safe? mr. zuckerberg: i think facebook is safe. i use it, my family uses it, people i love and care about use it all the time. it's not just to make people feel safe, it's what people want in the product. you don't want to share you take a photo you're not going to always send it to the same people. sometimes you want to text it to one person, sometimes you might send it to a group. i bet you have a page. you'll probably want to put some stuff out there publicly to communicate with your constituents. there are different gruchese people someone might want to connect with and those controls are very important in practice
4:57 pm
for the operation of the service, not just to build trust although i think providing people with control does that, but actually to make it so eople can use the service. >> i think the whole reason we're having this hearing is because of a tension between two basic principals you've laid out. -- principles you laid out. you said about the data you -- users put on facebook, you control the data. you said positive and optimistic things about the data. t facebook is a for-profit by ny that makes millions targeting ads. you recognize that an ad-supported service is best aligned with your mission. but there's a will the of examples where ad targeting has led to results that i think we would all disagree with or
4:58 pm
dislike or would concern us. you've admitted that facebook's own ad tools allowed russians to target users, voters, based on racist or anti-muslim or anti-immigrant views and that may have played a significant role in an election here in the united states. today, "time" magazine posted a story saying wildlife traffickers are can'ting to use facebook tools to advertise illegal sales of protected animal parts and i am left questioning whether your ad targeting tools would allow other concerning practices like diet pile manufacturers targeting teenagers who are struggling with their weight or allowing a liquor distributor to target alcoholics or gambling organization to target those with gambling programs -- problems. i'll give you one concrete example, propublica highlighted in 2016 that facebook allows advertisers to exclude users by race in real estate advertisers.
4:59 pm
you could say i only want this ad to be seen by white people, not people of colors. you announced that was a bad idea. you were going to change the tools and you would build a new system to spot and reject discriminatory ads that violate our commitment to fair housing, yet a year later a followup story said those changes hadn't fully been made and it was still possible to target housing advertisement in a way that was racially discriminatory. my concern is this pracktoifs making bold and engaging promises about changes in practices and the reality of how facebook operated in the real world are in persistent tension. several senators asked earlier today about the 2011 f.t.c. consent decree that required facebook to better protect users ear privacy and there are a series of examples where things have been brought to your attention, facebook has apologized, said we're going to change our practices and
5:00 pm
policies and yet there doesn't seem to have been as much followup as would be called for. at the end of the day, policies aren't worth the paper they're written on if facebook doesn't enforce them. i'll close with an experience i had today as an avid facebook userism woke up this morning, notified by a whole group of friends across the country asking if i had a new family or if there was a fake facebook post of chris coons. i went to the one they suggested, there's my picture with senator dan sullivan's family. same schools i went to. >> we're going to break away from this and take you live to the floor of the house and watch the hearing live on c-span3. clause 6 of rule 20. the house will sume proceedings on postponed questions at a later time. for what ppose does the gentleman from michigan seek recognition? mr. speaker, i move to suspend the rules and pass the bill, r. 4921,s amended.

87 Views

info Stream Only

Uploaded by TV Archive on