tv Facebook and Election Security at Tech Crunch Disrupt San Francisco CSPAN September 24, 2018 1:34pm-1:57pm EDT
professor christine blasey ford has agreed to testify before the senate judiciary committee about her sexual assault allegation against supreme court nominee brett kavanaugh on thursday. judge kavanaugh will also testify at that hearing. we have live coverage beginning thursday at 10:00 a.m. eastern on c-span3, c-span.org, and the c-span radio app. next, we'll hear from a former top security officer from facebook. he told attendees at the tech crunch disrupt in san francisco earlier this month that the upcoming 2018 election is no more secure than the 2016 election. ♪ >> unbrand. >> everything is branded here. absolutely everything. >> excellent. >> welcome to our brand. welcome to the stage.
for anyone who might not know, alex stamos was at yahoo! as the chief information security. you went to facebook in 2015. you spent three years at facebook. now you're moving on to different things. i don't know if they're better yet. >> i've been so bored. i wanted to get into the 2:00 a.m. phone calls of academia. >> it's ban dueen a dull year. we might have a hard time figuring out what to talk about. it's been quiet at facebook. >> sports. >> that sounds fine. we'll kick it off easy. let's dive right in and talk about russia. it's a lyiight topic. >> the country. it's a beautiful country. i wish i could actually visit it. i have this map. you know how people have maps on their rvs of every state they've been to. i have a map of countries i can never visit for the rest of my life. unfortunately, that's the biggest by land area. >> that's unfortunate. maybe things will change.
>> not the only one. >> probably true. so obviously yesterday your former boss sat in front of the senate select intel committee, testifying about facebook and russia and interference in the 2016 election and solutions moving forward. famously after the election, mark zuckerberg made the comments that facebook impacted the election was a crazy idea. i would love if you could take us through the day, a day in the life that you and your team realized that russia had, in fact, potentially interfered on the platform. >> so there wasn't one day because we first spotted russian activity in the spring before the election. so our company and a bunch of other of the big ones have full-time threat intelligence teams. our entire job is to track persistent government actors, groups that are always involved in doing things on the platform. we already had a team watching the activity of what people call
fancy bear or apt 28 who are folks who are believed to work for the gru, which is the main intelligence directorate of the russian military. so we saw some stuff from them in the spring before the election. in the end, their actual hacking activity happened off of facebook. kind of at the time, the way this worked is we have a relationship with law enforcement in the u.s. we informed them of things we found. later on, we heard about the dnc hacks and some other stuff. so we tried to move quickly to shut down their ability to amplify that stuff on facebook. at the time, we didn't have a handle on the activity we found later. there really wasn't one day. after the election, we really dove into the overall fake news problem and a big question was what is behind this, of all the stuff people call fake news, what is driving it? it turns out that the vast majority of it's actually financially motivated.
the stereotypical macedonian teenagers who actually exist and are living the good life, as well as folks in romania, pakistan, with good english comprehension, technical skills, and low-cost structure they can run these large groups to do fake news farms. they're behind most of it. but then we started to finds these chunks of stuff that was obviously not monetizable and leading up to the announcement in september of 2017 of the biggest chunk, which came from the internet research agency. so there wasn't one day. it was kind of this progression where, you know, we first saw a little bit during 2016. we saw the public stuff happen around the gru, the creation of the d.c. leaks personas, and then probably the big chunk for us was the ira cluster we found in all of the advertising that came with it in the summer. >> so that's interesting. the chunks were kind of in a
bucket. you said they stood out because they weren't monetizable, which indicated there was another motive. >> right. that's actually -- if you're looking at yourself and you want to try to determine whether something is financially motivated fake news or it might be an information operation, one of the signs is whether they take you off the social network. the people who want to make money do so by doing arbitrage. they will push a lot of spam on twir twitter or facebook or any other network, take you to a website on which they run a bunch of ads. some of though ads, those advertisers know they're there. some of it's ad fraud. they'll run -- if you go to a fake news site and all the sudden your cpu fan spins up, it's because there might be like a 4k bmw ad. they'll do a bunch of fraudulent stuff. they're basically trying to take the traffic. what the internet research agency and other government trolls want to do, they want you to reshare the content on social media. what they especially like to do is image memes. people love to download it,
reshare it as something else. that makes them no money. they can make money off you sharing this meme over and over again. >> that's really interesting. i definitely in my personal life have a hard time commute katini to the folks who wouldn't be at this conference how to look out for things like this. facebook is always going through this. they've been really transparent in the last year to their credit around that. but it's hard to tell people how to know what to look out for. >> and i think one of the hard parts here is there's a lot of smuggling of the messages they want to push but intermediaries. if you look at the russian campaign against 2016, there's really two different buckets. there's the gru-led work which is about hacking and leaking. they hacked a bunch of data from the dnc, from john podesta, and they used that hacked information to create the news stories they wanted to see in
the media. they amplified it using their trolls later. in that case, it was the legitimate newspapers and cable news networks and legitimate journalists who were carrying the message of the gru and kind of washing it through the respectability of their outlets that then changed the entire conversation, which is a very different kind of model than the i.r.a. model, which is to push messages to americans. the gru is specifically targeted at hillary. it is pretty clear the gru's goal was to weaken a future hillary presidency. i think it was less about actually electing trump. i find it unlikely that the russians are better than nate silver at predicting elections. it seems that they were assuming a hillary presidency that they saw as a big threat to them. it's been well documented putin has that personal antipathy to her and believes she was behind the protest against him in the 2012 russian election. so the gru activity was focused
on weakening her. the i.r.a. activity, which started well before the election and has lasted well after the election, is really about driving wedges in american society. so it's much more dispersed. but that's kind of direct messages that you're getting straight through them, that they're pretending to be americans or legitimate outlets. they do not seem as good at getting their messages carried by the media. but that's fine because they're going straight to people's eyeballs via facebook, twitter, and elsewhere. >> do you think that we need to redefine cybersecurity right now? i mean, it's something you touched on in your blog post, which everyone should read if you haven't. it's a well-laid out argument for what needs to happen to secure elections moving forward. do you think we need to expand the definition of cybersecurity? i feel like that's what must have been one of the most disorienting things early on. you have to go to your team or whoever at facebook, the board or your bosses and say, you know, we weren't hacked but this
thing happened. how do you describe that thing? >> right. i think this is -- you're totally right about redefining security. i don't know if we'll end up using cybersecurity as the term. i spent years fighting against the term cyber. i know i'm old and have a gray beard. >> it was a losing battle. >> i say cyber without irony now. i come from a traditional information security background. teenage hacker, started a security consultant group. in my jobs, i had to kind of grow into the realization that the vast majority of harm that is caused by technology does not have any kind of interesting technical component. it is the technically correct use of the products we build to cause harm. and that's not just in the disinformation space. that's in the abuse of children, in the harassment of individuals, in the suicide and suicidal idollation.
these are the kinds of things that have no technically interesting component to them yet are incredibly harmful. and i do think we as an industry need to vastly expand how we deal with this because that's not actually studying those issues. what we call trust and safety issues or at facebook the term is integrity. but those kinds of safety issues, there's not really a field around it. you can't take classes in it. it's not something you normally put on your linkedin resume. how do you hire those kinds of folks? it's very hard to find them. that's actually something to give you the soft pitch for the pivot, something i'm trying to work on at stanford. if we're going to graduate out these students who are going to try to change the world, they should have an understanding of all these ways technology has been misused in the past. and we need to start to build a cross disciplinary academic center around looking at all the ways technology can be misused that doesn't fall within the
really fine confines of information security. >> can you also build a time machine so we can go back and have that? >> yeah, not so much. and you're right. this is the tough part. it's always been true that our technological achievements outpace the understanding of how it's going to be abused and the fixes. that's true in traditional security and the trust and safety area. so i do wish that people -- you know, that we had an understanding of these things better. i wish the big companies had a better understanding of these things years ago. unfortunately, that's just not how we've trained people and not how these companies have grown up. >> switching gears a little, i want to talk about your choice to move from yahoo! to facebook. i guess in the twittererverse and security community, you're widely regarded as a champion of privacy, a user privacy advocate. how did you reconcile concerns about user privacy on facebook and facebook's business, which is predicated on using data to
target ads? >> yeah, i mean, i think this is a tough balancing act anybody has to make. if you want to actually make change, you have to be the man or the woman in the arena. obviously teddy roosevelt wasn't thinking in a very gender neutral way. you have to put yourself in a position that sometimes you're going to have a fight, and you're going to fight with folks who night dmight disagree with you have a chance of changing their mind. one, i think the ad supported internet is something that's going to last for a long time. we're not going to get away from the model of using data about people to target ads, and therefore supporting these platforms. partially because the truth is that a small number of consumers, mostly in north america and western europe, subsidize the existence of these technologies for everybody else in the world. and finding some kind of way to
build products that require millions of servers and billions and billions of dollars in hardware and lots of professionals to run it, finding a way to support that, then that service can be available freely across the world, that's a super hard problem. so if these products are going to exist, you're always going to have these privacy trade-offs. i personally thought it's better to be part of that argument and at these companies than to just throw, you know, tomatoes from the outside. but that also can be a self-serving argument. hopefully when you're on the inside and you're trying to work on these issues, you have the ability to change people's minds. and it's not like -- i was working with people who were like, let's go violate everybody's privacy. you're working with people who are well meaning and want to do the right thing. >> so they're not saying that at facebook? like, every board meeting isn't like, let's violate everybody's privacy. >> right. so it's not like you're deciding to work with these people. you're going to an organization
that has a lot of different equities. i think one of the things that's important is to go fight for the equities you believe in and hopefully that gets balanced out with everything else. >> by any measure, you've had an exceptional career. i mean, you've seen a lot of things go down. >> that could be read different ways. >> it's a technical word. still at yahoo! and then again at facebook, your name will be attached to at least two of the greatest cybersecurity scandals of all time. arguably. is that good or bad for your career? or your resume. do you skip over a few years on your resume? oh, yeah, i did a little thing before. >> you know, that's -- when you decide to take the title, you decide that you're going to run the risk of having decisions made above you or issues created by tennins of thousands of peop
that those are going to be stapled to your resume before anybody else's. it's kind of a crappy job in 2018 to be a chief security officer. we're in this time, it's like being a cfo before accounting was invented. it's a profession that's only existed for a couple decades. we don't have the mechanisms necessary to really understand what happens at these companies and to understand the risk and control the risk. >> are you defining the job in a way, like as you're doing the job? is it that kind of thing? >> yeah, and i think there's a bunch of different kinds of jobs. but at the big tech companies, it's again not just about the straight-up information security. it's about privacy in a way that is -- does not fall within normal security flaws. it's also about the misuse of the product. and yeah, all of these companies are kind of making it up as they go along. i think that's true for, you know, taking on the position. it's just -- you take on the responsibility. you get the advantages of having the platform. the downside is when things happen, even if you don't have
the ability to change or control them, that's something you have to take responsibility for. i was the cso when all this stuff happened. it is my responsibility. i'm not going to shirk it. i also hope that i was able to do things to make >> i want other people to learn about them, right. that is one of the reasons i left, there's a lot of things that happened in the last 4 or 5 years that are universal and the big companies on the forefront is dealing with them first. it is not a tech company that is starting up right now that will not have to worry about the
trust and safety and privacy issues and hopefully we'll spread out more. >> midterms and waste? >> it is because our adverary gave you us a forebearance. as a society we have not responded to the 2016 election that would be necessary to have a trustworthy midterm. platforms made changes. adtransparency. russian interferrance or not. we don't want a future where campaign and candidates are cutting up the electorates in smaller pieces. we need to have legal standards as to the minimum segment size
to political advertisers. there is positive changes and over all the active security of campaigns and no better. in my big fear that the 2016 electoral map was well balanced and in postcases throwing in an election or another it difficult. and throwing the election in chaos is totally doable. and the attacks against the infrastructure. and swing states. and even that is when they do those and that is that is that
is advocated. and that is the number of companies that are quite loss. >> that is hard and it doesn't take it to do throw everything in a chas o. respiratory >> and you will fight to the bitter end. they will be. and that is in a bad place. it is 90 percent facebook users and i spent time dealing with
the issues and countries. that is a pieceful transition for power does for you. we'll miss that if we don't defend it. >> now that we are transitioning. and i don't know what your desk looks like. >> would you consider going back to work with a major internet company and not for a long time, no. and that is a tough job and impractical as three little kids. and by the time they are out i will be old and irrelevant. >> that is my plan for a while. >> it is great to work on that. we have you mongous issues in
the planet that we need to work on and ability to tackle those is much more interested. >> thank you. that is all of the time we v. >> thank you. coming up tonight, panelist in washington d.c. look at federal resettlement program. over on c-span georgetown and colombia university hosted a dugz about the relationship between reporter and confidential sources in the government. we'll have more on lifetime. christine ford agreed to testify before the senate about the accusation against judge kavanaugh.
we'll have live coverage on c-span three. c-span.org. radio app. clear there is a constitutional right and national characteristic or historic event and explain how it defineds the american experience. we are awardingly $100,000 in prices and 5000 cash. for more information go to our website at student cam.org. >> we are back with a former counter terrorism senior director in the obama administration now at georgetown university here to talk about