tv RAND Discussion on Russia Social Media Influence CSPAN March 18, 2019 2:00am-2:39am EDT
ease watch "washington journal." join the discussion. >> on friday, rand researchers spoke with staffers to counter ssian social media and information sharing. this is 35 minutes. >> good morning, everybody. or good afternoon. good afternoon. good afternoon. my name is jamie fieldson, i'm the director of relations for the rand corporation. it's my pleasure to welcome you to our briefing called countering russian social media influence. before we start, i'm going to share a few quick items of housekeeping. first our briefing is being recorded. we will make the full representation online on our website about a week after this event. it will be put on www.rand.org.
today's briefing is also being broadcast live on c-span. so -- so those are the camera there. next, i want to encourage you to join our conversation online. you can do so using the #russiandisinformation. finally, i want to tell you something a little bit about the rand organization. rand is a nonprofit, nonpartisan research institution that helps research and analysis. our research focus on a wide set of issue including education, health care, national security, energy and a lot more. we disseminate our findings and information as widely as possible to benefit the public good. and we have 9,000 publications for free on our website. for this audience, i would like you to know that rand research and expertise is available to your staff and to your bosses here on capitol hill. should you have any questions about today's briefing or other issues that you're working on,
you can always contact me or a member of my team and we're happy to connect you to the right experts. we want to make sure the verge behind the many issues that you are confronting here on the hill. i want to tell you a little bit about why we're here today. -- criticizing research policies. it starts from the very top with russian leadership to russian actors and proxies through channels of amplification such as social media. and finally, reaches u.s. media consumers. and important we're going to focus on the different stuff policymakers can take to help combat this in the united states.
i'm delighted to be joined by elizabeth bodine barron. she specializes in complex networks and systems. she's the associate director of the force modernization for program air force which is a program we run at rand and network analysis and system science. her verge includes network analysis and modeling at austin. d she got her ph.d. from the school of education technology. and with that, we will let her
start our briefing today. >> thank you very much for coming. everyone can hear me just fine? thank you for taking the time today to come to this briefing. i'll spend about 15 or so minutes talking about the research and hopefully we should have plenty of time for questions and answers an discussions afterwards. i'm sure you have heard the headlines of the russian influence in the united states over the past few years. these are four facebook ads that were promoted on face book various times. i'm curious which one doss you think are actually sponsored by russians? black lives matters? no? blue lives matter? i see you guys are a well informed audience. it was a trick question. what we saw here was a classic example of the russian strategy of playing both sides against the middle and seeking to widen
existing divisions in our democratic society. so that's just a bit of a thing thinking. ll i'm working with a much more informed audience that i'm used to. the russian disings in, again, we've seen lots of headlines. russia developed it to control internal information and then exported it to look at near broad audiences within the former soviet sphere of influence and most recently within the united states. what east different what we've seen in the last few years as seen in the community assessment is this is the most recent expression of a very long standing desire to undermine western democratic society in particular led by u.s. what we see was a significant escalation in both the tactics and scope of this effort starting well before the 2016
election and continuing today. russia sends to involve different messages for different audiences for very different strategic goals. so when we look at the study we did a few years ago, what we saw was the overall approach is to exploit very contentious issues and a vulnerable ethnic population to draw wedges between this population and their host governments and ultimately to push a very pro-russian, anti-u.s., anti-nato message. the very different approach we look at the united states and europe. the goal is to sew confusion and stoke fierce by exploiting existing divides playing both sides like we saw with the u.s. of police force and minority communities in order to ultimately erode trust in western democratic institutions. different approaches, different tactics for different audiences. but -- that's just a basic
background. this briefing and what i'll be talking about today is really not so much about the how. there's a lot of really good news reporting out there as well as several academic and private sector reports that you can read. several from the rand corporation. what i'm going to focus on today is really more about what we can do about it. what is the public sector? what the government can do working with the private sector and what academics can do to combat this threat today. so currently efforts to combat russian disinformation is a really fragmented and incomplete. we have effort fwiss social media companies themselves whether it's facebook and twit tore identify and remove disinformation to update their terms of service and their user agreements to make sure that this is not allowed and then actually identify and remove it when they see it. we have efforts won the f.b.i. and the d.h.s. through their counter and foreign influence tas forces.
there are legislative efforts such as the honest ads act whichened up not passing to force social media companies to reveal the funding of political ads that are purchased on their platforms similar to what's currently required for radio and television ads. interesting for the 2018 election facebook voluntarily did this. they required funders to be disclosed for the ads and have made that entire process of purchasing ads more transparent. then there's academic efforts whether from think tanks or academic institutions to identify awe tentic propaganda or inauthentic behavior things like bots, amplifying social messages on social media. but there's nothing that is coordinating between them in order to produce an entire sweep of approaches to combat this. and as a result, it may work in certain cases, certainly none of
them are bad approaches. but there's no overarching strategy. >> -- so the research that i led with a number of other rand researchers was to look at a whole bunch of news immediate y and academic reports on this phenomenon, we convened in a workshop of various experts including legal experts, representatives from the social media companies themselves people with experience in media and people with experience in influence and information operations from the intelligence community and the department of defense, pulling them all together to understand what are the potential approaches that we could take to combat this threat. and to talk about it, we broke the problem into several steps into a framework that we call a disinformation chain that allow us to characterize how does disinformation work? how does it start, the target and how does it get there? to understand what both russian when, where and how and also
what we can do about it. i'll neat this disinformation chain is not unique to russia. information, with some tweaks it could be applied to any adversary trying to influence citizens of essentially any country. it's basically just a framework to talk about the problem and to break down solutions so that you understand what aspect of the problem a different solution is targeting. so startinging out with the disinformation chain. so starting with leadership, someone in the leadership has to make a decision that they are going to perform an influence campaign and it's going to target this audience with this sort of message in the example that we're talking about today, we're really talk about russian leadership. then we have russian proxies. in this case we can break them up into two different groups. one is attributed media, state sponsored domestic media as well as state sponsored media and un attributed groups.
the classic example is the research. ultimately with ties to a nation state. then we have the amplification channels. the distinction here is where is that message not being generated but where is it being spread? how is it being amplified? this is both platforms and individuals. social media platforms, facebook and twirt as well as other accounts, both real accounts that are actually manned by humans and fake accounts, people pretending to be someone else. bots, particularly used for retweeting and amplifying a message trying to get it higher on the trending topics on twitter or push it higher in your news feed take advantage the algorithms to grab your attention and the u.s. news media which may without fully realizing it amplifies all of
these messages. unaffiliated websites are ones that we put into here. finally we have the target of the disinformation campaign. in this example is u.s. citizens and decision mackers. the goal not only included u.s. itizens and european citizens, nato. what are our options? the goal is essentially to shape moscow's decision making the deterrence approach. we looked a at different options for different policy solutions that could be used to shape moscow's decision making. a few are listed here. they're discussed in detail in the report that you vall a copy of. we basically went through different ideas and talked about the pros an continues and the costs of these different approaches. just because i have it listed up here doesn't mean that i recommend doing it. for example from a defensive perspective you can make it more difficult for them to succeed
and deter russian leadership from engaging in the first place because it's not worth the cost. you could look at taking a more offensive approach whether that's using sanctions, either economic sanctions or political sanctions to even go more offensive looking at things like promoting democracy within russia. this is one of the options that we looked a it. and no, it turns time-out continues of that are quite large, not a good approach. some things that are kind of more in the middle grounds look at enforcing clear norms of behavior on these platforms for nation states. what's allowed? what's not? and perhaps coming to a shared agreement about those norms. we can also -- air conditioning is quite loud in this room. can y'all still hear me? ok. limiting these proxies, one of the key things here is you need be able to detect them, identify. naming and shaming is a commonly approach where you say this is a russian proxy.
they're pretending to be someone else. but pay attention they're actually sponsored by a particular actor. looking at deterring or curtail those activities and defining and nontoring norms. what are media entities allowed to do in social media platforms? look at the amplification channels the go here is to limit the impact of the spread of disinformation on those channels. part o that is to be able to identify and detect it. once it's detected either remove it or counter it provide an alternative narrative, something like that. this require as lot of close corporation between not only private social media companies but public sectors. finally looking at the last link, we want to improve consumer knowledge and judgment. one of the most commonly heard themes here will be use literacy campaigns.
make sure to distinguish between fact and fiction. probably a good idea. but the question is really how effective is that? and is it worth the money and the time and the resources that are invested in it. unfortunate will, there's a lack of rigorous research in the area. that could be an area that is promote really understanding what makes an effective media campaign? what are the elements of it? how should bit fund and acted? you can look publicizing the impact of this on social media, making people aware that this is a problem. and here's how a personal perspective to understand how to distinguish it but then why it might impact you. why is it something that an every day citizen should care about? no single solution is perfect. they all have different pros and different continues. hat we espouse is a suite of solutions that can target multiple lynx of the dis information chain so that you're
coming at it from different perspectives. the first is establishing clear and enforceable norms of acceptable behaviors both from nation states and the media entities on them. it's possible that existing treaties could serve as a mo demell this area looking at the treaties for prohibiting the use of dem cal and biological weapons with several nations sign on to them agreeing this is allowed, this is not allowed. but attribution is going to be critical. how do you understand that this piece of disinformation was pushed by this nation state in violation of that treaty? that's the big challenge with here. but nevertheless, an area that's worth investigating. we can also look at better coordinating u.s. government activities from the legislative branch to the executive branch, looking at the task forces that are already existing within f.b.i. and d.h.s. making sure
they're connected to efforts in developing legs and congress to some of the foreign policy and foreign affairs being pursued by the department of state and bringing in both the intelligence community and the department of defense where a lot of this expertise in terms of understanding operations lies making sure that they're sharing that expertise with the other elements of government. and getting people to work together, the question is how do you actually make this happen? is an executive order the right way to go with this? to make these different organizations play well together rather than having them volunteer do so on their own? that is a an open question. one of the most important recommendations that has the best potential for succeeding is to institution a formal organization for information sharing we had the active measures working group which exposed russian disinformation
and news media. it was run out of the department of state. that has since been disband. there's still a working group and things like that. but it doesn't have the same power and authority that it used to. the key difference between that nevert the 1980's and what we're proposing here is you have to loop in the private social media companies. they have to be willing to buy in. so this will need have property authority and expertise and resourced. and like i said involve both that public and private sector engagement. another thing that could be helpful is try to increase the transparency of the algorithms and the policies on these social media platforms. how are they detecting disinformation? how are they removing it? how are they defining it and making that clear not just to the government but also to individual users of those platforms? the question is really, is it possible even to incentivize these private companies to
increase the transparencies. these algorithms is what they're using to profit models. they will continue to drive their profits if the near term looking at why you will see that on your news feed or a trending topic on twitter. also those policies of that user agreement. is there a way to increase the transparency of these rules without enabling them to get around those rules. this is an interesting area where more research could be performed. and along those lines how do we encourage and fund academia to fund those tools? this could come from government sources or private sources like foundations. research grants, i think this is an area that is really ripe for more investment and funding. >> the twist here will really be in order to do this research properly, rigorously they need
access to real world data. this is going to require corporation from companies. if there's an approach to detect and remove disinformation or provide an alternative narrative, how do we, for lack of a better word, audit that new approach? can we say it is effective and it is worth the investment that we're putting into it? and that is going to require some sorts of cooperation. so how you enable access to real world data while maintaining user privacy and protecting these company's profits. one of the things that we think is really important in this discussion is essentially don't go overboard. prioritizing defensive activities over punishments. we think that the consequences of engaging in overt promoting democratic activities and demock sys within russia has a much larger downside. we run the risk of being called
hypocritical. maybe you're ok with that. but the problem we didn't engage in an arms choice. it's a free-for-all. if we prioritize the defensive activities, if we make it harder for disinformation to succeed in the united states that we have has the most promise for actually being able to shape the decision-making. and finally, it's really important to continually assess the impact and cost of these proposed solution. this list is what we think right now could work. two years from now, 10 years ago this may not work. new technology may have appeared. it's important to know we did this thing and we solved it, and we're done. you need to continually assess how effective are these various solutions? are they worth the cost and the resources that we're putting into them? is it actually being effective in terms of reducing the overall amount of disinformation and the impact of disinformation on
these various platforms? so in summary just to give you a quick recap of what we think are some of the top policy actions that could be undertaken now, increasing government coordination, establish and enforcing norms for nation states an media entities on social media platforms. prioritizing defensive over offensive activities. improving the tools through funding of academia, private companies, think tanks, whatever it might be improving our technology to be able to say this is a piece of disinformation. this is how it's impacted someone. giving it the appropriate resources, making sure they have access to the right expertise and they actually have the authority to do something about it. part of the role of this organization would be to figure out a way to incentivize the transparency of the algorithms on these private social media
companies an finally assessing and improving the solutions, not just doing a one and done approach but actually taking a comprehensive look and understanding is in the right way to go? should we continue down this path or do we need to change direction? i think without engaging in at least some of these activities, this is a problem that's only going to continue to grow. we're going to see more and more erosion of trust in our democratic institutions. this is something that needs action now not just from the legislative side but from the executive and other branches of government. thank you very much. and i'm happy to take questions. [applause] >> you talked about incentivizing companies to -- to change the algorithms. how do you -- i mean, because that's -- their algorithms basically is like a competitive
advantage for them to have it. so how can you incentivize private companies to give up their algorithms? >> so i think that's a great question. i think it's an area that needs more research just to repeat the question in case you didn't hear it. it was how do we incentivize these private social immediate companies to increase the transparency of their algorithms given that they give them a competitive advantage? i agree, it's definitely a big challenge. there are advances in technology that could be used and implemented. you see some of the advances in data privatey and forgive me if get a little technical here privacy algorithm where you can perform computation and assess it to combat disinformation without actually revealing anything about that algorithm that's powering the news feed or something like that. so that area of research has
advanced efficiently that could be applied in this area. i think there's also, you know, working through public, private partnerships where you have representatives from the companies working together with researchers from the government to attack this problem together so that they're sharing the relevant information perhaps within that organization, judiciously use of nondisclosure agreements but there needs to be some sort of audit mechanisms because these companies are doing things to fix this problem but how do we know that's actually working? how do we know it's not just for how to prevent them from being drug over congress over and over again. > do you have an idea of being a governmental or nongovernmental? >> she asked me to describe about where it would be housed. so it definitely needs to be a
government-led organization. if you look at the different organizations that are already playing in this area, you know, the department of homeland security really -- they have the mission to protect the homeland. they -- the f.b.i. is kind of more of an investigative role. so it probably would make sense to expand the role of counting influence task force that is existing within d.h.s. but most importantly give them the actual author and resources which they don't really have now to do this coordination and this information-sharing role and actually require participation from private social media companies. i am not a legislative expert so i don't know what the appropriate avenue is for requiring that might be but perhaps that could be some sort of legs that says this is the organization that has the authority, we've resourced them appropriately and in order to
have increased regulation we need to have active participation from these active companies. yeah. >> i have some questions. first of all, i'm wondering you mentioned at the beginning media literacy, but i wanted to know why you didn't develop on that? and the second question is what do you think the roll of news media organizations should be in all of this? >> ok. so i'll talk through the first question first. so the role of media it will -- media literacy. i think literacy is important. the reason it hasn't rizz on the our top recommendations basically because there's a lack of research on literacy campaigns. what elements are really useful in that? i certainly don't think it's a bad idea to have media literacy campaigns. i think it could only improve the situation. but the question is really how much resources should be put against that versus some of
these other approaches to understand the ultimate impact o that? and the second question -- media organizations. right. some what is the role of media organizations within ? i think it would be interesting to promote some of the efforts that are already being done by media organization to promote literacy. some of the various education opportunities that are out there, there's a lot of really great resources available on the internet that could be promoted by media organizations. but i think it's also a commitment to the quality of journalism to insure that they are not inadvertently spreading a piece of disinformation. you know, try to -- i don't know if regaining the culture is the right way to put it, but it's a problem in our current news cycle. there's something new coming out every single second. they have to jump on it otherwise they lose the story. i think -- you know, balancing
that with a commitment to quality and insuring that something is true and accurate is an important way to go. so -- >> in the category of reinforcing norms specifically naming and shaming, if that message is not nested all the way up to the top level of government, how can it be effective? >> receipt me make sure i understand that question. if naming and shaming is not all the way through the top level of government -- so i think that is a valid point. it would be better, certainly, if there is consistency in messaging across the u.s. government. but these efforts i think do have the potential to succeed in spite of that. if they are appropriately resourced and pursued.
i think it is not even the naming and shaming. it is all of these approaches. making sure there is enough coordination happening and enough support and reseniorses holland them that they have the element to succeed. without the whole government approach it is always going to be fragmented. >> you mentioned when you were talking about prioritizing defense over offense, it sounded like you were talking ing back from promoting democracy in russia, it is something that the russian government very much wants. is there a danger that we are rewarding bad behavior? >> that is an interesting question. the question in terms of stepping what back from promoting democracy within russia, is that rewarding bad behavior? i think that is an interesting perspective to take on it, not
necessarily the one that i would have on it. i think that taking this step back, looking at the larger picture and looking at -- if we are really talking about deterrence, do we want to get into and create a world where we are trying to one up each other? are we ok with consequences of that? you will hear people that are ok with that. i think if you look at the way we have talked about it in our report, our conclusion is that is not worth the price that we would pay, and we would ultimately be more effective if you generally make disinformation not work against our society. >> is it sufficient to make sure that the united states society is prevented from being affected by disinformation, or do you feel there is a need to perhaps not promote democracy
in russia, but in other countries, be it in latin america or europe, that russian disinformation is not effective. secondly, could you talk about the role of u.s. partners as well? >> certainly. the focus of this presentation and this research was more just specifically within the united states. but we have done some research looking at u.s. allies and partners and disinformation around the world. i will call out my colleague over there, who is the author of a great report about understanding russian influence in eastern europe. o the role of partners and allies in that arena is incredibly important and making sure that we are both from a resources perspective and foreign policy perspective supporting those efforts. whether it is through various efforts with local media organizations and promoting their ability. so i would say there is really
no difference in my mind when you are looking at eastern europe or south america. the role of partners and allies is going to be critical to this. and yes it is important to do it at home. but if we demrect partners and allies, we will end up on our own anywhere. it is of importance, just not so much my focus today. > any other questions? >> one of the points you made was coming to an agreement with russia. who would work on this? >> talking about the coming to a shared agreement in russia. that ended up not being one of our top recommendations. when we talked about it, we found that there was problem just that effort on its own is not going to be particularly
effective given the potentials for violations, and how do you do retribution and all of that stuff? we couch it more in the recommendation of establishing clear and enforceable norms. looking at the approach of something like the chemical or biological weapons, getting multiple nations to sign on and agree to it, the key there is agreeing what is and is not allowed and everyone signing on to that. you could just say disinformation because we all have different definitions of what disinformation means. so actually to be able to say? a concrete manner this is allowed and this is not allowed, and then being able to perform that. i think out of all of our top recommendations that is probably the most challenging one. if it could be executed, if we could figure out a good way to perform that attribution, it is
potentially the most effective one, but it is certainly the one that has the most challenges because it requires a willing partner. any other questions? that solved everything. >> thank you. [applause] [captions copyright national cable satellite corp. 2019] [captioning performed by the national captioning institute, which is responsible for its captioning content and accuracy. visit ncicap.org.]
education. he talks about net neutrality. he is sbe viewed by david mccabe. >> the priority right now is to get this net neutrality issue in statute and inalized once and for all after 15 years of watching this issue ping-pong fcc nd forth between the commissioners. i think the public overwhelmingly supports net neutrality rules and wants to see some certainly and permanence to this issue. >> watch the communicators monday night at 8:00 eastern on use. >> the only thing we have to fear is fear itself. >> ask not what your country can do for you. ask what you can do for your country. >> and the people who knocked will know ng downs
soon. >> c-span's newest book, the president he is. noted his toshes rank america's best and worst chief executives. it provides insight into the lives of 44 presidents. explore the life events that shaped our leaders, challenges they faced, and the legacies they have left behind. published by public affairs, c-span's the presidents will be on shelves april 23. but you can preorder your copy as a hard cover or e-book today at c-span.org/thepresidents, or wherever books are sold. >> on tuesday, consumer financial protection bureau director testified on the agency's semi-annual report during a report by the senate banking committee. this runs an hour and 35