Skip to main content

tv   Stanford University - Investigating Facebook  CSPAN  March 18, 2019 10:05am-11:31am EDT

10:05 am
african-americans and southern unionists. speakers included pulitzer prize winning author jon meacham and elizabeth bairn of the university of virginia. the event was co-hosted. watch tonight beginning at 8:00 eastern on c-span3. now, producers of the pbs documentary "the facebook dilemma," talk about their investigation into the social media platform and its impact on global privacy and democracy. stanford university hosted this event. it's an hour and 25 minutes. >> so, tonight's symposium follows the mantra, think globally, act locally. sitting here in encina hall, we're five miles from one hacker
10:06 am
way, the headquarters of facebook. this social media platform has great local impact. barrier students strive to intern there. alums build careers there and faculty offer the company both consulting insights and criticism. the decisions made by facebook in menlo park echo across the world, affecting the platform's more than 2 billion monthly active users. last october, pbs aired a two-hour documentary entitled, "the facebook dilemma," which investigated the company's impact on democracy across the globe. tonight, we're very fortunate to have three people involved in that reporting project to explore the story behind the story, including the particular challenges in covering a powerful social media platform. they are, anya bourg, james jacoby, and dana priest. after brief introductions, we'll turn to a panel discussion about
10:07 am
how "the facebook dilemma" came to be and what takeaways these journalists have from their experiences in putting together the documentary. we've provided audience members with note cards and as questions occur, please feel free to write them down. they'll be collected and forwarded to me so that for the last half of the panel, we can focus on questions from you. now, for the introductions. onya bourg is an award-winning producer and journalist who joined front line's enterprise journalism group in december 2014. it was a return after almost a decade having begun her career as an assistant producer on frontline's "the soldier's heart." in between, she spent nine years at "60 minutes," working on stories that range frd from the violence in mexico to the destruction of coral reefs to the lack of accountability for prosecutors accused of misconduct. bourg has also worked in radio, reporting a one-hour documentary for "this american life." she graduated from the university of california berkeley and columbia university's graduate school of
10:08 am
journalism. james jacoby is a producer for "frontline" where he's a founding member of the enterprise journalism group. in addition to "the facebook dilemma," james recently produced "war on the epa," which investigated how scott pruitt went from fighting the environmental protection agency to running it and rolling back years of policy. his film, "out of gitmo" told the dramatic story of a guantanamo detainee released from the controversial u.s. prison after more than a decade. in collaboration with npr, the film illustrated the struggle over freeing prisoners once deemed international terrorists. before joining "frontline," james worked for "60 minutes," where he produced investigative stories with correspondent steve kroft. his investigations revealed wrongdoing by, among others, major banks, credit reporting agencies, disability lawyers, and arson investigators. prior to joining "60 minutes," he worked for cnbc, dan rather reports, current tv, and "the
10:09 am
nation," reporting on a range of topics from youth politics in pakistan to the european debt crisis to the rebuilding of new orleans after hurricane katrina. james has received several honors for his work, including two globe awards. he's a graduate of the university of pennsylvania. dana priest has been a reporter for "the washington post" for 30 years and is currently the night chair and public affairs journalism at the university of maryland's merrill college of journalism. she covers mostly centuries issues and has been a reporter and contributor to pbs frontline and a contributor to nbc, cbs news, and there's a theme here, "60 minutes." priest has received numerous awards, including the 2008 pulitzer prize for public service for "the other walter reed," and the 2006 pulitzer for her work on cia's secret prisons and counterterrorism operations overseas. she is the author of two best-selling books, "the miss n
10:10 am
mission: waging war and keeping peace" and "top secret america." she's also the cofounder of prez and cuff online, an online publication of student journalism and research on censorship and press freedom. thank you all and thank the audience for coming here on what has turned out to be a dark and stormy night in palo alto. so let's begin. james, where did the idea for this documentary from and how did you initially start the reporting process? >> the idea initially came -- anya and i were just trying to remember that this morning, actually. because it was amidst a kind of -- we were working on the epa film and anya lives out here in the barrier and was kind of recognizing that there's a story to tell about -- this was after the 2016 election. you know, there was a lot of talk about russian interference and all sorts of security concerns and basically, what
10:11 am
happened was, our epa film aired in october and in november of 2017, the testimony of the general counsels from facebook and some of the other tech companies appeared in congress. and that was really the moment when anya and i decided just, this is something we need to look into, in part because of the non-answers from the attorneys at that point for the company, about what had happened during the election and we just thought it was good terrain to look into. >> and what was the first thing that you did? >> the first thing that we did, i mean, the first thing you do anytime is just read up as much as you can. and then, i think, talk to people about who are some of the smartest people in the field, to talk to. both critics of these companies, as well as people that have worked for them. and so, i mean, anya's really expert at finding current and former employees to really speak
10:12 am
to. and you really do speak on background at first. and just get the lay of the land. so it was really to create a database of people that we could reach out to and have just on background conversations about their time at facebook or their thoughts about the company. >> so, anya, i know when you walk into a tech firm, you're greeted with a smile and a nondisclosure agreement. and i think employees sign these ndas and company lawyers often police them. how did you get people who signed ndas to speak with you and did lawyers ever get involved? >> it was actually interesting. this was one of the most difficult stories i've ever -- i had such a hard time getting people to speak, even off the record. people were just incredibly nervous. so there was a lot of networking, like james said, a lot of just going around and talking to one person, they would introduce you to somebody else. but people seemed very reluctant to go on camera or even give
10:13 am
their name. i think it's a tight community and people are worried about appearing to be critics of their previous employer, when really, we weren't looking for criticism, necessarily. but i think there was just a lot of reluctance. >> the other reluctance for the ndas was about naming specific people or talking about specific events. so a lot of people that ended up speak iing on the record didn't want to talk about anything specific. a specific person, especially. so i think that specificity may have been a trigger for some of the agreements. and then there were lots of people that surprisingly -- not lots of people, but a few people that were kind of unconcerned with the ndas. and thought, well, you know, fine, come after me if you're going to -- if i'm going to tell the truth. there was one -- >> i don't remember -- >> there was one person. >> most people -- >> a few more than -- >> did you ever get the sense that somebody was going to talk to you and then maybe lawyers had contacted them and they --
10:14 am
>> yeah, we did have that experience. >> yes, definitely. or the company had -- someone at the company. not necessarily lawyers. >> so that's a great transition for dana. you've covered intelligence agencies, including the cia, so how did reporting on facebook compare to what you experienced in reporting on intelligence operations? >> well, as a journalist, it rang all the alarms. because, honestly, they were more worried than the cia people or the former people who worked at the agency, who take pledges, you know, and spend their whole lives in a classified arena. and i found that these people who had honestly, i thought, much less to lose, because there was no way that someone who started talking to us would actually be legally prosecuted by them. that would bring so much publicity, but they were culturally so scared. and i started looking -- you
10:15 am
know, i got involved -- in part, i had already started looking at facebook for the post, and talking to an intermediary, phil bennett, who brought -- you know, brought me into the project. and, you know, just from that experience, the comparison between talking to cia people about very classified information, dod people, about operations, and here i was, you know, in civilian america, having this very strange experience. and part of it was that a lot of these people were very young. so they probably were scared to death. but how -- how does a company that claims it is a community based, you know, happy company that's positive, instill that in people. and so, like i said, it rang every alarm bell. and then we knew -- my particular interest was what was
10:16 am
happening overseas, because i had done a little bit of work on that, and also, i followed just some basic, you know, journalism -- there's some basic things. some basic rules for stories. always follow the money, which weapon didn't do in this film, but that's a tried and true kind of mantra. the other is, who knew what, when. and that is one of the main themes that we started to pursue. once we got our handle around a little bit of what we were going to do as a film, then we wanted to say, well, who knew what, when? and we particularly drew that out in the foreign area. but for people who aren't journalists, the one thing that i would like to explain is even though a film or an article may come off as very -- not authentic, but very authoritati authoritative, usually, you do not start with "a" and get to "b." it's like this very squiggly
10:17 am
line to get to the end. and that's like -- that is a process that i think every story follows. and it did in this one, too. >> and i think when dana joined us to think in part what dana brought to it was that thinking about the who knew what and when aspect of the story and also kind of conceiving of the two-hour film, we knew we were going to have to go chronologically. and one of the interests was kind of, who was talking both internally and externally at the company and kind of warning about all sorts of things that then reared their head and ended up on the front pages after the 2016 election and onward. whether it be privacy concerns or concerns about, you know, malicious actors and disinformation campaigns, the problem of misinformation, all of those problems. it was a really helpful way of us kind of structuring our reporting, to try to fill in
10:18 am
chronologically, what was happening in terms of insiders and outsiders, talking about some of the issues that ended up being major problems for facebook, later on. >> so, tv works best with pictures and pictures need a narrative arc. and i wondered, as a story teller, what choices you made to visually convey how the company was operating and what its impacts were? >> so we had a phenomenal archive producer, a co-producer on the film, megan robertson, who is just expert sat finding footage. and really kind of given the mandate by our executive producer to think of archival footage as really investigative reporting, where you're really digging into the archives to see what it was that the principles, whether it's mark zuckerberg, sheryl sandberg, or others were saying about particular issues that we were then going to see, investigate or test whether that
10:19 am
was actually what was happening internally inside the company. so megan relied at first on something that was from the university of wisconsin, which is called the zuckerberg files. and it was a research professor there who was assembling everything that mark zuckerberg had said, since he basically became a public figure in 2004, 2005, about whether in video, audio, in print. and chronicling it. and then megan, there were lots of holes in the archive, but megan would then research that. so basically, that was one way to bring this to life, was with the archival footage and the statements of some of the principles in the film. >> so in setting up new interviews, anya, i know at both "60 minutes" and at frontline, you've been able to figure out some way to get people to talk about extremely controversial topics. and i wondered what happens when they regret what they said.
10:20 am
>> um, i mean, the main way we get people to do interviews is being genuinely interested in hearing their side of the story or their perspective. and we kind of comment everything that it would be better to know your sense of what happened if there's something that you're worried about, tell it to us, so we can really understand the full scope. we have had instances where people are not happy with their interviews. >> well, you know, there were a couple of people i can remember during the interviews, they said, you know, i want to tell you something off the record. so either say stop the cameras or something. and even if they didn't say stop the cameras, we would, of course, honor that. so -- and i think that, you know, we always said in the beginning, if there's something you don't want to have on camera, that's fine. because part of the process is figuring out what the story is. and so there's a lot of people -- one is figuring out what the story is.
10:21 am
and the second is, what can you get on camera about what you figured out the story is. and they're not always the same. so but knowing what the story is, is the most important thing. because then you can go to -- you can always find ways to get it somehow on camera or in the narrative, even. and so, the first half of the film, for those who haven't seen is largely about the business model that changed facebook from something that, you know, we know best, zuckerberg's, you know, connecting you with your kids and family and grandchildren and all of that to a multi-national corporation that is everywhere and so dominant. and we even had to sort of train ourselves and it didn't really work all the time, to call them a corporation. we actually never do that in the
10:22 am
film, although, you know, i remember trying in the script, and it just felt so weird, because they were so good at telling you there was something else. and so for those of you, including myself, who are not huge facebook users, and still didn't, in the beginning, understand the business model, you know, that's what weapon wanted to explain. and that is a huge -- when you go and you become a public company, no matter what company you are, the dynamic of that company becomes the same, which is, you have to please your stockholders. and you do that by making more money. and that's where the business model starts to change, and does the thing in the end that ends up doing some very controversial and i would say bad things, unethical things, so that's -- >> to go back to the process one more second, we should say, we do like three-hour interviews
10:23 am
with people and they're very meandering and we don't always know what they're looking for and sometimes we stumble on things we weren't anticipating talking about at all and very often in the middle of interviews, people will say, oh, we're running up against something that i feel nervous talking about because of my nda or we're in an arena where, so we want to respect people's boundaries. and similarly, when people call us afterward and say, i'm not comfortable with something that i said, we'll always hear them out and try to understand what it was, if they said something that they thought would put them in danger or something that was proprietary, or something they were embarrassed about. in this case, can we talk about the main regret. someone from facebook was very unhappy with their interview. >> oh, yeah. yeah. >> and that was difficult and we had to really listen to their concerns. that's not something -- it wasn't off the record, right? >> no.
10:24 am
>> but at the end of the day, they felt they weren't properly prepared for the interview and that was on them. >> right. >> because we had properly prepared the public affairs people that were our intermediaries, telling them, basically, what we wanted from these two days at facebook -- >> one last thing i would add to that, "frontline" does something unique, which is what we call the transparency project, so for most, if not all, of the interviews we did for the film, and we did dozens, we published the entire transcript of the interview online. and both in video, you can watch the entire interview and read the entire transcript. and one, we feel like it's a public record and important for the public to know and also to judge how we have what we hope
10:25 am
properly characterized someone's story as he or she told it. and it also defends us, i think, very well against allegations of being fake news in some way or biased. because it's all there, the source material that we're working from and it's a very strange thing to do that, because no one really does that, but it's really important to front line to do that, as a public service and so we are transparent about that. so people that have been kind of unhappy with how they've been edited and things like that, it's, you know, that happens occasionally. but i think one, generally, we do a good job of that, of being fair and hearing people out and properly characterizing them. that's what we do. but also, their transcripts are out there. >> so there are 29 of those. and one thing i think is very interesting is that you have highlighted in the transcript
10:26 am
and in the interview what you used in the actual documentary. so if there are students looking for machine learning project that would look at their editorial judgment, you can actually see out of an hour, how did they select 30 or 40 seconds. and the favorite one that i watched was with trump's -- president trump's current campaign manager, brad parscale. and i wanted to ask you, what is it like to interview somebody who says directly to your face that the press are the enemy of the people? >> well, he tempered that to some degree after we had an exchange about it. and one, it's odd that we actually had that exchange in an interview about facebook. that's really what the interview was about, was for a film about facebook, how the trump campaign used facebook as an advertising tool and also what had happened, how he responded to the idea that there was a disinformation
10:27 am
campaign, that may have helped his candidate. but we did get into a discussion about the enemy of the people charge. and then he kind of dialed back a bit and said, you know, said that, not all of the press is the enemy of the people. and he feels as though -- you know, it's a long and laborious exchange that happens to be out there. and what's really weird about it, though, is that when you put your transcripts out there and you are transparent about it, on youtube, the vitriolic comments about the interview, as if it was some sort of battle between me and brad parscale, it's kind of astonishing, where they think that there's a lot of comments, predominantly comments that i'm left-wing media, challenging him on things. it's a strange thing to have your transcripts out there, but i think it's fantastic. i mean, people can judge for themselves.
10:28 am
>> he also tweeted a link to the entire transcript. so he was glad it was out there. >> yeah. >> and speaking of the transcripts, which you can see, at, right, is, you know, for students looking for papers. the two really interesting people that are not in our film, that are very thought provoking and i would recommend you looking at, one is depian gosh who was the privacy expert at facebook. he had previously worked -- he's a ph.d, you know, in computer science, i think. and his explanation of what he thinks of as privacy is very different than, you know, what probably most of us would think of. his is the right not to be -- the right to not have your data taken in order to manipulate you when you don't know it.
10:29 am
so it's not just not knowing where i live or my social security, but it's this idea of unknown manipulation. it's very -- it's much worth reading. the other one is don graham. so we got don, you know, the former owner of "the post" to -- we tried to get him to talk about zuckerberg, because he was on the board. and he wouldn't really -- he wouldn't really tell us much that was that new or interesting. but what is interesting is what he says about who should be the regulator of speech and, you know, do you really want the government to do this. or do you want private industry that is somewhat reactive to public demand to do it and to their customers. and i think he makes a very good counterargument to those who say that it should be the government. and at that same week that he was talking, vice president
10:30 am
pence was dissing google, i think it was, claiming that they were -- their algorithms were tweaked towards the left. and so he pointed to that. anyway, those would be the two that i would recommend that are not in the film. >> well, following on that, james, one person's responsible operation of a social media platform is another person's censorship. so i wonder how you view the role of facebook as a global decision maker about the elevation or suppression of content that could excite violence or create social division? >> it's -- that's the dilemma, right? so we entitled it "the facebook dilemma," and we had a great conversation with alex stamos, who's the former chief of security for the company. and basically wing better than anyone, he laid out a lot of the dilemmas about, okay, be careful what you wish for, right?
10:31 am
you have an incredible powerful internet platform here. you have other incredibly powerful internet platforms. and what happens if you draw out a scenario where they become proactive now in kind of regular regulating speech and what happens further down the line when there are ai tools to detect speech and take it down before it's even posted. and you could end up with an orwellian scenario when it comes to speech. and also the main question of thinking about leadership with these companies. you know, as critical as the film may have been or comes across when it comes to mark zuckerberg, with someone else in charge further down the line, who may not -- who is under no obligation to keep it a quote/unquote neutral political platform, what could happen if there were internally decisions to be made that were, in fact, biased. and where the algorithms were biased.
10:32 am
so that's at scenario that we, you know -- it's pretty frightful. and i don't know the solution to that. i think that that's something that smart minds at campuses like stanford should be discussing as to what -- where do we want to go with that? where do we draw those lines? i think it's a question that facebook itself takes really seriously and are really thinking and have been reluctant to exercise their power. because once you take responsibilities for speech, you kind of own it. and so, taking a much more libertarian approach to what's on platform was not just good business sense in some way, but also, there was a philosophy behind it. i'm actually quite afraid of the alternative, of what stamos and others talked to us about, about where this could lead, because these are basically unaccountable companies that have tremendous power over the mediation of speech in our
10:33 am
society. and one thing's for sure, there just needs to be like some transparency in order to judge what they're doing and how well they're doing it. >> so it's interesting, you talked about our society. and dana, in the film, you present evidence that facebook's operations destabilized democracies. in myanmar, the philippines, and the ukraine. i and wonder if you've seen evidence that the company has developed the infrastructure and expertise to mitigate that in the future. >> well, they certainly want you to think they have, but i don't -- i don't think that's the case, honestly, because they're both trying to deal with the countries that you named and that are having the worst problems, but they're also expanding at the same time. and they're relying on local partners, they call them, local -- what they are, local small ngos, sometimes --
10:34 am
sometimes they're news organizations that are themselves struggling to be news organizations. and all of a sudden, they get a contract from facebook to be the -- i call them censors. they call them content moderates. and i don't say that in necessarily a bad way, because there are terrible things on the platform that need to be taken down and were not taken down. so recently, i was at a journalism conference in santa fe, new mexico, and it had 28 sort of mid-career journalists from all over the world including one from mongolia. and we were talking about facebook. and this woman from mongolia were telling me their problem with facebook, what she described, is exactly the problem that the ukrainians had on facebook, which had to do with a lot of people who were anti-democratic, trying to suppress the pro-democracy voices by complaining en masse to facebook about, you know,
10:35 am
something that they were saying that was hate speech. and facebook not having the capacity, because of the language problem, to really know if it was hate speech or not. and buckling under the pressure of a lot of complaints. and so in mongolia, this woman described exactly that. so, it continues to expand. and doesn't yet have the capacity -- although -- so one of the things we sat in on, facebook did open up to some extent and let us -- i found the two most interesting things were the meetings. they let us sit in on two meetings. one was a meeting about content moderation, which they now have slr list slur lists, and they recognize that those lists change all the time, given the context of a slur. so a slur today might be -- not
10:36 am
be a slur tomorrow. and there are these, you know, young people who are trying, you know, very earnestly and hardly to figure out how to do the right thing. but this is a huge issue. and i'm not sure that we saw that there was a huge amount of resources devoted to it. >> so i'm wondering, anya, you're local. have you seen a reaction from people at facebook to the film? have you talked to the people that you talked to? >> assist strange experience to work on these projects, because afterward, it's very quiet. and so it's actually nice to be here tonight, because i'm really looking forward to hearing what people hear thought and to get questions, because i don't get a lot of reaction, actually. i hear from some of the people who are in the film, from other people that were helpful along the way, but it's kind of -- there's surprisingly little feedback aside from my friends
10:37 am
and family. >> but it has done well in terms of streaming, right? could you share that? >> yeah. and one interesting anecdote, which was an anecdote from someone who's there still, was that a lot of the younger employees, a lot of them take the bus out to menlo park from the city every day and on the buses, people were kind of watching in the days after the film aired. and that there were a lot of questions internally. in part because institutional memory, especially, when you have younger employees is short. so a lot of the younger engineers and product designers and other people that work there were not necessarily familiar with some of the history that we told and that, you know, what was discussed at the company about privacy concerns or other things. so i had heard from someone that there were a lot of people
10:38 am
internally that were watching it and had, you know, serious questions to ask of people internally about what was known and really getting a good sense of -- a better sense, i think, of the history of the company and how it had approached different problems, like the speech problem, for instance. that in 2008, the company was enormous and having to come up with a whole bunch of rules. so we would tell them the story, the story of people kind of sitting down to come up with what is essentially a constitution for a nation state. you know, what speech will be permitted. that's the first kind of element of creating a way to regulate this community online. and what was the ethos about that? what was the thinking. so i think some of the younger employees didn't know any of that history, and that's great, if they're learning something. >> so since your film, "the new york times," has done additional
10:39 am
reporting, some of which focused on sheryl sandberg, and i wondered for each of you, what would you like to do in your next reporting on facebook? what do you want to know? >> you know, it sounds so simple minded. i still want to know who knew what when about some of these crucial issues. because our film did not really -- it's very different from "the new york times" reporting. you know, we did not really get into the leadership and their role in decision making. we -- and that, i think, is critical, because we're at a stage right now where we have big choices to make about not only facebook, but other tech companies that have become so monopolistic that they're crushing any competition and innovation. and you know, to me, it's like the industrial revolution and
10:40 am
the reforms that eventually came in, we are at that phase where we're, you know, maybe we're not quite there yet, but at some point, people are going to have to start deciding how much power they're going to really give to them, now that they know how much power they have. right? because i think the process of knowing how much power they have is just coming to light now. >> so it's interesting. through a dual stock structure that gives his shares greater power, mark zuckerberg is the controlling shareholder in facebook. so as a shareholder, he could pursue policies that did not maximize profits in the annual report reminds people of that. he could favor trading off revenue for doing things such as promoting democratic participation. did you see any evidence of altruism or civic participation in how mark zuckerberg is leading facebook? >> well, i think -- yes, that is
10:41 am
the short answer. i think that there are a lot of kind of orwellian names inside of facebook, but like, there is a department of facebook that's really thinking about how, as a tool, it can help with all sorts of issues of social good, and you know, one of the things we have to remind ourselves of is that, you know, this is still a very good service for people, right? and it's something that we use and many of us do. and i think that in emergencies, it can be very useful. in fund-raising tools, in terms of getting information out very quickly to networks, it's pretty amazing. i think, you know, the astonishing thing, i think, was that when it came to investment in security, when it came to investments in parts of the
10:42 am
company that really deserved a lot more attention and there were people internally that were saying it needed more attention and more resources, i think that they made some really bad decisions there. the interesting thing is they're now saying they are investing in that stuff. what that means when it comes to, you know, protecting elections from disinformation campaigns, bringing down fake accounts, things like that, i mean, we had the midterms, we feed to see how things go further into the future. that's one thing we're interested in. and i think the other thing is just basically, you know, wit n whether -- whether or not,, you know -- yeah, there's major questions still to be asked about the company's size and its data. and i think that the issue of anti-trust hasn't been addressed yet in a real significant way. and thinking about data differently and thinking about trust differently and what this
10:43 am
company has and how well it's doing in terms of whether it benefits more than its actual consumers. >> can i just say that one of the most fun things working on -- collaborating with really intelligent journalists is just like the after-hours talk. you know, because we get obsessed with the story. first, we get obsessed with -- do we understand what it really is and all of its elements? and you know, we can talk late at night through -- what's more important? this element, that element? but, also, just, kind of imaging, what are the solutions to these problems. and we don't really address that in the film, but it's something that we talked about, we couldn't help but talk about. and so my fantasy world is, would it be possible to have a non-profit facebook? you know, a facebook -- and what would that look like? would that actually solve the problems? are the problems created because of, you know, the need to
10:44 am
make -- you know, to make the most profit possible? or what if you took money out of the equation? could you actually have what we all like about facebook, which i think is still the idea that you can connect with your friends and your family. >> and anya, any reporting hopes? >> i mean, i'm most interested in the same questions about, you know, what are we all going to do about it? what are the possible solutions? it's hard to make a film about that and i think that's one of the challenges and the process, is the things you sometimes get most interested in are in the -- when it comes down to storytelling, it's hard to tell a story about the future. we wind up focusing on the past and the present. but i think those are the things that are the most important and are most interesting. just one more thing, which is just, the scale of the problems is what is so mind numbing. and we saw that when we went to
10:45 am
those meetings and, you know, my favorite line in the film was, we were filming this -- there's a team that's dedicated to trying to fix the fake news problem. and someone yells out, we need a fact checker for the middle east! and as someone who has fact checked, that is a crazy thing to say. trying to fact check a two-hour film takes weeks. you can't fact check the middle east and there's not one person who can do it. so they do care, i think there are really earnest people there who want to solve these problems. i think there were some people genuinely surprised by the impact the company has had around the world. but trying to solve these problems is overwhelming. >> so if you do have questions, please raise your hand and we'll collect them while i ask one more round. so dana, as your role in the night chair and public affairs journalism at the university of maryland, you teach classes about misinformation. what lessons from your own
10:46 am
classes do you wish facebook managers would take to heart? >> wow. that they are operating in countries -- they are facilitating the -- they are facilitating in some countries that are not democratic and they are facilitating non-democratic forces, unfortunately, probably as much as in the beginning when they facilitated democratic forces. but what happened after the arab spring, were all the dictators realized, we've got to figure out this thing! you know, that just happened. and so there was a huge wave that was extended far beyond the middle east, that said, we're never going to let this happen again. so all of those authoritarian regimes learned what social media, including facebook, could do for them. and i don't think, you know,
10:47 am
necessarily facebook was kept abreast of that. so i have -- i teach a class once a year, an advanced reporting class, where i give every student an imprisoned journalist they have to do an intimate profile of. and they have to find their family and their colleagues and and all of that. and so i've learned a lot about the people who are imprisoned all around the world. and the regimes that are imprisoning them, you know, have all the cards. they have the keys to the kingdom. and to the extent that facebook doesn't realize that they empowered them in a way that maybe nwas inevitable, but they facilitated that. and they have some really interesting decisions to make. in vietnam, if they want -- a small country, right? but it's a -- you know, a
10:48 am
complete authoritarian regime. and if they want to go into vietnam, they're going to have to put -- they have to give them, you know, put the servers there and basically give them the names of the people that are using facebook. and of course, china is -- wants the same thing. and zuckerberg is enamored with going there, apparently. although i haven't followed it recently. and so what will happen there. and what will happen in the places where they know, like the philippines, brazil, hungary, where they know their tools have been used for anti-democratic methods. what are they going to do about that? >> and anya, a significant number of our m.a. journalism students are here tonight, so i wondered what advice you would give to someone who was very interested in accountability reporting, who, because they were trained here, have great
10:49 am
data journalism skills. that's a plug for the program. >> i'm sure you know more than i do, so it's hard for me to give you advice. i think that if you're here, my main piece of advice would be to try to cover tech, because i think this is going to be the beat for years to come. and i wish i was better steeped in doing data mining myself. >> so, james, you've thought a lot about the students going into computer science, working in silicon valley. if our students end up working at a social media company, in menlo park, what would you like them to take away from your reporting? >> i think one thing, right off the bat is just, know your history. and know the history of the company that you're working for, try to understand its prevailing ethos. and i think, you know, one of the -- one of the things that struck me over the course of
10:50 am
this project was how little appreciation for history there was. history of, you know, of authoritarians using tools for bad aims, history of i think of kind of the mindset of silicon valley and how that's changed as well. i think that in terms of venture capital and, you know, a lot of people talk to us about the earlier days of silicon valley and the ethos, i guess, most embodied in sort of a much more truer idealism about the powers of technology, the bicycle of the mind, things like this, and that there's been major shifts toward a much more mercenary aspect of the valley. i think what really struck me
10:51 am
and the most in reporting this is how historical a lot of the people we interviewed were that worked in these companies, not understanding security concerns, not understanding regulatory problems of when industries grow enormous, and these are things that are just basic. i wish that more people were trained in. and then i mean we talked about this earlier, before the session here, but that the ethics, the ethical components of product design, whether it be the mental health aspects of it or the security problems that could be present in it, a lot of what we heard over the course of reporting this, is that various divisions of the company weren't speaking to one another, so, you know, there may not have been product designers who are trained just keep shifting products and keep creating and keep programming, are not
10:52 am
consulting with other people within the company that actually deal with real world problems. that is something that absolutely has to change. i mean it's something that these companies should be mindful of and it's something young employees going into these companies should be mindful of. am i communicating with all the people i could potentially communicate with inside the company that might how understand something practically i'm designing could either go right or wrong. >> right before we got here when stepping out of our hotel, we were talking about, you know, ethics and what was the analogy. is the defense industry an analogy? ha are the ethics that everybody be who is a programmer, developer, should be thinking about when they're building something that could be used for bad. like when you first -- when the first drone makers built the
10:53 am
first drone, i'm sure they had no idea that it could be armed to kill people, you know, pinpoint strikes all over the world. and now it's -- so even those questions, you know, is it like medicine, what are the ethics of a doctor, what are the ethics of a lawyer, what are the ethics of someone who builds a technology? >> so we talked before this winter, stanford has started a computer ethics class that has 300 students in it taught by a political philosopher, computer science professor, and a political scientist, who served in the obama administration. there's a weekly writing requirement, some weeks that's code, because there's actually a prerequisite to get into it, and it meets four times a week. i think it's in part been driven people at stanford see our alums and think how can we broaden the things that they take into
10:54 am
account in the jobs that they hold. so we're going to go to the unfettered lightning round portion of the evening. i really appreciate the questions here. incredibly diverse. i'm going to read the question and not guess who is going to answer it, okay. so you can just volunteer. first question, is there a google dilemma? >> yes. >> yeah. >> sure. i think, again, when it comes to the amount of data that google has on each of us, what they can do with that data, and in terms of algorithms and how -- what we're seeing is huge swaths of our population, what they're seeing, what they're querying, and how, you know -- what
10:55 am
accountability mechanisms exist, there's definitely a dilemma. it's again a service that ostensibly we're getting for free, but we know there's a price and we know there's -- we have to start putting a price on what that is and no one really has figured that out yet. >> next question says, ndas in california are one concern, but isn't the real concern about employment blacklisting? is in other words, will you be hired if you go public? >> absolutely. it's -- and that was -- that's a great way to put it because that was always implied in the people that talked actually about an nda, and it wasn't just -- it was definitely blacklisting employment wise, but blacklisting culturally among the people that they -- among their tribe, the tribe that they respect, they liked the people,
10:56 am
they speak their language, they grew up in the tribe and the idea of being dispelled because you've broken a rule was, to me, almost as powerful as, you know, breaking some legalistic term. >> yeah. there were a lot of -- anya can speak to this, a lot of frustrating experiences in the reporting process where we'd speak, for instance, to a former employee at facebook whose real primary concern about not going on the record wasn't his or her nda. it was much more of a kind of -- >> current employer would frown upon it. >> or people would frown upon it if i spoke. however, then in the next breath we'd often here, but what i'm telling you is really important and you need to understand it. we would often be there saying to these folks, you know, there's no other way for us to report it unless you help us report it and tell it to us.
10:57 am
the bridge to cross was not necessarily an nda problem. it was much more of a cultural problem and i think, you know, there's a reluctance certainly in tech to talk to journalists, understandably, because it's technical and a lot of things are technical, complicated, they're not black and white, and we have to explain to the general public what they're doing so i think that a lot of what people felt or reluctance to talk may have been are we going to really get the nuance of it. but we were kind of perfectly positioned to try to do that more so than any other news organization, in part because we had two hours to do this and we're publishing our transcripts. i think part of it is cultural in the issue in silicon valley of you don't talk, you don't snitch, but also there is something that we need to acknowledge on our end is tech
10:58 am
reporting is often -- it can be wrong or it can be problematic because you're taking really complicated things and you're trying to boil them down. on our part there's certainly something we can do better to convince people to chat with us on record. >> facebook spreads a lot of money around so you will find formers everywhere, former government officials, former facebook officials, you'll find them in think tanks in washington or elsewhere, but those think tanks get a lot of underwriting by facebook so there's another disincentive for them now out in a think tank to talk about it because of the finances. >> i was just going to add that i think there's a lot of criticism of news organizations and media companies right now and a lot of it is warranted. many of the criticisms that we are lodging about facebook, i think we have to, in fairness, say are true about news as well. i mean, if it bleeds it leads. >> right. >> has always been true.
10:59 am
it's true when you click on facebook. it's the same problem. and so i think people are feeling really critical right now of news and i think that it adds to their reluctance to want to go on record because they're afraid they will be a part of -- fake news resonated for a reason. >> one thing that we are interested in as a reporting goal next is that -- and we heard it from a lot of people that were at facebook that worked in news, is that essentially our industry, the journalism industry, operates in facebook world, right. i mean it was sort of the salvation for a lot of the, you know, "the washington post" and certainly commercial enterprises to find an audience. so when you're operating as a journalist or as an institution in that world where you can find your audience, but you also may be playing to their bias to some degree, or that may be what comes up first when you've got the app of some paper, and that
11:00 am
is problematic. we have to do a better job as an industry of journalists and reporters, pub be lishers, in thinking that through, of what it really means to operate when there's a few outlets, a few distributors and facebook being a major one. >> so the next question relates a little bit to audience participation. raise your hand if you use facebook. okay. that applies to the people on the panel too. >> yeah. >> yes. >> so the question is, what responsibility do users have, given the knowledge that you've shared in your film? yeah. >> to get off it? >> certainly to be -- >> it's not me. this is -- this is a question interest the audience. >> certainly to be media literate and know what is not
11:01 am
true and not to circulate things that are not true. >> i mean at the end of the film, who i think is one of the smart critics of the industry and is taken seriously in the valley for her critiques, she basically states the dilemma, which is that yes, i've got a problem with their business model, i've got a problem with their data collection, i've got a problem with how they've handled and addressed certain problems, but i've also got a problem because my friends and family are still there. the network effect of this invention are tremendous. as a communication tool, as a place to share things, so i think each of us has that same dilemma because -- but as users, there's a lot of ways to -- that -- i mean, i don't know if user revolt will work. their numbers, whether they're
11:02 am
true or not, are still quite high in terms of how many users they have. >> and the market overseas is 90% of the market. >> yeah. >> so -- >> 90% of facebook's market is overseas. and, you know, in -- that's why the story is so wonderful -- in a wonderful way, in th that perverted journalistic way, it went from a closed society, ruled by the military, all of a sudden opened and facebook ends up being the major communications method, but there's no media literacy, there's no tradition of journalism or truth or anything that's approaching a non-governmental truth, and so it becomes just a platform for abuse. how much is facebook really liable for that? you know, it's a much more complicated question. >> so it looks like people are
11:03 am
turning to you for a lot of helps because it says, from your research, what would satisfactory change look like at facebook? >> i think there's -- the idea of putting it on the company is not necessarily something that's going to work in a vacuum of lack of regulation and a lack of policy. it's kind of like the principle of, you know, of photo synthesis. the companies, as incentivized as it may now be to invest in the right things and making a more secure, safe place for its users, it's going -- it still does have market concerns. it still has to grow. it still has to have new users. i think we need to have a regulatory conversation and i think there's a lot of proposals
11:04 am
out there, but nothing happens in a substantive way without that. >> so that's a great followup because the next question is, what are the most important government regulations that you could put into place to curb facebook and these are different people, one is pencil, one is ink, not coordinating. >> there's a lot of really good ideas throughout right now, but i think one idea that's particularly interesting to us right now is, thinking about data differently, that -- whether or not, you know, whether or not these companies should in some way be compensating users for their data, they need to disclose more what they're doing with their data, and whether or not we can -- in a way, we're talking globally about putting a price on carbon emissions. we also maybe need to start talking about putting a price in some way on data and figuring
11:05 am
out what the cost of it is to us all as a society that these companies have so much data about us and what that means. that's just one idea. it's one -- >> in my just very local recommendation would be for any of you on facebook to get the profile, get the information that facebook has about you. if you go to settings at the right-hand side, and you click on it, you'll see the drop down menu and then near the bottom it's, request my profile. you just click on it, fill in the form, used to take days, now it takes hours, and you'll see -- i don't think it's everything, but you'll see what facebook has on you and the associations, you'll see some of the associations that it has on you. it just will give you -- i did this with my students who were
11:06 am
shocked, many of them, at how much it keeps and what the associations that makes for you that you don't even make for yourself necessarily. some invitation that you clicked no, thank you, but it knows everybody who got that invitation so now you're vaguely associated with those people, and it's just illuminating. you know, i think the first step is for people to just understand more, which is why the film is good because it does -- it's somewhat of a primer on this, just to understand more what is behind this and what do they know and what are other people doing with that knowledge to manipulate people not just to buy products but to make political decisions. >> i think also the u.s. has the benefit right now that europe has taken the lead for better or worse on their data privacy laws
11:07 am
and i think that that's an experiment. i think, you know, a lot of people would say it's a flawed experiment for all sorts of interesting reasons, compliance, costs, really favor -- compliance costs really high with this law really favor the larger platforms which may end up exacerbating the monopoly problem that we have. that's just one of many things that -- so we at least have the ability to study how that's going to play out to some degree and have a more intelligent conversation about what might work here and whether it would, in fact, have to be some sort of global regime that tackles this or whether u.s. laws or regulations would really do -- like make productive changes here. >> so debates about regulating media companies often are fraught because of the first amendment, but facebook famously insists it's a tech company, not a media company. i wonder, did you in talking
11:08 am
with the people for your program, find people who were coming to a point that they would say, we wrapped advertising around content and that makes us a media company? >> there's certainly a lot of people who think facebook is a media company because it's the largest distributor of news now. so, you know, then what do you do with that? do you protect it in the same way? >> well, historically there's also been media has had a social responsibility, the idea if you're talking about public affairs there may be stories that the market doesn't support but help people in their role as citizens. that's one of the reasons if you looked in the 80s or 90s a family that had ownership, media franchises. both got income for doing the right thing and contributing. sort of ironically, in silicon
11:09 am
valley, you have companies that are publicly traded but the control rests in one person or in the case of google, three people. so there is a question about whether they would see a sense of social responsibility. >> yeah. >> yeah. a related question was, what are we to make of facebook's $300 million commitment to news that was recently announced? >> well, you know, if you're cynical you'll just say it's just a public affairs thing. i think -- i'm not cynical, so i do think they care. there are a lot of people who don't realize the harm that was created and probably do support real journalism and that 300 -- i mean i know where some of it is going. it's going to report for america, which supports local
11:10 am
news reporters who are going to local community newspapers to be journalists there and are supporting, you know, actually -- the money is going to great things. i've looked at that recently. that's a good thing. it's a good thing they're doing that. i don't know -- it doesn't really change the problems that we are talking about. >> yeah. i think still, there's no remedy for the revenue problem at news organizations. that was in part happened because of facebook and others that the audience was on facebook and facebook knew more about the users or the readers of various publications or the audiences of news -- television news organizations than those organizations knew themselves, and so the whole revenue model of journalism has been hugely affected by facebook. this sounds like fill -- i don't know much about the details, sounds like philanthropy when
11:11 am
what we really do probably need more is -- and i don't know what the percentages are of income sharing from ad revenue on facebook when it comes to articles, say "the washington post" publishes on facebook, i don't know what percentage of every dollar of ad revenue is going to the "post." but i know at first it was minuscule compared to what facebook would take from it. that was their prerogative, but it was also hugely detrimental to the news industry. if this is a band-aid fix on something that had happened, we basically need to address the revenue problem in journalism. that's the important thing. >> can i offer something completely different? why do we care about anti-democracy content if that's the will of the users and it's found on the platform? >> wow. >> i think that's an interesting
11:12 am
question and one that we talked about in various places that, you know, there are a lot of people posting content in places like the philippines who support the president there and that is content that -- it's just an interesting ethical question that -- i'm not prepared to answer, but i certainly think it's part of the dilemma. it's one of the many dilemmas that we were grappling with trying to figure out how to report. >> well, i'm -- i have a much more black and white view of this. let's start from the fact that we're a democracy and i think our national security is better served by other democracies, even though we have many alliances that are very tight with authoritarian regimes, and our whole -- not our whole foreign policy but our ul touristic foreign policy and our goals as a country and nation have always been to promote the
11:13 am
rule of law and democracy. so if we're saying in that question, no, it's not, you know, and we're willing to change our big, strategic goals to say we don't really care if there are more authoritarian regimes, that's a huge difference in what our values up to this point have been. >> i think that a big part of what our documentary is about and a big part about what we need -- facebook needs to grapple with is the issue of fake accounts and others that are magnifying sentiment or magnifying different messaging. that was the case in the philippines, for instance, where the duterte regime was using a network of fake accounts and in order to make it seem as though there was more support for his policies and also to attack critics. i think that the issue of fake accounts is a big one,
11:14 am
especially when it comes, you know, potentially to amply fig anti-democratic messaging in campaigns in particular. >> two related questions. one, does facebook have the capacity to fight state-led propaganda in africa? and the second one is, is facebook actually enabling totalitarian regimes such as in vietnam to suppress their own populace? >> well, they don't have the capacity to regulate or do much about state propaganda on their platform -- >> unless it's a fake account issue. >> but even that is something they're grappling with. i mean, even in myanmar where the u.n. said they facilitated genocide, they are trying to do something that they weren't doing before, but they aren't 100% or even probably 50%
11:15 am
successful yet, so it is very much a work in progress. they have said that they have hired -- they doubled the amount, is that what it is? they've doubled amount of content moderators and other security people to combat these problems, but the problem is, they grew so fast that it's almost impossible, unless -- it's -- i think it's almost impossible to get your handle now -- get your hands around every problem that was created in every country in the world and that's, you know, both a testament to their success and how much people love facebook and how many great things it's empowered people do, and then all of a sudden the problems came along in every language and every dialect and here they are in palo alto trying to deal with
11:16 am
it. it's a huge -- >> i think that, one, i don't know the specifics in africa or the specifics in vietnam, but i do think that the issue of how state-run either media organizations or authoritarian states are using social media in order to push their messaging or attack critics is going to be -- continued to be an issue for the company and it's going to have to be something that we continue to report on, the ways in which that's morphing and that's something that everyone needs to kind of get ahead of. >> so as you're revealing information and it's becoming part of the public discussion, a person in the audience wants to know, would this make a difference if facebook is involved in paying regulators and elected officials to reduce regulation?
11:17 am
>> well, to pay them directly? >> not directly. that would be another documentary. >> through lobbying, yeah. >> [ inaudible ]. >> yeah. >> i'm assuming that's absolutely happening through the internet, what is it called -- >> internet association. so all the -- most of the tech companies don't do pacts with their names on it or they don't lobby directly. they do it through this group called the internet association which would be a great subject for a documentary itself because it's very cleverly done to be low profile, but they absolutely have everybody covered in the legislature that might have anything to do with regulation. in fact, one of my favorite examples of this in their washington office, which also keeps a rather low profile, the two, you know, i wish i could remember who -- they have staffers from both the republican and democratic side,
11:18 am
you know, who were key staffers to key members, to key regulators, they now employ them in the washington office, you know, to do anti-regulation. >> i mean in the film we speak to an early lobbyist who was there in 2008 period who said absolutely it was a part of the company's strategy before it really began writing checks, it was a part of the company's strategy in order to engraceate it with politicians, helping them with their campaigns, it's the new place you could connect with the electorate, it's a smart way to do it, so the company did a lot of outreach to politicians to just essentially make friends and then have that leverage to say, well, you know, are you going to regulate us? and the strategy was, it's much less likely if they see facebook as a terrific tool for
11:19 am
campaigning purposes, than regulate them. that was an explicit part of, according to this former employee, an explicit part of the strategy lobbying wise and i think some of that has changed, but they certainly have been very instrumental or effective in washington. one other former employee told us about who was very technically minded told us what it was like to go around on the hill, having conversations with politicians and often he would be the guy they turn to in the room that facebook or the internet association would turn to in the room and he would kind of -- his line was that wouldn't be technically possible, to which the politician has nothing to say. and so that has been another effective strategy in order to
11:20 am
say, you know, what you're asking about is technically impossible and that's something that was shared with us in just another kind of lobbying tactic. there's all sorts of way they exerpts their influence, it's not just in writing checks. but perhaps that's changing. >> one person would like to know, was there ever a worry about a legal breakup of the company when you were interviewing people? >> no. >> they weren't fearful of that anti-trust? >> no. >> no. there was, strangely, kind of a real sense even though the company has undergone a lot of, you know, really tough reporting and revelations over the past couple years, a little bit of an imperviousness to things. i would say that they really -- i am not sure whether they're really getting a lot of the
11:21 am
message and i think when it comes -- i'm sure that their legal department is very concerned about the new european regulations, i'm sure that they're in full effect thinking through a lot of the proposals out there right now, but from the people we were speaking to, it was generally a sense of no, probably nothing major is going to change any time soon. >> yeah. i mean we had a very interesting two days at facebook and we were very open about what we were looking for, which is basically their story, and having, again, worked with the military and controversial subjects, if you are -- if you disclose what the story is you're working on, it's more likely -- and if it's not like a hit job, you know, you really want to understand the complexities of the story, chances are they'll understand that you are -- if they give you their story, even if you don't -- you have an obligation
11:22 am
to air it. and we went in with, you know, very sincere hope that they would tell us what it was like inside facebook, you know, prior -- right when the election happened, right when the russia thing broke a, when the u.n. called them out on myanmar, tell us what it felt like. it wasn't forthcoming. again, going back to the cia and the military, if you know you've been given this opportunity, why aren't you taking it? so there are two -- there are two answers to that. one is you just don't get it and you're in such a bubble you haven't learned the most basic public affairs lessons, or two, you really have something to hide. we really tried to be -- and i was very impressed, you know,
11:23 am
working -- working on film and realizing how different that is than print, very impressed with trying to let facebook people tell their case, make their best case for themselves, and if you see the film you'll see that the best case that they make is not an adequate case. in fact, they end up looking very much like we saw them, you know, and we debated among themselves, is that even fair? is it fair to put them out like they actually appeared to us? right. because they didn't come off good. and we -- >> they had been over media trained i think and there's one point in the film where we show a montage of everyone saying the same thing. we were too slow. we really debated about that. is that mean spirited on our part? they gave us very limited time with people so we typically did two to three hour interviews an they insisted no more than 30 minutes. not 31 minutes.
11:24 am
30 minutes. they were incredibly disciplined about that. we weren't used to having to conduct interviews like that or to work -- it was just -- it made everything very challenging. we actually just in terms of process, when we arrived at facebook to do our interviews, we had only basically edited the first hour of the film and we were really struggling because everything is always a struggle and these things are always a crisis. we thought we don't have a second film, what are we going to do. the whole second hour should just be inside facebook, their story, what's it like inside there. we went in thinking like we have to film a whole hour of interviews based on these interviews and what we collect and then we left totally freaked out because we didn't get enough material to constitute an entire hour and we didn't get enough insight to create anything that would be in and of itself interesting enough. we had to really rethink the whole way we were going to approach it. >> to me that was yet another indication that they are not worldly, that they are in their
11:25 am
bubble and they don't understand the risks outside their bubble or how people outside their bubble operate. >> so last question is, i have the feeling that you all are going out afterwards and will be talking about this event, and one of the things that often happens is, i wish they had asked me "x." so i would like to ask each of you, what should i have asked you, what would you really like the people to know that we failed to draw out of you? this is the final essay on any exam here. yeah. >> what i really want is not something from us, but i'm curious what we should do next. that's what we're really gr grappling with. i'm interested afterward and people who want to talk to us and give us ideas. >> that's source solicitation. >> that's right. blatant. >> i think in the spirit of what we do, which is ask critical
11:26 am
questions, i mean i'm curious, what the critique is of the film. i don't know how many people have seen it, but whether, you know, whether it's people that have worked there or people that know it intimately in some way, i'm kind of curious from the audience as to what a potential critique would be and so we can be challenged on that. >> okay. >> you know, i guess it's just a little bit more probing into what it was like in the interviews in facebook because, you know, there's -- it was so fascinatingly different than any other place i've been and, you know, i've been in so many different places and so i kept having to check myself. is this -- am i really seeing what i'm feeling like i'm seeing. you know, are they really -- do they really -- are they really coming off so naive? i mean why are they coming off so naive.
11:27 am
and, you know, is it because i'm getting older and i feel like they're all so young, but no, i'm around young people, you know, my students, young journalists, my kids, you know, they don't come off as naive, so why could they all be putting up the shell because we're here? i don't really think so. there's one story unfortunately i can't tell you because it's a security story, that to me symbolized it all. again, it goes back to they -- i don't think understand the risks in the world like the world that they're operating in. i kind of felt like it's a super government, not -- that's not really the right term, but it's a super nation state, but it has not -- it doesn't -- it doesn't understand national security,
11:28 am
you know. it doesn't understand its strategic goals, aims and how they interact and what they produce when they're interacting with different countries, real countries, and i'm still kind of dumbfounded about it, curious about it. >> the good news is, we do have professors grappling with that, including alex stamos now, in the crowd and teaching at stanford. hopefully if you go back five years and talk to alums from our school, they will be able to give you more nuanced answers. thank you so much for your stellar work and for sharing the story behind the story. >> thank you. [ applause ] coming up here on c-span 3, a bipartisan group of u.s. house members talking about infrastructure discussions that are under way in congress.
11:29 am
that's followed by a look at progress in the afghanistan peace process. the lead negotiator provides an update. and a congressional oversight hearing into issues in the u.s. intelligence community. while congress is on break this month we're showing american history tv programs. normally only seen on the weekends. tonight the civil war and presentations from historians at a symposium co-hosted by the american civil war museum and the university of virginia center for civil war history. you'll hear about the causes of the war, the impact of southern unionists and how african-american dealt with the civil war. we begin tonight at 8:00 eastern with pulitzer-prize winning author jon meacham, here on c-span 3. >> once tv was simply three giant networks and a government supported service called pbs. then, in 1979, a small network with an unusual name rolled out a big idea. let viewers decide all on their
11:30 am
own what was important to them. c-span opened the doors to washington policy making for all to see, bringing you unfiltered content from congress and beyond. in the age of power to the people, this was true people power. in the 40 years since the landscape has clearly changed. there's no monolithic media, broadcasting has given way to narrow casting, youtube stars are a thing, but c-span's big idea is more relevant today than ever. no government money supports c-span, its nonpartisan coverage of washington is funded as a public service by your cable or satellite provider on television and on-line, c-span is your unfiltered view of government, so you can make up your own mind. >> president trump's 2020 budget proposal includes a request for $200 billion for infrastructure projects. but it does not specify what the money would be used for. a bipartisan group of house members talked about the need


info Stream Only

Uploaded by TV Archive on