Skip to main content

tv   David Kaye Speech Police  CSPAN  August 8, 2019 5:31am-6:46am EDT

5:31 am
check it out. [inaudible conversations]
5:32 am
>> we can get going. we have been looking forward to this and those on twitter, we have had a lively sort of prepublickity. so i'm ann marie slogger, the ceo of new america and i get to moderate this fabulous discussion today and we are here to launch david kaye's book, speech police, the global struggle to govern the internet and to celebrate consent of the enemyworks which is 201 which was rebecca mckinnon's book and a new america book and i was very struck. i'll start with this. when i was reading through the acknowledgments in speech police say david you actually say that rebecca and consent of the network, you say few people have had as much influence on my
5:33 am
thinking as that book of thought that's wonderful and you have to have written that before you knew we were doing this event, so -- we're going to have a discussion and then we'll turn it over to you, but i wanted to begin by framing the debate in terms of these two books because i think it says a lot about how the internet has evolved. and the first -- when you look at rebecca's book "consent of the network," the subtitlele is "the worldwide struggle for internet freedom." so the 2012 -- the question is internet freedom, and when i was at the state department, working for secretary clinton, i was very involved in her speech at the knew -- newseum where she talked but the trying connect and we thought very hard bud the
5:34 am
pops of being able to connect to the internet at bag freedom, fundamental freedom and then what you can do online, and that remains important but this is 2012. so then you look at "speech police" and the subtitlele "is the global struggle to govern the internet." obviously freedom is still important as is democrat si but the focal opinion has shift it from freedom to be free from restrain to governance which is the wise restraints we hope that make us free. so, that to me tells us something about how the internet has evolved in ways that none of us might have predicted, although we were just talking above, larry seems to have predicted a lot of it and rebecca predicted some of it but still very different setting.
5:35 am
the other thing did want to just say in framing is that the -- david's become is "speech police" and in the book, and often online, we encounter these issues as speech issues, and in the united states, that means you encounter this as first amendment issues, more broadly, and i do want to talk but this, too. you talk about human rights issues, freedom of expression issues, but that's just the portal for the real discussion. the real discussion is the protection of speech, at the protection of privacy, the maintenance, preservation, vitalization of democracy. so the second thought i leave you with, we have moved from freedom to governance, and governance is really about
5:36 am
participation in governance and what does democracy look like? liberal democracy? in a world that is as much online as offline. so just to -- thinking but the evolution and sort of the big themes and we'll pull these out in conversation and then as i said i'll turn to you. before i do that i want to properly introduce both of our speakers and i will -- i actually will start with rebecca who is the director of ranking digital rights. ranking digital rights is a index that ranks companies around the world -- and you can see -- should have been able to pick up the foldout on it and more of it's online -- based on how well they actually protect digital rights. i'm just going to say nat in many ways i think that work
5:37 am
exempli identified what news america is, rebecca is a big thick, aer in america fellow when she wrote the book and was the cofounder over global voice which is how i first knew offer in policy. she was cnn's beijing bureau chief from 1998 to 2001, living and working in a country where questions of censorship and freedom of speech were daily fare. she is a big thinker and a lot of -- if you go back, consent of the network, it's striking how much she saw but she strives to connect thought to action, which is really why i say it's where new america strives to be to take the big ideas out democracy, consent, speech, but then to turn them into something that is more practical that actually does rank companies on an annual basis. so, we love the work.
5:38 am
and it's important both intellectually and practically. and david kaye is currently u.n. special rapporteur 0 on at the promotion' protection of the trying freedom of opinion and expression, and i'm -- you are a specialized audience but identity womenning to venture many people do not know they're a special rapporteur from the u.n. on the promotion of protection of the right of freedom and opinion, i his a prefer at uc irvine and teaches international human rights law and directs a clinic, as you might expect, given his aclinical professor niksch international justice and has electricked around the world, the u.n., at the international criminal court, taught at many other law schools. he cofounded the international human rights program at ucla, and actually began his career as
5:39 am
a lawyer in elm at the u.s. state department practicing international law. one of the thing is will be talking about toward the end is how important it is that both of you come out of an international resume rights frame, not just a u.s. legal frame. so with that, we'll begin our discussion. >> all right. so, i want to actually start by framing the debate. i just framed it sort of any broadest way but there is a quote -- you can see this book is dog-eared already. so there's a place where you are talking to the eu commission, and you're talking about u code of conduct, and it's interesting
5:40 am
because you're talking to an eu official who in the first place is very down on u.s. companies. that's not surprising in the eu and we can talk about that. but he's talking about the difference between u.s. approaches to freedom of speech, which of course we have by far the most liberal code of freedom of speech. in other words we protect the least speech -- we protect the most speech of anyone in the world. we limit the least speech and we talk about word lead to violence and how the europeans understand that, much more than americans do. and then he says, this -- these questions must be decided by democracy, not technology. there's a responsibility beyond the internet. for rights activist it's always just inside the internet but the politics regarding the internet
5:41 am
today are general societal politics. the bodies of democracy will have to own up to a view for regulating it and then he says -- this is where i -- so set he start with that ask you to elaborate on how -- your book is to govern in the internet. do you think it as the internet or more broadly. >> it's a great question. thank you for hosting me. thanks to new america. it's kind of remarkable to sit between two people who have shaped really major fields, the two major field is work two so it's an honor. so i'm really glad you pointed that out because one of the opinions of the book i'm trying to get across, particularly i think for an american audience, is that the debate over regulation, which is practically nonexistent in many resents with
5:42 am
some new changes around competition policy in the united states, the debate around regulation is robust in europe and their approach is that the companies have massive impact on public space. whether we get into debate over them being publishers or whether they have the public square or whatnot, they see what the companies are doing as having massive impact on their public life, on the access to information, on the rights of individuals in their space, and so from their perspective this is an issue particularly the european commission and in paris and berlin and london the issue is how do we protect our society? how do we protect individuals in how do we protect public safe, protect public institutions and elections and they're having a robust debate and we're practically excluded from in a way. even though the impact of the changes that might come from that will absolutely have some effect on us. >> great.
5:43 am
rebecca. >> well, i mean, just coming back to this point, it is what is best for people. this is the whole point of why we like the internet. because it's supposed to be good for people. it's supposed to help empower people. so, the issue about freedom, where internet freedom isn't freedom as in state of nature, free for all. it's freedom in the context of human rights and democracy in which you need governance. because if you don't have governance, life is nasty, brutish and short for everybody but the biggest and the strongest. and so that is the case online as it is offline. and so the question realist, okay, we have this resource, that's globally interconnected, that's enabling people in communities that used to be very caught automobile, marginalizedded, isolated, repressed no hope of breaking out to do so, but at the same
5:44 am
time we need to make sure the space, as it's been constructed, designed and governed, that power is accountable and somebody exercises power over what i can say or can do, whether it's corporate actor or a government, they're held appropriately accountable, there are appropriate constraints, and also that rights of everybody on the network are protected and respected. this is where we get into i think a lot of really tricky debates and david was talking bought the debate in europe and that it another debate going on or reacts to european debate happening everywhere else. so, in the middle east, in asia and so on, and there's a lot of concern of human rights activists in countries that are not democracies or are kind of
5:45 am
quasi-authoritarian democracies where leaders are picking up on regulatory debates, misconstruing them, and using mechanisms that are being developed in europe in the context of much more robust rights protections, and actually using those mechanisms to censor dissent, and so one of the big challenges -- david's written about this, too -- that i pointed out in "content of the network" is that our legal systems are based on nation states, based on boreds, based on media that is largely contained within national cultures and ecosystems and now we have these globally interconnected systems, and our approach to constraining power, our approach to law, and geopolitics, isn't fit for purpose when it comes to really
5:46 am
protecting rights and constraining power and holding power accountable across global networks and that's a question david it taking further. >> it's knot just the internet and it's society, and it's interesting that at some point david you refer to companies as custodians of important public spaces and i thought the idea that we -- this is a mixed reality, it's not -- there's not that line. and the line also between -- this isn't just one country. this is global. so we're talking global democracy. let's talk about who then participated in governance. let accept it needs to be a broader conversation than now. just going to take that as a given.
5:47 am
but what is the role of the company? there's a part of the book where david you describe facebook's legislative process and i was rating it thinking this is like civics book where we say, you go up to capitol hill and there's a drat of a law and there's markup. well, this is kind of facebook's markup session and you describe people in the facebook, and led my monica pickart who was a law school student of mine and i hope i taught -- facebook people from around the world are very earnestly trying to decide what should be taken down and what should be left up. so maybe you want to describe that a little more immediately but that has to be part of this. right? as long as the companies are there. who else needs to be in the rooms or should we displace that
5:48 am
conversation from those rooms entirely? >> so, there is this facebook what monica call this minilegislative forum, minilegislature and i our perception, the public perception generally, around the companies is they're making seat of the pants kinds of decisions. it's an ad hockery, but the throughout of the matter is they are incredit by bureaucracyized institutions. they rules that are -- >> they're all lawyers. >> a lot of lawyers. their content policy team, particularly at facebook which probably agrees the most bureaucracyized of. the of all but they are basically making the rules? they're evaluating the intermay of the rules with actual lived experience of people, they're doing all of this and so they're performing a kind of governance. it's shrined as the new governors governors and that's an accurate description of them. >> that's going to persist.
5:49 am
no matter what happens with the companies, that is going to continue to be a part of the way content is evaluated because they are just not withstanding any move to delete facebook -- # deletefacebook -- crossing platforms there -- not noh no matter what happenings there it's a massive platform just as youtube and twitter are around the world and they're ging on their spoken a major challenge for the companies is -- i think as a way of democracyization, how do though open up their space so more voices are part of the decisionmaking around what the rule shoots be and how the rules are enforced. and so it's one thing to say, this group of -- like, really extraordinary people in men low
5:50 am
-- menlo park make decision what's happens in the united states and when you extrapolate tout 2 billion users around the world, how do they even think through applying their global rules which i think we guy he should be rooted in human rights law, human rights norms oh, do they enforce those and all of these different places in your question is important. how do they good bo about involving the people who are most affected by the rules. >> rebecca, before you take a crack at that, let me ask a prior question. one clarification is, as you describe it, i they're somebody from dublin. it's knot just americans in the room at least. it is facebook people from around the world. >> that's true. that's true. their forum -- >> out not just the -- >> not just people in menlo park. >> not just first amendment people. >> dominated by people who are
5:51 am
marinated in first amendment culture and so even the nonamericans -- the nonamericans who are involved in the process i think brain kind of sun supplying this is the we the rest of the world things. >> waters what i thought. >> that's rue. >> really. >> always surprising to people, but i still think it's very much focused on -- at the end of the day -- i think this is moving into rebecca's central work around pushing for transparency -- at the end of the day, we don't really have a good sense of who is making the rules and ultimately who in company is a accountable so it becomes too kind of systemic so the platform is part of the problem but tent of the da -- make mark zuckerberg is the final decisionmaker and how do we think about that in terms of the impact they have all around the world? >> so, that's a terrifying thought, that all comes down to
5:52 am
mark zuckerberg, but -- so, to rebecca, there are two approaches there. one is which you are working hard to achieve, one is that those rooms become glass walls. that we can say who there is and what decisions marry making and try to hole them -- -- old them conditionable and the other is to put more people in those rooms. >> well, in my book i call zuckerberg the sultan of facebook istan and the sovereignty these companies are exerting over people unless you think that clicking i agree means consent, haven't really consent or are participating in what is going on if think there's a number of issues here and one of these issues is one of the things we look at with
5:53 am
ranking digital rights, in terms of applying u.n. framework, to company behavior and policies and practices, that we're not just looking for companies to be transparent. we're looking for though make a commitment and actually provide evidence that they are implementing that -- these commitments in certain ways, and we look at companies governance, as we call it, around human rights risks so we're not just looking for them to be transparent what they're talking down to enforce the content moderation policies, or what they're taking down in response to government demands, but we want -- we are looking for companies to conduct what we call human rights impact assessment. now, facebook as a member of something called the global network initiative which we can talk but later, does make a commitment to conduct human rights impact assessment when it relates to government censorship
5:54 am
and surveillance demands but it -- beared on our evaluation, they provide no evidence that they conduct human rights impact assessments on the content of their terms of service or what their rules are, how they enforce their rules and what impact their rules are having on the human rights of user. that do not do any kind of impact assessment so the decision that would take down or not take down is way downstream but what the rules and are how they should be enforced they have not done systemic think can about the implications of human rights of the global user base. that's human problem. no doing human rights impact assessment on the use of a.i., artificial intelligence, and not doing human rights impact assessment 0 on their targeted advertising business mod:has a lot to do with why we're having speech problems and how to police things and also in terms
5:55 am
of remedy and grievance mechanisms and in order to do an impact assessment properly to under houston is my business model going to affect communities and myanmar and egypt and anywhere else, you have to be engaging with affected stakeholders and communities around the world to understand that. you have to be engaging with communes around the world to understand how to set up a mechanism when people are being harmed we eave an early warning system and we know who will tell us what the problem and is help us fix it, come up with a solution. so, yes, there needs to be more transparency so we understand what is being taken down and how things are being enforce erred but also needs to be sort of accountability and governance of risks and impacts to users human rights. just not happening in a systemic way right now. >> david i want you to upon to
5:56 am
that. i'm trying to get a sense of -- so if they conduct human rights impact assessments, then they're getting more information for how they make those decisions, but they are still making those decisions. >> right. >> is there a way that you can really compel a broader are process of public decisionmaking with these entities that are public squares but not in the sense they're like the capitol mall. they have terms of service so that they can restrict speech far more than, say, the u.s. government or european government. how do we get more people into the actual decisionmaking process. >> one answer is government regulation, that's from an american perspective, particularly because we're talking about platforms that are about speech and information, we resist that.
5:57 am
that's the model people are moving towards and that's where it becomes extremely problematic for the companies and for individuals. so think of an example. right now i have been can but ethiopia. it's a place not that much different than myanmar. in one respect the lid has been taken off an oppressive regime and a lot of new media and a lot of robust debate on the platforms but also hate speech, ethnic division that has been surfacing and this where is i think reb republic is right. there should be a process -- again the platforms are first responder. nor what kind of 0 regulatory environment the platforms are in the position to have a first response. so much -- the volume so ridiculous so huge, always be in that pokes but they should be evaluating the impact of facebook on the political
5:58 am
situation and on the public in ethiopia. how do we address that? need to have a process that not only gives them kind of a playbook for doing that kind of assess. but also one that is signed off by the -- by mark and sheryl and that's true for youtube and twitter. knees to have that kind of accountability mechanism in order to say we're doing this, we're evaluating our impact and we're going to go forward or not but it's a company decision and these people will be held accountable. >> oh due know but eggover ya. >> that's the problem and this gets to a deeper question whether it's even solvable if don't want to go too dark quite yet. >> quite yet. >> i have a question. we talk but right know in the u.s., particularly since the op-ed in "the new york times" about breaking up affection and also another kind of a possibility and that's breaking down facebook. how do we get it closer to the
5:59 am
communities they're involved in, and it's not just a matter of getting the communities some insight and access to the decisionmaking and the rulemaking. it's also how to do that while preventes government from capture thing process, because we also don't -- that's why this is not some -- not a one size fits all and going to the communes might work in certain ways in, say, democratic europe that wouldn't no, other countries inch turkey, for example. that wouldn't work as a model. so, that's why i want -- there's not an overall global answer to the question of how do you involve users? it's not -- mark zuckerberg used to say -- not a much -- facebook is a community and everybody in silicon valley would roll their eyes at that, as would we, it's not one community. the needs to be difference and i don't know at the scale of 2.5 billion users -- youtube is close.
6:00 am
... >> i think that's right they need to find ways to actually have is a presence in the places they call them markets. but in the jurisdictions, in, you know, the the country where is they're operating, they need to figure out a way to have a presence, so it's not because it's not just language you know because language only gets you so far the code around language around social norm around all sorts of developments, that you can't get all of the way there justs by knowing, for example.
6:01 am
you need to know a lot more than that but again there's this -- problem liking they go to the country, there's also a the risk of them being captured and actually literally being captured you know brazilian judges have ordered facebook executives to be arrested. so they have been. they've been detained. so it's, you know, it is a very complicated dance. in a way but they so far have not really been making kind of full effort to get to local actor as and not just a national level but at the local, most local levels. >> so rebecca you're actually doing this. right? ranking digital rights is the way of -- of holding companies accountable that is not government regulation. right, i mean we're doing this right a bench mark. it could help inform government how do you think about, you know, if you're wildly successful companies are are paying attention to ranking digital rights, they are paying attention to and again, both of
6:02 am
you an i'll just say this again are using a global human rights framework not a toc constitutional framework not a european convention framework. a global human rights framework. how do you imagine -- or where do you imagine governments and/or watch dog groups participating in that larger project? >> sure. well, you know, ranking digital rights is a tool. it's kind of a set of data in a framework of standards for people to use in a whole ecosystem of efforts. right, that influence regulation that influence advocate city that includes all kiengdz of things so in a way you see sustainability bench marks that bench mark company it is on, you know, how they're doing. with their carbon foot precinct so on or that are quite successful, and have a huge impact on companies not only
6:03 am
practice and policy but innovation and help to inform regulation in terms of, you know, how, what impact is regulation actually having on the practices, what companies are going beyond just compliance of the law which companies are doing minimum compliance and so on so that's what we modeled on is modeled it on sustainability bench marks or bench marks that are ranking companies and labor practices and other things. and you see that tben, you know, it's, it alone having a bench mark is not going to do all of the work. but we're seeing, for example, with our data this year we can make initial observationings on what impact privacy regulation in europe has had and what a impact it hasn't had and what companies are going above and beyond regulation and what companies are doing minimum compliance as they interpret it as compliance and not getting
6:04 am
very good scores you know, and so it's fort so the a tool to really understand kind of how what companies are doing and how they compare is also a framework around which had to have debate. because it is a living thing we keep updating the standards that we're using to evaluate companies about what that practice should look like so we're adding indicators related to target advertising from relating artificial intelligence that we didn't have before. >> that practice, though. >> so it's -- it's a lot of consultation and a research you know so that indicators we have right now you know they're based on human rights standards but we have to build a lot of detail into it and detail was built through talking to a lot of people in the human rights community to talking to companies to talking to technologist and working out okay what represents kind of a general rough consensus about what good should look like. you know? what it should look like if companies are, in fact, being
6:05 am
accountable and responsible about all of the ways in which people speech is people are actually, you know, is someone speech could be manipulated or constrained in all of the ways in which somebody might be able to know something about digital activity are companies acting to soft maximum transparency and accountability around that. that's our indicators are kind of an indication of, you know, that's what the standards should be. but it is based on, you know, none of it is really original work it is all drawing on work of others. that, that has come to be considered best practice. >> i would just add quickly that i think this is really important because there needs to be a smart set of regulations. i think that content regulation by government is incredibly risky and need to repression speech. but what rdr is doing i think is not only modeling what the company should be doing. but also if governments want to think about the kind of regulation that would aa low for
6:06 am
a more robust and semitic public debate it should be adopting these starngdz and sales i don't have a problem -- with government basically saying you know, one requirement for selling ads in our jurisdiction is to be transparent along these lines. >> so that means a mini forum in new america is just so -- become important as many legislative forum in the park but david let me push you further no how do you think about your vantage point in the pus and struck by thinking about you're at the u.n. the u.n. is 194 countries. facebook is the biggest country but even just by market cap if we leave aside population, it is way, way up there. most of the countries in the u.n. but how u.n. should ten gauge with rdr with article 19,
6:07 am
the many, ngo and with the government right because the u.n. and particular plenty of governments at the u.n. who would like to just -- you know, regulate away how do we think of a global legislative process? and what would you if i'm -- if i am and tony and i say to you okay so where the u.n. and digital age what's the process? what do you tell me? >> you should tell him -- [laughter] >> ask me to do that -- >> so i do think so ting that u.n. -- at least in the human rights mechanism human rights counsel and the u.n. general assembly third committee to a certain extent and i think that u.n. conservative really important function in identify what are both the global, what are the global norms around freedom of expression online? >> have they changed a man that online and offline are the same l that whatever rights you have offline apply online and that, that should be is true in
6:08 am
principle. but if doesn't go far enough into identifying well what is that mean in terms of how company and governments should be protecting that state to be requiring -- what we use in our wonky language how do companies and how do governments encourage enableing environment for freedom of expression so a lot of room for u.n. to basically say look, the rules for example around transparency and transparency can be a meaningless mantra if we don't specify as rdr is so what are disclosures that are important so that the public can have a genuine debate about that and that also is access to information which is also a human right right so how do we construct rule around that but then also u.n. can basically say to government, look when you are constructing your regulation. you need to be sure not to cross over into space that insent vising companies to talk down or more content than they should be
6:09 am
allowed to and based on human rights -- so that's why i think and i think again we're on the same page on this and if the companieses were to essentially revamp their community standards and rules around human rights norms, i think that that would give them a stronger tool in dealing with governments that are repressive so governments cool to them now say we want you to take down this content because it is inking the with your term of service so company can say right now, well, that's actually not inking the with with term of service so qept to leave it up country doesn't care. but if the companies are able to go back and say, look we see ourselfves as protecting individual rights which had you yourself as a party of the international covet of legal right under declaration of human rights that yourself are bound to adhere to and i think it doesn't mean that you know turkey is going to say you're right. we're bound by this form. but i think many instances it
6:10 am
will slow down the process and if it is coupled with transparency around that it does make it harder for government to just willy-nilly take down the contact. >> so really as international lawyer this is fascinate because pus companies through the 20th century were the vehicles through which essentially u.s. law it was applied expert territory and extraterritorially around the world. right? and now when you read both of the books you think these companies are global. and they are actually bringing global human rights and european law into the u.s. which is just such a flip in terms of the way we've talked it be. we've only i've got ten more minutes before i turn to audience i would like to talk to be get more specific. and a really look at the disinformation question. right, the fake news question which again propaganda, lies fact that it is called fake news
6:11 am
it is not new. and it's very much not new in many of the countries that you all are both looking at and so -- david talk about how as u.n. wrap, you have seen the disinformation and then market of the information and then the rush to counter that and what -- what the complications of that are around a world and then turn to rebecca to comment. >> so we've seen really since 2016 you know, since the election of donald trump let's say when there was kind of a collective freakout that, and that's not a technical term bsh but there was coif a collective freakout that you know face -- and there was a particular "new york times" op-ed that basically said facebook you have a stillon engineers you can zap fake news and basically save our democracy, and that kind of
6:12 am
idea, i think, has, you know, traveled quite or far around world so the point where, you know, just a few week ago singapore adopted a new law called onis line falsehood prevention of online falsehoods act and basically it criminalizes the false information of fake news online falsehoods but is also basically gis government ministers the authority to make demands of the companies to correct false information. to like label it and maybe take it down. but either way to put up something that says, this information well the government disputes it. sost there a real -- not just a risk. the reality now and there are many false information laws you know that criminalized dissemination of fake news, there are basically used against government. this is right critic -- terrifying and that is the world that we're heading to and one of my fears is that the debate in
6:13 am
european has been incentivizing giving coffer to that and point that rebecca made earlier around this particular area and particularly the way regulatory discussions travel worldwide is really important because it is not to say that european shouldn't do something because russia might take advantage of it. that's not exactly that. it is that you know, the entire conversation around a the internet particularly on disinformation and particularly in europe, is the internet is bad. and it doesn't give much room for thinking about the tradeoff. that are proportionally of restriction apartment of legal traditions but also a part of what we want to preserve in the good there. so this doesn't get to like what the rule should be around information. i mean last point i would just is you know, again, at least the liberal freakout over the pelosi
6:14 am
video was instructive right this was not just a question of you know, this authoritarian regime around borders wants false information to be struck down but it's also here and i think we have to think about whether our own debate is giving cover to liberal which you described in sensitive network you know 7, 8 years ago. >> there's a debate going on within people just even in d.c. who work on digital rights issues. over how to address this question you know and with liability section 230 reform and how much, and so on, but also just this question of a lot of people reacted to the pelosi video saying well, it is real easy you know if it is a fake, a fake thing and about yourself you should have the company take it down, of course, everybody pointed to all kinds of satire
6:15 am
on trump, on william barr, and so on there's just, obviously, satire but could be taken dander such rule, and so david had hada famous tweet recently so like please give me a rule to deal with, you know, prevent, manipulation but at the same time, not result in overcensorship and debate about that was really fascinating. not really answers but i think this is part of the problem is that a there's i think often an a approach to regulation of the anything related to the internet or expectations about what companies can do with their platform just like you just fix it like you fix the tv. like samsung tv had a bug and you just fix the bug and then it's fixed. right?
6:16 am
we're not going to fix that. we're not going to fix issue of speech and where with line needs to be, and have a perfect rule whether it's a company's private rule or a law or whatever that's going to satisfy everybody just as in governing a city you know, qheer never going to have perfect law enforcement that never infringe rights and services that everybody is really happy with how they're set up. right, you never will, and nor will you with a digital space in terms of how it is designed and regulated it is going to be a constant battle tug-of-war around how speech and data are governed and that's normal and natural, and it is where it's when we get to point where it is not possible to have that date. that's where you're really in trouble. so it's about how do we ensure
6:17 am
that there's the possibility to continue to identify where the abuse is and adjust. this is as we do in the governess of our physical spaces. it's just not going to get fixed. i do think that's very important to recognize that you have to be making tradeoff if i think about going back to law school right it's nothing but hype hypothetical where with you draw a line and case on either side of that line don't make much sense but you have to have a line and they will always be -- be tradeoffs but i'm very mindful david when you were saying about, you know, it's if europe -- regulates something and says it is fake news, this is like when the u.s. allowed torture. right, we allowed torture and in the limited way as far as i was concerned it should have been no way. but, of course, that license governments aired around the
6:18 am
world to do far things worse than we were doing and i remember being in state department and human rights activist coming saying look people are pointing to you. the u.s. government is doing this we can do this and the same yes if europe which is, obviously, deeply rights protectings environment, then essentially starts to like censorship and idea, you know, british law you can't say what's not true in the u.s. that's fine to say what's not true you can't just defame. >> one way i would list is a little bit is that you know, what happened in europe doesn't stay in europe. >> right. so that's not a question just about whether, you know, jipght or singapore somewhere else adopt a rule. it is also about the fact because of the company operate at yale, if they have a new european law around hate speech or disinformation, i mean, i can't say for certain but it is easier for the company to make a decision that okay, with we will just import that into term of service so the rule that is
6:19 am
adopted for europe will actually do the rule that also frames how americans get to enjoy the platform and so -- it, i mean, it is an interesting like we don't as americans don't often think that we might actually be affect ad by other people's rule. but this is one area where looking forward that is -- , you know, absolutely a possibility that and particularly because we're not as a like a political culture right now not really having a serious debate on issues europe will make the decision maybe we would be in this room happy it be about that but we're not participating in it. >> all right that seems like the moment to reach out there yes on the aisle and wait for a microphone please and then introduce yourself the gentlemen is a light suit half way down here. >> thank you macintosh a blog called i am global transparency, david you spoke about wanting to have disclosure of rules and decisions.
6:20 am
you seem to suggest that there also a shown disclosure about the process about a transparent decision making process do you have a model for that for corporate transparency in that realm? and do you have a strategy for getting there? >> yes. so thanks for coming here, this is an access information where it stands, so what i wanted to -- one way to respond to that is that i talk about having a kind of -- a case law around platform decision, right, so right now we get very little out of the companies in terms of at the a granular level as to how they make decisions and from the platforms perspective i think that's actually harm them. i think the debate around many of these issues are well they're inconsistent they look inking the and certainly they seem inking the because we don't know about range of other, you know, like cases that they've addressed, and you know, from time to time, they will, you know, swirl up and youtube has
6:21 am
done this put up hypothetical but they should be pushing themselves and we should be pushing them i think to provide more and more granularity not a transparent but a debate more or less on the same, same page with same information had. but it would also give users the opportunity to decide whether they want to be on the platforms which i think has -- you know right now is very difficult not to be on many of the platforms particularly outside the united states but at least people have information about rules sharing data all of these things they're in a better position that advances ability to protect their own rights. >> we were talking yesterday michael was talking about data portability being, you know, a absolutely essential part of any competition and any ability for
6:22 am
choice. yes there in the very back. you guys i'm mark nelson, i wonder if you could say few words about the multistakeholder governess approach, on the internet that gives voice to people and doesn't work very well we haven't taken it very seriously as a country and international arena and i wonder few things, a potential to move more toward a multistakeholder approach that would give voice to local media producers and others who are really shot out of a lot of these decisions about and with the probably you know more legalized framework. i wonder if you don't think that has potential. >> great question. >> sure. i'm happy to give a stab at it i'm sure -- i think that there are a nusm multistake holders addressing certain problems. you know, i think when people
6:23 am
think about multistakeholder governess by think about i can, and a kind of that model which is -- is very specific around name and number and internet domain and doesn't deal with other things and shops that aren't really governing anything but the question is, i think the best work on kind of where we might take multistakeholder governess is where one talks about rather than trying to have one multistake holder body to govern all things and solve all problems that it's rather more likely to be effective to have different multistakeholder efforts that bring together, you know, the people concerned with and involved and efnghted by and have expertise on, you know, one
6:24 am
specific issue say like content moderation on facebook. and try to come up with a mechanism that bring in stakeholders to figure out a better way forward than to have some big body that kind of governs everything that relates to the internet. so you know, and a we are starting to see some efforts that are bringing together stakeholders including governments and mgo and other, you know, press organizations. so on -- about but i would only add to that, sort of -- following on that point in there is article 19 organization has been promoting social media counsel is it is nongovernmental but definitely multistakeholder approaches to dealing with you know most difficult content kind of questions so i think that might be one of the approaches that would, i mean, who knows if it would work but i think it has promising and goes back to your question about how do you
6:25 am
actually this space and get involvement of community? particularly if you imagine social media counsel and different regions or even in different national jurisdiction again there's the problem of government capture but that might be a model of bringing in different actors to be participants in the decision making. >> i'll just add that -- who used to be the head of i can has a lab proposal at the u.n. for multistake holder group rolling up into network of networks that somebody who was written extension ofly on web craft as opposed is it a state craft i love the idea when you actually see it laid out it is horrifically complex. but we've got to start somewhere. you yes -- right there. thank you i'm affiliate of the officer i went to turn to the question of this content online how to address it. and most recently manifested
6:26 am
with the -- gun violence shooting and my question is how should the u.s. and countries with a human rights address things like christchurch culture action and similar statements at the g20 and a g7 addressing that call on social media are companies that sends a violent content online. >> i'm glad you raise that i didn't have a chance to get to prism so -- >> so just -- i think there's at least a million ways to talk about contents and so one is -- there's a i'm not sure if it was adopted but directive in european union and a legitimate thing for government to be doing
6:27 am
there's no question about that. but they often adopt rules and this is worse than other parts of the world they often adopt rules that are generally vague they don't provide precision to ensure that plat formals aren't sent vised to leave up robust debate and not take down debate you know is -- is legitimate but might be edgy or problematic for some government and one part of it saying any regulation on content needs to be precise as to what it is aiming to prevent online. christchurch massacre we saw australia propose on it tuesday and adopt on a thursday, and a new law on life streaming and violence what they called it and putting aside the real deep problem with with adopting a law that quickly without public
6:28 am
input you know there might be room to have, you know, a real debate over live streaming and its impact i think we need empirical evidence like a real research around the impact of live streaming and the impact of video on insightment to violence i think we need more and need to sees more in that framework. but we also you know, may be possibilities for certain kiengdz of tradeoffs around certain kinds of -- content like live streaming. it gives space for the companies to make decision ises that aren't as they don't require same kind of timeliness that they do in the christchurch situation so you know the delays in -- in live streaming for example. but that's the tradeoff is -- we also like had to see live streaming of public protest. we like to see live streaming of police abuse --
6:29 am
to feel -- yeah. exactly those kinds of things this is all about tradeoff. when we're talking about this technology. >> there's been a number of argument a articleses from human rights community about how syrian groupses try to document war crime and their videos are disappearing from youtube as youtube is reacting to pressure. from regulators to take down terrorist content and they're getting wrong and that's a result so that's one very specific example of what happens when governments impose requirements that, you know, you're liable unless you keep everything that fits the bad category off your platform they're going to oversensor in order to avoid, you know, the penalty. and that's well documented always what happens. and so -- how you and which is why i argue
6:30 am
that if companies aren't doing enough way upstream to kind of before they did -- facebook had a roll out feature did they do any assessment other than how it was going to help them commercially? i'm pretty much sure they didn't. and consultation and you know, kind governance before they get implemented about how to deal with downside there hngt been enough thought put into it which means oarptsing their legislative process because that's what congress has to do; right? there was a -- right there in the yes, sir -- thank you i work egypt david and
6:31 am
rebecca you spend time thinking about how to regulate big company and ranking rights and so on and i wouldn't be exaggerating to say that most of the social liberals are putting the effort -- while they can, and when we are with successful pushing them out and go elsewhere to the open -- unregulated internet that i grew up on messy one that we all remember from old times, extending this how does this -- benefit speech, this end up with -- regulated big platforms.
6:32 am
>> when you introduced the situation where you're coming from i think i'm not sure i totally caught all of the questions. think about a place like egypting right now, and actually rdr i think started out with a much more on continuing to do this on the government transparency as sort of that relationship between what a company -- >> that government, and what are companieses sharing about what government demanding and that like, for even the global network initiative which rebecca mentioned earlier got its start in many ways in speaking about how to deal with government and what should be company response to all of that, so you know, i think in thinking about issues you're mentioning and it seems to e me that -- you know, deal with the
6:33 am
situation that you're describing and that means ping rebecca puts it perfectly it is upstream this is deciding in advance so that we're not really a guinea fig of the process sites in advance what the situation is should look like for them and providing tools for individuals maybe it is through decentralization and other tools. but providing those mechanisms for people to enjoy their rights. refer to situation where maybe, you know, facebook and twitter get cleaned up. but then they are full of nasty people who are basically doing things without any consequence. yeah, and this is huge and this also relates to offline so -- so just for instance law
6:34 am
enforcement, and a bunch of other issues about these groups and kind of how their online afnghts are connecting to who they are offline and what governments are supporting or not supporting. other thing i think some of the companies and there have been discussions with some of the plat formals -- researchers point out that oftentimes what happens is that, you know, violent extremist will start out in each, and kind of organize means and organize the strategy how they're going to then jump into facebook and just go right up against the terms of service but not cross them. to that plan refine their campaign and then move them over and with very clear strategy to stay within rules so
6:35 am
they don't get taken down or that they -- they would notice in that kind of thing. and platforms work with researchers and communities that are tracking extremist of different kinds to be able to alert platforms to what, you know, this is what they're planning. don't let it happen there needs to be a lot more conversation and corporation it seems like. all right last question. >> yes. >> do you have any final closing remarks -- holding on. >> hi. my name is david m.a.d.d. hadden the founder of organization called pandia so it was one of the main groups battling with facebook, about the hate speech problem, and i would make a
6:36 am
comment on i think we're talking we're talking earlier about sort of importance of who is in the room and who is participating i guess i wanted to make a point about the limitations of that. and various reasons and a number of other society groups had unprecedented level of access by personally went to facebook headquarters in 2015 and told them that facebooks was going to be radioed had had dropped -- we went back with with a long list of very detailed product and policy recommendations that things they needed to do. that was in 2017 before what later transpired in the state. so and i think for us in mama
6:37 am
the counterpoint, of course, is germany, right where, you know, we for years would barely find a burmese speaking person in that connecticut contract or otherwise and in germany where they have regulation, a thousand content and hate speech insight violence coming down 24 hours was a facing fines. starts here when maybe global platforms, but we definitely have multiple peer of global resistance right so the walls that are applying to folks in germany like the kind of environment that facebook user in germany enjoys, is completely different from a facebook user in mama, and i think we just need to recognize limits even when you have deep civil society engagement quite technical down to like product recommendations like the limitations of that. and i think it's --
6:38 am
frankly the sobering. >> so this was a great last question, and the question to end on i'll just add to it. you to write about digital colonialism and it was as i underlined it several times is exactly this, you know, standard set in one place and maybe even with strong engagement being applied to those who don't have the ability to do anything about it. so phil let you -- >> that is had a really important question and it makes me think that maybe over the last hour or so we've been too easy on the company. >> uh-huh -- and i think you're absolutely right and one of the things that i try to write a little bit about in the book is the difference exactly what you're discussing the difference between powerful markets is end of the day seen as markets so the powerful market it is in europe think of the right to be forgot on so you have, you know, existing rule of law structure. you have litigation as a
6:39 am
mechanism. you had the top court in the european union make a decision and it forced google to make decisions and to adjudicate weird saying but toed adjudicate relevance when a link needs taken down that isn't put aside -- it isn't available anywhere. you go to a place, you know, i've been in -- well i was recently in ethiopia as example again i think it is a cautionary tale for facebook and ethiopia, and you know people there have no idea how to reach out to the company, and putting that aside your point is right peach they get access what's the guarantee that facebook would do right thing or youtube would i think that is a big question, and i do think it is why we should be moving from a place so -- first we do need transparency but we also need to talk about remedy because we're a platform
6:40 am
facility harm but absence of -- compensation of reparation for that kind of damage and maybe if we move towards something like that, and to making that part of the discussion, we start to at least resolve it and start to, you know, force the company to see that it doesn't really matter if the harm is coming from germany or there they need to treat it all the same and ask -- >> wow. >> i one can say many things but we're out of time so i -- i'm going to end on optimistic note which is that we've actually come a long way. you know, ten years ago, the policy discourse was give everybody the internet and they will be set free. we just need lots of vpn and -- and everything will become democratic we've got a lot more sophisticated both in the policy side and you know, while we're
6:41 am
very, very far from solutions point is you never have solutions look at governance at large in the world but people are identifying the problems working on the problems. there are people in the companies who realize that the status quo is completely unattainable, and so some of the questions how to empower those people and help them move dial in their companies. also an interesting moment in the united states where, you know, right now kind of u.s. leadership on what a global internet internet and dmbs ought to look like not a lot of leadership on that question been but maybe there's potential for some leadership in the future. and -- and you know what should that
6:42 am
look like? and so there's some important questions we can, you know, work on. we have a long way to go and a multiracial issue but point is people are trying. >> well that's so -- both of you have agreed to advance that debate i'll pose with my own reflection on the companies and the what i'm left with in reaching speech release most recently is thinking about their companies, their commercial entities they think about these as markets, and yet these are the public spaces and they cannot just be companies and, of course, it comes up over and over again, that what really drives the response is advertisers don't want to be associated with either genocide or ethnic cleansing or terrorist
6:43 am
video or others, and but i was thinking when i was a law professor when monica and many others were in my class i would say you know you as a lawyer you repght your client but you're also a servant of the court that is what you -- you must be. you are part of a system of the rule of law and you have an obligation to uphold that system you're a servant of the court and you are, of course, representing about somebody and somehow, in this process of talking about new legislative processes people have to be both commercial and public because governments can't do it alone but certainly companies can't but we with can't just let these be commercial entities that respond to commercial incentives so i thank you both i thank all of you for your -- questions. and join me in a round of applause. [applause]
6:44 am
6:45 am
6:46 am


info Stream Only

Uploaded by TV Archive on