Skip to main content

tv   Inside Story 2017 Ep 281  Al Jazeera  October 10, 2017 3:32am-4:01am AST

3:32 am
eighty billion dollars from the u.k. to satisfy commitments made under a u. treaties the german chancellor says a deal lemonade in the intake of refugees will help forge a new coalition government. has announced a cap on new arrivals of two hundred thousand a year or an attempt to unite the conservative bloc after i'm president of losses in last month's election immigration is seen as one of the biggest hurdles to forming a government germany chicken more than one million migrants between two thousand and fifteen and two thousand and sixteen catalonia high court has requested extra security from spain's national police on tuesday and the event its regional parliament declares independence officials in the u.s. state of california say ten people are dead as fast moving wildfire spread across two different regions mass evacuations have taken place as the wildfires move through urban california's wine producing region flames covered eighty hectares of land in napa county
3:33 am
a second fire burned in neighboring sonoma county those are the headlines the news continues on al-jazeera after inside story keep it here. spotting and stopping the fake news facebook says it's developed a tool to give users more context on what they read but admits it's not always easy to flush out fiction from fact so how do we deal with this modern day reality of where does it leave social media this is inside story.
3:34 am
hello welcome to the program in doha i'm adrian finnegan fake news there is no doubt it's become one of the key of twenty seventeen brought into the spotlight after the us presidential election last november it became clear that thousands of people were unintentionally sharing articles that were full of misinformation the resulting paranoia led people to accuse even once trusted sources including the new york times of carrying fake news and social media is a core part of the problem facebook has finally decided to do something about it it's testing a new feature called the i pad when clicked it gives the reader more information about the article they're reading the idea to give people context about their news sources so they can decide if articles are from publishers they trust and if the stories themselves are credible well facebook's come under attack for not doing this earlier critics say that the company has a responsibility to protect its users from fake news but facebook's chief security
3:35 am
officer says it's not so simple alex thomas tweeted on saturday that lots of. journalists have celebrated academics who've made wild claims of how easy it is to spot fake news and propaganda without considering the downside of training machine learning systems to classify something as fake based upon ideologically biased training data an understanding of the risks of machine learning drives small c. conservatism in solving some issues or facebook isn't the only tech company struggling to contain fake news you tube has changed its search algorithm to promote videos from mainstream news outlets instead of accounts that may serve up conspiracy theories but it hasn't said how it deems an account credible and yet you observe google has started to identify and label fact checked articles in the u.s. and some european countries it's also allowed people to report misleading content
3:36 am
that shows up it's also complete search function or in the featured snippet pops up above all of the other search results and twitter has also been criticized for allowing the spread of misinformation but it's taken a very different approach to the issue to us as twitter's vice president of public policy wrote in a blog post in june that we cannot distinguish whether every single tweet from every person is truthful or not we as a company should not be the arbiter of truth so let's bring in our guests for today they're all in london alice to read is a social media journalist of the press association who specializes in the spread of misinformation online sebastian moss is technology reporter for data center dynamics and tom law is the director of campaigns and communications at the ethical journalism network gentlemen welcome to you all i mr let's start with you what do you make of this facebook i button initiative is it going to be enough to stop the dissemination of fake malicious news on the platform. well it is an excellent start
3:37 am
and having a bus and around stories which provides extra context is something which from time has been missing from facebook and people see stories in their feeds and they don't not necessarily sure where it came from the not necessary sure who wrote it so it is an excellent start but in my view there is the problem that people if they believe something if they want to believe something then no matter what amount of context they're going to believe that maybe dismiss the context that comes with it so it's not that facebook has grown too big too quickly to be able to control or to keep control of the material that's that's published on it that is a part of it as well i mean it's not it's not just a one dimensional issue here there are many factors in play which contribute to the spread of misinformation and false news and this information whatever we want to call it there are many things and the fact of the platforms themselves and how they work and how we interact with people on there that's a big part of it the oversight that goes into these platforms and how content spreads around them is part of it and also the actions of people outside of the
3:38 am
platforms of media organizations of politicians of individuals in that it's all of these are all factors within how this spreads but it is good as we've been saying that this new button has been added to address this sebastian is it possible to put the fake news genie back in the bottle now that it's out. i mean it is never possible you just need to look at journalists and how much they struggle to kind of work out what's true and what's not it's a daily battle for facebook obviously people are going to share what they want to share and you can do what you think is like the eye initiative to raise some awareness but at the end of day people want to share something they're going to share so what do you think facebook should be doing then to make sure that a lot of this misinformation this false fake news doesn't get out there. i think i mean the eye is a good start but it's a very small button and people are not clipped my knowledge is i think at the end and then you need a lot of people that they made three point nine billion dollars in profit last quarter so they can afford a few people to be working at least on the biggest news stories are being shed
3:39 am
looking at them seeing if they're false or not and then flagging that with maybe a bigger more visible icon story do i was there about about facebook having perhaps grown too big too quickly i mean given the number of people that use it that what is it something like a quarter of the world's population surely you can never have enough humans and journalists looking at content to sift out the bad stuff can you with the amount of material that's published on the on the platform. facebook is obviously the biggest platform in the world when it comes to information sharing we have two billion more than two billion monthly active users and that is a huge proportion of the whole population as the bar scene said hopefully we can facebook plans to employ some more people to be over to be able to oversee this and be able to oversee how information gets shared but as you also mentioned there is the problem that people are now just believing what they want to believe and as human beings the way our brain works we're more biased to believe something if it confirms how we understand the world to be or how we want the world to be and those
3:40 am
kind of stories that fit with our worldview bypass our kind of critical thinking faculties so who are often going to believe something no matter what's fact checkers or journalists say and it's you know that sounds like a very pessimistic view on all of this but what we need to do in my view is to yes have the context been absolutely but raise the importance and the value that people hold on the truth and on providing proof for what they're saying and on the negative aspects of lying and online especially on social networks and not just facebook a lot of those kind of social factors have disappeared if i was to come on here and just tell many lies i hopefully wouldn't be invited back but online on thin social networks people get a thumbs up or a star as a positive reinforcement for spreading lies that fits with people's worldview and that also matters a verification mark or so is a very multifaceted problem with how those networks work in allowing this kind of false information to spread that journalists and real people are going to be able
3:41 am
to help to a degree but not in totality ok so we'll look at some of the ways in which perhaps that aspect of this problem can be solved in the space did you do you agree with that that this is more perhaps a bit of a social problem than the technology problem. yeah i mean at the end of the day you are responsible for what you decide to read what you decide to share what you decide to like and there are some fake news stories that are highly to do as a fake news story that went viral and they seem quite obvious as fake news stories so we don't even work on on ourselves and working out how to kind of control what we ingest part of the same time facebook does have a duty as that middleman as a company that has increasingly trying to get involved in the media sphere trying to increase the amount of publishers and websites that go on and if it wants to be part of the media industry it needs to understand that it also needs to take a part be part of that struggle to work out what is true and what's not told law from the ethical journalism network should say its book and other social media
3:42 am
platforms and search engines be forced to give priorities information that is that is in the public good. well i do they should be forced to be much better if they took those forms with see upon themselves what we've seen recently in in germany are laws being created to fine facebook google and others huge amounts of money for hate speech and fake news online and that will probably result in both of those companies removing content hastily and actually having a chilling effect on freedom of speech so self-regulation on an industry level and also on the individual level for those companies is is what we need and so far we've seen that without public pressure and without scandals searchers the misinformation broadcast after the recent las vegas shooting. we have to we have to find a way to address this now the arguments made that this is going to have a century effect on on freedom of speech i think is a bit of
3:43 am
a. false argument to make when people go on facebook to find out news about a terrorist attack or a breaking news story they want to find accurate information and the fact that it took so long for misinformation to be removed from their algorithm is is very problematic but what we need to know is how is it removed from the algorithm what choices were made on the what guidelines we can't labor under the are under the false idea that algorithms were not made by people with guidelines as to as to what they should show and so if google facebook and twitter and others are going to make these adjustments we have to have to be more transparent about how they do it and if they're going to hire human editors to make choices we have to know what the guidelines under which they operate so that they can be held accountable by their audiences so. facebook sites business reasons to explain why
3:44 am
that it doesn't talk about how it's going to algorithms work but who. what you're saying is that facebook's apparent lack of respect for transparency and ethical standards are eroding trust not just in it in it but but on other social media platforms too absolutely i mean the some of facebook's guidelines were leaked to the guardian a few months ago and they published some great reporting called the facebook files where they had examples of what facebook allows and doesn't allow and some of it looking at it just doesn't doesn't make any sense and often in one context publishing a certain image of an extremist group or of a terrorist or terrorism symbol would be warranted and other times it wouldn't be but those choices should be made or in journalism we make them on a case by case basis depending on ideas such as what's in the public interest and everything else and you know what you have is facebook employing. thousands of
3:45 am
people in not directly as facebook employees but for other companies who are on the extreme time pressures to go for and sift through content and hit delete or to or to raise it to one of their superiors based on very very you know basic criteria now first of all these should have been published so the journalists like us and others and so society groups have a real debate around what content is acceptable and what content isn't and and and and what context otherwise eventually of a social media platforms and others will have their guidelines leaked and then the exposure that way i would say it's much better from a p.r. point of view for these companies to get ahead of the game and have a much better engagement and a conversation with the journalist community and media community more widely and the longer that they hide behind this idea of we can't talk about this because it's an interval property or you want to understand it because it's coding or and these
3:46 am
kind of arguments i don't think it benefits them i don't think it benefits consumer . i don't think it benefits people who are audience is who don't want to have clearly false misleading information put in front of them when their scorn for that their news feed ok you raise a couple of issues there alister should there be public oversight of the technology that these companies are using to fill our news feeds and and search results should should algorithms be regulated this is the big secrets that they won't talk about for business reasons is it even possible to regulate an algorithm. well well first you need to understand what the algorithm is and because they've been keeping that secret for so long it's very difficult to understand when and how it may be regulated i do agree with tom in that having that kind of very public oversight it's not necessarily being directly involved in what the algorithm
3:47 am
doesn't deciding the terms of in deciding what actions it can or cannot take but in understanding how that works understanding the moderation policies and having a public conversation around that allows different voices to be heard because what we see not just on facebook but on all kinds of these these online platforms is that certain pieces of content are being removed like for example breast feeding videos while hate speech is allowed to stay up because the more of the terms of the moderation and the content guidelines on the the public and the there have been discussed and seem to be paid taking priority over certain things which aren't necessarily public harmful in a public way but which may offend their guidelines against certain types of news here other things so they're still having an actual conversation as tom says about what about what is acceptable and what isn't excess acceptable because there are always going to be people who will speak out against censorship no matter what that censorship is but having a public public conversation about it is how we have had our laws built over many
3:48 am
many years decades centuries but online we're not ready for that you know and we haven't actually been able to apply that same thought process and same public process to understanding what is acceptable and what is not and how that is dealt with suppressed a nodding the things that we've we've discussed in the last few minutes it's just pick up on one thing that tom was saying just a moment ago companies such as google and facebook. other assurances that they're cracking down on fake news only a result of the bad publicity that that's been generated since the presidential election last november. i mean you know almost definitely you had right after the election you had facebook c.e.o. say it was pretty crazy that facebook fake news had any impact on the election and now they've kind of walked back that statement but only because of the mounting media attention and the mounting political attention otherwise they would they want to just put this under the carpet they don't have to deal with this this is
3:49 am
complicated is it difficult is it expensive there's no easy way out of this there's no easy way to do this they just want to increase profits they want to increase the amount of articles people share it preferrable to them but people share content they like and put them into narrow narrow groups because they can target them with their own are they don't want to have to deal with this and they're only dealing with it because of media attention again and again and again so is that suggestive why it why companies such as google and facebook due to peer to act quickly and else where when journalists or even just. other users flag articles as big fake all delicious i think it's a mixture of they don't want to deal with it they also don't know how to deal with it and they're also worried that they'll deal with it wrong and then they want to spark of regulation that's that's the thing they see and they're scared of the most of it if regulation comes along and destroys their business models that can it's the one thing that can actually stop them so they're very scared and they're approaching this very carefully tom a lot two thirds of american adults get their news from social media companies like
3:50 am
google and facebook dominate public discourse should they be designated and regulated as as publishers and made to abide by the laws the traditional media companies companies like this one for instance have to abide by other publishers have to follow. well they're not traditional publishers as as news organizations like like al-jazeera and others but i think that you can't completely ignore ignore them in but if you sense i think to be much what i think facebook and google others would prefer to find a way of doing that that themselves but the moment that they're not. they're not meeting that that challenge and the fact that they do remove some misinformation or some deliberate. content that's masquerading as as news is fine but you know what is the process by which that's removes what the guidelines by which they they they they they choose to use to to do that and without publishing
3:51 am
that so that audiences can hold onto accounts like they would have an another news organization i mean you can look at the news of the nation's code of ethics you might have a concept like the news on. that so you can complain to the having a much better process around that is is what's needed and we have to come to terms with the fact that yes people are gathering their news from from social networks more and more and more but the same time we have to realize that. the jocularly of google and facebook have completely undercut the the financial the of the way that journalism used to fund itself which we used to nourish good reporting and going back to las vegas incident one for the new york times the washington post found their first amendment rights or their their wish for free speech in any way in inhibited by the values of journalism by making sure they were accurate they were independent they were impartial and and to make sure that they
3:52 am
that they were correct in what they reported so i think there's a lot that the if they wanted to that these tech companies could learn from journalism and and from journalists if if they were willing to to listen more and not hide behind these these arguments around. intellectual property and and and freedom of speech online. and who should be accountable for the lies of route law lies published on social media what penalty should be paid and by whom i mean you can't hold an algorithm or a piece of code to account can you. no you can't but the often these algorithms are gains by people who know exactly how to make their results appear on top of the google or to feature on facebook news feed you can do that by paying you know adverts as been much in the news recently about how they
3:53 am
were used during the join the us election so the people that publish that content in the first place. you know must be highlighted and they should be it remains very clear if they are purveyors of misinformation or if they are more reliable outlets now how exactly that's hearted on facebook and on google i know they are working on that in terms of you know repercussions of it need to think very carefully about that we don't want to have a chilling effect on freedom of speech and he was ability to to communicate online but i think it's very important that media consumers understand in far greater detail what they're consuming and and what they can trust online and that comes down to efforts of literacy foot for me literacy which i think is you know essentially important we we also have to ask ourselves as individuals what are we doing to make sure that we're not sharing false information that just because we
3:54 am
agree with something we don't just click like and and click share so it's circuit is a mixture of of regulation on the individuals on individual level on the industry level under the companies themselves we need to find answers to or it's alister should society as a whole take more responsibility for people's goal ability when it comes to new should information literacy be taught in schools for example to ensure that the people are more discerning and more critical of what they they see and read online . absolutely one hundred percent of that something which should be taught in schools from a very early age is how to decide a piece of information or piece of text and understand what whether it's been sourced at all and when i was at school in history lessons you would look at primary sources and second resources and things like that to see whether something has been or who wrote it and what influence this might be on the person when they wrote it and if history has taught us anything it's that skilled liars when given a platform can be very dangerous and when people aren't bringing those intrinsic
3:55 am
values of holding truth holding truth up and holding people to account for their words that kind of a very dangerous effect on society so in terms of individuals taking responsibility teachers teaching this in schools politicians or whoever it may be bringing in those kind of education into society as a whole can only be a positive thing in my view are you saying that basically you have to change the behavior of the audience to to change the behavior of of anyone who's producing the fake news is that what you say well the people who produce well i mean if that if that if audiences are more discerning in what they view then the news won't get shared as much it will wither on the vine it won't go if people are more discerning about the information they consume and they can spot mistruths more readily and there are the kind of the actual social repercussions of certain false news which happen in the real world which don't necessarily happen online then that that
3:56 am
misinformation won't travel as much so yes there should be. repercussions on the people who share i don't think necessarily you know legal or legislative repercussions because people can share misinformation accidentally but it is on people and on individuals to be responsible for what they share and as i said before that is very difficult online and when it comes to things like facebook and twitter and google and everything else there are positive reinforcements around sharing things which might not be true but if they chime with people's views that. again as a verification marker if there are five hundred five hundred little heart notes next to something which is a blatant lie people might believe it because it's been viewed and heard so often but we have to bring some of those social repercussions that happen in the real world blacks of the online world and we can only do that together ok so best to very briefly because we're always out of time do you agree with that. i actually believe in people's right to share things that things that stupid to be honest i
3:57 am
think if someone wants to share something that's not entirely true they might share in that belief that is true we should always expect everyone to have a full understanding of all the rights and wrongs of publishing and news we can hopefully try and teach them but at the end of the day facebook in my view should be paying to a third party independent body that looks at this looks at that algorithm and works out how to at least try and tackle some of the biggest offenders of this to take out in the people that are sharing the stories i don't know ok really good to talk to you gentlemen there were stand it thank you very much indeed i was to read tom law and sebastian moss and as always thank you for what you don't forget you can see the program again at any time just by going to the web site al jazeera dot com and for further discussion join us on our facebook page facebook dot com forward slash inside story and you can also join the conversation on twitter at a.j. inside story from me adrian said again and the whole team here in doha thanks for
3:58 am
watching i'll see you get by for that. witness documentaries that open your eyes at this time on al-jazeera.
3:59 am
the street is quiet the signal is given. out yet so it's safe to walk to school last year there are more than thirty metres in this community in one month the police say this area is a red zone one of several in some townships and children sometimes a court in the crossfire when rival gangs fight so parents and grandparents have started what they call a walking bust to try to take them to violence i lost my son you cooking and i live . i also lost my but there are more than one hundred fifty volunteers working for several working buses teachers say it is working class attendance has improved the volunteers also act as security guards. and age old part of spanish culture shock over i can't stop thinking of all the bullies my life of as a barbaric sport and a symbol of central government by we shouldn't carry on something that goes against
4:00 am
the morals of go along. as a in the catalan nationalist perspective the believe the present kind of culture and catalonia last bullfight the fifth time on al-jazeera wild. yes. al-jazeera where ever you are. i'm richelle carey in doha let's take a look at the top stories on al-jazeera turkey's president has described us decision to suspend most visa services for turkish citizens as upsetting the phrase in response to the arrest of a u.s. consulate employee in istanbul queues of links to last year's failed coup. immediately imposed a measure.

3 Views

info Stream Only

Uploaded by TV Archive on