Skip to main content

tv   Inside Story 2017 Ep 281  Al Jazeera  October 9, 2017 8:32pm-9:01pm AST

8:32 pm
kind state and orcas or than sixty people drowned when two boats capsized last month the international committee of the red cross says it will reduce its operations in afghanistan off to seven members of its staff were killed there this year six died in an attack on an aid convoy in february and last month a physiotherapist was shot dead by one of her patients and all the four workers have been abducted at least ten people have been killed in as strikes in syria is a live province despite it being part of a deescalation zone the strikes hit a if the popular market town of martel nomen sources there say strikes were launched by syrian government warplanes iraqi forces of launched an operation to clear mines left by i still in how we after the group was driven out of the city last week during the two week offensive thousands of civilians and i still fighters fled the city. we'll have more on everything
8:33 pm
coming up in the news hour i'll see for that in twenty five minutes time that soft inside story which starts now. spotting and stopping the fake news facebook says it's developed a tool to give users more context on what they read but admits it's not always easy to flush out fiction from fact so how do we deal with this modern day reality of where does it leave social media this is inside story.
8:34 pm
hello welcome to the program in doha i'm adrian finnegan fake news there is no doubt it's become one of the key of twenty seventeen brought into the spotlight after the us presidential election last november it became clear that thousands of people were unintentionally sharing articles that were full of misinformation the resulting paranoia led people to accuse even once trusted sources including the new york times of carrying fake news and social media is a core part of the problem facebook has finally decided to do something about it it's testing a new feature called the i pad and when clicked it gives the reader more information about the article they're reading the idea to give people context about their news sources so they can decide if articles of from publishers they trust and if the stories themselves are credible well facebook's come under attack for not doing this earlier critics say that the company has a responsibility to protect its users from fake news but facebook's chief security
8:35 pm
officer says it's not so simple alex thomas tweeted on saturday but lots of. journalists have celebrated academics who've made wild claims of how easy it is to spot fake news and propaganda without considering the downside of training machine learning systems to classify something as fake based upon ideologically biased training data an understanding of the risks of machine learning drives small c. conservatism in solving some issues or facebook isn't the only tech company struggling to contain fake news you tube has changed its search algorithm to promote videos from mainstream news outlets instead of accounts that may serve up conspiracy theories but it hasn't said how it deems an account credible and yet you observe google has started to identify and label fact checked articles in the u.s. and some european countries it's also allowed people to report misleading content
8:36 pm
that shows up at its auto complete search function or in the featured snippet pops up above all of the other search results and twitter has also been criticized for allowing the spread of misinformation but it's taken a very different approach to the issue to us as twitter's vice president of public policy wrote in a blog post in june that we cannot distinguish whether every single tweet from every person is truthful or not we as a company should not be the arbiter of truth so let's bring in our guests for today they're all in london alice to read is a social media journalist of the press association who specializes in the spread of misinformation online sebastian moss is technology reporter for data center dynamics and tom law is the director of campaigns and communications at the ethical journalism network gentlemen welcome to you all i'm mr let's start with you what do you make of this facebook i button initiative is it going to be enough to stop the dissemination of fake malicious news on the platform. well it is an excellent start
8:37 pm
and having a bus and around stories which provides extra context is something which from time has been missing from facebook and people see stories in their feeds and they don't not necessarily sure where it came from the not necessary sure who wrote it so it is an excellent start but in my view there is the problem that people if they believe something if they want to believe something then no matter what amount of context they're going to believe that and maybe dismiss the contacts that comes with it so it's not that facebook has grown too big too quickly to be able to control or to keep control of the material that's that's published on it that is a part of it as well i mean it's not it's not just a one dimensional issue here there are many factors in play which contribute to the spread of misinformation and false news and this information whatever we want to call it there are many things and the fact of the platforms themselves and how they work and how we interact with people on there that's a big part of it the oversight that goes into these platforms and how content spreads around them is part of it and also the actions of people outside of the
8:38 pm
platforms of media organizations of politicians of individuals in that it's all of these are all factors within how this spreads but it is good as we've been saying that this new button has been added to address this sebastian morse is it possible to put the fake news genie back in the bottle now that it's out. i mean it is never possible you just need to look at journalists and how much they struggle to kind of work out what's true and what's not it's a daily battle for facebook obviously people going to share what they want to share and you can do what you think is like the i initiative to raise some awareness but at the end of day people want to share something they want to share so what do you think facebook should be doing then to make sure that a lot of this misinformation this false fake news doesn't get out there. i think i mean the eye is a good start but it's a very small button and people are not clipped my knowledge is i think at the end and then you need a lot of people that they made three point nine billion dollars in profit in the last quarter so they can afford a few people to be working at least on the biggest news stories are being shared
8:39 pm
looking at them and seeing if they're false or not and then flying out with maybe a bigger more visible icon story do i was there about about facebook having perhaps grown too big too quickly i mean given that the number of people that use it that what is it something like a quarter of the world's population surely you can never have enough humans enough journalists looking at content to sift out the bad stuff can you with the amount of material that's published on the on the platform. facebook is obviously the biggest platform in the world when it comes to information sharing we have two billion more than two billion monthly active users and that is a huge proportion of the whole population as the bar scene said hopefully we can as facebook plans to employ some more people to be over to be able to oversee this and be able to oversee how information gets shared but as you also mentioned there is the problem that people are now just believing what they want to believe and as human beings the way our brain works we're more biased to believe something if it confirms how we understand the world to be or how we want the world to be and those
8:40 pm
kind of stories that fit with our worldview bypass our kind of critical thinking faculties so who are often going to believe something no matter what's fact checkers or journalists say and it's you know that sounds like a very pessimistic view on all of this but what we need to do in my view is to yes have the context been absolutely but raise the importance and the value that people hold on the truth and on providing proof for what they're saying and on the negative aspects of lying and online especially on social networks and not just facebook a lot of those kind of social factors have disappeared if i was to come on here and just tell many lies i hopefully wouldn't be invited back but online on thin social networks people get a thumbs up or a star as a positive reinforcement for spreading lies that fits with people's worldview and also matters a verification marker so it is a very multifaceted problem with how those networks work in allowing this kind of false information to spread that journalists and real people are going to be able
8:41 pm
to help to a degree but not in totality ok so we'll look at some of the ways in which perhaps that aspect of this problem can be solved in a bit but suspense did you do you agree with that that this is more perhaps a bit of a social problem than the technology problem. yeah i mean at the end of the day you are responsible for what you decide to read what you decide to share what you decide to like and there are some fake news stories that are highly to do as a fake news story that went viral and they seem quite obvious as fake news story so we do need to work on on ourselves and working out how to kind of control what we ingest part of the same time facebook does have a duty as that middleman as a company that has increasingly trying to get involved in the media sphere trying to increase the amount of publishers and websites that go on and if it wants to be part of the media industry it needs to understand that it also needs to take a part be part of that struggle to work out what is true and what's not told law from the ethical journalism network should face book and other social media
8:42 pm
platforms and search engines be forced to give priorities information that is that is in the public good. well i do they should be forced to be much better if they took that sponsibility upon themselves what we've seen recently in in germany are laws being created to fine facebook google and others huge amounts of money for hate speech and fake news online and that will probably result in both of those companies removing content hastily and actually having a chilling effect on freedom of speech so self-regulation on an industry level and also on the individual level for those companies is what we need and so far we've seen that without public pressure and without scandals searchers the misinformation broadcast after the recent las vegas shooting. we have to we have to find a way to address this now the arguments made that this is going to have a censoring effect on on freedom of speech i think is a bit of
8:43 pm
a. false argument to make when people go on facebook to find out news about a terrorist attack or a breaking news story they want to find accurate information and the fact that it took so long for misinformation to be removed from their algorithm is is very problematic but what we need to know is how is it removed from the algorithm what choices were made on the what guidelines we can't labor under the are under the false idea that algorithms were not made by people with guidelines as to as to what they should show and so if google facebook and twitter and others are going to make these adjustments we have to have to be more transparent about how they do it and if they're going to hire human editors to make choices we have to know what the guidelines under which they operate so that they can be held accountable by their audiences so. facebook sites business reasons to explain why
8:44 pm
that it doesn't talk about how it's algorithms work but who. what you're saying is that facebook's apparent lack of respect for transparency and ethical standards are eroding trust not just in it in it but but on other social media platforms too absolutely i mean the some of facebook's guidelines were leaked to the guardian a few months ago and they published some great reporting called the facebook files where they had examples of what facebook allows and doesn't allow and some of it looking at it just doesn't doesn't make any sense and often in one context publishing a certain image of an extremist group or of a terrorist or terrorism symbol would be warranted in other times it wouldn't be but those choices should be made or in journalism we make them on a case by case basis depending on ideas such as what's in the public interest and everything else and you know what you have is facebook employing. thousands of
8:45 pm
people in not directly as facebook employees but for other companies who are on the extreme time pressures to go for and sift through content and hit delete or to or to raise it to one of their superiors based on very very you know basic criteria now first of all these should have been published so the journalists like us and others and civil society groups have a real debate around what content is acceptable and what content isn't and and and and what context otherwise eventually of a social media platforms and others will have their guidelines leaked and then the exposure that way i would say it's much better from a p.r. point of view for these companies to get ahead of the game and have a much better engagement and a conversation with the journalist community and media community more widely and the longer that they hide behind this idea of we can't talk about this because it's an interval property or you want to understand it because it's coding or and these
8:46 pm
kind of arguments i don't think it benefits them i don't think it benefits consumer . i don't think it benefits people who are audience is who don't want to have clearly false misleading information put in front of them when their scorn for that their news feed ok you raise a couple of issues there alister should there be public oversight of the technology that these companies are using to fill our news feeds and and search results should should algorithms be regulated this is the big secrets that they won't talk about for business reasons is it even possible to regulate an algorithm. well well first you need to understand what the algorithm is and because they've been keeping that secret for so long it's very difficult to understand when and how it may be regulated i do agree with tom in that having that kind of very public oversight it's not necessarily being directly involved in what the algorithm
8:47 pm
doesn't deciding the terms of in deciding what actions it can or cannot take but in understanding how that works understanding the moderation policies and having a public conversation around that allows different voices to be heard because what we see not just on facebook but on all kinds of these these online platforms is that certain pieces of content are being removed like for example breast feeding videos while hate speech is allowed to stay up because the more of the terms of the moderation and the content guidelines on the the public and the there have been discussed and seem to be paid taking priority over certain things which aren't necessarily public harmful in a public way but which may offend their guidelines against certain types of news here other things so they're still having an actual conversation as tom says about what about what is acceptable and what isn't excess acceptable because there are always going to be people who will speak out against censorship no matter what that censorship is but having a public public conversation about it is how we have had our laws built over many
8:48 pm
many years decades centuries but online we're not ready for that we haven't actually been able to apply that same thought process and same public process to understanding what is acceptable and what is not and how that is dealt with suggested a new nodding the things that we've we've discussed in the last few minutes it's just pick up on one thing that tom was saying just a moment ago companies such as google and facebook. other assurances that they're cracking down on fake news only a result of the bad publicity that that's been generated since the presidential election last november. i mean you know almost definitely you had right after the election you had facebook c.e.o. say it was pretty crazy that facebook fake news had any impact on the election and now they've kind of walked back that statement but only because of the mounting media attention and the mounting political attention otherwise they would they want to just push this under the carpet they don't have to deal with this this is
8:49 pm
complicated is it difficult is it expensive there's no easy way out of this there's no easy way to do this they just want to increase profits they want to increase the amount of articles people share it prefer rable to them but people share content they like and put them into narrow narrow groups because they can target them with narrowing our ads they don't want to have to deal with this and they're only dealing with it because of media attention again and again and again so is that suggested why it why companies such as google and facebook don't appear to act quickly and else when when journalists or even just. other users flag articles as big fake all delicious i think it's a mixture of they don't want to deal with it they also don't know how to deal with it and they're also worried that they'll deal with it wrong and then they want to spark of regulation that's that's the thing they see and they're scared of the most of it if regulation comes along and destroys their business models that can the one thing that can actually stop them so they're very scared and they're approaching this very carefully tom a lot two thirds of american adults get their news from social media companies like
8:50 pm
google and facebook dominate public discourse should they be designated and regulated as as publishers and made to abide by the laws the traditional media companies companies like this one for instance have to abide by that other publishers have to follow. well they're not traditional publishers as as news organizations like like al-jazeera and others but i think that you can't completely ignore them in but if you sense i think to be much while i think facebook and google others would prefer to find a way of doing that that themselves but the moment that they're not. they're not meeting that that challenge and the fact that they do remove some misinformation or some deliberate. content that's masquerading as as news is fine but you know what is the process by which that's removes what the guidelines by which they they they they they choose to use to to do that and without publishing
8:51 pm
that so the audiences can hold onto accounts like they would have an another news organization i mean you can look at the news of the nation's code of ethics you might have a concept like the news on. that you can complain to so having a much better process around that is is what's needed and we have to come to terms with the fact that yes people are gathering their news from from social networks more and more and more but the same time we have to realize that. the job plea of google and facebook have completely undercut the the financial the of the way that journalism used to fund itself which we used to nourish good reporting and you know going back to las vegas incident one for the new york times or washington post found their first amendment rights or their their wish for free speech in any way in inhibited by the values of journalism by making sure they were accurate they were independent they were impartial and and to make sure that they
8:52 pm
that they were correct in what their reporters so i think there's a lot that the if they wanted to that these tech companies could learn from journalism and and from journalists if if they were willing to to listen more and not hide behind these these arguments around. intellectual property and and and freedom of speech online. and who should be accountable for the lies of route like law lies published on social media what penalty should be paid and by whom i mean you can't hold an algorithm a piece of code to account can you. no you can't but the often these algorithms are gained by people who know exactly how to make their results appear on top of the google or to feature on facebook news feed you can do that by paying you know adverts as been much in the news recently about how they
8:53 pm
were used during the join the u.s. election so the people that publish that content in the first place. you know must be highlighted and they should be and remain very clear if they are purveyors of misinformation or if they are more reliable outlets now how exactly that's hearted on facebook and on google i know they are working on that in terms of you know repercussions of it need to think very carefully about that we don't want to have a chilling effect on freedom of speech and he was ability to to communicate online but i think it's very important that media consumers understand in far greater detail what they're consuming and and what they can trust online and that comes down to efforts of literacy foot for me literacy which i think is you know essentially important we we also have to ask ourselves as individuals what are we doing to make sure that we're not sharing false information that just because we
8:54 pm
agree with something we don't just click like and and click share so it's circuit is a mixture of of regulation on the individuals on the individual level on the industry level under the companies themselves we need to find answers to or it's alister should society as a whole take more responsibility for people's gullibility when it comes to new should information literacy be taught in schools for example to ensure that the people are more discerning and more critical of what they they see and read online . absolutely one hundred percent is that something which should be taught in schools from a very early age is how to decide a piece of information or piece of text and understand what whether it's been sourced at all and when i was at school in history lessons you would look at primary sources and second resources and things like that to see whether something has been or who wrote it and what influence this might be on the person when they wrote it and if history has taught us anything it's that skilled liars when given a platform can be very dangerous and when people aren't bringing those intrinsic
8:55 pm
values of holding truth holding truth up and holding people to account for their words that kind of a very dangerous effect on society so in terms of individuals taking responsibility teachers teaching this in schools politicians or whoever it may be bringing in those kind of education into society as a whole can only be a positive thing in my view are you saying that the basically you have to change the behavior of the audience to to change the behavior of of anyone who's producing the fake news is that what you're saying well the people who produce well i mean if that if that if audiences are more discerning in what they view then the news won't get shared as much it will wither on the vine it won't go if people are more discerning about the information they consume and they can spot mistruths more readily and there are the kind of the extra social repercussions of certain false news which happen in the real world which don't necessarily happen online then that
8:56 pm
that misinformation won't travel as much so yes there should be. repercussions on the people who share i don't think necessarily you know legal or legislative repercussions because people can share misinformation accidentally but it is on people and on individuals to be responsible for what they share and as i said before that is very difficult online and when it comes to things like facebook and twitter and google and everything else there are positive reinforcements around sharing things which might not be true but if they chime with people's views that. again as a verification marker if there are five hundred thumbs up or five hundred little heart notes next to something which is a blatant lie people might believe it because it's been viewed and heard so often but we have to bring some of those social repercussions that happen in the real world blacks' of the online world and we can only do that together ok so best to very briefly because we're always out of time do you agree with that. i actually believe in people's right to share things that things are stupid to be honest i
8:57 pm
think if someone wants to share something that's not entirely true they might share in the belief that it's true we should always expect everyone to have a full understanding of all the rights and wrongs of publishing and news we can hopefully try and teach them but at the end of the day facebook in my view should be paying to a third party independent body that looks at this looks at that algorithm and works out how to at least try and tackle some of the biggest offenders of this to take out in the people that are sharing the stories i don't know ok really good to talk to you jeff but there were stand it thank you very much indeed i was to read tom law and sebastian moss and as always thank you for what you don't forget you can see the program again at any time just by going to the web site al jazeera dot com and to further discuss join us on our facebook page facebook dot com forward slash inside story and you can also join the conversation on twitter i have the at a.j. inside story from me adrian figure and the whole team here in doha thanks for
8:58 pm
watching i'll see you get by for that. i provoked it or is it a lesson where on line we were in hurricane winds for almost like thirty six hours these are the things that new york here has to address or if you join us on sat i'm a member of a complex one but we struck up a relationship this is a dialogue tweet us with hostile a.j. stream and one of their pitches might make them actually join the global
8:59 pm
conversation at this time on al-jazeera the street is quiet the signal is given. out yet so it's safe to walk to school last year there are more than thirty metres in this community in one month the police say this area is a red zone one of several in some townships and kept our children sometimes at court in the crossfire when rival gangs fight so parents and grandparents have started what they call a walking bust to try to take the violence i lost my family cooking well of years ago i also lost my but there are more than one hundred fifty volunteers working for several working buses teachers say it is working class it in it has improved the volunteers also act as security guards we are witnessing around the word this hungry money which is only looking at how to make the next profit devastating economies devastating ecosystems putting a price on the protection of nature green economy is sound good but it was all
9:00 pm
about privatized sation of nature should our environment be for sale what we're trying to do this destroyed people to stabilize the country we're giving them a financial incentive to do that pricing the planet at this time on al-jazeera. witness documentaries that open your eyes at this time on al-jazeera. this is al jazeera. hello i'm maryanne demasi this is the news hour live from london coming up turkey arrests a second u.s. consulate official in both countries suspend visa service.

5 Views

info Stream Only

Uploaded by TV Archive on