Skip to main content

tv   Inside Story 2017 Ep 281  Al Jazeera  October 10, 2017 10:32am-11:01am AST

10:32 am
and romanian troops will be supported by personnel from nine other nato countries moscow has accused nato of trying to in circle it and threaten stability in eastern europe which nato denies you not mirroring what russia does so we're not mirroring russia playing by play in the shoulder by shoulder point by technical exercise by exercise we are responding in our way and we do it in a proportionate michoud and defensive way and we have to remember that of course you need to also always have full on forces we have more forces than a nato response force but the whole of the you know with nato is to be so strong is to be so. united and so clear and coherent no messaging that no other so tries to know a potential and tries to attack nato. and holds a duty often in liberia in less than half an hour voters will be electing a new president after twelve years under africa's first elected female president
10:33 am
ellen johnson sirleaf twenty candidates are in the running to replace them those are the headlines coming up next on al-jazeera its inside story to stay with us. news has never been more enviable but the message is as simplistic and misinformation is rife listening provides a critical counterpoint challenging mainstream media narrative at this time on al-jazeera. spotting and stopping the fake news facebook says it's developed a tool to give users more context on what they read but admits it's not always easy to flush out fiction from fact so how do we deal with this modern day reality of where does it leave social media this is inside story.
10:34 am
hello welcome to the program in doha i'm adrian finnegan fake news there is no doubt it's become one of the key terms of twenty seventeen brought into the spotlight after the us presidential election last november it became clear that thousands of people were unintentionally sharing articles that were full of misinformation the resulting paranoia led people to accuse even once trusted sources including the new york times of carrying fake news and social media is a core part of the problem facebook has finally decided to do something about it it's testing a new feature called the i button when clicked it gives the reader more information about the article they're reading the idea to give people context about their news sources so they can decide if articles are from publishers they trust that if the stories themselves are credible or facebook's come under attack for not doing this earlier critics say that the company has a responsibility to protect its users from fake news but facebook's chief security
10:35 am
officer says it's not so simple and externals tweeted on saturday that lots of journalists have celebrated academics who've made wild claims of how. easy it is to spot fake news and propaganda without considering the downside of training machine learning systems to classify something as fake based upon ideologically biased training data an understanding of the risks of machine learning drives small c. conservatism in solving some issues or facebook isn't the only tech company struggling to contain fake news you tube has changed its search algorithm to promote videos from mainstream news outlets instead of accounts that may serve up conspiracy theories but it hasn't said how it deems an account credible and yet you observe google has started to identify and label fact checked articles in the u.s. and some european countries it's also allowed people to report misleading content that shows up in its auto complete search function or in the featured snippet pops
10:36 am
up above all of the other search results and twitter has also been criticized for allowing the spread of misinformation but it's taken a very different approach to the issue to us as twitter's vice president of public policy wrote in a blog post in june that we cannot distinguish whether every single tweet from every person is truthful or not we as a company should not be the arbiter of truth so let's bring in our guests for today they're all in london alice to read is a social media journalist of the press association who specializes in the spread of misinformation online sebastian moss is technology reporter for data center dynamics and tom law is the director of campaigns and communications at the ethical journalism network gentlemen welcome to you all i'm mr let's start with you what do you make of this facebook i button initiative is it going to be enough to stop the dissemination of fake malicious news on the platform. well it is an excellent start
10:37 am
and having a bus and around stories which provides extra context is something which from time has been missing from facebook and people see stories in their feeds and they don't not necessarily sure where it came from the not necessary sure who wrote it so it is an excellent start but in my view there is the problem that people if they believe something if they want to believe something then no matter what amount of context they're going to believe that maybe dismiss the context that comes with it so it's not that facebook has grown too big too quickly to be able to control or to keep control of the material that's that's published on it that is a part of it as well i mean it's not it's not just a one dimensional issue here there are many factors in play which contribute to the spread of misinformation and false news and this information whatever we want to call it there are many things and the fact of the platforms themselves and how they work and how we interact with people on there that's a big part of it the oversight that goes into these platforms and how content spreads around them is part of it and also the actions of people outside of the platforms of media organizations of politicians of individuals in that it's all of
10:38 am
these are all factors within how this spreads but it is good as we've been saying that this new button has been added to address this sebastian loss is it possible to put the fake news genie back in the bottle now that it's out. i mean it is never possible you just need to look at journalists and how much they struggle to kind of work out what's true and what's not it's a daily battle for facebook obviously people going to share what they want to share and you can do what you think is like the i initiative to raise some awareness but at the end of day people want to share something they're going to share so what do you think facebook should be doing then to make sure that a lot of this misinformation this false fake news doesn't get out there. i think i mean the eye is a good start but it's a very small button and people might not click my knowledge is i think at the end of day you need a lot of people that they made three point nine billion dollars in profit last quarter so they can afford a few people to be working at least on the biggest news stories are being shared looking at them and seeing if they're false or not and then flagging that with
10:39 am
maybe a bigger more visible icon do i was there about about facebook having perhaps grown too big too quickly i mean given the number of people that use it that what is it something like a quarter of the world's population surely you can never have enough humans enough journalist looking at content to sift out the bad stuff can you with the amount of material that's published on the on the platform. facebook is obviously the biggest platform in the world when it comes to information sharing we have two billion more than two billion monthly active users and that is a huge proportion of the whole population as the bar scene said hopefully we can facebook plans to employ some more people to be over to be able to oversee this and be able to oversee how information gets shared but as you also mentioned there is the problem that people are now just believing what they want to believe in as human beings the way our brain works we're more biased to believe something if it confirms how we understand the world to be or how we want the world to be and those
10:40 am
kind of stories that fit with our worldview bypass our kind of critical thinking faculties so who are often going to believe something no matter what's fact checkers or journalists say and it's you know that sounds like a very pessimistic view on all of this but what we need to do in my view is to yes have the context been absolutely but raise the importance and the value that people hold on the truth and on providing proof for what they're saying and on the negative aspects of lying and online especially on social networks and not just facebook a lot of those kind of social factors have disappeared if i was to come on here and just tell many lies i hopefully wouldn't be invited back but online on thin social networks people get a thumbs up or star as a positive reinforcement for spreading lies that fits with people's worldview and they're also marked as a verification marker so it is a very multifaceted problem with how those networks work in allowing this kind of false information to spread that journalists and real people are going to be able to help to
10:41 am
a degree but not in totality ok so we'll look at some of the ways in which perhaps that aspect of this problem can be solved in a bit of that suspense did you do you agree with that that this is more perhaps a bit of a social problem than the technology problem. yeah i mean at the end of the day you are responsible for what you decide to read what you decide to share what you decide to like and there are some fake news stories that are highly to do as a fake news story that went viral and they seem quite obvious as fake news story so we do need to work on on ourselves and working out how to kind of control what we ingest the same time facebook does have a duty as that middleman as a company that has increasingly trying to get involved in the media sphere trying to increase the amount of publishers and websites that go on and if it wants to be part of the media industry it needs to understand that it also needs to take a part be part of that struggle to work out what is true and what's not told law from the ethical journalism network should face book and other social media platforms and search engines be forced to give priorities information that is that
10:42 am
is in the public good. well i do they should be forced to be much better if they took that sponsibility upon themselves what we've seen recently in in germany are laws being created to fine facebook google and others huge amounts of money for hate speech and fake news online and that will probably result in both of those companies removing content hastily and actually having a chilling effect on freedom of speech so self-regulation on an industry level and also on the individual level for those companies is is what we need and so far we're saying that without public pressure and without scandals such as the misinformation broadcast after the recent las vegas shooting. we have to we have to find a way to address this now the arguments made that this is going to have a century effect on on freedom of speech i think is a bit of a. false argument to make when people go on facebook to find out news about
10:43 am
a terrorist attack or a breaking news story they want to find accurate information and the fact that it took so long for misinformation to be removed from their algorithm is is very problematic but what we need to know is how is it removed from the algorithm what choices were made on the what guidelines we can't labor under the are under the false idea that algorithms were not made by people with guidelines as to as to what they should show and so if google facebook and twitter and others are going to make these adjustments we have to have to be more transparent about how they do it and if they're going to hire human editors to make choices we have to know what the guidelines under which they operate so that they can be held accountable by their audiences so. facebook sites business reasons to explain why
10:44 am
that it doesn't talk about how it's algorithms work but who. what you're saying is that facebook's apparent lack of respect for transparency and ethical standards are eroding trust not just in it in it but but on other social media platforms too absolutely i mean the some of facebook's guidelines were leaked to the guardian a few months ago and they published some great reporting called the facebook files where they had examples of what facebook allows and and doesn't allow and some of it looking at it just doesn't doesn't make any sense and often in one context publishing a certain image of an extremist group or a terrorist or terrorism symbol would be warranted in other times it wouldn't be but those choices should be made or in journalism we make them on a case by case basis depending on ideas such as what's in the public interest and everything else and you know what you have is facebook employing. thousands of people in not directly as facebook employees but for other companies who are on the
10:45 am
extreme time pressures to go for and sift through content and hit delete or to or to raise it to one of their superiors based on very very you know basic criteria now first of all these should have been published the journalists like us and others and so society groups have a real debate around what content is acceptable and what content isn't and and and it and what context otherwise eventually of a social media platforms and others will have their guidelines leaked and then the exposure that way i would say it's much better from a p.r. point of view for these companies to get ahead of the game and have a much better engagement and a conversation with the journalist community and media community more widely and the longer that they hide behind this idea of we can't talk about this because it's an interval property or you want to understand it because it's coding or and these
10:46 am
kind of arguments i don't think it benefits them i don't think it benefits consumers. i don't think it benefits people who are audience is who don't want to have clearly false misleading information put in front of them when their scorn for that their news feed ok you raise a couple of issues there alister should there be public oversight of the technology that these companies are using to fill our news feeds and and search results should should algorithms be regulated this is the big secrets that they won't talk about for business reasons is it even possible to regulate an algorithm. well well first you need to understand what the algorithm is and because they've been keeping that secret for so long it's very difficult to understand when and how it may be regulated i do agree with tom in that having that kind of very public oversight it's not necessarily being directly involved in what the algorithm doesn't deciding the terms of in deciding what actions it can or cannot take but in
10:47 am
understanding how that works understanding the moderation policies and having a public conversation around that allows different voices to be heard because what we see not just on facebook but on all kinds of these these online platforms is that certain pieces of content are being removed like for example breast feeding videos while hate speech is allowed to stay up because the more of the terms of the moderation and the content guidelines on the the public and the there have been discussed and seem to be paid taking priority over certain things which aren't necessarily public harmful in a public which are necessarily public harmful in a public way but which may offend their guidelines against certain types of other things so they're still having an actual conversation as tom says about what about what is acceptable and what isn't excess acceptable because there are always going to be people who will speak out against censorship no matter what that censorship is but having a public public conversation about it is how we have had our laws built over many
10:48 am
many years decades centuries but online we're not ready for that you know and we haven't actually been able to apply that same thought process and same public process to understanding what is acceptable and what is not and how that is dealt with suggested a nodding the things that we've we've discussed in the last few minutes let's just pick up on one thing that the tom was saying just a moment ago companies such as google and facebook. other assurances that they're cracking down on fake news only a result of the bad publicity that that's been generated since the presidential election last november. i mean you know almost definitely you had right after the election you had facebook c.e.o. say it was pretty crazy that facebook's fake news had any impact on the election and now they've kind of walked back that statement but only because of the mounting media attention and the mounting political attention otherwise they would they want to just put this under the carpet they don't have to deal with this this is
10:49 am
complicated is it difficult is it expensive there's no easy way out there's no easy way to do this they just want to increase profits they want to increase the amount of articles people share it preferrable to them but people share content they like and put them into narrow narrow groups because they can target them with their own are they don't want to have to deal with this and they're only dealing with it because of media attention again and again and again so is that suggestive why it why companies such as google and facebook do it appear to act quickly and else when when journalists or even just. other users flag articles as being fake all delicious i think it's a mixture of they don't want to deal with it they also don't know how to deal with it and they're also worried that they'll deal with it wrong and then they want to spark of regulation and that's the thing they see and they're scared of the most of it if regulation comes along and destroys their business models that can it's the one thing that can actually stop them so they're very scared and they're approaching this very carefully tom a lot two thirds of american adults get their news from social media companies like
10:50 am
google and facebook dominate public discourse should they be designated and regulated as as publishers and made to abide by the laws the traditional media companies companies like this one for instance have to abide by other publishers have to follow. well they're not traditional publishers as as news organizations like like al-jazeera and others but i think that you can't completely ignore ignore them in but if you sense i think to be much while i think facebook and google others would prefer to find a way of doing that that themselves but the moment that they're not. they're not meeting that challenge and the fact that they do remove some misinformation or some deliberate. content that's masquerading as as news is fine but you know what is the process by which that's removes what the guidelines by which they they they they they choose to use to to do that and without publishing that so the
10:51 am
audiences can hold onto accounts like they would have an another news organization i mean you can look at the news of the nation's code of ethics you might have a concept like the news on. that you can complain to is having a much better process around that is is what's needed and we have to come to terms with the fact that yes people are gathering their news from from social networks more and more and more but the same time we have to realize that. the job plea of google and facebook have completely undercut the the financial the of the way that journalism used to fund itself which we used to nourish good reporting and you know going back to las vegas incident over the new york times or washington post found their first amendment rights or their their wish for free speech in any way in inhibited by the values of journalism by making sure they were accurate they were independent they were impartial and and to make sure that they
10:52 am
that they were correct in what they reported so i think there's a lot that the if they wanted to that these tech companies could learn from journalism and and from journalists if if they were willing to to listen more and not hide behind these these arguments around. intellectual property and and and freedom of speech online term who should be accountable for the lies a fraud like law lies published on social media what penalty should be paid and by whom i mean you can't hold an algorithm or a piece of code to account can you. no you can't but the often these algorithms are gains by people who know exactly how to make their results appear on top of the google or to feature on facebook news feed you can do that by paying you know adverts as been much in the news recently about how they
10:53 am
were used during the join the us election so the people that publish that content in the first place you know must be highlighted and they should be it remains very clear if they are purveyors of misinformation or if they are more reliable outlets now how exactly that's hearted on facebook and on google i know they are working on that in terms of you know repercussions i think need to think very carefully about that we don't want to have a chilling effect on freedom of speech and he was ability to to communicate online but i think it's very important that media consumers understand in far greater detail what they're consuming and and what they can trust online and that comes down to efforts of literacy foot for foot literacy which i think is you know essentially important we we also have to ask ourselves as individuals what are we doing to make sure that we're not sharing false information that just because we
10:54 am
agree with something we don't just click like and and click share so it's circuit is a mixture of of regulation on the individuals on individual level on the industry level under the companies themselves we need to find answers to or it's alister should society as a whole take more responsibility for people's go ability when it comes to new should information literacy be taught in schools for example to ensure that the people are more discerning and more critical of what they they see a read online. absolutely one hundred percent is that something which should be taught in schools from a very early age is how to dyslexic a piece of information or piece of text and understand what whether it's been sourced at all and when i was at school in history lessons you would look at primary sources and second resources and things like that to see whether something has been or who wrote it and what influence this might be on the person when they wrote it and if history has taught us anything it's that skilled liars when given a platform can be very dangerous and when people aren't bringing those intrinsic
10:55 am
values of holding truth holding truth up and holding people to account for their words that kind of a very dangerous effect on society so in terms of individuals taking responsibility teachers teaching this in schools politicians or whoever it may be bringing in those kind of education into society as a whole can only be a positive thing in my view towards you to say that the basically you have to change the behavior of the audience to to change the behavior of of anyone who's producing the fake news is that what you say well the people who produce well i mean if that if that if audiences are more discerning in what they view then the news won't get shared as much it will wither on the vine it won't go if people are more discerning about the information they consume and they can spot mistruths more readily and there are the kind of the extra social repercussions of certain false news which happen in the real world which don't necessarily happen online then that
10:56 am
that misinformation won't travel as much so yes there should be. repercussions on the people who share i don't think necessarily you know legal or legislative repercussions because people can share misinformation accidentally but it is on people and on individuals to be responsible for what they share and as a supper for that is very difficult online and when it comes to things like facebook and twitter and google and everything else there are positive reinforcements around sharing things which might not be true but if they chime with people's views that. again as a verification marker if there are five hundred five hundred little heart notes next to something which is a blatant lie people might believe it because it's been viewed and heard so often but we have to bring some of those social repercussions that happened in the real world back so the online world and we can only do that together ok so best to very briefly because we're always out of time do you agree with that. i actually believe in people's right to share things that things are stupid to be honest i think if
10:57 am
someone wants to share something that's not entirely true they might share in that belief that is true we should always expect everyone to have a full understanding of all the rights and wrongs of publishing and news we can hopefully try and teach them but at the end of the day facebook in my view should be paying to a third party independent body that looks at this looked at that algorithm and worked out how to at least trying tackle some of the biggest offenders of this all right to take out in the people that are sharing the stories i don't know ok really good to talk to you jeff but there were stand it thank you very much indeed i was to read tom law and sebastian moss and as always thank you for watching don't forget you can see the program again at any time just by going to the website al jazeera dot com and for further discussion join us on our facebook page facebook dot com forward slash a.j. inside story and you can also join the conversation on twitter at a.j. inside story from me adrian figure and the whole team here in doha thanks for
10:58 am
watching i'll see you get by for that. in the heart of the amazon the bolivian family has put their lives in peril to harvest brazil nuts. which can sing the congo to the capital is an even more dangerous challenge. risking a tool in the libya. at this time on al-jazeera. and under pointed well on. u.s. and british companies have announced the biggest discovery of natural gas in west africa but what to do with these untapped natural resources is already a source of heated debate nothing much has changed they still spend most of their days looking forward to for the dry river beds like this one five years on the
10:59 am
syrians still feel battered or even those who managed to escape their country haven't truly been able to escape the war. her waves the rain forests of the state and we continue on our current way we won't have her race within twenty thirty forty years from now so you're essentially trying to recreate the ecosystem but under controlled conditions the main goal is so the ballot bowl still has a pulse resilience to climate change the great barrier is still sizeable but we've got to start now and we need to get everyone behind the solution tag no i base time on al-jazeera. what are you seeing like how my they suspected terrorist attack people of all faiths fell victim to a suicide bomber in manchester but if the bomb was indiscriminate was the placing of blame this is nothing to do with us this is about an individual who psycho you
11:00 am
know nobody could do this unless they were completely unhinged how much just as muslims responded to challenging questions in the aftermath of a deadly attack on people in power manchester united at this time on a. al-jazeera where every. reports that ten thousand range of muslims cross the border into bangladesh in a single day we report from cox says besides.
11:01 am
that i'm fully by.

12 Views

info Stream Only

Uploaded by TV Archive on