Skip to main content

tv   Inside Story 2019 Ep 234  Al Jazeera  August 23, 2019 2:32pm-3:01pm +03

2:32 pm
rhythms that match people to resources and the reason i think of them as a digital poorhouse is because that the decision that we made in 820 to build actual poor houses was a decision that public service systems should 1st and foremost be moral thermometers that they should act to decide who is most deserving of receiving their basic human rights the genius studies into the automation of public services in the united states points to developments in the late sixty's and seventy's along with the civil rights movement came a push for welfare rights people are forced to live in the most human situations because of poverty african-americans and unmarried women who were previously barred from receiving public funds could now demand state support when they need. well technology was touted as a way to distribute financial aid more efficiently it almost immediately began to serve as
2:33 pm
a tool to limit the number of people getting support so you have this moment in history where there's a recession and a backlash against social spending and social movement that's winning successes that and discriminatory treatment and there really is no way to close the roles they can't close the roles the way they had in the past which is just to discriminate against people and that's the moment we see these tools start to be integrated into public assistance i think it's really important to understand that history i think too often we think of the systems s. just simple administrative upgrades sort of natural and inevitable but in fact there are systems that make really important consequential political decisions for us and they were from the beginning supposed to solve political problems among them the power and the solidarity of poor and working people in the only 900. close to
2:34 pm
50 percent of those living below the poverty line in the united states receive some form of cash welfare from the government today it's less than 10 percent in public assistance the assumption of many folks who have not had direct experience with these systems is that they're set up to help you succeed they are not in fact set up to say help you succeed and they're very complicated systems that are very diversionary that are needlessly complex and that are incredibly stigmatizing and emotionally very difficult so it shouldn't then surprise us that a tool that makes that system faster. more efficient and more cost effective furthers that purpose of diverting people from the resources that they that they need having algorithms make decisions such as who gets financial aid who has money back to the government has caused concern among many different groups but what's causing a full on panic for some is the fact that algorithms are being used to actually
2:35 pm
make predictions about people one of the most controversial examples is the correctional offender management profiling for alternative sanctions it's a bit of a mouthful but it sure is compass and it's an algorithm that's been used in courtrooms across the country to assist judges during sentencing now of course algorithms caught way up arguments analyze evidence or assess remorse but what they are be used for is to produce something known as a risk assessment school to predict the likelihood of a defendant committing another crime in the future this school is then used by judges to help them determine who should be released and who should be detained pending trial. now the judge has to consider a couple factors here there's public safety and flight risk on the one hand but then there are the real costs social and financial of detention on the defendant on their family on the other now historically what happens is a judge looks into this defendant's eyes and tries to say ok you're
2:36 pm
a high risk person or you're a low risk person i trust your i don't trust you now what algorithms are helping us to do is make those decisions better the compass algorithm was brought in to offset balance out inconsistency is in human judgment the assumption being of course that a piece of code would always be less biased and listen to prejudice however compass is faced several criticisms primarily accusations of racial bias inaccuracy and lack of transparency in 2016 a man named eric loomis sentenced to 6 years in prison took his case to the wood sconce and state supreme court his allegation was that the use of compass violated his right to due process it made it impossible for him to appeal his sentence since the algorithm is a black box impenetrable unquestionable. eric loomis didn't get very far the supreme court ruled the use of compass in his sentencing was legal the verdict tell
2:37 pm
about revealed the ways in which the ever increasing use of algorithms is being normalized the court had a funny argument saying that nobody knows where these decisions are coming from and so it's it's ok you know it's not that the state has a particular advantage over the defendant but that everyone is at this sort of an equal playing field and it's not that there's an informational advantage for one side or the other to me i find that somewhat dissatisfying i do think that in these high stakes decisions particularly the criminal justice system we don't just want to have an equal playing field no one knows but i think we need to have an equal playing field of everybody you know because we need to have this transparency built a system for the record equivalent the company that sells compass off with has defended its algorithm it points to research commissions that the company meets industry standards for fantasy and accuracy. whether compass most of the privately developed algorithms meet acceptable standards for transparency is another question
2:38 pm
even when they are used in the provision of public services algorithms are often closed to the public they cannot be scrutinized regardless of that sharon says that in certain cases he would still be comfortable being judged by a group bust algorithm so i do think it's true that many of the people in the criminal justice system are the most disadvantaged and the reality is they probably don't have a lot of say in their futures in their fates and how these algorithms are going to evaluate them. whether this would happen if more powerful people are being judged by these algorithms i don't know now me personally i would rather be judged by a well designed algorithm a human in part because i believe the statistical. methods for something risky in fact are better than humans in many situations and it can at least one as
2:39 pm
well designed eliminate a lot of these biases that that human decision makers often exhibit the united states has a massive racial discrimination problem and public services that's real so it is really understandable when agencies want to create tools that can help them keep an eye on frontline decision making in order to maybe identify discriminatory decision making and corrective the problem is that that's not actually the point at which discriminated discrimination is entering the system and this is one of my huge concerns about these kinds of systems is they tend to only understand discrimination as something that is the result of an individual who is making ever actionable decisions. and they don't these systems are not as good at identifying bias that is systemic and strong. actual the promise of algorithms is that we can mitigate the by sees that human decision makers always have you know we
2:40 pm
always were always responding to the way somebody looks who is we somebody acts and even if we try as hard as we can and if we really have these good intentions of the try to just focus on what matters i think it's exceptionally difficult now again is the promise of algorithms the reality is much more complicated the reality is that algorithms are trained on past human decisions they're built by fallible humans them selves in so there's still this possibility that that by sees creep into the development and application of these algorithms but certainly the promise is that we can least make the situation better than it currently is one of the things i'm really concerned about about these systems is that they seem to be part of a philosophy that increasingly sees human decision making as a black box and unknowable and computer decision making as transparent and accountable. and that to me is really frightening because of course computer
2:41 pm
decision making is not as objective and is not as unbiased as it seems at 1st glance we build bias into our technologies just like we build them into our right we teach our technologies to discriminate. but on the other hand people's decision making is actually not that opaque we can ask people about why they're making the decisions they're making that can be part of their professional development and i think this idea that human decision making is somehow unknowable is a sort of ethical abandonment of the possibility to grow and to change that we really really need as a society to truly address the systemic roots of racism and classism and sexism in our society so it feels to me like we're saying will never understand why people make discriminatory decisions so let's just let the computer make it and i think
2:42 pm
that's a mistake i think that's it a tragic mistake that will lead to a lot of suffering for a lot of people. so going back to the question that started us on this journey can we trust elders that's the biggest thing i've learned from speaking with the genius and many others is that i've actually got the question right. it isn't really so much about whether algorithms are trustworthy it's more about the quality of the data that feeds in egypt it's those designing actually. human biases human input fictions that's what we see reflected in our algorithms and without better oversight we risk reinforcing our prejudices and social inequalities. cut you off to 0. our program to a shame that the past is the future that we want as well and by the past that's
2:43 pm
often things the fear of stigma and bias and stereotypes and rejection and discrimination and really what we need is to create systems the allow for. the future scenario is that different from the all of course we can build better tools out there and make tools and i see them everywhere that i go but what makes a difference about good tools about just tools is building those tools with a broader set of values from the very beginning so not just efficiency not just cost savings but dignity and self-determination and justice and fairness and accountability and fairer process and all of those things that we really care about as a democracy have to be built in at the beginning from step one in every single tool .
2:44 pm
we're actually getting our hands on the data we're analyzing the data. now one thing that we've done is we've tried to make as much of the state of the old bulls possible so it encourage people to look at. this and one of our one of our projects is called the stanford open policing project we release lots of data in the criminal justice system we release code for people to play with the data and i encourage everyone to look at that and try to understand what's going on. you know maybe they'll discover a pattern that you consider yourselves my biggest piece of advice is to never underestimate your influence. you know you may be fighting some machine. some computer system that you've never been able to mate let's say to his him inflicted huge homeless suffering but no words can make government scared your voices combined can make said. sin quote sit up and pay attention to gather we can shape the way these tools are created and the ways that they
2:45 pm
impact as a political community if we want better outcomes from these systems we have to we have to claim our space as decision making and decision makers at these tables. and we can't do that if we think that these technologies are somehow gods they're built just just the way we build our kids we build these technologies and we have a right to be in dialogue with them. think of some of the biggest companies in the world today all of them. with algorithms that the more that we used to have we produce we're in the midst of a great race and big tech companies are on the chase and fires are rising on a wealth of information and we. in the sections of a 5 part series ali re-examined where the corporations are all of.
2:46 pm
american power of big tech just.
2:47 pm
millions of people across india missed out on medical but a hospital train is delivering doctors. to those most in need what i want to east india is a lot. on the al-jazeera. al jazeera. every. building a new life on an entirely beach living off the sea and. a dream shared by so many but so few make it a reality. a family business led by a mark of a woman with a flair for cooking and disaster than if. i didn't catch it
2:48 pm
on al-jazeera. we understand the differences and the similarities of cultures across the world. so no matter where you call home al-jazeera international bringing the news and current affairs that matter to you. al-jazeera. this is al-jazeera i'm with a check on your world headlines japan has described south korea's decision to withdraw from an intelligence sharing agreement as regrettable countries are locked in a better route over trade and japan's wartime atrocities in the last century probably
2:49 pm
pride has more from seoul. this deepening raul is having an increasing impact on trade and businesses here in south korea already underway is the so-called no campaign this is a campaign to boycott and i think japanese cancel your holidays to japan and so on and today friday starting we have been yes campaign this is in. thoughts of south korean businesses that are increasingly being affected by this trade dispute with japan telling people to basically buy korean obviously this dispute is now worsened by the cancellation of this pact sharing intelligence but south korea says it had no option blaming japan for the breakdown saying that there has been a great change in security cooperation japan has said that it is increasingly unacceptable for south korea to link security with trade but then south koreans
2:50 pm
would say it is japan that 1st linked trade with this controversial court decision finding in favor of the victims of forced labor from the 2nd world war opening up all the old animosities about japan's reco during the 2nd world war decades on watching anxiously from the sidelines the united states calling for calm between the 2 of its most important regional allies and choosing this moment to up the pressure has been real young hope north korea's foreign minister saying that the continued sanctions by the united states is a miscalculation saying that north korea is prepared for either dialogue or confrontation. the amazon rain forest in brazil is on fire and burning out a wreck or grates pressure is growing on the country's president also now as a tackle it's tribes have joined the international chorus to save what they consider to be sacred land from the devastating fires they say they will fight
2:51 pm
until their last drop of blood to protect their home meanwhile brazilian president continues to blame environmental groups for the fires. now the amazon is bigger than europe how can you fight criminal fires in such an area it is clearly criminal how can you do it you need to catch them in the act otherwise there. is nothing you can do now nongovernmental organizations are losing money money that came from germany and norway they are unemployed now so they are trying to overthrow. un investigators say the scale of sexual violence against the road shows genocidal in a report on sexual and gender based violence and feel that the military routinely used rape as a weapon against ethnic minorities the u.n. says those responsible should face work. in russia the u.s. accused each other of risking a new arms race at the un security council the meeting was requested by russia to
2:52 pm
discuss washington's testing of a land based nuclear capable missile us secretary of state says american officials are working on securing the release of 2 canadians held in china made the comments after meeting canadian prime minister justin trudeau 2 canadian men were detained in china 8 months ago on spying charges their arrest was seen as retaliation against canada's decision to detain an executive from chinese telecom giants weiwei on a u.s. warrants. as a bargaining chip is a legal process by the united states department of justice designed to bring someone who we believe we have sufficient information to bring back to their states under the agreement between the that it's very straightforward another round of talks between the u.s. and the taliban have begun in qatar the sides are discussing u.s. military intervention in afghanistan negotiations have been ongoing since october
2:53 pm
in a bid to end the 18 year long conflict. is due to visit kabul next week to meet the afghan government. those are the headlines al-jazeera world is coming up next to stay with us.
2:54 pm
let have the mama. but. the sad. thing that's because if they shared with the film the film ok.
2:55 pm
with everyone that has shut and loom and load and unload. i abandoned that and then found. them so it's. sad as a sad well i can do this i am. full of this and so i have so with the said to me. show the family essential deaths as. well as have decided to sneak shuttled off when the. house went to hell with long oh in a minute. did allen it was an act for us so full that a more had. the custom of how to. ruth min but i mean lifting that i mean in the ending were fed git out.
2:56 pm
bit sad none more of an expert in the. sad but of a village nanny to busy as many of you. have been led to that the sabbath going at even then sad then vava visioning lim up to hallam of a suit yet at trying the best and i have only. a says the say. and see the pivot to the end see. lewis has a hair so if someone and says ok let me show you a lot of stuff they all wore on. to the. economy or.
2:57 pm
more one long. love the above all. along with. well no the key was. the law in. our day that oh yes you. 2 do you. think the people want you. to do it for you to leave us with look if you go. around them now or for them rather for this hobby or for washing up
2:58 pm
because i wonder. how many kids' school i'm coming to this week or in any that congress don't appear so. i don't think you know when i'm with the group and it's really a movie. that the login story in the heart of.
2:59 pm
on a normal. size jersey but i'll bet i. live in my. model fair. enough some actually. live in my life a job in another job. the other. with this. problem of. commuting the living. in the city you think it's going into the foundations. but off the ledge in a city in oh well young you've got to leave the. good my head in the story i'm up for the love. in
3:00 pm
a sort. of of soria and the role that the good mother out of us what yani us for the. wealthy and also my 3 and i have. never in my life a job in the job. saurians. obvious that or i'll be in her. about money or her good most of the home but i'm not so very up at this age groups who are it will be live in saudi and it has gone. now. clearly i know how do you. tip huddle and it is a subdued butyl the army. must love you could dance and been hey you sweetie you are both to my feet on the plot about a lunch we. made good why did the whole team miami to you we should sue you i
3:01 pm
should big city ya'll been sitting here and will come in were near you and if it did bad bank no.


info Stream Only

Uploaded by TV Archive on