tv Bloomberg Technology Bloomberg March 15, 2019 11:00pm-12:00am EDT
♪ >> i am emily chang in san francisco and this is "bloomberg technology." mass shootings at two mosques in new zealand have left 49 people dead. the attackers live stream this on facebook. how can the biggest internet players stop the spread of violence? u.s. state officials in the early stages of a probe into google focused on antitrust and privacy. the biggest effort to take on big tech since the 1990's.
and a win for smallcom, but first the story that has gripped us. a mass shooting has left 49 people dead. the terror live stream to buy the suspect on facebook for 17 full minutes before facebook took it down. it still went viral on facebook, twitter, leaving tech platform scrambling to keep up with their users after the massacre. it is not the first time that violent incidents have been live streams. in august, a shooting at an e-sports tournament in florida was live streams. but in this particular's, it was not the moderators that flagged the stream but police.
facebook made the statement, police alerted us to a video after the live stream commenced and we quickly removed the suspect's video. youtube stated, our hearts go out to the victims of this terrible tragedy. shocking, violent, and graphic content have no place on our platforms. we were cooperatively with the authorities. joining me in seattle is david kris, founder of the consulting firm. and ambassador mark wallace. here with me in the studio, tom giles. ambassador wallace, i want to start with you. the shooter knew how this would play out online before this
happened in the new york times put it this way, this was a mass murderer for the internet. how much more terrifying does that make the situation? ambassador wallace: the social media companies know how this would play out. this is not a new problem. for years, the social media companies have been aware of the proliferation, the weaponization of their platforms and the fact that people can recruit, propagandi, call to act and there have been various right wing extremists, islamic extremists that have propagated online. what a shocking that it took them 17 minutes to take it down and a technology exists they claim to remove any uploads of this instantaneously.
so it should be that it prevents any of the proliferation of this material going up online. i think there is some really hard questions that have to be posed to social media companies. they have a real responsibility. i am not accusing them of perpetrating these acts, but these platforms have been weaponize, they have known about them being weaponize, and they are not doing enough to allow for the proliferation. >> the technology did not work, and the human moderators did not catch it. 17 full minutes. why not? >> without getting too much into the technicalities of it, a lot of these failsafes they put in place are meant for videos that have already been recorded and posted. it is less effective when it is live video.
the idea is they want -- there have been so many ways that social media has enabled citizen reporters to be out there with their cameras, capturing really important video, and you think about the role that live the video has played in helping us think differently about violence against young black men for instance. it is just one way where it has been some kind of a redemptive social good here. what happens, it is harder for the social media companies to vet that video as it is being streamed live free of what is happened is you can actually read post live video to make it look like it is live. you are reposting captured video, but it makes it look like it is live. in a sense, all of these failsafes do not work in these instances. social media has a more work. emily: david, you are working with technology companies on these issues, but when it comes to these human moderators and it
has been documented how difficult this job is -- there is a new report out that some of these moderators can press snooze, despite the redemptive social good that tom talked about, should they be allowing live streaming at all? ambassador wallace: the internet is a force for good and a lot of ways, and it carries with it a lot of risks including in respect with live video. this is a nun challenge for companies and society as a whole. i think we are going to have to use technology here it's a fight technology and as new techniques emerge as opportunities emerge to misuse, we are going to have to come up with technical solutions. the volume, velocity, and veracity of digital information out there is very taxing on the
human moderators. these are not easy problems to solve, but they are vital problems. we are going to meet technology, but we also cannot subordinate ourselves to our machine overlords either. we are going to have to have smart ethics to police us. emily: the reposting of these videos for hours after this happened and almost as disturbing as the initial video, youtube and we are getting new details of how they have handled this. they have removed the thousands of videos related to this incident. think about all of those users out there who are perpetrating this. is this something at this point that these companies can solve? >> let me respectfully disagree with tom and david. the technology access here and now, the san bernardino terror
attacks or the first time that social media companies acknowledged that they begun monitoring. since that time, they say they have employed thousands of moderators to find life contents. they knew the exploitation of live platforms, the risks were there. but because it is a moneymaker, they pushed it. let's be clear -- there is no excuse for any video or photograph of this instance to be repeatedly uploaded and to be viewed by anyone. the technology which is a genesis of photo dna which came out of the child exploitation and abuse context prevents an extract dna signatures of video and photographs. if i take a picture of you and i try to re-upload that, we can immediately and instantaneously have that removed before anyone viewed it. the company's claim that they
have this and this is not new technology, and they are not implementing it, or they are not beipent about the implementation. in the other statement to the contrary is an excuse. emily: if they have the technology and it is available, why would they not implement the technology? >> remember though model of social media companies is about our data and content. anything that serves to filter data or minimize data or content, the social media companies have been historically resistant to. anything that tries to limit that is not good for the bottom line. they have been slow to act and they do not want to be transparent. ask the social media companies the number of videos, extremist videos and content they have removed. i guarantee it is the tens of thousands. we had a counter extremism project have highlighted thousands of those content and have removed it, but why are they not systematically doing it? they exist, and there should be
no excuse for having any of this video up on the internet. it can be removed instantaneously if the platforms are running this content, and it is time to ask hard questions. this is not advanced technology. rest assured, they know what you want to buy, what you're buying habits are, but when it comes to removing content, they do not want to do that because it is not good for business. >> we do not know exactly how many videos facebook has had removed. david, do you want to respond? david: i think the problems are harder than that. there is a reference to hashtags on videos, but that has been a problem in regards to child exploitation and obscenity. nobody wants that stuff up, and yet, it is not totally depressed and removed from the internet. i do think we have to ask hard questions.
i do think we have to confront this problem head on, and we all have to do more, including the full some media companies, but i think it is not as simple and easy as all of that and these technologies including ai and ml are still developing. we have to understand them and use them wisely and in a proper framework. we do not just want to give ourselves over to our machine overlords either. we have to be smart and thoughtful about this. >> tom, we are being careful not to repeat anything from the manifesto, but there is one thing we cannot ignore which is a shot out that he gave to you to -- youtube's biggest star. the star made a statement saying that, i feel absolutely sick and having my name uttered about this person. this is a controversial internet to character, somebody less been accused of spreading hate,
anti-semitism -- >> and graphic images. >> and graphic imagery. what about the bigger problem of the hate, the white supremacy that lives online but there is no bright line to remove it. it is clear when you need to remove a violent video or shooting, but these companies are saying, it is less clear when it is characterized as an issue of free speech. tom: how do you define it. it is very cut and dry in my view when you have a lives shooting. this is a terrible, horrible event in christchurch. but there are others when it is much more of a gray area. you know that people would be crying foul if there starts to be this, and they have in fact done so when you have a sense that maybe facebook, google, are other companies were acting as an arbiter of political views,
censoring world -- they are a little bit further right of me, we do not want them taking that content down. or farther left, we do not want them taking that content down. so we do not trust google and facebook to be the arbiters of that. who do we want to come in and do that? the state, the chinese model where the state is the arbiter? so it is a big conundrum. >> what do we want? these platforms have been designed for maximum engagement, public discourse to prioritize free speech. you can imagine from the beginning, they were designed for a more curated experience. do we need government intervention and if so, what does that look like? >> this has nothing to do with artificial intelligence. this is humans deciding we do not want to allow content on the internet, ready to remove it instantaneously.
and let's just talk about the first amendment. when you are a social media companies citing the first amendment to you, it means it is on the defensive. if you go on facebook and want to look at good old-fashioned pornography, you can't do it. they limit all sorts of speech through its terms of service. it is not the first amendment that governs it, it is the terms of service. we as a society conclude that there is all sorts of types of speech that we do not agree with. child exploitation, firing in a crowded theater -- the social media companies whenever they start mentioning the first amendment, that is the time to be suspicious. their terms of service, they do not want to enforce it, they want to have an out, and when they start getting criticized,
they say first amendment. it is not true. >> there is a point that these companies have been bill and designed by humans, and there have been choices made a based on certain values by human beings, and the platforms could have been designed differently. it has nothing to do with technology. >> there was a failure to see a lot of the possible outcomes. a law of the ways that people would use and misuse that. i think it was a failure of imagination on the parts of some of the creators. there was a belief that people would use it for social good. this would connect the world. it would help you stay in touch with your grandparents and your children, and it was very much a failure of imagination. my question comes back to who was then do we expect to enforce that. >> we are going to talk about that and the role of possible
government intervention. tom giles, ambassador mark wallace, and david kris -- a debate we could continue to have for hours and will have over the next days and weeks. coming up, apple is fighting back against the spotify. why the app store says that it has contributed to the streaming sites's success, next. this is bloomberg. ♪
joining me now is my guest, but let's talk about what apple is saying now, because it took a couple of days to respond, but we have a lengthy response. >> yeah, from apple attacking spotify for being anti-artist and invoking some of the other disputes between spotify and the music community. apple and spotify are doing what the role of the app store is in promoting the different apps. apple is saying that spotify benefited from being on it. spotify argues that it would be doing better if not for this fee that apple charges. there is no question that the iphone and the app store created this economy, but it comes down to whether you think apple is justified. apple thinks it is. >> here are some quotes from the apple's statement. the majority of customers uses the free spotify which makes no contribution to the app store.
spotify is asking for that number to be 0. developers can rest assured that everyone is playing by the same set of rules. that is how it should be. we want more app businesses to thrive. daniel, i told you that you could not quantify the impact on a business but what customers have told them, it does not work on my apple tv and apple watch, it has them to believe that it is having a big business impact. >> spotify has quantified it, but they have just declined to elaborate to us. it is whether they decide to investigate. i imagine we will try to rally support from other services whether it is a european music service or other companies that feel wronged by apple.
meantime, the ethiopian records configured that the plane was configured to dive. joining us to discuss is george ferguson. what do we know about the software update and will it be enough? george: the software update i heard about is that they are going to take information from two sensors on the airplane rather than just one. there is an attitude that they were taking from previously, and they will use that to indicate whether the nose needs to be pushed over to prevent a stall. there still seems to be a lot confusion with pilots of how to shut off this stall inhibitor once it starts and that is the problem they had on the ethiopian air flight. it is not convincing that this is the final fix. but it is good to see them making progress. >> if indeed there were lives at stake because of this software issue, why, why, why did it take so long for boeing to roll this
out? it has been months since the last crash. george: i totally agree. the last crash, the lion air crash was the thought was the sensor on the airplane might have been most of what was wrong with that. since then with ethiopian air flight, since we do not think that there is a sensor at fault, but we do not even know yet -- the concern could be that it was more software driven than sensor driven. >> when will he have more definitive answers about the crash? >> as soon as we get the black box opened, we will see more definitive answers if all of the systems on the plane were working. what i have heard through the field with the push of the stall inhibitor is that it can be bulky, hard to use, and hard to figure out how to shut off. i wonder of boeing will be in the cockpit and figure out how to shut it off. that is the problem here, the guy did not know how to get the pushovers system shut off. >> george ferguson at bloomberg
this is moving day with the best in-home wifi experience and millions of wifi hotspots to help you stay connected. and this is moving day with reliable service appointments in a two-hour window so you're up and running in no time. show me decorating shows. this is staying connected with xfinity to make moving... simple. easy. awesome. stay connected while you move with the best wifi experience and two-hour appointment windows. click, call or visit a store today.
♪ emily: this is "bloomberg technology." i'm emily chang in san francisco. let's return to the top story. tragedy struck christchurch, new zealand when an armed terrorist attacked two mosques, killing 49 people and leaving dozens wounded. those responsible posted a live view of their attack on facebook. even after it was taken down, that video was posted thousands of times online, raising the question if tech cannot censor itself, what does the government need to do? i want to bring in jamil jaffer, the vice president for strategy and partnership at ironnet cybersecurity, and gerrit de vynck. we are getting new information
about google and youtube about how they handle this behind the scenes. the most stunning is that youtube itself removed thousands of re-postings of this video. tell us more about what youtube is saying. gerrit: we are not getting much more from youtube. their initial reaction was they were trying to find any postings and take it down. now they say they have removed it thousands of times. the criticism here is why was it going up in the first place? if we knew this video existed, why couldn't you teach the algorithms quickly to recognize new posts and take them down before they showed up. it seems the technology to do that quickly is not there yet. emily: there are so many disturbing things about this story. the video going online even once and thousands of people reposting it. i mean, can these companies even handle this or has the monster already created, or the genie, whatever, you cannot put it back
in the bottle. jamil: they don't want this stuff on their platforms. they don't want terrorist videos, whether it is of this variety or isis beheading people -- at the end of the day, they depend on user eyeballs on this. people are not going to come back if there is videos of people being murdered or beheaded out in the middle east. these platforms have a desire, economic desire to get this stuff off their platform and they are working on it. have they done enough? no. can they do more? absolutely. is the right answer regulation? we know how slow federal regulation laws are. it is kind of silly in my mind. emily: let's talk about the execution. murders, suicides have been livestream on facebook before. maybe the problem is not with the technology. it is the fact is that allows livestreaming at all. you could easily go to youtube and see plenty of things that
cross the line that have not been taken down, with regards to this or other issues. many conspiracy theories have proliferated on youtube for years and the company has not been enough about it. jamil: i think there is a line between conspiracy theories and things -- people have different ideas about the world and want to express these views. there is a line between that and what you call hate speech or incitement or murders and suicide. i suppose you could have laws that police that line. in the u.s., our philosophy has people step out of line and we only bar the most extreme
things. we let the markets soar the rest of that out. i think you are right, there have been challenges. there have been things you and i may not want to see or ideas we think are silly. 9/11 conspiracy theorists. that being said, it does not mean the platform has a responsibility to take those things down. what responsibility do users who retweet and reposted this that have? i think everyone needs to take a close look at what they are doing. you are part of the problem, not the solution. emily: yes, but also maybe the way the platform was designed to allow the reposting to happen in the first place is also part of the problem, right? gerrit: i suppose you could argue that. emily: talk to us a little bit -- we had tom wheeler, the former fcc chair, talking about the very line that you are trying to draw. take a listen to what he had to say. tom: clearly, there are first amendment rights and there are rights of speakers.
but, we need to figure out just where do you draw the line? what is the equivalent to shouting fire in the crowded theater in the internet age? emily: jamil, where do you draw that line? jamil: i think the line is where it is always been. shouting fire in a crowded theater. this is not that. this is something that is vile and disturbing. although, this might be designed to incite. it might cross the line now that i am thinking about it, because it is designed to cause terror and further hate. maybe it does cross that line, but the question is do we want the government regulating that or do we want these platforms to do what they are going to do anyway? it is clear what is happening -- they are not good at it. can the law make it better? that seems unlikely and that is where i worry we are trying to get in regulation, changing technology in ways that would be productive which is rapid removal of this bad stuff. emily: if the government can't
do it or should not do it, or the law, you think, should not necessarily apply here and companies are not doing it, then, where does that leave us? how do we keep it from happening again? jamil: i think people can vote with their feet. if you are seeing stuff on souls and media platforms that you do not like, you can spread that. you can make that clear by getting off the platform. i think these companies will realize -- i think they are already realizing since the last election, companies are realizing they are not benefited by having this material on there. a huge rebellion online against terrorist views overseas. you are seeing the same thing happen here with this particular video. real reaction. they're always going to be people at these extremes that what this stuff. we're going to push people off the platforms, to the margins. the vast majority of that stuff
is an appropriate. i think the platforms are responding and they will respond faster the more those voices are heard. emily: in this case, we are talking about thousands and thousands of times this video has been reposted. it is not necessarily a super tiny, isolated voice. jamil, we can talk about this for a long time. i appreciate you being here and sharing your thoughts. jamil jaffer of ironnet. gerrit, i want to talk about another story about we learned on wednesday that the u.s. justice department had a probe into facebook's data practices and that probe was widening. we also know that google is the focus of a group of u.s. attorneys general. they are investigating consumer protection violation by google. tell us more about the scope of this case. gerrit: this is a story that was broken by our colleague. the scoop is although there has been a lot of conversation about
antitrust, this is a moment where he has learned and we can report several attorneys general have begun an investigation into google and it will focus on antitrust and consumer protection. it is a very early step but it is a more concrete step then just talking about it and meeting about it and having congressional hearings about it. emily: how could this play out? this is the first time there has been such a wide-ranging probe since the microsoft antitrust issues back in the 1990's. gerrit: i think we still need to see where this is going but i think the way one should read this is that it is just another kind of note in sort of this ongoing story of changing the way american regulators and the american public thinks about antitrust. taking a really hard look at what is going on with google, amazon, facebook, apple and maybe even microsoft again, and saying have these companies grown too big and are they
stifling competition? elizabeth warren, running for the democratic candidate for president, has come out very strong and specific on this issue, to go as far to say specific company she would break up, how she would do it. she has brought that conversation that has been going on in some political circles for a couple of years now. brought that into the mainstream. i think this story of the attorneys general doing that as well is another note in that story. emily: thank you so much for weighing in. i know you will continue to report on that for us. coming up, elon musk has unveiled the tesla model y, but a delayed release has rekindled concern about the company's cash position. we will have all the details next. this is bloomberg. ♪
♪ emily: qualcomm won the first trial in its global dispute with apple. a federal jury awarded qualcomm $41.6 million in damages on its patent infringement claim, boosting their contention that a technology provides significant value to smartphones. the verdict means qualcomm can ask the judge for a sales ban, but the decision does not cover apple's most recent smartphone models. tesla's elon musk has unveiled the electric carmaker's newest vehicle, the model y, a cheaper suv that will be available in spring 2021 for $39,000. a longer-range version will cost $47,000. musk spent much of his presentation talking about the company's struggle with manufacturing. i want to bring in gene munster.
i know much of what was announced yesterday did not live up to your expectation. how so? gene: i think there are three different pockets here. i want to be clear about each of them. there is the timing of model y. i incorrectly published something and later but a correction on this, that they are going to be late getting the model y out. elon musk has always said they would have the model y out sometime late in 2020, next year. i want to be clear that piece was on track despite my mistake. the actual car itself, think that actually did exactly on the expectations. all the specs, the price point. one piece that gets missed is the price point really is a surprise when you compare it to other suv's and crossovers. basically half of what they cost.
a $39,000 -- if you look at the mercedes crossover, they are all around $70,000. that wasspot in mind. the one piece that was a change is we feel there will be a greater chance they will raise money in 2019. i think the street is generally accepting of the believe they will in fact need to raise money. part of the reason is this loss in the first quarter. another part of the majestic challenges they have had to get cars in europe and china impacts the cash flow. they have this $566 million debt payment in the month of november. it is a mixed bag here, but i want to backtrack and clear up some of my own mistakes and give people a good snapshot of what happened yesterday.
emily: thank you for being straight up about that. i want to talk about what the street is thinking. i have a chart on the bloomberg showing the number of weeks of selloffs, the number of weeks of gains. so far in 2019, there is more red than green. so, how concerned are you about the cash position -- the manufacturing issues that musk spent so much time talking about? gene: it is a concern because this is a company that is ramping several cars, model 3, model y. they have been open about their troubles in manufacturing. it is a concern for me. when i put that into context and put the higher probability that they raise money this year, i believe that would potentially be negative for the stock in the near-term. ultimately, the concern about raising money and production is near-term. i would quantify that as a one
or two year phase of this investment. the good news for tesla -- this is the reason we still believe in this story -- the curve from gas to electric will be so dramatic in the next decade. we are going from 1% to eventually 100%. i think tesla will survive and prosper in this. the simple take is this quite a roller coaster. it will continue to be a roller coaster in 2019. i think investors who have a longer-term focus should continue to believe in this. emily: i'm curious about your thoughts on another story, the apple spotify dispute. apple has released its response to spotify's antitrust complaint. elizabeth warren is saying they are due for a breakup and have too much power. how does this play into the issue and who is in the right? gene: i think apple is in the
right. i think elizabeth warren is not in the right, probably not a surprise to hear me say that because i believe in tech. i don't think tech should be broken up, but i understand the basic concept of when tech gets too much power. i think an apple's case, apple does not really make any revenue from their app available on the app store. they make some revenue if you look at apple music. emily: that is really what spotify is talking about. spotify said that apple claims apple got more hostile after it released its own competing music product. gene: i think there is spotify and the competitive issue with apple. apple is free to continue to use his own platform to its own good. i don't consider that anti-competitive when you look at the market share being global
in terms of smartphones, 18%. this is very different from microsoft and internet explorer back 15, 20 years ago. i understand why they would be upset on the music, but if you look at the platform, what is more important is the platform overall. i think the platform overall is almost entirely for the good of getting apps out there and apple take their cut of it. emily: what about this apple-qualcomm developing situation where a judge has ruled in qualcomm's favor and there are ongoing suits around the world between apple and qualcomm. in this particular case today, the ruling favored qualcomm. gene: this has been a war of words. a legal war too. big judgments, lots of appeals on those judgments. hard to say how much money has changed hands through all of
this. i think the simple takeaway is this -- the language is very strong between the two companies. ultimately, i think apple will find a way to work with qualcomm on the next 5g radio chip. down the road, apple will want to do exactly what they did with the a4 chips and their a-series of chips. they will want to vertically integrate this. from my perspective, there will be a few billion dollars changed hands here. this is giving apple even more confidence and motivation to vertically integrate around a competitive qualcomm chip. emily: well, qualcomm touted this as a victory. apple meantime continued to say that qualcomm's ongoing campaign is nothing more than attempts to distract. we will continue to follow those other suits happening around the world. gene munster, good to have you with us. thank you.
♪ emily: former twitter cfo mike gupta has joined plenty, a softbank indoor farming startup. plenty is creating vertical farming technology that could produce more in any given area than conventional farms with only a fraction of the water. gupta previously helped to take gaming company zynga and twitter public before leaving in 2015
and joins plenty as cfo as the company has plans to grow in china and the middle east. let's turn to another story we have been following all week. tpg has fired bill mcglashan, founder of the firm's social impact fund, after being charged in a wide ranging college admissions scandal. "bill mcglashan has been terminated for cause in his position with tpg effective immediately. after reviewing the allegations in the criminal complaint, we believe the behavior is inexcusable." mcglashan, who said he sent a letter of resignation, led the business focused on social good. and, also recruited several high-profile investors to be part of this effort, including u2's bono and john kerry. sabrina willmer joins us from new york to discuss. there were some interesting back-and-forths here where mcglashan resigned, tpg said he
was fired. mcglashan release emails about the back-and-forth. walk us through why this is happening. sabrina: mcglashan basically preempted tpg's firing if you look at the email exchange his p.r. sent us. mcglashan sent his resignation at 1:02 p.m. yesterday and then tpg's co-head of the firm sent an email at 2:03 p.m., and basically sit on behalf of tpg, ideology or receipt of the note below. see the attached notice of termination of employment when we were preparing to send you. we will the in touch to talk about the economic consequences of your termination. he responded i am perplexed by your attempt to terminate me because as you acknowledge, you
had already received my resignation. emily: mcglashan is already in a lot of trouble here. why is he doing this? is he trying to build some sort of case that he deserves some sort of financial compensation despite being terminated for cause? sabrina: it is hard to tell since i have not talked to him. usually for cause means you probably won't get a certain compensation when you leave. i guess that would make sense from his point of view to try to preempt it. emily: what does this mean for tpg and tpg's investors? like, how big a blow is this given the size of the fund? sabrina: i think it is more an optical think because a lot of tpg's largest investors are big public pension plans and they hate negative news and headline risk. tpg brought in jim coulter, the
cohead of the firm to take the place of mcglashan the interim. i think investors feel ok about that. they do have a really big team. more than 100 people on the growth equity and social impact team. i think that that is fine. it's just optically, it does not look good. so, they are in the middle of raising this social impact fund. it will be interesting to see how pension plans will react, whether they will pull their investment. usually money gets locked up for 10 years and lp's cannot get out of the investment. it is an unusual move for tpg to offer this. emily: we will see how many investors decide tuesday. sabrina willmer, thank you. that does it for this edition of "bloomberg technology." we are live streaming on twitter.
all of you. how you live, what you love. that's what inspired us to create america's most advanced internet. internet that puts you in charge. that protects what's important. it handles everything, and reaches everywhere. this is beyond wifi, this is xfi. simple. easy. awesome. xfinity, the future of awesome.