Skip to main content

tv   Boom Bust  RT  November 30, 2019 9:30am-10:01am EST

9:30 am
i've heard an insurance policy will something that occurred in your background influence that decision now in my opinion what they're building is what i call a lifestyle index which is where they take your medical records and integrate it with all the other information google has ever collected upon you and determine whether you're a good insurance risk or not which is really going beyond what that information was meant for so not only is there a civil rights component to it but there's also a question of did it violate hipaa i say yes no one knew about this situation this project nightingale until a whistleblower within the project basically went ahead and dumped this information on dailymotion and so what happened is there was no information that individuals records are being used in this manner no consent which violates hipaa and i would also say that the way that the information was used and stored and the security protocols around it certainly did not meet him yet that was the question is going
9:31 am
to follow up with what we know to rebut the big question has to be whether or not consent plays a role here exactly so right that's the other big issue here and that's the other issue of a company like a google having access to so many records and connected social media health care history fitness all of these patients so are we looking at a potential future where insurance companies can access and these data and analytics and determine whether or not people can get coverage or deny claims well let's try and make this a simple as we can for everybody who's trying to understand what is incumbent upon us as a doctor or a provider when it comes to medical records that's the data that we collect from the patients yes doctors or doctors offices are allowed to share certain data but according to the 1996 law which established how that information is shared it has to be for a medical function in other words christie you go to see your doctor she gives you . a referral to see another doctor and then your doctor send your data your file
9:32 am
over to that doctor because there's a reason for them to have that i see no reason in the world where someone would be able to use that dict of that information from that law to then say well we're going to share with google which has nothing the hell to do with medicine so that's the real problem here this information is sacrosanct and it's being shared with a company which shares and collects data but then illegally why they are structured as a business associate sanction its stock has a business associate to google does that technology provided that it's assisting doctors with the information so they're an independent party so that is a claim that there are no and i get that but the point is and i think that you make a very valid point there the point is this 96 law the way it's structured say just because they're your business partner doesn't mean you have the right to share information they have to be a business partner that then allows you to share because there is a specific medical function that's going to be served for the patient not for the
9:33 am
business for the patient and i think miles would probably agree with me that seems to be what's missing here well and then miles to that point i mean isn't the part of the issue here the fact that we're looking at google not just delving into this part of health care that they're in right now right but they're developing a i with this and i don't want this to be missed in this discussion because the ai component of it goes far beyond just collecting and processing information they're actually building a new business model a new business tools of the future right so here's the question is with this a i do we get to the point in the future where there there are machine learning that's telling doctors what they can and cannot do let's say the doctor doesn't follow that and now health insurance companies say we don't have to pay on that claim because you didn't follow what the machines told you to do. is. absolutely there are not enough health care professionals to cover the medical needs in the
9:34 am
united states in fact health information data doubles every 73 days and you're going to use a to do diagnostic you're going to do this to develop tools to help doctors absolutely but what's going to drive it is this information this lifestyle index and it's going to determine who gets treatment who doesn't get treatment who gets insurance and who doesn't get insurance and this is what the frightening point about it is not only do i believe that google has violated heppell but they probably violate several state privacy laws as well you know by going ahead with project nightingale and you know can i can i say something here that sometimes gets lost and in this discussion the expectation of privacy that a patient has when he goes to a doctor is at a premium is so important the reason i would go to see my doctor and tell him something very private about myself that i wouldn't even want my family to know
9:35 am
about me is because i have an expectation that that information is never going to get out the damage that we do to the patient and to our medical system if we allow this type of thing to happen is unbelievable in terms of what it could do down the line because it would mean that in the future i may not want to go to my doctor i'm going to get on a plane and go to mexico the next time i need a checkup because you don't have that connection and then the lastly and i know we're short on time here but something miles said about state regulatory laws and rick i want to ask about this right you know illinois has this biometric law that essentially everyone is getting sued for a video dr pepper we were everyone sued under this thing because they don't get consent no one gave consent for this it's been going on for over a year and there's been no consent how do you do you get around the miles will tell you and most practices if you're a provider even if you just have let's say somebody working answering phones for exam. bill that person has to be checked out to make sure that they're not
9:36 am
accidentally sharing information even if it's just wellness stuff so i mean most doctors go through this very rigorous process to make sure something like this doesn't happen and they usually get fined $4000.00 for one violation we're talking about $10000000.00 people here where their data was already filed with reichl this is a major violation i can't see it any other way and miles i think than just very quickly go ahead. yeah and i just want to say there's no private right of action under hippo but you can get a lawsuit under state privacy laws either for negligence or breach of them you know imputed contract so what i'm going to start seeing is i'm going to see some class action suits happening at the state level for negligence against google you can have the state attorney generals come in as well and they're going to pursue this very vigorously as they have been pursuing facebook for the past several years real quick miles before we go i know we're running over here but did google know they
9:37 am
were doing wrong here do you think simply by the fact that they kept it a secret they didn't tell anyone you didn't they didn't even tell doctors that they were doing this. absolutely isn't that your answer that viber they didn't tell them doesn't that kind of you know we don't think there's a reason you don't say something sometimes i think it kind of. guilt but then again i'm not a lawyer. you may have thought it was convenient to have a voice assistant in your home but it might be time to think again a group of researchers in japan and at the university of michigan found a way to take over some of your home's personal assistance from hundreds of feet away this included google home amazon's alexa or apple's siri devices all by simply shining laser pointers and even flashlights at the devices microphones the researchers even open a garage door by shining laser beams at a voice assistant that was connected to it
9:38 am
a recent report shows at least 26.6 of us households own at least one smart speaker and as we enter the holiday season even more will likely have them r.t. america's branch of war and sam filled in on the show and spoke with tech executive and author alex out of are about the possible vulnerabilities these systems and what responsibilities tech giants have to lock down their hardware. so it's actually. out of a long running pattern of people testing out things that are popular you can go way back to the day when you had people using captain crunch whistles to the century tweaks of call phone booths to get free long distance calls now it's a little more industrialized where's the main research teams are checking to see oh ok does this new device work does not how can i crack it because all of these pieces are actually really critical to our overall security as a society going forward well you mention as as a society as we move toward a more connected society with all of these devices and every home and office how
9:39 am
can companies like amazon and google put these devices out that are susceptible to these types of attacks. so i think that it's a really really hard problem because if you think about it there's almost infinite ways to attack and test these devices and it's really hard for someone like a google or amazon to dispute every single one of these i mean human creativity is kind of infinite so 20 laser pointers at it or flashing numbers at it i mean these are just part of the development of technology i think for us to expect that these things will go out perfect is not realistic that said when we see the starts of vulnerabilities pop up we should expect to see them fixed fairly quickly now there's a whole other line along that too which is that behind these things are invariably artificial intelligence and artificial intelligence is still quite brittle so it's still fairly easy to fool in a lot of ways but on that point that you just made there i mean artificial intelligence is all new obviously we can't like you said expect them to go out perfect but there has to be some sort of security measure in place correct alex
9:40 am
because what we're looking at is you know you could affect somebody you could open a garage door that means your house is now vulnerable to somebody being entered so i mean obviously i know you're not saying they don't have any responsibility but what you know you as a former tech executive and see in the rollout or as a tech executives general as seen the rollout of products like this i mean what or what kind of strange stuff especially when you're seeing from the big boys like amazon and google what are they doing to kind of prevent these types of things before they go to market. well i think pretty clearly they're not doing enough i mean i agree with you on that i think that they probably are testing some vulnerabilities in their skunkworks or their labs before they send them out what i think is missing is that when you go look at commercial software for example google has an entire team dedicated towards what they call 0 day vulnerabilities in software that's looking for things that could bring a server to a halt that's not necessarily known out in the wild apple has
9:41 am
a bug bounty program or you know various software companies a bug bounty programmes so they've put in place to try to mitigate or spot these problems in advance different incentive structures that have different testing programs now things like voices that are very early days so i think that we still haven't built that out yet and i think it's incumbent upon google and amazon and all these other folks to now go and say ok we need to put in place some more robust both internal and external testing programs to incentivize and create a market for finding these types of laws well alex do you think i mean these laws like you said are inevitable are we ever going to really be able to contain 100 percent of this i mean over 25 percent of u.s. households own one of these smart speakers is this something that really we we should should be doing relying on these or is this something that's going to become more common then like i said is it ever going to be 100 percent containable. so no it will never ever be 100 percent containable just as software bugs have not been
9:42 am
100 percent containable over time you will see systems that hopefully will be more resilient and more secure i think that that's a reasonable ask but in terms of saying we want 100 percent it's never going to happen in the same way they are never going to have a car that's going to not break down 100 percent of the time or you're never going to have a plane crash ever things will happen that's just part of complex systems now that being said i think that there's lots and lots of areas that we should be more we should build in better ways to absorb failures. as we start to see more of these devices in us households become more important part of our infrastructure of how we interact with finances of how we do all these things in our lives now i think the other part of it which we should just get used to is that failure will tend to be more catastrophic so when things go really wrong they're going to really really wrong and you've seen almost the same thing in airline crashes you know where now when you see airline crashes almost everybody dies as we get towards things like automated cars or when the systems are driving or were they have more control in
9:43 am
our lives or these speakers the vulnerabilities will be worse the information exposed will be worse and i think that's part of the tradeoff of heading into a world where we allow these systems to take more and more responsibility and more control. time now for a quick break but hang here because when we return as technology continues to speed forward should society tap the brakes long enough to sort out the developments we spoke with you on all stolen out of copenhagen denmark about the growth of microchip in humans what role the controversial practice to play in a capitalist society. so what we've got to do is identify the threats that we have it's crazy for him to let it be an arms race of his own often spearing dramatic development only mostly i'm going to resist i don't see how that strategy will be successful very critical
9:44 am
time to sit down and talk. more industry is based on greed and that greed is based on this. rush to cumulate as much paper wealth as possible even though it's not genuine wealth spot actual money it's not gold like a warren buffet just hoards of money like an old granny one or phone books and that doesn't credible damage because of being called it's into the population this notion of holy. who's still got us into a tirade indeed let's move. on . the records are cool to discredit and i knew you.
9:45 am
could google them if i think about you think a lot of them want to go. to moscow. if you don't have them do they need to get to be a good tone. the number 9. on the thought that you fire going to determine who was in your.
9:46 am
facebook is under fire yet again this time for invading is members privacy think it might since kind of a theme with facebook and privacy turns out the social media giant could have been sneakily using your i thought camera while using the facebook app at least that's the case if you're using it on an i phone artistry of the job as filed this report to break it down i believe there's the where it appears that a system bug is allowing facebook to access users i phone cameras while they're using the facebook app scrolling through their feet the news now sparking fierce reaction. numbers right facebook users took to twitter saying that their phone cameras were activated automatically whenever they were using the facebook app that's an invasion of my privacy i mean that this is social media we don't see that one user joshua maddux tweeted out this video showing that when he scrolled through his social media feed the app would actively use the camera he said found a facebook security and privacy issue when the app is open and actively uses the
9:47 am
camera i found a bug in the app that lets you see the camera open behind your feed note that i have a camera pointed at the carpet earlier this month several other users spotted the bug one user posted this. owen said today while watching a video on facebook i rotated to landscape and could see the facebook instagram story you i for split 2nd when rotating back to portrait the story camera at you i opened entirely a little worrying facebook confirmed a bug in the latest version of the i o. s. up the company said in a statement we recently discovered that version $244.00 of the facebook i.o.'s app would incorrectly launch in landscape mode in fixing that issue last week in v 246 launched on november 8th we inadvertently introduced a bug that caused the app to partially navigate to the camera screen adjacent to news feed when users tapped on photos although the company submitted the fix to apple users are still outraged some even considering closing their facebook
9:48 am
accounts yeah i might be thinking of closing my account i don't like my rights being violated it makes me want to shut off facebook i don't want anybody getting it's my private information like that that's invasive absolutely not meantime until the fix is completely approved experts are warning users to revoke a camera access for the facebook app until the update is available now this is just the latest scandal that facebook is battling as it tries to win back the trust of its users after a series of privacy scandals that have occurred in recent years including eavesdropping on its users through phone microphones in order to better target advertising all of the company has repeatedly denied those allegations reporting in the orchard of each of us are to. imagine showing up for work and instead of swiping a key card to get into the building you simply just wave your hand and
9:49 am
a microchip inserted under risk and gave you access for some people in sweden they don't imagine this life it is their life thanks to a company called bio hacks might bio chip a has taken off as an increasingly common alternative to the use of key cards and take it after being. injected with a syringe between the thumb and the forefinger at roughly the size of a grain of rice these chips are designed to render users' lives more convenient by replacing the need for cumbersome amounts of cars pass codes and signatures we were joined by the founder and c.e.o. of bio hags yawn arced all stolen all the way from copenhagen denmark to break this down we started the conversation off by asking just how common it is to see microchip being in sweden. so i'd say it's per capita more common than any other place in the world most people use it for single sign on on computers to. rid themselves of passwords jim axis trainer. close to payments. smart office smart home personalisation i mean were
9:50 am
were do pay men and x. couple weeks. which is going to relieve anyone from having to handling money i mean paper notes is extremely cumbersome for any nation state and just risk clutter so. it's pretty incredible technology can you explain a little bit about how the technology works because obviously people lose their jobs they move to new apartments they change their gym membership some people actually go and some people just fake it but what is the process for updating that microchip once it's already inserted in the skin so be. it be kind of if you had to replace the chip they were once in a while and every time you you move or whatever all you have to do is you become the ruki instead of you having to handling 101520 keys and disconnect the tokens so the only thing you do is change instead of getting their
9:51 am
card you supply them with your key and they facilitate that in their system. in the duration of you being a member of the same thing with a job i mean you load credentials onto a card by a system so if. someone changes job you just remove them from the system but here we make the use or the room key so the user is the distributor of their keys rather than the other way around. so i felt they had basically flipping the entire model which is very interesting now in 2017 there was an attempt by wisconsin company to implant these chips in their workers the company 3 square market didn't receive a very positive reaction so have you found that in other countries specifically in the us that there is a picture of the lowly worker being chipped. monitored and being tracked. i mean
9:52 am
you can't escape the fact that. lean towards 84 valley and dystopia and i mean who wouldn't think so because everything we've ever seen microchip or implants featured in a movie or or anything else it's either polonium tracking device or explosives which. it's all the information we have or had i mean. this kind of flipping it by putting you in control and making this tiny thing a part of you making. i mean you dictate preferences on share and you dictate everything about it so it's not employee employer saying employee get this chip it's like 99 percent is the other way around an employee saying why don't we use
9:53 am
this bad us technology in our workplace because they do and. i mean the the media. does portray it and all the different ways but i'd say it's a good thing being scared of technology because if you are you you get informed so i think i'd rather. scare people. and make them inform themselves of what it actually is and how it works and then the other way around it is interesting how they become as you said the root of all of this and they share that identifier but the company name of course is what about the issue of hacking and i'm sure a lot of people are concerned about this as well because these chips do release a signal essentially and i believe it's in enough the signal but is it possible for hackers to read those chips separately from their intended use to pull information from them to change them how do you avoid that issue of hacking. so nothing is done
9:54 am
hackable says get that straight right i mean the only what our primary goal is basically making everything just stupendously. boring to try and get it so the actual hack would never reward itself once you get to the goal because you'd get an extremely compartmentalise set of bits so i mean less or a nation state or a couple of nation states going together to try and pawn block chain. you know there be no reward. the end of that so. you just make it boring and you know rude no big reward at the end of the road that no one's going to hack it. so what do you think of this because this actually this interview completely changed my mind on what i thought it was all about i was compelled to
9:55 am
change your mind i was completely against it before because i thought it was such an invasion of privacy that companies would be able to track you and everything like this but the way that he implemented so with you being the root of it all and actually makes it so that you are the one giving access to all of these people not the other way around yes so i do like the fact that it kind of flips that whole idea of being microchip on its head however it doesn't eliminate the reality of being tracked right so everything he talked about which was very interesting doesn't change the fact that because it is emitting in the signal and because you are at the root there's not to say that employers can't still use it to track that number or that it did a fire or governments track that number or did the fire out where you are and where you've access so again look at make since you know he used the term orwellian i think if you look at it from an orwellian point of view if you have the technology governments will always look to find a way to use that for their benefit not for yours so whatever governments do you think would actually be able to use it because every single government here in
9:56 am
specially in europe or in the united states they have extremely different laws concerning user data and user privacy so saluting who do you think would be more acceptable for this kind of technology government through your example before i don't think any government keep government out of my microchips at least as a teacher i'm pretty good but either way i just got to keep an eye on it so. well that's up by this time you can judge by must on directv channel 321 dish network 280. 7 on clarity the 79 or as always hit us up at youtube dot com slash boom bust r.t.e. we'll see you back here next time. is
9:57 am
your media a reflection of reality. in a world transformed. what will make you feel safe. isolation or community. are you going the right way or are you being
9:58 am
led so. direct. what is true what is faith. in the world corrupted you need to descend. to join us in the depths. or a maid in the shallows. young elephants have come to us off the. west basically brutal budging incidents because sadly the baby elephants often do see their mother as not only be killed but also be cut off and butchered. i do believe the elephant smile i see it in his little ones they all say surely express some changes. in. the.
9:59 am
body learn it and then belong to me some of us the song of the most new name put a monument to you stuck in the middle of the screen because lucy disconnects it thought the. for normal coach. donna suspects ill and. message. to infant daughter don't you know little.
10:00 am
talk with. carnival for much of the. amount suspected of stopping 2 people to death in the british capital is revealed to have had a previous conviction for terrorist says the suspect in a not the stopping incident in the netherlands remains at large. us democrat strategist who was up for brock obama are accused of creating fake local news in a bid to sway the 2020 lection in crucial swing states. and job needs a class that the mark when a crisis is officially over because suggest integrating new promise to the country hasn't been the greatest of successes.


info Stream Only

Uploaded by TV Archive on