tv Press Here NBC July 8, 2018 9:00am-9:31am PDT
part by barracuda networks, cloud-connected securityd in and storage solutions that simplify it. scott mcgrew: this week, we delve into car technology. what safety device makes your car the most valuable on trade-in? we'll ask the guy who literally writes the "kelley blue book." plus, christy wyatt tries to find some good in all that data that corporations collect. all that, and venture capitalist hemant taneja, with reporters laura mandaro of "usa today," and joe menn from "reuters," this week on "press:here." ♪ scott: good morning, everyone, i'm scott mcgrew. we're gonna start with the story of a close call.
this is bill ley who was driving his car you see there, when a pedestrian suddenly jumped out in front of him. bill ley: then the car flashed two beep in front of me, and then it just applied the brake for me. scott: now, the car stopped itself, nobody hurt, nothing happened, it's really the least exciting story ever. that's kind of the point. the lack of an accident, thanks to the computer in the car, kept two people's lives from changing for the worst. now, this idea, to prevent accidents before they happen, also saves money. an insurance group took a look a 327,000 hondas equipped with collision warning and automatic braking, and discovered insurance costs fell by 11%, injuries fell by 28%, and those are relatively inexpensive cars. this technology is now affordable. karl brauer knows everything there is to know about cars, he's the executive publisher of "kelley blue book" here to
answer all of our questions about cars this morning, joined by joe menn of "reuters," laura mandaro of "usa today." for the people who possibly don't know "kelley blue book," what is "kelley blue book?" karl brauer: so, "kelley blue book" is a consumer website and car-buying resource. anything you would ever need to do to buy a car and have knowledge to do it with confidence, "kelley blue book" will provide. scott: so, when people say the blue-book value, they're talking about the thing you write? karl: exactly, and it's a 90-year-old brand that's been around and helping people for a long time, but now it's a full-service website, too, that helps you do everything from finance and test drive, to locate the dealer, and know exactly what you should pay. scott: alright, "kelley blue book" man, tell me about-- let's talk with safety features. what's the one thing i should get in the car that's gonna increase its resale value? i'm guessing its emergency braking. karl: you know, emergency braking is gonna absolutely help you in terms of resale value, it's also just a good thing to have, like you said, to keep your life and other people's lives from being changed. scott: okay, you're right, i did start it with, you know,
what's gonna be best for my wallet as opposed to what's best for humanity, but also, is it going to be pretty good for my wallet? is that the feature that maybe i should, when i want to sell it, show off? karl: it is, it is, and isn't it nice when those two things are the same thing, right? but i think the important thing to keep in mind is that this is coming rapidly. we're seeing this become affordable, as you mentioned, and we're seeing it become very effective and kind of strung together among multiple types of technology. it's not just emergency braking, it's lane-keeping assist, it's collision warning, these are things that can help anyone drive more safely, and when you look at the statistics, they're going the wrong direction. we're seeing them go up in terms of fatalities and injuries, and when you quote the numbers you just quoted, honda's actually seeing them go down for their own cars, so it's pretty impressive to see a company coming down in an environment where it's going up, and it really speaks to the effectiveness of this technology. laura mandaro: so, i had a question about, sort of, the consumer perception of these driver-assist, or it could be,
like, called auto-pilot, or pro-pilot. consumers are sitting there in their living room looking at these ads where the driver takes their hands off the wheels, maybe looks over here, it seems like the car is driving itself, or really in some tricky situations. then if you talk to the company, or look at the manual, or look at the fine print, it's all, like, driver has to be monitoring, and hands near the wheel, and they seem to be sending two different messages. do you think that these auto makers aren't really going far enough, or actually, sort of, putting people in somewhat more dangerous situations than they should be because the advertising is giving this image of a partially self-driving car? karl: well, it's a great question, and it's true that you're seeing this kind of dichotomy between what the manufacturers want to say, which is, "our cars are super-safe, and they can keep you safer, but still drive safely and don't assume the car's gonna save you no matter what happens." and it's a fine line to walk, and we've seen, in the last
couple years, a lot of manufactures have to change either terminology, or change the nature of a commercial. there was a mercedes commercial that actually had to get changed kind of last minute because i think the broadcasting and the safety advocates looked at it and said, "this is sending a bad message. you're telling people the car is gonna do things that, even if it can do, it shouldn't be encouraged to do, and consumers shouldn't be expected for it to do, that's dangerous in and of itself." joseph mann: is this why the injury rates are going up even though the technology is improving? because we're in that awkward transition phase where we're taking too much for granted, we think it can do too much. karl: i think the injury rates are going up because of just general distraction. i think we are in an awkward period where there is information being sent to the car that used to never come into the car, and there's all these opportunities to be distracted that we didn't have even five years ago, let alone ten or 15 years ago. i don't think these technologies are really at fault. there may be some percentage of people who are a little more careless than they would have otherwise been, because they think about this technology, but i think that's the minority.
i think the bulk of the rise is just general distraction, and in which case these technologies are doing more help than harm, because many cases, they can take a distracted driver and give them an extra second to think about something, or an extra nudge, literally, to keep 'em in the lane when they're about to drift out of one. scott: as a car buyer, i don't wanna buy technology too early. the first time a plugin hybrid, the ford fusion, which i really thought was cool, i leased it, because i thought to myself, "you know, if i buy this thing, by the time i have it paid off, people are gonna roll their eyes at the technology, 'cause it will have gotten so much better." what technologies, perhaps plugin hybrid is it, i don't know, are a little bit too early to necessarily jump right on right now? karl: you know, that's a good question. i mean, i think we see evs, for instance, that struggle. they are one percent of the market, and they have a terrible resale value as someone from kelley can tell you, and it's because things are changing so quickly
in that market. and you know, if you have an ev now with an 80-mile range, you're already old news, because you should have at least 150, that's pretty much the new standard now. so, it makes it tough for ev buyers to want to buy a car and not feel like they're gonna end up having a bad long-term investment. that's why so many are leased. the overwhelming majority of evs are leased, and really, that's the smarter way to go. but then you look at something like honda sensing, where it has this kind of network of safety features strung together, and no matter what happens the next ten years, those things are better than not having them. it's gonna always be better to have something that might stop the car and avoid hitting a pedestrian than if it hadn't had that technology. scott: is the best ev to buy, if one can afford it, with resale value in mind, the tesla? karl: resale value on tesla is better than the other evs, so, at this point at least, that would look like the better one to go with. you gotta start at $70,000 plus. scott: you know, if you're buying a $70,000 car, maybe resale value is not-- laura: if you can get one. scott: right, if you can get one. karl: if you can get one, exactly.
joe: well, i'm curious about the volatility. i mean, normally you wouldn't think of, you know, "kelley blue book" value of a car, you know, being one of these jagged lines like the stock market chart. tesla, who knows, i mean, it could, you know, whatever accident was in the news last week might impact what people are willing to pay for it. have you actually seen it go up and down as public perception waffles? karl: believe it or not, the stock moves around much more than the resale value, which is good for people who own teslas, or are thinking of buying one. i think you have a high demand for that vehicle, and, you know, supply and demand will always make that decision. and honestly, i think if tesla had something drastic happen tomorrow and they were really reduced in their ability to produce cars, or if something even worse happened to the company, i think the cars would still have value because they're these attractive cars that got a long range, and there's a huge fanbase out there. scott: isn't the other possibility with tesla, and i don't know, but the appeal of tesla is mostly range. i mean, they're cool looking cars, and they've got cool technology, it's the range, you mentioned, you know, 80 miles on a leaf, and 300 on a tesla, or whatnot. how soon until bmw, or mercedes,
or one of those tesla competitors catches up on ev rage? karl: twenty-twenty, in the next two years, you're gonna see about four or five, minimum, new cars at the market that are gonna have at least 200-plus mile range, and they're all gonna be from well-known brands with big dealer networks. scott: can tesla survive that? karl: we'll see. [laughing] scott: alright, i got one more question, i'm up against the commercial, but i got one more question for you. karl brauer of "kelley blue book," what was your first car? karl: my first car was a 1969 plymouth gtx with a 440 in it. i grew up a muscle-car guy. laura: do you use something like an auto-pilot? do you let the car take over in that sense? karl: i trepidatiously will use auto-pilot. i will use auto-pilot, and i sometimes struggle with whether it's actually reducing or helping my driver stress, because using it causes me a certain amount of stress just like it's supposed to reduce. scott: karl brauer is a muscle-car fan, and the executive editor of "kelley blue book"
as his apology tour. data has become a major topic for consumers and professionals in silicon valley, but it's not just facebook that knows about us, google, obviously, but also a company that makes toasters. i looked at their website once, and they will now not stop trying to sell me a toaster on every webpage i visit. christy wyatt's company, dtex systems, looks at data and how it's used as part of its duty as a security company. she joins us this morning. i think what people don't realize is data follows them around all of the time. we are constantly generating data from what light switches we turn on, if we have a networked home, to when we turn our nest thermostat, to when we plug in our ev car. christy wyatt: you know, i meet someone, i remember their name, i write their name down, i put their name in a system, now it's data, right? and so, it's happening constantly, all the time. the thing that's changed is that data has become a form of currency, and so, with currency comes rules,
and regulations, and risks, and concerns. so, it's a new world. laura: i have a question, since some privacy rules are changing in europe, and some of the u.s. companies are changing their u.s. policies as a result. the one part of these privacy policies that we constantly find covers sort of a wide range of activity that, for the consumer and a report is always hard to figure out is the one where it says we may share your information with third parties to make the service better for marketing, and it's just this kind of blanket statement, and hides a nest of things. and do you find working with companies that, i mean, what is the gamut there, and as a consumer, i mean, it kind of seems like almost any app has that, you're sort of out of luck. christy: yeah, and, you know, i think the world is changing, right? so, it used to be that was the fine print at the bottom of every application.
if you use an android device, you click through it really quickly to get past it, and then you kind of get into what you were trying to get done. when i say the world is changing, i think people are becoming a lot more aware of what those entities mean by we'd like to share your information. what it means is, we'd like to instrument you and let people target you with advertising. and so, now we're starting to see sort of an evolution of people becoming more aware of where their data's going, having--forcing some level of accountability, both to the company as well as the third parties that are accessing that information. and whether we're talking about the regulatory landscape that's emerging with things like gdpr, and there's others, it's happening in the u.s. as well, or whether we're just talking about the consumer conversation, how people are responding when they see their data going in a direction that they didn't expect it, i mean, this is all a part of the maturation of this data currency that's happening around us right now. joe: it is still early days with gdpr, which seems to be, like, the big gorilla here. it's hard for me to imagine that that's gonna turn off
the 2030 tracking bugs on every website i visit. is it going to, sooner or later? is that really gonna get whittled down, or is this just gonna be sort of blanket check boxes, yes, i understand, you know, once, and then i don't say it again? christy: so, that's complicated. first of all, i'm loving getting all of these emails asking me to opt back in, because it means that the ones i don't respond to will hopefully go away, but you actually have to be very, you know, pay very close attention. you have to read them, they're already starting to imbed fine print at the bottom that says, "and if you continue using our service, then we'll just assume that you just opted in," and so, you know, they're going to continue to try to be clever. i think what's going to be really interesting to watch is how-- where the accountability comes in. gdpr is really the broadest, because it's across europe, right? it's really the broadest implementation of a regulation in this space, and it actually has teeth to it, right? there are actual financial penalties for companies that don't protect your data, that don't protect your privacy.
laura: if you're in the e.u., though, i mean-- christy: it doesn't just apply to--if you are a u.s. business, and you have employees that came from the e.u., if you have customers in the e.u., if people are taking your products to europe, or you have europeans coming into the region and using your service while they're here, so i think it would be a mistake for a u.s. business to look at gdpr and think that that doesn't apply to them. in fact, i think they should see it as a early warning for probably one of the more mature frameworks we've see. because a lot of different countries have experimented in this area, but you're going to see more, and you're going to see them get tougher. scott: even small companies, small websites that sell, but do sell into europe, those sorts of things. mom and pop operations need to follow these new rules, or they're gonna be in trouble. christy: absolutely, and you have to be very careful. i think if anyone went to the security show, rsa, there was thousands of companies claiming they could make you gdpr compliant out of the box. gdpr is actually complicated, and it actually takes some work, and there's multiple paths, right?
it's about protecting privacy, it's about being held accountable for protecting the data, so if i have user's data, am i taking appropriate precautions to secure it, to be able to know where it went, who touched it? so, it's much more complicated. joe: are any companies, in security or elsewhere, just collecting a lot less because of this? or is that just a fairytale? christy: i haven't necessarily seen a change in behavior from what people are collecting. i do think what we'll see first is a change in behavior around transparency. i think that the tolerance for end users and consumers is going to be far lower for the fine print at the bottom that just sort of says, "well, i put in a check box, and you opted in." i mean, everything that entities did with facebook data, or that they do with google data, or apple data, people have given permission for, to a certain extent. but you have to dig around to find it, and that's the part that needs to change. scott: i'm gonna jump in front of you,
because i can squeeze in one more question. [laughing] evaluate one idea for me, that is i know facebook has my data, i know google has my data, what if there were a neutral repository of my data? and much like, you know, you look on facebook, and it says, you know, "the following apps can access your facebook," or access your twitter. the following people can access your data. i typed it in, i'm this old, i live at this place, and this zip code, i'm interested in that, and i can revoke that at any time, or i can go in and change it, or just say, you know what, i don't want anybody to know my age anymore. is that a thing we could do? christy: so, i'll break that into a couple of different pieces. some of those facilities exist within facebook, within twitter, if you haven't already done it, it would be fascinating-- scott: within twitter, but i'm talking about like a, you know, everybody in the world just accessing scott mcgrew's data point. christy: there's a huge market opportunity there, and i think there's a number of different companies that are trying to figure that out, not just because it protects your privacy, but because, also, these companies are making billions of dollars on monetizing your data,
what should be some percentage of that going back to you? and so, i think that there's an interesting solution there that we haven't yet found. scott: christy wyatt is with dtex systems, we appreciate you being with us this morning. well, up next, a silicon valley venture capitalist says his whole investing philosophy is based on one thing, when "press:here" continues. ♪
scott: welcome back to "press:here," if you have listened to a song on spotify, or booked a ride on lyft, or rented a room on airbnb, you've been using the same computer system all along. all of these companies depend on amazon to run the backend instead of relying on their own computer systems. amazon web services, aws, means any small developer can access the computing power of a company the size of netflix. netflix uses aws to send you movies. meanwhile, if you want to build something, you don't need a factory, companies like flextronics will build it for you, you just send them the design. microsoft uses flextronics to build its xbox. the idea here is we're at an inflection point. the little guy can rent his way to the top. hemant taneja is the guy who brought that to my attention, he's general partner at general catalyst, an early investor in stripe, and snapchat, only guy i know with five degrees from mit. he's got a new book, "unscaled, how ai and a new generation of upstarts are creating the economy of the future."
i have written down what you have said, and that is, "my entire investment philosophy is centered around unscaled." so, you better explain unscaled. hemant taneja: sure, so, i moved to silicon valley in 2011, and the pattern i started noticing was with founders are going after reimagining major pillars of our society, education, finance, healthcare, insurance, you name it. they used to sell software through these industries, they could never actually take on the idea of actually replacing major industries that have been around for a hundred years. so, the core thesis behind this book s that scale has actually run its course. are we happy with our healthcare system? are we happy with our education system? are we happy with our banking system? two-thousand eight will tell you we weren't. are we happy with the government? i'll let you guys ponder that. so, the reality is, all these systems that have been-- that have done great service to society actually are reaching a breaking point. meanwhile, you've got all these founders that are reimagining
these industries and succeeding. i think they are succeeding for a simple reason, which is data and ai, and as you were saying earlier, because they can rent scale. and that is the pattern that i saw that led to writing the book. laura: so, what's your take? i mean, these days, you could say big data and ai, there's kind of this--you know, nervousness because we assume that a couple big players, just like we've seen with the internet, are gonna win that game, own, you know, have all the data, be able to kind of profile, surveil. joe: it's just a different kind of--somebody else will have the scale, but there'll be scale, and it might be scary. hemant: so, i think there's two core points in there, one is, you know, this technology is so profound, do we use it in a positive way or a negative way? most technologies are neutral. if you're staring at the internal combustion engine today, it was 150 years ago and somebody said this is gonna cause climate change to you, how would you have used that differently? i think ai and data is that same kind of--does that same moment
in time, but we're looking at just profound technology which has fast feedback loops. we've already seen it create mishap in elections and other areas, right? how do we wan to use it responsibly? i think the second point is that is it going to be a technology that's only available for these big companies and expand on their monopolistic advantages, or are we gonna make it available for the larger ecosystem? so, i do think those are the issues where are we gonna self-regulate, or are we gonna regulate or way to solving for those, but those are two fundamental issues in front of us, around how data and ai are gonna be used. scott: hemant, a lot of ai companies come to me and say, "hey, we want to talk about it," and i'm sure they come to you, "hey, we want funding." how do you know, is there a way to know, without five degrees from mit, what company really is in ai? i mean, because ai has become this thing, like big data, like the block chain, where it's everybody wants to be the block chain of big data and ai, how do i know a company has got ai and it just hasn't got a computer? hemant: well, the first thing i ask, if it's a company that's been around for a while is, when did you start using
the word ai in your company? because a lot of folks that have been around for a while, you know, were using traditional data technologies, but ai's a hot buzzword so it ends up getting used. so, that's my first filter. for new companies, i think it's very much about how are they using the data that their application or solution is generating, and how is the system learning to hopefully service the customer better and better in a more responsible way? so, that's what we look at to say what are the underlying technologies that they're applying the data to, to get to the bottom of that question? laura: well, i mean, if--do you see ai being something that's more off the shelf, that, you know, any small company, i mean, even like sales force is using, or touting ai in it's software, so that's opening it up to lots and lots of companies that use sales force. so, i mean, do you think it's, you know, likely that a couple years, this is gonna be pretty standard, everybody's gonna be using it to some degree if they're running a business,
and horse has already left the stable at that point, who is setting these standards for some kind of responsibility about all that data, how it gets synthesized? hemant: i think that's a great question. there's only two ways to do this, either these large technology platforms that have all the data, and frankly are using that to develop ai, are going to embrace self-regulation and demonstrate transparency, accountability, and explainability of ai, or you're gonna get regulated, in which case, by the way, as you saw in the senate hearings, our government isn't well equipped to regulate ai either, so where is the department of artificial intelligence that's going to actually then come and regulate? so, it has to be one of those two models where we have to head. scott: hemant, with 30 seconds left, let me ask you just real quickly, now that you don't need to own the stack, you can rent the computing power, you can rent the manufacturing, what's the number one industry or thing a young entrepreneur should be chasing?
because he or she doesn't have to own all that stuff. hemant: to me, a large portion of our economy, 20% of our economy is healthcare, that is where application of ai is going to make the most profound difference in the near term. scott: sure, hemant taneja with the new book, "unscaled," and we appreciate you, a friend of the program, been on several times. thanks for coming back. "press:here" will be back in just a minute.
to "comunidad del valle." i'm damian trujillo. and today, a look back. our sit-down interviews with george lopez, vicente fox, and little joe, on your "comunidad del valle." male announcer: nbc bay area presents "comunidad del valle" with damian trujillo. ♪ damian: we're going to start today with our exclusive sit-down interview with former mexican president vicente fox. this interview was a few years ago, but it's timely given mexico has just elected a new president. vicente fox: i worked 15 years for the coca-cola company. i became president, ceo. i worked 15 years for private business, and i learned there about accountability, honesty, and hard working. and this is the way i have acted in politics.