Skip to main content

tv   The Communicators Data Privacy  CSPAN  August 19, 2019 3:45pm-4:20pm EDT

3:45 pm
>> host: this week on the communication we want to introduce you to daniel castro. he's the vice president of a group called the information technology and innovation foundation. mr. castro, what is that group and what did you do? >> guest: i'm with a think tank. we focus on innovation generally. we we're interested in seeing innovation move forward and how you can have policies support that. >> host: can you give an example of that innovation you are in favor of? >> guest: ten years will not talk about artificial intelligence and all the benefits are about to this new wave of automation. we were thinking about that and think about the policies we would need to have in place to get there. we try to help policymakers get ahead of the curve. >> host: who funds itif? >> guest: we have whole set of funders from corporate donors, foundations and individuals support a think tank. we work on a lot of different
3:46 pm
issues so we get support from those that are interested, everything from ip issues to biotech. >> host: for our purposes on "the communicators" is it fair to say large silicon valley companies are part of your funding operation, google, apple, facebook of the world? >> guest: absolutely. these companies were early supporters because they're interested in this idea of how do we proceed quickly with innovation. >> host: you also run, you are the director of itif center for data innovation, which is what? >> guest: we have research center focused on issues around data. for a long time policymakers realize they had a few different levers and government. they could tax things. they could spend money. they could regulate things. part was to say you could say how do you think about you collect and use data with the government. he should have smart policy to drive different parts of goals you might have.
3:47 pm
if you want to see goals for cancer, -- cures for cancer, strategic policy around data. >> host: do you find that agencies are well staffed when it comes to data protection and data officers? >> guest: we are getting there. one of the first issues our center focused on was the open government data act which we been working on for five years. finally just passed this year. part of the open government data act requires all federal agencies to have in place a chief data officer. they had a requirement for doing this by july and reporting out to omb with a selected august 2. when i last checked this, there were about four agencies that still had not done but most of them had and that's significant amount of progress. you have agencies paying attention to what data they are releasing but also to what data they collect and how they manage it through the entire lifecycle.
3:48 pm
>> host: so the purpose of the open government data act is? >> guest: its to make government gave a payable for use by the public, by corporations, but individuals, for innovation and other purposes. it is also required agencies to be strategic and how they manage it. >> host: the other half of that is the data they collect. >> guest: that's right. >> host: what are they collecting about us, what do federal agencies know about us? >> guest: the open data collection act applies to all government data whether it is the weather data, corporate data or individual data. if it's individual data, personally identifiable, they're likely not going to be releasing it but they are going to track it. different agencies do different things. some collect everything from health information on veterans to educational data about individuals who are applying for grants, to information about commercial transactions that
3:49 pm
still have personally identifiable information in there. >> host: do you believe and maybe this is a remote question that doesn't matter, but should those agencies be allowed to share data between themselves, such as the tsa sharing with social security, et cetera? or should it be stovepipe information? >> guest: there's certain data we do want to protect and keep confidential. so, for example, one of the reasons people generally are trusting the irs even though they might not like it is a no the irs isn't going to take the data and turn over to the department of justice, just to start some fishing expedition. some of those privacy safeguards are important. that said, we do see a lot of problems with stove piping in government. so, for example, there's about half a dozen or more statistical agencies in the united states that are trying to figure out
3:50 pm
how the economy is working, and to some basic questions about that. those agencies are able to share data. the end of up with different answers. they are not able to combine the data for better analyses. they face significant challenges. that's a problem because it's wasting government resources and taxpayer dollars and it's creating less optimal outcomes. one of the of the challenges in this space are government agencies are starting to think about how can we get data from the private sector. sometimes the private sector has much better data and how can we use that data in helpful ways but still treat this data either confidentially or treat it confidentially but still share it across some agencies for very specific purposes? >> host: what do you think the issues are the people would be concerned about of the government getting data that's currently held by a private entity? >> guest: a lot of people have rightful concerns about government intrusion into their personal lives. we've had very strong privacy
3:51 pm
safeguards in the privacy act that protects what government can do in that space. that said, as we enter this new era of much more private sector data collection, there's a question of can we do more? let me give you a concrete example. you have a company like adp that does a lot of data processing for payroll across america. they're going to know every time the company submits their payroll what the state of the economy is. they can see what's changed from the weeks before. they can see if there are fewer workers out there. they can see these types of changes in real-time. that's information that can be useful for policymakers as they are trying to respond to a potential downturn in the economy or respond when to thinking that what should the monetary policy be. it's a very legitimate question to say, can we continue to have the long established protections of how we want to treat citizens while recognizing that the government doesn't always have the best data and maybe
3:52 pm
sometimes we need to go to the private sector for that? >> host: on a different note, and perhaps a darker note, should equifax a company like equifax be allowed to share their data with the federal government? some people would be very uncomfortable with that. >> guest: equifax is i think an example of a company that sat a lot of challenges and a lot of americans are probably upset with them. probably a lot of americans didn't know that company a year or two ago and then they get this announcement that not only has there been this massive data breach but they are us by company that never heard of. that's a problem. it's a problem for a lot of reasons. one, a lot of what we rely on for companies to have good data practices is market behavior and market, you know, companies basically respond to the market. if i'm unhappy when there's a target data breach i can no longer shop at target.
3:53 pm
if i'm unhappy if there's an equifax breach, there's not a lot i can do about that. that's a significant problem. there are certain companies that are collecting data about individuals where consumers don't have a significant amount of control because they don't have a direct commercial relationship with them. there's a legitimate question to ask about what kind of government oversight is appropriate and even when that data should be available to the government or anyone else. >> host: what does the company like equifax currently know about us? >> guest: they are trying to collect data on peoples credit history. they will collect personally identifiable information, where you live, social security number, credit card history, any loans you take an outcome any mortgages you've had, that kind of information. they will compile it in a large database and make it available to other companies that are looking to assess your credit. >> host: in other words, they are selling our information.
3:54 pm
>> guest: they are monetizing it, which is i guess the reason i would be hesitant to use the word selling, when most people think of selling your car, at the end of the transaction if i sold my car i don't have a car anymore and you have that car. when these companies are monetizing the data as i said, they are not turning that data over to somebody else. they are just giving you an answer about this. they are saying this person has good credit or this person is a high risk for a low risk. they are not necessarily sharing that banking information with the other entity. >> host: is it a good system? >> guest: there are parts of it that work really well, right. the parts that work well is we get credit. it's very easy to go and open a new line of credit. easy to go buy a car from a dealer because you can have this information that's available to you. we also have some pretty good protections in place with the fair credit reporting act, if
3:55 pm
there's wrong information. we can get correction made to it. i think the problem we have in this space, there's a few, one is that each state sets their own laws around some of these requirements about things like credit freezes. there are mechanisms in place to make this world safer. you can freeze your credit. you can unlock it. in some states that's expensive to do. that's a problem. basically you have to pay these companies to secure your information. i think that system is fundamentally wrong and should be changed and that something we need to change either state-by-state or get a federal law that would fix it. >> host: we americans tend to be trusting people in the sense until we're not, and then when a breach like the equifax breach happens or the recent capital one breach happened, we get a little antsy about our personal information being out there, don't we? >> guest: i think we do.
3:56 pm
>> host: is there a solution? is it a fine? is it new legislation? i mean, where do we go? >> guest: what we have now is not working and people are getting increasingly fed up with the announcements of here's another data breach. sometimes there is no penalty at all, as we saw with the equifax breach. there was an announcement that you could get ten years of free credit history monitoring or free credit monitoring or you could get $125. it turned out if you asked that $125.80 what else asks, there's only a small pot of money and you might end up with five dollars or something less. i don't think the system is working today. there's ways to change it. one way we can change it is by looking at what people are going after. the reason there's all these data breaches is because attackers are going after certain types of information. the valuable information on things like social security numbers. that is only valuable because
3:57 pm
you can use it to commit fraud. the question we can ask is can we make make that data less valuable? one thing we could do is we could make it so it's illegal to use social security numbers for identification and verification purposes outside of social security. this is something the social security numbers were never intended to do. for a long time it says on the card this is not for identification purposes. they stopped printing that but that something that could be done. that's something that could be a requirement that nobody could ever open and get using a social security number. you have to prove your identity through other means. another thing we could do, and if we did that just to be clear, the recent breaking into all these, all this information would go away. you don't have a tax on date if the date is invaluable in more. something else we could do is also fix what happened after the data breach. right now you get this offer of free credit monitoring. i've had probably five to six
3:58 pm
offers of free credit monitoring. i don't need more free credit monitoring. in fact, there are services now that offer free credit monitoring. capital one in fact, offers a free credit monitoring service before the hack. when they say they're giving you free credit monitoring after-the-fact, they are not doing anything different. no one needs, actually no one needs more free credit monitoring. what we need are other things. one recommendation i have is after a data breach instead of offering free credit monitoring, consumers are offered a whole menu of options they can pick something. for example, they might get a free year of a password management service so they can have better password management. they might get a secure token so that when they want to log into an account they have better security, multifactor authentication. they might be able to get a secure electronic id, and we can create a whole new market for
3:59 pm
security services that right now doesn't exist because people rightly don't want to spend money in this space and is not a market until people are willing to do that. if we start making it whenever there's a data breach we take one big step forward in securing americans on my identities, that would meet with getting closer to something more secure each time instead of this situation we're in now where we have a new data breach, people roll their eyes, we wait six months for the next one. >> host: mr. castro, we talked to kate mancini on this program at cbc. just new book out called kingdom of lies. it's about hacking, and the way she writes, it doesn't sound sitting behind our passwords in our personal computers is really a very good defense. >> guest: it's true absolutely. one of the things that i think
4:00 pm
shocking to a lot of people is that for their security, log into their bank account, that often less secure than what you're using for the email. i know a lot of people who use to factor identification for the email. they get a notice on the phone at they have to prove it's been before the login. when you log into the bank it's just typing in password one to three and they are in. that's a huge problem. ..
4:01 pm
it seemed like it was a configuration error on their end. they made a mistakethat was a mistake that could have been caught . and it was a mistake. mistakes happen, right? and that's not an excuse but at the end of the day, these things do happen. it shouldn't have, they should have had better oversight but it did . they were doing a lot of things right. for example, they had a value program which is one of the best things companies can do which is actively say we will pay anyone who finds a problem with our system. if you finda system, let us know and there's money in it for you, we want to encourage able to find these problems and bring them to us and that helps us track down that particular problem and solve it and theywere doing other things that were right . they had outdated systems, they've done a lot of things right. they had a big mistake . so that's why there's a lot of i think analysis in that
4:02 pm
particular one to see what went wrong and there are other companies i think that they just never invested in security and that's why they're not doing that, they're not getting things right. capital one probably did a reasonable investment, they just made mistakes and that's something consumers are going to have to recognize. these size data breacheswill continue to happen, the thing is what can we do about the data so it's less valuable ? >> how did you get into this line of scholarship? >> my background is in information security so i've been interested in these cases for a while but i recognize that you need to have policymakers understand the issues verywell too, otherwise you don't always end up with good outcomes . >> are threats in the sophistication of the attacks and our protection systems growing exponentially? >> i don't know if i'd say
4:03 pm
exponentially but they're definitely growing. the sophistication of these attacks show that the attackers are bringing significant resources and their very complex, a lot of these involve significant amounts of dedication to find the problem and exploit it. but the problem is it's just really easy once you find that way into a system and to get all that data , to start making a lot of money off of it all these kind of black market sweeps where you can sell people's identity and creditcards. that's also part of the problem . we have to have good cyber law, enforcement of these types of crimes to make it so if you commit thesecrimes, you'll go to jail . you have foreign attackers getting away with these things very easily and that's a problem too. capital one happened to be someone in the united states but that's not always how it
4:04 pm
works out. >> that said, in a digital world, borders are money. >> they are and that's a huge problem and that's one of the reasons that these are international issues, these are global issues. we need to move away from this idea that we can secure just united states or just us consumers or us businesses. we want to address information security and the data issues. it's a global problem and we need to think about global issues as well. it's not enough tothink that we're going to have this relative security where the us will be safe and were going to be able to takedown our adversaries . we need to think about raising all those . >> in a recent article on your website information technology innovation foundation, you co-authored an article, the cost of an unnecessarily stringent federal data privacy law.
4:05 pm
just one of the key takeaways and i want you to expound on this if you would. federal legislation mirroring key provisions of the european union's general data protection regulation or california's consumer protection act could cost the us economy approximately $122 billion a year or $483 per us adult. >> yeah, so right now we're in the midst of this huge conversation about where we have new federal data privacy legislation and it's being brought about by the fact that when europe passed the law and a lot of people are saying that we copy them and california passed a law that might set the rules for the rest of the united states so the question is are we going to look to europe and copy them or let california set the rules of the road or do something different and the challenge in this space is that it's very costly, it can be very costly to do data privacy. doesn't we shouldn't do it. it means we should be strategic about how we do it
4:06 pm
. the report was to start teasing about the different components of what to do in legislation and talk about where the value add is for different ones and how we could construct something that provides significant protection for consumers but keeps the price down. the problem with europe is they move forward with data protection regulation and they don't have the same silicon valley the united states has so they want interested in keeping costs down companies and keeping costs down on consumers, they just wanted maybe the best privacy that money could buy or where money was no cost. in the us we need to be thinking about how can we get privacy regulation at a good value. not at any cost because when you think about these terms in any cost, you end up lowering consumer welfare. with what we want to see if consumers coming out ahead because they have better privacy they still have access to innovative and services and they're not cut off all the things they like
4:07 pm
to use today. >> what did you mean when you say lower consumer welfare. >> if you look at some of these proposals, they would fundamentally change the way the internet ecosystem works today which is that we see a lot of ads in exchange for free services. if you change that, then one of those is you'll see even more ad because these ads will be each ad will be worth less and they will be targeted or you've got to start paying for moreservices . you might start paying a nominal fee or your email service or start paying for more for some free video streaming service, you might pay more for the you download where a lot of them are free right now and that's why if you start asking consumers in survey question about privacy, you ask them just in general would you like more privacy, we all say yes across the board, everyone wants more privacy as they should given the environment and you start asking how much are you willing to pay for privacy and you see the
4:08 pm
answer is very a lot. some people have a lot of money, they're aboutprivacy and they spend $100 a month . and some do. but the vast majority of consumers want to see some kind of trade-off in this space. i want more privacy, but they don't want to pay that much for it. maybe they're willing to pay a little bit more, maybe they're willing to see a couple more ad but they don't want to see a significant shift in what's being done today. i want to see if the ship on issues like data reach when you talk about the types of ad, most people are pretty okay with that. >> mister castro, algorithms come into play at this point okay. >> there's a lot ofdiscussion about how rhythms and the transparency algorithms . and what kind of oversight exists in this space. this is where i think this is an emerging area.
4:09 pm
it's old, we had the vague about a rhythms about 20 years ago when there were about the old systems that were used for flight. there were questions about how you decide which light comes up first because the one that comes up first is one that all the travelagents are going there were a lot of discussions about this . and we do some of those they now kind of again in a different context. >> is coming up? >> one relates to things like facial recognition and lesson about how accurate these algorithms are. how much insight we have and you how different types of algorithms, especially these artificial intelligence, how do we know they're working correctly. how can we explain the decisions that are made by them. how much transparency is there are and what do you when algorithms are very accurate but we don't know
4:10 pm
why. and how we manage thatkind of trade-off . and that's something that i think the context is and in some cases, accuracy is paramount. if i'm going to the doctor, and i'm having an algorithm to diagnose me, oftentimes doctors don't know why they're making decisions. and algorithms often don't either. i personally would rather have an algorithm is 99 percent accurate and can't necessarily explain why in one at 80 percent accurate and can give me thereason . another context, and explanation is necessary so for example when we're talking about certain context, we want to make sure that actors like race for example or regulation having been used discriminate against people need to have more insight and transparency and how thosework . >> there's been a little controversy about race and facial recognition. what's your view? >> what's interesting about that conversation is that
4:11 pm
it's been conflating two different technologies for the most part. facial recognition and facial analysis. facial recognition is a system take a picture of your face, and. your id and say these are the same person a picture of somebody face, a database and find a match. facial analysis is taking a picture and say this person has a beard and this person is smiling or notsmiling . this person is male or female, those types of distinction characteristics. >> a lot of research has been on race, it's been on facial analysis, not facial recognition . there has been a few reports that have looked at race on facial recognition, what we see if there are differences between demographics, so it's light-skinned or dark skin, there's a difference in accuracy and if you are male versus female, there can be differences in accuracy . but it depends on the specific implementation so some algorithms are worse for black females, some are worse
4:12 pm
or blackmail or white male and you see the performance very basic on these contextual factors. what people are concerned about of course is a really use by police in ways that end up over policing minority communities which are important given the ways where having about the important role of policing. i think the problem in the space as we seen couples a move quickly command the technology before they try to figure out how to set up guardrails use it because there are many different types of uses of the technology. i think a lot of people when they think about facial recognition immediately think of this kind of china like real-time rose for pervasive surveillance, security cameras watching your every move when in reality a lot of the uses are seeing things like somebody come in and they're simply trying to identify this person that doesn't happen iv on them
4:13 pm
area or they have suspect or witness and they're trying to match the name and this is something that people, police officers are doing manually and they're doing it effectively and slowly and this is a way to use technology to speed up. i don't think most people would have an objection to those type of uses. so they should definitely be on the table and his other kind of more i think serious uses where people have concerns about the privacy implications of races surveillance, that remedy having the debate but we can't have those debate if you just have city banning the technology. without even seeing what's possible. >> final question, on your twitter feed you identify yourself as pro-copyright and anti-cessna which is the sex trafficking. section 230. >> there was a new law that came out recently that
4:14 pm
created a carveout for section 230, communications decency act which provides liability protection to third-party and intermediaries somebody loading up with content on craigslist, craigslist isn't responsible for what's up there. users who upload it is responsible . >> law cards out that it is relatedto sex trafficking , kind of broadly defined, those roots are responsible so as a result of this law, we saw across the board basically every internet platform ban anyone who might have any type of relationship to sex work, not sex trafficking necessarily because it was so broadly defined and the critique among workers has been that this is significantly hurt them. >> instead of being able to for example that customers arrange online transactions while they're walking the streets and subject to a tax
4:15 pm
or worse. and i think in the past, we saw the benefit of online platforms and online community enabling people and empowering individuals and new communities this particular law i think really a setback and it's because a lot of groups have been trying to i think may online intermediaries more responsible for the content users post and there are some cases where that very appropriate. i think in this case they got it wrong. >> daniel castro is vice president of the information technology and innovation foundation . we appreciate your spending a few minuteswith us here on the communicators . this communicators and all others are available as i pass.
4:16 pm
>> weeknight we are featuring both the program focusing what's available every weekend on cspan2. the next scene is science, journalist andrew bloom explores the resources and you to develop a daily weather. nobel prize-winning biologist ramakrishna on dna breaking down the molecule and science john berger explores the history of ice on greenland. what might be in a 8:30. and enjoyable tv this weekend every weekend on cspan2 area. >> tonight on the communicators, el paso, vice president of the information technology and innovation foundation on data privacy and enough is being done to protect americans from our . >> one thing we could do is we can make it illegal to use social security number or identification and verification purposes outside
4:17 pm
social security . this is something the social security number were never intended to do or longtime even sat on the bar, this is not for identification purposes. but that's something that could be done, that's something that can be a requirement that no life for example open an account using a social security number area you have to prove your identity through other means . >> watch the communicators tonight at the eastern on cspan2 . >> tonight on c-span at night eastern, millennial journalists on the future of journalism area we talk about industry changes and news. because the reporter describes what it's like trying to be the first to break news in today's in the news cycle. >> we move really fast and it can be an asset but onthe plus side it can be an exception . sometimes we don't have abc news has, maybe fact checking stories, we have an editor of the copy editor that that's us out so with the parkland
4:18 pm
shooting, when it was moving very quickly we misidentified the shooter and just based on what some teenagers had told us, it's that kid and not tracking his photo. that looks like okay, yes. but we misidentified that kid is especially in breaking news and the rush of that and trying to be first, to setback and take a deep breath and question and ask and have, and verify more and especially now with the pace of news, it's more important than ever. >> see the entire discussion tonight at night eastern on c-span as millennial journalists talk about how media is changing the impact of those changes in news coverage .
4:19 pm
>>. >> when congress is on break, where showing key hearings from the last session. the house oversight and reform committee heard from patients on prescription drug prices and how they'redealing with rising costs . this hearing is about two and half hours .

16 Views

info Stream Only

Uploaded by TV Archive on