Skip to main content

tv   The Communicators Data Privacy  CSPAN  August 19, 2019 8:00pm-8:34pm EDT

8:00 pm
. . created by cable in 1979 it is brought to you by your local or cable satellite provider. the plan, your unfiltered view of government. >> host: we want to introduce you to daniel castro, vice
8:01 pm
president of a group called the information technology and innovation foundation. mr. castro, what is that group and what do you do? >> guest: i with a single take. we focus on innovation where were interested in innovation and how you can have policy to support that. >> host: can you give an example of that innovation? >> guest: ten years ago we were talking about artificial intelligence and the benefits. we were thinking about that and thinking about policies we need to have in place to get there and so we try to help policymakers get ahead of the curve on these issues. >> host: who funded iti f. >> guest: a bunch of funders and individuals that support our thing think and we work on a lot of different issues so we get support from those interested in everyone from it issues to
8:02 pm
biotech engineering. >> host: for our purposes here on "the communicators" is it fair to say that large silicon valley companies are part of your funding operation, google, apple, facebook of the world? >> guest: absolutely. the companies are early supporters because they were interested in this idea of how we proceed quickly with innovation. >> host: you also run, your director of iti f center for data innervation as well. >> guest: we have a research center focused on all these issues around data. i think for a long time policymakers realized they had a few different leaders in government. they could tax things. spend money. relate things but part of our innovation was to say to think about how you collect and use data with the government and have smart policy around data to help drive different types of goals you might have. if you want to see cures for cancer or improve education one
8:03 pm
way to do that is by strategic policy on data. >> host: do you find the federal agencies are well staffed when it comes to data protection and data offices? >> guest: we are getting there. one of the first issues a center for innovation with open government data act which we been working on for about five years and finally just passed this year. part of the open government did act requires all federal agencies to not have in place a chief data officer. they had a requirement to do this by july and they are reporting omb who they selected by august two. when i last checked this there were about four agencies still have not done it but most of them had and that's a significant amount of -- because they been paid attention to what day that they are releasing but also paid attention to what they collect and manage it and that's a big improvement for everyone.
8:04 pm
>> host: the purpose of the government government dated act is -- >> guest: to make data available for use by the public and corporations and individuals for innovation of purposes. it would require agencies to be strategic and how they manage. >> host: the other half of that is the data they collect. >> guest: that's right. >> host: what are they collecting about us? what do they know about us? >> guest: open government did act, all government data whether whether data or corporate data or individuals if it's individual data is personal identifiable they are likely not going to be releasing it but they have to crack it. different agencies do different things. some collect everything from health information on veterans to educational data about individuals applying for grants to information about commercial transactions that are personally identifiable transactions in their. >> host: do you believe -- maybe this is a remote question that
8:05 pm
doesn't matter but should those agencies be allowed to share data between themselves, such as be at tsa sharing with social security et cetera or should it be stored? >> guest: i think there is a certain data we want to protect and keep confidential. for example, one reason i think people generally are testing the irs even though they might not like it is they know the irs will not take the data and turn it over to the department of justice just to start a fishing expedition. some of those are incredibly important and that said, we do a lot of problems with the stovepipe in government. for example, there's a half dozen or more agencies in the united states trying to figure out how the economy is working to answer basic questions. those agencies are able to share data and so they end up coming up with different answers and not able to combine the data for better analysis and they face
8:06 pm
significant challenges. that's a problem because it's a waste of government resources and taxpayer dollars and creating less optimal outcomes. one other big challenge in the space are government agencies are starting to think about how can we get data in the private sector because sometimes the private sector has much a data and help them use the data in helpful ways but still treat this data either confidentially or treated confidentially but shared across some agencies for various purposes. >> host: what do you think the issues are that people would be concerned about of the government getting data that is currently held by a private entity? >> guest: i think a lot of people have read the full concerns about government issues to get into their personal lives. we have very strong safeguards with the privacy act that protects what government can do in that space. that said, as we enter this new era of much more private sector
8:07 pm
data collection there's the question of can we do more. let me give you a concrete example. of a company like adp that does processing for payroll and they will know every time a company is a payroll what the state of the economy is and what change from the weeks before and they can see if there are fewer workers out there and they can see these types of changes in real-time. that is information that can be useful for policymakers as a response to a potential downturn in the economy or to respond when you think about what your monetary policy policy should be. it's a very legitimate question to say that we continue to have a long-established protection of how we want to treat citizens by recognizing that the government does not always have the best data and maybe sometimes we need the private sector. >> host: on a different note and perhaps a darker note, should
8:08 pm
equifax, a company like equifax, be allowed to share their data with the federal government? some people would be very uncomfortable with that. >> guest: equifax is, i think, an example of company that has had a lot of challenges and americans are probably upset with them and americans do not even know that company a year or two ago. then they get this announcement that not only has there been massive data breach but they been attacked by company they never heard of. it's a problem. it's a problem for a lot of reasons. a lot of what we rely on right now for companies to have good data practices is market behavior and market -- companies to be responsibly in the market. if i'm unhappy and there's a target data breach i can no longer spot that target or if
8:09 pm
i'm unhappy if equifax reached there's not a lot i can do about that. that is the significant problem. there are certain companies that are collected data about individuals and consumers don't really have significant amount of control because they don't have the direct commercial relationship with them. i think there is a legitimate question to ask you what government oversight is appropriate and when that data should be available to the government. >> host: what does a company like equifax currently know about us? >> guest: there trying to correct data on people's credit history. personal information like where you live, social security information, credit card history and any loans you take out or mortgages you have had and that information and compile it in a large database and make it available to other companies they are looking to assess your credit. >> host: in other words, they are selling our information. >> guest: they are monetizing it which is, i guess, the reason i
8:10 pm
would be hesitant to use the word selling is when most people think of selling you talk about selling your car and at the end of the transaction if i sell you my car i don't have my car anymore and you have my car but when these companies are monetized the data as i say it they're not necessarily turning the data over to someone else but just giving you an answer about this insane this person has good credit for this person has high risk or low risk and are not necessarily sharing all that information. >> host: is it a good system? >> guest: parts do work really well and the parts of the work while our we get credit and it's very easy to open a new line of credit to buy a car at a dealer because you have this information available. we also have pretty good protections in place with the reporting act and we can get correction for wrong information but i think the problem that we have in this space is a few and
8:11 pm
one is that each state makes their own laws around some of these requirements from things like [inaudible] so there are mechanisms in place to make this world safer. you can freeze your credit or locket but in some states that's hard to do. i think that the problem. basically you have to pay these companies to secure your information. i think that system is fundamentally wrong and should be changed and that's something we need to change either state by state or federal law that would fix it. >> host: daniel castro, we americans tend to be trusting people in a sense until we are not. then a breach like the equifax breach happens or the recent capital one breach happens we get a little antsy about a personal information being out there, don't we? >> guest: i think we do. >> host: is there a solution? is it a fine or new legislatio
8:12 pm
legislation -- where do we go? >> guest: i think what we have now is not working. people are getting increasingly fed up with the announcement of here's another data breach and sometimes there is no penalty at all and as we saw with the equifax breach there was an announcement that you could get ten years of free credit monitoring or $125 but as it turns out if you asked for that $125 and everyone else for it there's only a small pot of money and you might end up with $5 or something less. i don't think the system is working today and i think there are ways to change it. one way we can change it is by looking at what people are going after. the reason there's all these data breaches is because hackers are going after certain types of information and valuable information like things like social security numbers. that information is only valuable because you can use it to commit fraud and so the question that we could be asking is can we make that data less valuable?
8:13 pm
for example, we could take it so that it's illegal to use social security numbers for identification or verification purposes outside of social security. this is something the social security numbers were never intended to do for long time and even it says on the card does not frighten education purposes and is not for any of that. that is something that could be a requirement that no bank can ever open account using a social security number have to prove your identity through other means. another thing we can do and if we do that, to be clear, the reason for breaking into these information would go away. if you don't have a tax on data if the data is invaluable. we can also fix what happens after that data breach. right now you get this offer a free credit monitoring and i've had probably five, six offers to a credit monitoring but i don't really need free credit monitoring. in fact, there are services now that offer free credit monitoring and capital one
8:14 pm
offers free credit monitoring serious before the hacks. when they say it's after the actor not doing anything different. patterns because of the new change in policy veterans as of october the credit monitoring. no one needs or no one needs free credit monitoring but we need other things. one recognition i have been to say is after a data breach the offering free credit monitoring consumers are off for a whole menu of options they can fix something from there. they might get a free year of a password management system so they have a better password management or i get a secure token so when they want to log into an account they have better security. they might be able to get a secure electronic id and create a whole new market for security services because right now it does not exist because people
8:15 pm
really don't want to spend a lot of money in that space and there's not going to be a market the people are willing to do that. if we make it so whenever there's a data breach we take one big step forward in scaring american online identities that would mean for getting closer and closer to something more secure each time instead of this situation we are in now where we have a new data region people roll their eyes and wait six months for the next one. >> host: mr. castro, we recently talked to [inaudible] on this program of cnbc, new book out called kingdom of life. it is about hacking. the way she writes it doesn't sound like were sitting behind our passwords in our personal computers is a very good defen defense. >> guest: well, it is true absolutely. one of the things shocking to a lot of people is that for their security for logging into their bank account that is often less
8:16 pm
secure they were using for our e-mail. a lot of people who use two factor authentication for their e-mail so they get notice on their phone and have to prove it on before the login but when they log into the bank it's typing in password 123 and they are in. that's a huge problem and that's where i think we can make progress by making it so consumers have more of these options. setting requirements in these related industries. for example, banks if you're a financial institution need to be moving much faster towards better security. >> when you see and read about what happens with capital one were you surprised at the scenario? >> guest: the actual attack that happened were splitting the details but it seems like to put it partly a configuration error on their end. they made a mistake. it was a mistake that could have
8:17 pm
been caught and it was a mistake. mistakes happen. that's not an excuse but at the end of the day these type of things do happen. they should have had better oversight but what is interesting to me about capital one they were doing things right. for example, they had a bug bounty program which is one of the best things, these can do and they actively say we will pay anyone to find the problem with our system. if you find your system let us know and we want to encourage people to find these problems and bring it to us. that helps them end up tracking down this particular problem and they were doing other things that were right. they do not have allocated systems but had moved forward and done a lot of things right and had a big mistake and so that is why there's a lot of i think analysis that will help you go into that particular want to see what went wrong but other companies think that they just never invested in security and that's why they are not doing that but capital i think
8:18 pm
probably did a reasonable investment but just made mistakes. that's something that consumers will have to recognize with the data breaches are continuing to happen but we do about the data to make it less valuable than it is. >> host: how did you get into this line of scholarship? >> guest: my back on is an information security though i've been interested in these issues for a while but i recognize that you need to have policymakers understand these issues very well to otherwise you don't always end up with good outcomes for consumers. >> host: are the threats and sophistication of the attacks and our protection systems growing exponentially? >> guest: i don't know if i was a exponentially but they are growing. the sophistication of these attacks show that the attackers are having significant resources
8:19 pm
and they are complex. a lot of these really involved significant amount of dedication to find the problem and exploit it. but the problem is it's really easy once you find that way into a system to get all that data to make a lot of money off of it. these black markets where you can sell people's identity and credit cards and that is also part of the problem. we need to have really good enforcement of these types of crimes to make it if you commit these crimes you will go to jail and have a lot of foreign attackers getting away with these things very easily and that is a problem, too. capital one happen to be someone was here in the united states but that is not always how it plays out. >> host: that said, in a digital world, borders are muddier. >> guest: they are. that's a reason these are
8:20 pm
international issues and global issues. we need to move away from this idea that we can secure just the united states or u.s. consumers but if we want to address information security in these data issues is a global problem and we need to think about global solutions as well. it is not enough to think we will have this relative security where we will be able to take down our adversaries but we need to think about raising all bells in the scenario. >> host: in a recent article on your website information technology and innovation foundation you co-authored an article, the cost of an unnecessarily stringent federal data privacy law and this is a key take away and i want you to expound on this if you would. >> host: federal legislation mirroring the general data protection regulation or california's consumer protection act could cost the u.s. economy
8:21 pm
approximately $122 billion a year or $483 for every u.s. adult. >> guest: yeah, right now or in the midst of this huge conversation about will be have new federal davis he data privacy regulation and it's been brought about by the fact that europe passed their law and people are saying should be copy them in california passed a law that might that the rules for the rest of the united states. the question is will be look to europe and copy them or let california that the rules of the road. the challenge in the space is that it's very costly to do and it can be costly to do data privacy. it doesn't mean we shouldn't do it but we should be strategic about how we do it. the point of the report we put out is to start teasing apart the different components about legislation and talk about where the value is for different ones and how we can construct
8:22 pm
something that provides significant protections to consumers but keeps the price down. the problem with europe is to move forward with data protection regulation and they don't have the same silicon valley the united states has but they weren't interested in keeping costs down companies and keeping costs down on consumers. they just wanted the best privacy that money can buy or where money was no cost. in the u.s. need to think about how can we get privacy regulation at a good value, not at any cost. when you think about the term any cost you end up lowering consumer welfare. we want to see is consumers coming out ahead because they are better privacy but they still have access to innovative product services. they are not cut off from all the things. >> host: when you say lower consumer welfare what do you mean? >> guest: if you look at some of these proposals they would fundamentally change the way the
8:23 pm
internet ecosystem works today which is with the a lot of ads in exchange for free services. if you change that then one of those is you will see even more ads because the ads will be each ad will be worth less because they won't be targeted or have to pay for more services and you might start paying eight nominal fee for your e-mail or more for some free video streaming services or more for those apps you downloaded were some are free right now. if you start asking consumers and surveys questions about privacy and usm in general would you like were privacy we all say yes across the board. everyone wants privacy as they should give in this environment. then you start asking how much are you willing to pay for privacy and a seat the answers vary a lot. some people have a lot of money and care about privacy and spent $100 a month and some do buy services today.
8:24 pm
the vast majority of consumers want to see trade-off in this space. they want more privacy but they don't want to pay that much for it. maybe they're willing to pay a little more or willing to pay a couple more ads but they don't want to see a significant shift in what is being done. they want to see significant sit on data breach but when you talk about types of ads using online those people are pretty okay with that. >> host: mr. castro, algorithms come into play. >> guest: they do. a lot of discussions about the transparency of algorithms and what oversight exists in this space. this is where i think this is an emerging area and some of it is old. we had a debate about algorithms about 20 years ago when there were questions about the old systems using for flights and questions about how do you
8:25 pm
decide which flight comes up first because the one that comes up first is the one the all the travel agents will book. there were a lot of discussions about that and we got into those debates and now it's again a different context. >> host: how is it coming up? >> guest: well, one relates to things like facial recognition and questions about how accurate these algorithms are and how much insight we have into how different types of algorithms, especially the artificial intelligence, how do we know if they are working quickly and how can we explain the decisions made by them and how much transparency there is to consumers. what can we do when algorithms are very accurate but we don't know why. how do we manage that trade-off. that is something and up being context specific. in some cases accuracy is
8:26 pm
paramount and if i'm going to the doctor and i'm having an algorithm help diagnose me oftentimes doctors don't know why they're making decisions and algorithms often don't either. i personally want to have an outgrowth of that 99% accurate but can't necessarily-one is 80% accurate and can give me a reason. in other contexts the nation is necessary. for example, when we talk about lending in certain contexts we want to make sure that factors like race, for example number or religion have not been used to disseminate of people so more insight into fancy into those how they work. >> host: there's been a little bit of controversy about race and facial recognition what is your view? >> guest: what is interesting about that conversation as it's been conflating the different technologies for the most part. facial recognition and basal
8:27 pm
analysis. facial recognition is a system that can take a picture of your face. to your id and is it the same person or take a picture of some of space, scan it through database and find the match or say there's no management basal analysis is taking a picture insane person has a beard and this person is smiling or not smiling and this person is male or female. those types of distinguishing characteristics. a lot of the research that has been on race has been on visual analysis not facial recognition. there has been a few reports that have looked at race official recognition but we think there are differences between demographics so light-skinned versus dark skin and the difference in accuracy if you're male versus female in different this for accuracy but it depends on the specific implementation. some algorithms are worse for black females and some are worse for black males or white males and you see the performance vary
8:28 pm
based on these contextual factors. what people are concerned about is, of course, it will be used and placed in ways that end up over placing the minority communities. it's a legitimate concern giving the bates were having about appropriate role of leasing. the problem in the space is missing a couple cities with quickly to ban the technology even before they try to pilot it or figure out how they can set up guardrails to use it effectively because there are many different types of uses of the technology a lot of people really think about facial recognition they think of this china like real-time pervasive surveillance social security cameras watching your every move and in reality a lot of these as we are seeing are things like somebody has come in and they're simply trying to identify this person that does not have id on them or they have suspect or witness in trying to match a name and it's something that people, police officers, are
8:29 pm
doing annually and ineffectively and slowly. this is a way to use technology to speed that up. i don't think most people have an objection to this but those should be on the table and these other more i think serious uses were people have concerns about the privacy indications of pervasive surveillance that's where we need to have the debate but we can't have the debate if we have the cities banning the technology without us even seen what's possible. >> host: final question. on your twitter feed identify yourself as though copyright and anti- cessna which is a sex trafficking act, section 230. spirit yeah, there was a new law that came out recently that created a carveout for section 230, communications decency act which provided liability protection to third-party intermediaries so someone
8:30 pm
uploaded content onto craigsli craigslist, craigslist is not responsible for what's up there, the user who uploads it is. the stock rated this carveout and if it's related to sex trafficking and broadly defined those platforms would be responsible. as a result of this law he saw across the board basically every internet platform ban anyone who might have any type of relationship to sex work, not sex trafficking necessarily because it was so broadly defined. the critique among sex workers has been that this is an ethically hurt them that instead of being able to, for example, that customers or arrange online transactions now they are walking streets and subject to attacks and worse. in the past without the benefit
8:31 pm
of online platforms and online communities enabling and empowering communities and this particular law really took a step back and it's because groups have been trying to, i think, make online intermediaries responsible for the content users post and there are some cases where that. appropriate and in this case, i think, they got it wrong. >> host: dino castor, vice president of the informational technology and innovation foundation. we appreciate your spending a few minutes with us here on "the communicators". this communicators and all other are available as podcast. >> for 40 years c-span has been providing america unfiltered coverage of congress, the white
8:32 pm
house, the sipping court and public policy events from washington dc and around the country. you can make up your own mind created by cable in 1979 c-span is brought to you by your local or satellite provider. c-span your unfiltered view of government. >> starting now it's book tv on c-span2. >> coming up tonight.
8:33 pm
>> next, journalist andrew blum on his book "the weather machine" where he first the technology used to develop daily weather reports. from politics and prose bookstore in washington dc this runs one hour. >> good evening everybody. i'm the co-owner of politics and prose along with my wife in on behalf of everyone here, welcome. thank you for coming. congratulations on here braving the weather to get your weather. what a great day to have a book about the weather, isn't it if you have been following the weather and who doesn't these days you will have noticed, at least two things. first, that it is getting

36 Views

info Stream Only

Uploaded by TV Archive on