Skip to main content

tv   The Communicators Data Privacy  CSPAN  August 19, 2019 8:00am-8:33am EDT

8:00 am
join us again next saturday at 8:00 a.m. eastern for the best of nonfiction books. .. .. and how you can of policies to support that. >> host: can you give an example of that innovation you are in favor? >> guest: ten years ago we were not talking artificial intelligence and all the benefits brought about with automation. we were thinking about that and think about the policies we
8:01 am
would need to have in place to get there. we try to help policymakers get ahead of the curve. >> host: who funds itif? >> guest: we have whole set of funders from corporate donors, foundations and individuals support a think tank. we work on a lot of different issues so we get support from those that are interested, everything from ip issues to biotech. >> host: for our purposes on "the communicators" is it fair to say large silicon valley companies are part of your funding operation, google, apple, facebook of the world? >> guest: absolutely. these companies were early supporters because they're interested in this idea of how do we proceed quickly with innovation. >> host: you also run, you are the director of itif center for data innovation, which is what? >> guest: we have research center focused on issues around data. for a long time policymakers
8:02 am
realize they had a few different levers and government. they could tax things. they could spend money. they could regulate things. part was to say you could say how do you think about you collect and use data with the government. he should have smart policy to drive different parts of goals you might have. if you want to see goals for cancer, -- cures for cancer, strategic policy around data. >> host: do you find that agencies are well staffed when it comes to data protection and data officers? >> guest: we are getting there. there. one of the first issues are center focused on was the open government data act which we been working on for five years. finally just passed this year. part of the open government data act requires all federal agencies to have in place a cheap date officer. they had a requirement for doing this by july and reporting out to omb with a selected august 2.
8:03 am
when i last checked this, there were about four agencies that still had not done but most of them had and that's significant amount of progress. you have agencies paying attention to what data they are releasing but also to what data they collect and how they manage it through the entire lifecycle. >> host: so the purpose of the open the government data act is? >> guest: its to make government gave a payable for use by the public, by corporations, but individuals, for innovation and other purposes. it is also required agencies to be strategic and how they manage it. >> host: the other half of that is the data they collect. >> guest: that's right. >> host: what are they collecting about us, what do federal agencies know about it? >> guest: the open data company act applies to all government data whether it is the weather data, corporate data or individual data. if it's individual data, personally identifiable, they're likely not going to be releasing it but they are going to track
8:04 am
it. different agencies do different things. some collect everything from health information on veterans to educational data about individuals who are applying for grants, to information about commercial transactions that still personally identifiable information in there. >> host: do you believe and maybe this is a remote question that doesn't matter, but should those agencies be allowed to share data between themselves, such as the tsa sharing with social security, et cetera? or should it be stovepipe information? >> guest: there's certain data we do want to protect and keep confidential. so, for example, one of the reasons people generally are trusting fire i arrested with of the might not like it is a no the irs isn't going to take the data and turn over to the department of justice, just to start some fishing expedition. some of those privacy safeguards are important. that said, we do see a lot of
8:05 am
problems with stove piping in government. so, for example, there's about half a dozen or more statistical agencies in the united states that are trying to figure out how the economy is working, and to some basic questions about that. those agencies are able to share data. the end of up with different answers. they are not able to combine the data for better analyses. they face significant challenges. that's a problem because it's wasting government resources and taxpayer dollars and it's grading less optimal outcomes. one of the of the challenges in this space are government agencies are starting to think about how can we get data from the private sector. sometimes the private sector has much better data and how can we use that data in helpful ways but still treat this data either confidentially or treat it confidentially but still share it across some agencies for very specific purposes? >> host: what you think the issues are the people would be
8:06 am
concerned about of the government gating data that's currently held by a private entity? >> guest: a lot of people have rightful concerns about government intrusion into the personal lives. we've had very strong privacy safeguards in the privacy act that protects what government can do in that space. that said, as we enter this new era of much more private sector data collection, there's a question of can we do more? let me give you a concrete example. you have a company like adp that does a lot of data processing for payroll across america. they're going to know every time the company submits their payroll what the state of the economy is. vacancy what's changed from the weeks before. vacancy if there are fewer workers out there. they can see these types of changes in real-time. that's information that can be useful for policymakers as they are trying to respond to a potential downturn in the
8:07 am
economy or respond when to thinking that what should the monetary policy b. it's a very legitimate question to say, can we continue to have the long established protections of how we want to treat citizens while recognizing that the government doesn't always have the best data and maybe sometimes we need to go to the private sector for that? >> host: on a different note, and perhaps a darker note, should equifax a company like equifax allowed to share their data with the federal government? some people would be very uncomfortable with that. >> guest: equifax is i think an example of a company that sat a lot of challenges and a lot of americans are probably upset with them. probably a lot of americans didn't know that company a year or two ago and then they get this announcement that old hasn't been this massive data breach but they are us by company that never heard of. that's a problem. it's a problem for a lot of reasons. one, a lot of what we rely on
8:08 am
for companies to have good data practices is market behavior and market, you know, companies basically respond to the market. if i'm unhappy when there's a target data breach i can no longer shop at target. if i'm unhappy if there's an equifax breach, there's not a lot i can do about that. that's a significant problem. there are certain companies that are collecting data about individuals where consumers don't have a significant amount of control because they don't have direct commercial relationship with them. there's a legitimate question to ask about what kind of government oversight is appropriate and even when that baby should be available to the government or anyone else. >> host: what does the company like equifax currently know about us? >> guest: they are trying to collect data on peoples credit history. they will collect personally identifiable information, where you live, social security number, credit card history, any
8:09 am
loans you take an outcome any mortgages you've had, that kind of information. they will compile it in a large database and make it available to other companies that are looking to assess your credit. >> host: in other words, they are selling our information. >> guest: they are monetizing it, which is i guess the reason i would be hesitant to use the word selling, when most people think itself to talk about selling your car, , at the end f the transaction if i sold my car i don't have a car anymore and you have that car. when these companies are monetizing the data as i said, they are not turning that data over to somebody else. they are just giving you an answer about this. they are saying this person has good credit or this person is a high risk for a low risk. they are not necessarily sharing that banking information with the other entity. >> host: is it a good system? >> guest: there are parts of it that work really well, right.
8:10 am
the parts that work well is we get credit. it's very easy to go and open a new line of credit. easy to go buy a car from a dealer because you can have this information that's available to you. we also have some pretty good protections in place with the fair credit reporting act, if there's one information. we can get correction made to it. i think the problem we have in this space, there's a few, one is that each state sets their own laws around some of these requirements about things like credit freezes. there are mechanisms in place to make this world safer. you can freeze your credit. you can unlock it. in some states that's expensive to do. that's a problem. basically you have to pay these companies to secure your information. i think that system is fundamentally wrong and should be changed and that something we need to change either state-by-state or get a federal law that would fix it. >> host: we americans tend to
8:11 am
be trusting people in the sense until were not, and then when a breach like the equifax breach happens or the recent capital one breach happened, we get a little antsy about our personal information being out there, don't we? >> guest: i think we do. >> host: is there a a solution? is it a fine? is it new legislation? i mean, where do we go? >> guest: what we have now is working and people are getting increasingly fed up with the announcements of here's another data breach. sometimes there is no penalty at all, as we saw with the equifax breach. there was an announcement that you could get ten years of free credit history monitoring or free credit monitoring or you could get $125. it turned out it asked that $125.80 what else asks, there's only a small pot of money and you might end up with five dollars or something less. i don't think the system is working today. there's ways to change it.
8:12 am
one way we can change it is by looking at what people are going after. the reason there's all these data breaches is because attackers are going after certain types of information. the valuable information on things like social security numbers. that is only valuable because you can use it to commit fraud. the question we can ask is telling make that data less valuable? one thing we could do is we could make it so it's illegal to use social security numbers front edification and verification purposes outside of social security. this is something the social security numbers were never intended to do. for a long time it says on the card this is not for identification purposes. they stopped printing that but that something that could be done. that's something that could be a requirement that nobody could ever open an and get using a sl security number. you have to prove your identity through other means. another thing we could do, , and if we did that just to be clear, the recent breaking into all these, stephen this information
8:13 am
would go away. you don't have a tax on date if the date is invaluable in more. something else we could do is also fix what happened after the data breach. right now you get this offer of free credit monitoring. i've had probably five to six offers a free credit monitoring. i don't need more free credit monitoring. in fact, there are services now that offer free credit monitoring. capital one in fact, offers a free credit monitoring service before the hack. when they say they're giving you free credit pirating after-the-fact, they are not doing anything different. and veterans he goes of any change of policy, , veterans asf october will have free credit monitoring. no one needs, actually no one needs more free credit monitoring. what we need are other things. one recommendation i have is after a data breach instead of offering free credit monitoring, consumers are offered a whole menu of options they can pick something. for example, they might get a free year of a password management service so they can
8:14 am
have better password management. they might get a secure token so that when they want to log into an account they have better security, multifactor authentication. they might be able to get a secure electronic id, and we can create a whole new market for security services that right now doesn't exist because people rightly don't want to spend money in this space and is not a market into people are willing to do that. if we start making it whenever there's a data breach we take one big step forward in securing americans on my identities, that would meet with getting closer to something more secure each time instead of this situation we're in now where we have a new data breach, people rolled her eyes, we wait six months for the next one. >> host: mr. castro, we talked to kate mancini on this program at cbc. just new book out called kingdom of lies. it's about hacking, and the way she writes, it doesn't sound
8:15 am
like our sitting kind are little password in our personal computers is really a very good defense. >> guest: it's true,, absolutely, and one of the things that is i think shocking to a lot of people is that for their security, for logging into the bank account, that's often less secure and what they're using for the e-mail. i know a lot of people use to backpack identification for the email. they get a notice on the phone and have to prove itself before they're locked in. when the log into the bank it's just typing in password 123 and they are in. that's a huge problem. that's where i think we can start making progress by making it so that consumers have more of these options. setting requirements in some of these regulated industries, for example, banks, if a financial institution need to be moving much faster towards better security. >> host: when you see and read
8:16 am
about what happened capital one, were you surprised at the scenario? >> guest: well, the actual attack that happened, we're still getting all the details, but it seemed like it was, to put it bluntly, a configuration error on their into. they made a mistake that was a mistake that could've been caught and it was a mistake. mistakes happen, right? that's not an excuse but at the end of the day these types of things to happen. it shouldn't have. they should've that better oversight but it did. what was interesting to me but capital one is they were doing a lot of things right. for example, they had a bug program which one the best things companies can do which is been actively say we'll pay anyone to find a problem with our system. you find a system, let us know and there's money in it for you. we want to encourage people to bring that to us. that helped them into tracking down this particular problem and
8:17 am
resolving it. they were doing other things that were right. they did not outdated systems. they had moved forward. they have done a lot of things right. they had a really big mistake and so that's why there's a lot of i think analysis that will have to go into that particular want to see what room. the are other companies that never invested in security and that's why they are not doing, they didn't get things right. capital one probably did a reasonable amount of investment. they just make mistakes, and that something that consumers are also going to have to recognize. these size of data breaches will continue to happen. the question is what we do about the data so it is less valuable when it does. >> host: what your background? >> guest: information security. i've always been arrested in these types of issues for a while. i recognize you need to have policymakers understand these issues very well, too, otherwise you don't always end up with
8:18 am
good outcomes for consumers. >> host: are the threats and sophistication of the attacks and our protection systems growing exponentially? >> guest: i don't know if i to exponentially but they are definitely going. the sophistication of these attacks show that the attackers are using significant resources and they are very complex. a lot of these really involved significant amount of dedication to find the problem and to exploit it. but the problem is it's really easy what once you find that wt into a system and to get all that data, to start making a lot of money off of it. on these kind of black markets we can sell peoples identity and self credit cards that's also part of the problem. we need to have really good cyber law, enforcement of these types of crimes to make it so if you commit these crimes you
8:19 am
actually go to jail. you have a lot of foreign attackers were getting away with these things are easily, and that's a problem. capital one happen to be someone who is here in the united states but that's not always how it plays out. >> host: that said, in a digital world, borders on monday, i do not? >> guest: they are and that's a huge problem and one of the reasons these are international issues, global issues. we need to move away from this idea that we can secure just the united states are just u.s. consumers, just u.s. businesses. if you want to address information security and these data issues, it's a global problem and we need to be think about global solutions as well. it's not enough to think we're going to have this relative security where the u.s. will be safe and we will be able to take them our adversaries. we need to be think about raising all those scenarios. >> host: in a recent article on your website, information technology -- innovation
8:20 am
foundation, who co-authored an article, the cost of an unnecessarily stringent federal data privacy law. just one of the key takeaways i want you to expand on this if you want. federal legislation mirroring key provisions of the european unions general data protection regulation, or california's consumer protection act, could cost the u.s. economy approximately $122 billion a year, or $483 per use adult. >> guest: yea. so right now we're in the midst of this huge conversation about willie have new federal data privacy legislation? is being brought about the fact by one, your pastor law and people are saying shibley copy them? california passed a law that might set the rules for the kenai state. the question is are we going to look to europe and copy them or are we going to california set the rules of the road or do
8:21 am
something different? the challenge in this space is its very costly to do, it can be very costly to do data privacy. it doesn't mean we shouldn't do it. made we should be strategic about how we do it. the point of the report we put out was to start teasing apart the different components of what we could do do in legislation d talk about where the value add is are different once and have we can construct something that provide significant protections to consumers, but keeps the price down. the problem with europe is they move forward with data protection regulation and first of all the don't have the same silicon valley that the united states has. they were not interested in keeping costs down on companies and keeping cost and on consumers. a just wanted maybe the best privacy money could buy where money was no cost. i think in u.s. we need to be thinking about how can we get diversey regulation at a good value? not any cost. when you think about these terms
8:22 am
and any cost, you end up lowering consumer welfare. what we want to see is consumers comes out had because they have better privacy but they still have access to innovative products and services and they are not cut off from all of the things they like using today. >> host: what did you mean you said lower consumer welfare? >> guest: if you look at some of these proposals, they would fundamentally change the way the internet ecosystem works today, which is that we see a lot of ads in exchange for free services. if you change that, then one of those is you'll see even more ads because these ads will be worth less because they won't be targeted. or you have to start paying for more services. you might paying a nominal fee for your e-mail service. you might start paying for more video free services. you might pay more for apps to download. if you start asking consumers in surveys questions about privacy,
8:23 am
if you ask them in general, would you like for privacy? we all say yes. across the board everyone wants for privacy as we should given this in private. but then you ask how much are you willing to pay for privacy? you see the answers very a lot pick some people have a lot of money, really care about privacy. they spent $100 a month and some do. some buy services today. but the vast majority of consumers want to see some kind of trade-off in this space. they want more privacy, but they don't want to pay that much for it. maybe they're willing to pay a little bit more. maybe they are willing to see a couple more ads but they don't want to see a significant shift in what's been done today. they want to see a shift on issues like data breach but when you talk about the types of ads on line, those people are pretty okay with that. >> host: algorithms come into play at this point, don't they? >> guest: 50.
8:24 am
a lot of discussions about algorithms and the transparency of algorithms and what kind of oversight exists in this space. this is where i think, this is an emerging area. some of it is old. we've had debate about algorithms about 20 years ago when there were questions about the old systems use for flights, questions about how do you decide which comes up first? the one that comes up first is one of the travel agents are going to book. there were a lot of discussion about those issues. she we got into some of those debates and now it's again in different contexts. >> host: how is it coming up? >> guest: one relates to things like facial recognition, and questions about how i could these algorithms are, how much insight we have into different types of algorithms, especially that use artificial intelligence. how do we know if they are
8:25 am
working correctly? how can we explain the decisions that are made by them? how much transparency that is to consumers, and what do we do when algorithms are very accurate we don't know why? and how do we manage that kind of trade-off? that something that ends up being context specific. in some cases accuracy is paramount. if i'm going to the doctor and i'm having an algorithm help diagnose me, oftentimes doctors don't know why they are making decisions, and algorithms often don't either. i personally would rather have an algorithm that is 99% accurate and can't message of explain why, and one that is 80% accurate and can give me the reason. in other contexts, and explanation is necessary. so, for example, when were talking about living in certain context, we want to make sure factors like race or religion have not been used to
8:26 am
discriminate against people. we need more insight and transmit. >> host: there's been a lot of controversy about race in facial recognition. what is your view? >> guest: what's interesting about that conversation is it's been conflating two different technologies for the most part. there's facial recognition and facial analysis. facial recognition is a system that can take a picture of your face and compare to your id as a these are the same person. or take a picture of someone's face, scanned through a and find a match or say there's no match. facial analysis is taking a picture and say this person has a beard, this person smiling on a smiling, this person is male or female. those types of distinction characteristics. a lot of the research has been on race has actually been on facial analysis, not facial recognition. there has been a few reports that it looked at race on facial recognition. what we've seen is there are differences between
8:27 am
demographics, so if you're light-skinned versus dark skinned, there's a difference in accuracy and if you're male versus cmo that can be differences in accuracy, but it depends on the implementation. some algorithms are worse for black females. some are worse for black males are white males. use the performance vary based on these contextual factors -- factors. people are worried though be used by police in with it will be over police in her minority committee. very legitimate concerns about debates about the appropriate role of policing. the problem in this space is within a couple of cities move quickly to ban the technology before they even tried to pilot it topic of at how they can sef guardrails to use it effectively. there are many different uses of the technology. people in a think about facial recognition and policing think of this kind of china like
8:28 am
real-time pervasive surveillance of security cameras watching you your every move when in reality a lot of uses are something like some has come in and there's something kind to identify this person that doesn't have an id on them or they have suspect or witness and they're trying to match ending. this is something people, police officers are doing manually and are doing it very effectively and slowly. this is a way to use technology to speed that up. i don't think most people would have an objection to those instances. those should be on the table. then these other more i think sears uses what people have concerns about the privacy implications of pervasive surveillance, that's what we need to have the debate that wn have those debates if you just, if we of cities banning the technology without us even sing what's possible. >> host: vital question. on your twitter feed you identify yourself as
8:29 am
pro-copyright and anti--- which is sex trafficking act, section 230, right? >> guest: that's right. so there's a new law that came out recently that create a carveout for section section 2e committee kitchens decency act which basically provides liability protection to third-party and intermediaries. some uploads content onto their craigslist, craigslist is a responsible for what is up to the user uploaded it is responsible for this law created a carveout the set if it's related to sex trafficking, kind of broadly defined, those platforms can be responsible. as a result of this law you saw across the board basically every internet platform ban anyone who might have any type of relationship to sex work, not sex trafficking necessarily, because it was so broadly
8:30 am
defined. the critique among sex workers has been that this significantly hurt them, that instead of being able to that customers, arrange online transactions, now that aa walking streets and subject to attacks and worse. in the past we saw the benefit of online platforms and online communities enabling people, empowering individuals and new communities. this particular law i think really took a step back. it's because a lot of groups have been trying to i think make online intermediaries more responsible for the content that users post, and arson cases where that's very appropriate. i think in this case they got it wrong. >> host: daniel castro is vice president of the information technology and innovation foundation. we appreciate you spending a few minutes with us here on "the
8:31 am
communicators." this communicators and all others are available as podcasts. >> for 40 years c-span is been providing america unfiltered coverage of congress, the white house, the supreme court and public policy events from washington, d.c. and around the country so you can make up your own mind. creative aikido in 1979, c-span is brought to you by your local cable or satellite provider. c-span, your unfiltered view of government.
8:32 am
>> next, former congressman beto o'rourke talk about the recent mass shooting in some town of el paso, texas. he also announced he will continue his run for the democratic presidential nomination in 2020 after temporarily suspending his campaign in the days following the el paso shooting. this is just over 35 minutes. [inaudible conversations] [inaudible conversations]


info Stream Only

Uploaded by TV Archive on