Identity Review | Global Tech Think Tank
Keep up with the digital identity landscape.
Large technology and social media companies are at a crossroads, finding themselves under scrutiny for tracking consumers and mining user data. Industry giants, governments, global trends and the COVID-19 pandemic are reshaping consumer views on the value of personal data privacy.
Debbie Reynolds is a technologist and advisor on handling global data privacy, cyber data breach response and complex cross-functional data-driven projects. With her knowledge on global data privacy, data protection and technology issues, Reynolds has served on over 17 advisory boards and received over 15 honors and awards in higher-education and the data privacy industry.
We spoke with Reynolds on the importance of data privacy and individual consumer rights. Through this conversation, Reynolds discusses her views on the important role digital identity plays in consumer privacy.
With recent California legislation regarding consumer data protection as well as public efforts from technological giants to ensure transparency on data collection, Reynolds offered her insight on how the rest of the tech industry will be reshaped in the near future and how consumer opinion has shifted.
Where is the data privacy space going?
Reynolds: In terms of corporations, I think people are looking very closely at what Apple is doing with iOS 14. In part, they’re trying to give individuals more control over their data or more information about the apps that they use. For example, they’re releasing features that tell you if an app you’re using is recording what you’re doing, sending data to other apps, or looking at other apps that are on your phone so that you can decide or not whether you want to use the app and whether you want to do that sharing. Before, you would download the app, but there was little transparency.
Most corporations have tried to do every other thing besides get consent from individuals. So the fact that Apple’s leading with this makes for more people to use Apple products or Apple services. Apple is going to make money from this. So, once people see that they can actually profit from it, they’re going to try to emulate them, whether it be by cleaning up their act on data privacy or being more transparent.
I call it a trust war. I think individuals, as they get more educated about their data and what’s happening with it, are going to start sharing less or being more cautious with how we share our data. So, there will be a couple of trusted companies who we want to share our data with or may share more than what we had before because we trust them more. Other smaller players that can’t get our confidence won’t benefit as they had in the past, because we don’t want to share with those people.
Reynolds: Consumers are getting more educated. Something enacted this year that I think makes people think about privacy more is the California Consumer Privacy Act (CCPA). When that was enacted, it was eye-opening to people in other states. From a business perspective, with the United States not having any type of federal data privacy regulation, companies are now trying to market to that order. I’m seeing companies that aren’t in California try to emulate the CCPA in their applications. For example, the CCPA requires sites to have a button that says, “Do not sell my data.” Other states are not legally obligated to do that, but they’re following along on the technology side with California. California has become the de facto standard, because there is no other standard.
Reynolds: With COVID, there’s much more medical information that people need to be concerned about. Before, medical information was separate from employment information and any other things you did in your life. But now, especially because of practices like contact tracing, a lot of personal health information that would have typically been held or protected in a medical setting is now put in a consumer setting, and it’s not protected the same way.
Do you have privacy concerns that have arisen with COVID-19?
Reynolds: In terms of apps, I’m not a fan of contact tracing. There are professionals that do contact tracing who have a whole process and program that must abide by HIPAA rules for medical data in the US, so through those traditional means, consumer data is protected by law. But let’s say you download an app and you’re giving information about your health. That’s not protected by these medical laws, because you’re not interfacing with a medical professional. So that information becomes consumer data, which has less protection. That information can be sold or used in commercial ways that you could not do with medical information in a patient-provider setting, which has huge implications for things like insurance.
Privacy concerns regarding sensitive personal health information have been voiced in the cybersecurity and digital identity industries. Equally important is understanding the implementation of pandemic-control technologies can put consumers at an unknown disadvantage post-pandemic.
What are the implications of a post-COVID world in terms of how consumer data is protected or shared?
Reynolds: Right now, Apple and Google have introduced contact tracing capabilities on their devices through proximity-based information sharing, but they are reserving that capability for sanctioned contact tracing apps that certain governments use. However, there is no guarantee that they can’t use that for other needs in the future. That means that later, they’ll be able to collect more information on you and also people in your proximity.
Will this continue to happen after COVID-19?
Reynolds: Oh, absolutely. I mean, just think of your bank. You have an app for your bank, and let’s say they have a capability where they can use this proximity. This means that in addition to them knowing what you buy because you use their card, they know who you associate with. So they have a capability to collect a lot more information about you. All that these banks and insurers want to do is reduce their risk. So, if they feel like there’s something about you or your information that makes you more risky, they can change how they treat you as a consumer—maybe you have a higher interest rate, or maybe you don’t get a loan, or maybe you don’t get insurance. There’s a lot that can happen here, especially if you don’t know that the collection is happening. Transparency is an issue.
On Consumer Education
Reynolds: I’ve been in data privacy for over 20 years and I’ve never seen privacy be so mainstream. Even my friends and family call me up now and ask me about it. That never happened. So I feel like people are really thinking more about it. Like, thinking about Facebook and disinformation campaigns, or how algorithms work and how they push data to you. There’s more awareness, and people are starting to ask more questions.
Contact tracing is not the only technology that has increased in use after the COVID-19 pandemic. The use of biometrics for identification and verification has also raised concerns in the data privacy and digital identity industries.
Reynolds: Let’s say someone hacks your account and you change your password. You can’t change your face, your fingerprint or your iris scan. So, anyone who’s able to get that type of information about an individual can impersonate them in the world and do almost anything because of things like holos that have high regard in terms of authenticity. They’re only as authentic as you’re able to determine that they belong to a real human, but there are a lot of false identities out there.
You can create 50 Google email accounts, but that doesn’t mean that you’re 50 different people—you’re one person. But people use these mechanisms to create identities, and they get registered in systems as though they are real people, and they’re not. So being able to really tell who’s real and who’s not is going to be very important. If someone has a fake identity and gets a credit card, someone may think that that’s pretty good evidence that that’s a real person because they’re interacting with a financial institution, even though that may not be true.
Is there a happy medium for biometrics and facial recognition without violating people’s privacy?
Reynolds: It’s a tough problem, but I think there are a lot of companies that are trying to solve it. The key, really, is that at some point, you have to be able to authenticate that you are who you say you are, and it must be done in a way that can’t be replicated or stolen. We’re doing smaller things now, like two-factor authentication, but we’ve witnessed hackers that have been able to get around that. There has to be a way to be able to tell a fake identity from a real one anywhere. If you can make sure someone is who they say that they are and their credentials or biometrics can’t be stolen, reused or repurposed, that will stop a lot of identity theft in its tracks.
Are there better ways to validate an identity rather than biometrics that you’ve seen in the industry recently or that you think will come in the future?
Reynolds: That’s a good question. I’m not sure if I’ve seen anything better than biometrics. I am seeing wacky stuff. I’m an expert evaluator on things like virtual reality and augmented reality, and what they’re finding through observing people using or wearing a physical apparatus of some sort is that it creates a very unique pattern of movement that could be almost considered like a biometric identifier. There are companies that are looking at gait for identification use, for example. Those are things I’ve seen creeping up in the industry, but all those things have some type of biometric or physical attribute.
As countries have different regulations and norms, understanding the way privacy is viewed and valued across the globe is also a point of interest, especially considering the effects of the current global digital transformation.
What is happening in data privacy across the globe?
Reynolds: There’s a battle between privacy as a human right. The example I gave to someone is—say, you and your grandma take a picture and you, as a Facebook user, put it on Facebook. Facebook tags you and your grandma. She’s not on Facebook, but she’s associated with you, and since Facebook has a photograph of her, they know that you guys are related. In the US, if she decided that she didn’t want her picture on Facebook, she would have no standing to do that unless she was a user of Facebook. You could request that, but she could not.
The EU is different, where she would have some standing. Because her biometric or her image is on Facebook, and she could petition them to take it down. In order for her to be able to take some type of stance against it in the US, she would have to become a user on Facebook, which means that she would have to give them more data so that she could tell them to take her data out of Facebook.
How has GDPR had an implication on differences in privacy across countries?
Reynolds: GDPR is the most comprehensive framework for privacy around the world. So as it has rolled out, countries have been looking at certain parts of GDPR and have adopted a lot of the vernacular in their privacy laws, including controller, processor and data subjects. Obviously, with the CCPA in the US, our states are beginning to adopt part of that language as well. I think the GDPR is very influential in that way. We can’t really even talk about privacy laws without talking about the GDPR because it is the most comprehensive framework. It’s setting a larger global standard for privacy regulations.