How are privacy laws evolving and impacting the workforce?
Dylan Gilbert, Privacy Policy Advisor at NIST, shares his insights on recent developments in privacy laws and outlines how both domestic and global companies can adopt these frameworks.
You’ve spoken at events on the topic of ‘Caterpillars to Butterflies: Transforming the Privacy Workforce’. Can you go into detail about what you mean by that analogy?
Privacy workforce is a topic that our stakeholders articulated as a key challenge going all the way back to the development of the NIST Privacy Framework. It something that’s been top of mind for us as a privacy engineering program, to try and figure out ways that we can address those challenges.
When we thought about what that might look like in terms of where things are now within the privacy workforce and where we want to be, the immediate thought that sprung to mind was the caterpillar-to-butterfly analogy. In many ways we’re still in the early stages of privacy in general, especially in terms of it being an actual profession and discipline. It’s something that is still in its nascent stages and not fully mature, at least when we compare it to cybersecurity, for example.
We thought about why the privacy workforce might look like a caterpillar. So there are things like the privacy team being out of sync with the cybersecurity team. Or there may be a focus solely on compliance. There may be insufficient knowledge and skills within a privacy team, or it could be under-resourced.
I spent my first year at NIST working on resources for small and medium businesses, and when I was talking to stakeholders, there were large companies that still only had a couple of folks working on privacy. So privacy is perennially something that organisations find to be under-resourced. We wanted to say, “Okay, what kind of resources can we build out at NIST to help you blossom into the beautiful butterfly, where you’ve got the cybersecurity team in sync with the privacy team, where you’re building trust with your customers, your users and your business partners?”
By doing privacy risk management, you’ve got sufficient knowledge and skills to manage privacy risk and then hopefully getting better resourcing as well.
We’re not short of examples when it comes to privacy breaches, especially in the last few years around data. Do you think that’s going to be the catalyst for getting momentum behind privacy? Are we going to see a lot more butterflies sooner rather than later?
I think that’s right. There are certainly things that will happen out in the market, as well as the day-to-day operations of organisations that have been and will continue to drive this. So part of it will be data breaches – absolutely.
But then you’ve got laws; we can’t ignore them. There’s a reason why organisations are often focused on compliance. In the US alone, we’ve got state privacy laws that are coming online seemingly every month. We’ve had new laws over the past couple of years in Utah, Virginia and Colorado, in addition to the first big prime mover, the California Consumer Privacy Act.
You’ve got innovations and data-processing activities and organisations that want to compete on privacy. You’ve got the duck, duck, goose of the world and various companies that are privacy-preserving as their calling card – they didn’t really exist when I was first getting into this.
So we’ve got privacy being top of mind for a variety of reasons. And while I do think it’s causing demand for the workforce to skyrocket, the supply hasn’t caught up.
In terms of how much technology opportunity there is within the privacy sphere, can you talk about homomorphic encryption and multiparty computation?
Homomorphic encryption and secure multiparty computation are two examples of privacy-enhancing technologies, and there’s a lot of research that is ongoing into these technologies and a lot of interest in figuring out ways they can be used together to help to solve important problems when it comes to data sharing and analytics.
If you take a look at the US–UK prize challenges that were announced recently, one of them is around coming up with a solution for training a machine-learning model on multiple datasets – such as for financial datasets in order to combat fraud – but doing it in such a way that you’re protecting the privacy of those whose information may be encapsulated in those datasets.
So there may be a need to combine various solutions together – like federated learning along with differential privacy and others – to really make sure we’re getting the maximum utility out of these datasets while also trying to protect privacy in the best possible way.
There’s a lot of interesting and great work going on in the technology side, both in research and development, and it’s going to have policy implications as well. So if there’s a new law around de-identification, you’ll want to avail yourself of the right technology to comply. That’s why we have a disassociated processing category in our privacy framework, where if you have some sort of obligation then you can look to implement various technological or policy solutions to get where you need to be.
As an expert in this field, but also from a personal perspective, why do you think it’s important that these privacy initiatives continue to have such an impact on the way we do things?
It’s what got me into privacy in the first place. Privacy is really important. It safeguards fundamental values like human autonomy and dignity – these are really critical values to protect, and ones that have a connection with trust when it comes to technology.
What was endlessly fascinating to me, and why I think this is so important, is because we want to make sure we are identifying all the ways we can harness the power and the value of data in interesting, important ways. It can drive innovation for the economy at large, but also things like public health and various other ways that can benefit the public interest.
But when it involves information on individuals and groups, there could potentially be issues for the privacy of those folks. So we need to strike the right balance to make sure we are getting as much as we can out of this data while also being sensitive to the fact that the privacy of individuals and groups is critical and must be protected. And the best way to do that is through privacy risk management.
Want more insight into the world of cybersecurity, digital identity, biometrics and more? Get your fix with the IDentity Today podcast, hosted by Daltrey CEO Blair Crawford. Listen via Apple Podcasts, Spotify or your favourite podcast app.