We sit down with leading technology lawyer Michael Montgomery to discuss what organisations are doing to protect themselves and what we as a society need to consider about the digital footprint we’re leaving.
Michael Montgomery has operated as a trusted advisor, law firm partner and in-house counsel for some of the largest technology firms in the world. Here’s a snippet of our conversation from Episode 2 of the IDentity podcast, which you can listen to in full via Apple Podcasts, Spotify, your favourite podcast app or online here.
After the hack of Clearview, a facial-recognition company, they released a statement essentially saying: “Hacks are part and parcel of living in this world. It’s about what you do after it happens.” What are your thoughts on that?
I completely agree. There’s no doubt that there is no invulnerability. I think that’s one of the biggest issues as a lawyer – when you see a large financial institution or healthcare company come in, they want assurances that you will ensure invulnerability and that there will be all these levels of data security. But you can hack the Pentagon; it’s impossible for anything to be invulnerable.
Clearview’s statement and post-hack intentions are all about remediation, which are the sort of steps you should be taking. But I’m just not sure whether I would have packaged it up in the same way. The fundamental point is: yes, breaches are a fact of life, and the security standards that exist are there to try and minimise harm.
Do you think that’s the right way to go about things? Does that change how people are thinking about hacks from an accountability perspective?
It does, and the law is starting to catch up with it in terms of accountability. Ultimately, there are various levels of accountability, such as between Clearview and its third-party providers. There would be a contract which sets out the responsibility matrix – who’s liable for what? Clearview will attempt to pass as much of that across to the hoster, but the hoster will naturally only be able to absorb so much liability.
But it’s the consumer – the end customer – whose personal information, potentially sensitive information and health information is being compromised and potentially monetised by a third party. That’s really what privacy laws are looking to address.
So in Australia, we don’t have a personal right for an individual to bring action against an entity, in this case Clearview. You can complain to the Office of the Australian Information Commissioner (OAIC), which can itself bring a court action and seek to impose fines. Most commonly in Australia there are enforceable undertakings, and there are numerous examples you can find on the OAIC’s website where large companies are given undertakings about not repeating that behaviour and what they’re going to do to address it.
Your main area of speciality is supporting clients in the cloud, focusing on software as a service (SaaS). Thinking about that as well as the topic of data security, there’s a conversation at the moment around legal obligations as driven by legislation and regulation versus contractual obligations, which are obviously deemed by a supplier–customer relationship. What is the approach to all of that right now?
I think that approach is evolving. For years I’ve seen – particularly from the bigger end of town and the more well-resourced customers of IT services – some fairly draconian provisions about what you have to do, and then more recently data privacy and security agendas. So you might have an existing contract and suddenly you’ll get a whole new layer they’ve asked you to sign up to. The issue with some of those documents, and I understand completely what they’re looking to address, is in the way they approach it and how prescriptive they are about it: “This is how you must deal with data, and these are the processes and procedures.”
The problem with that for providers is that they need to do things in the same way. And that’s important because you want them to do things in the same way. That’s where the evolution of certification standards, common practices and industry best practices have made some of those things go away. So when someone is ISO-certified 27001 and they can demonstrate how they protect data, that’s where there’s often a ‘meeting of minds’. People understand what’s going on and they get comfortable.
In terms of the approach to contract creation, is there a risk that when you require the supplier to have a specific certification, that the certification won’t necessarily be applied in the right way to achieve the required outcome?
Absolutely. But it depends on what’s happening. For example, in the area where you’re doing something bespoke for someone, you need to marry up to what their expectations and their needs are. Where you’re offering a platform or an infrastructure as a service, I think there needs to be a recognition by customers that you want them to do things in the same way, and that if they do something unique and special for you, then that’s going to deviate from their processes. That’s going to be harder for them to do and they’re more likely going to breach.
Staying within the realm of data security, whose responsibility is it to draft the approach to the protection of that data? Should it be something the supplier puts forward as their method, or should it be the buyer who already has it in their agreement for the supplier to sign?
I think the onus is on the supplier to demonstrate how they do things. Then an informed buyer can look at it alongside their own processes and expectations and then conduct their due diligence. They can ask: “Is there a gap here? Is there something we specifically want addressed?” And that gets you to the heart of the matter.
If you have specific needs and they’re not being addressed, it’s better to then have that conversation rather than to say: “Look, here’s all of our requirements, put those in your contract.” The supplier then has to go back and say: “Well, it’s priced at this, it’s a product that’s off-the-shelf, but it does it the same way for everyone else. We can do it your way but it’s going to cost you a hell of a lot more money.”
That’s a negotiation, but – and salespeople hate this – if the products are more or less homogenised and doing things in the same way, then there’s a great benefit because they’re able to report in the same way.
Looking at the landscape as it is now, taking into account privacy, legislation, regulation, customer sentiment and the way people are operating their businesses, what’s next? What’s happening in the world right now so far as all of that?
Well, I haven’t really unpacked the new ISO Privacy Standard, but I think that together with a Data Security Standard it’s providing an easy way out. I know it’s very expensive to get certified, and I think that any emerging suppliers should look at those standards alongside what they do and what their suppliers do so they can move in that direction and say: “Well, that may be part of our journey but this is how we protect your data and these are the relationships we have.”
It’s about being as transparent as you can without giving anything away so the customer can get comfortable quickly. You need to convey to them: “We are your equivalent to those standards. We don’t need to have a long and detailed unpacking of everything you do because we’ve got a contractual promise from you that you will stump up to this set of standards.”
Michael delves into data security issues and ‘digital time capsules’ at a more granular level in the podcast. Listen to Episode 2 of IDentity Today via Apple Podcasts, Spotify, your favourite podcast app or online at Omny.