Trust, biometrics and identity in the digital age

Rachel Botsman – author, speaker and Oxford University’s first Trust Fellow – invited Daltrey’s managing director Blair Crawford to present a guest lecture to her class at Oxford’s Saïd Business School. What followed was an illuminating discussion about trust, biometrics and identity. 

 

Under an overarching theme of trust in the digital age, the conversation touched on the common misconceptions around biometrics – how the media and some concerning applications by foreign governments have contributed to distrust and uncertainty – self-sovereign identity (SSI) and the opportunity to force big tech to ‘pay to play’ with our data; the balance between rights, privacy and context; and the responsibility of users, vendors and regulators.

With such a diverse range of perspectives from around the world, the questions being asked were insightful and probing – more so than many of those currently being asked in market. Some standouts included:

On the Indian government’s Aadhaar biometric program…

“It feels like a billion people have been forced into giving up their biometric information and it’s not necessarily clear what it’s for. How much of this has to do with country-by-country privacy laws or data protection laws?”

If it’s not clear how your biometric information is being used and a carrot and stick approach is being employed to incentivise adoption – for example, you can’t get a driver’s licence or pay your taxes without a biometric identity card – it can cause widespread disenfranchisement. But the issue here is really about consent, because organisations already have our biometric information (whether that’s a passport office, social media platform etc). It’s when they have your biometric information as well as your credit score and your vaccination status (or other identity attributes) and you don’t have control over how they use it that it can become problematic.

A person should have the right to give out this information based on the use case and the exchange of value. And although there are different data privacy laws across regions, Daltrey has adopted GDPR policies as a global benchmark. Ultimately, we all need to be more focused on the idea that we should be in control of the attributes that make up the digital representation of ourselves on a consistent basis.

On the use of biometric technology in the criminal justice system…

“What are your thoughts on different strategies for earning trust in different use cases? For example, identification in the criminal justice context, where people are being identified and their rights taken away without consenting.”

While acknowledging that there’s a public safety requirement for identification in some cases, we need to reframe the discussion around criminal justice and the better applications of biometric and identity technology. In addition to streamlining the experience for the workforce (removing the need for passwords and access cards, for example), there’s also an opportunity for biometric credentials to be used in custodial environments to better support release and reintegration when it comes to parole reporting.

For example, biometrics can be used to give rights and control back to a person who has ‘done their time’ by removing the stigma attached to those wearing ankle bracelets and having to report in person. Instead, people could report on their mobile using facial recognition or voice recognition – the GPS and biometric technology combining to prove they’ve met the conditions of their parole and rebuilding trust in a more humane way.

On regulation and standards…

“Do you negotiate the space between your customers and their end users, your business and other institutions in a way that attempts to pre-empt regulation? How do you think about filling the gap in regulation that exists – as you pointed out – between your customers and the end users?”

There will always be problems with technology if there’s no regulation standards or control. Which is why it’s a priority for Daltrey to have a direct input into the standards being written. There has to be a common base level which is understood across all different stakeholders, particularly when it comes to user privacy vs organisational security.

Click here for more information about Daltrey’s involvement in the development of ISO standards on biometric verification and identification systems.

On passive trust…

“Even if we choose not to be on social media, even if we choose not to give our data away… enough people around me have Facebook or Instagram on their phone where my pictures and my data are going. I’m always near an Alexa or Google home… Isn’t it better to have control over what you are giving away and choose to opt in, instead of just passively giving information through other people?”

The reality is that we’ve already committed so much of our data to various organisations over time, but most of us don’t have a problem if we’re receiving value in return – like letting Facebook and Google leverage our data for marketing purposes because we’re using their services for free. Where things get complicated is when the goalposts move and we don’t have control over what our data’s being used for. Case in point, the Cambridge Analytica scandal. Ultimately, despite the fact that so much of our information has already been pumped into artificial intelligence engines, it’s important not to be fatalistic. We need to start somewhere when it comes to controlling our data and our identity.

The students’ insightful questions showed a sophisticated understanding of the space. As a sector, it’s incumbent on us to follow their lead and better educate those around us so they can ask more and better questions of vendors and service providers. By elevating everyone’s knowledge, we elevate the security posture of people, organisations and government.

For more information on Daltrey’s biometric-based identity solution, click here