Griffin on Tech: Biometrics code needs work but remains vital

When Judith Collin’s released a cabinet paper last month outlining the Government’s position on artificial intelligence, she said a “light touch, proportional, and risk-based” approach to regulating AI would be pursued.

Existing laws and regulations would be drawn on where appropriate, but no dedicated AI laws were in the works. Only where there was a gap in regulation would new ones be considered, she concluded. 

The paper really amounted to a place-holder, indicating that the Government’s Chief Digital Officer would be tasked with bringing a work programme around AI together. In contrast to other countries, no firm areas of focus were identified. 

That split commentators into two camps - those that thought it a pragmatic approach that will encourage innovation in AI, and those who saw it as a cop-out.

The likes of Victoria University senior lecturer and AI lecturer, Dr Andrew Lensen, thought the Government should go further and regulate specific uses of AI.

“There needs to be regulation on what is and isn’t allowed to be automated with AI. For example, should the government automate benefit eligibility decisions or should the justice system use AI for low-level sentencing?” Lensen wrote in The Conversation.

Light-touch or lightweight?

I too was disappointed by the paper but not surprised. I expected the light-touch regulation decision, that has been the mantra of Judith Collins since before the election. But I expected more of a plan to have been formulated by now to encourage the responsible uptake of AI across the economy. 

Through that lens, one of productivity-enhancing economic development, we are rudderless compared to other nations that already have in place strategies to bring AI research, infrastructure and economic initiatives together.

The one area where work is actually reasonably advanced on a complex and challenging area applicable to AI is the Office of the Privacy Commissioner’s proposed Biometics Code

If there’s one area that needs specific, fit-for-purpose regulation it is around biometrics and how they intersect with AI. The use of facial recognition technology for law enforcement and retail security is a powder keg in most countries, including here with the controversial trial of the technology across stores owned by Foodstuffs North Island.

A round of submissions on the draft Biometrics Code yielded fairly predictable responses - individual members of the public who submitted (180 of the 250 responses received) are worried about biometrics like facial recognition and want it to be adequately regulated.

Businesses, on the other hand, think the code is too onerous, broad in scope, ambiguous in parts, and likely to stifle innovation in AI. Graeme Muller, the chief executive of industry body NZTech went so far as to write an open letter to a range of ministers, Collins included, venting at the draft code.

“Unfortunately in this case the level of engagement and listening from the Office of the Privacy Commissioner has been close to zero,” Muller wrote on LinkedIn

“They appear to be charging ahead with a Biometrics Code that makes little sense no matter who you talk to,” he added.

A mixed bag of changes needed

I’ve had a read of the code and the summary of the submissions on it. It’s clear that the Biometrics Code, which would modify some of the privacy principles in the Privacy Act 2020 if implemented, needs some additional work.

The six month grace period to allow organisations gathering and using biometric material is way too short - 12 months makes more sense. The transparency obligations are a bone of contention.

Better provisions for Māori data governance are needed. Clearly, if facial recognition, fingerprint data, gait analysis data etc. is being gathered about individuals, they have a right to know that. But it needs to be clearly and concisely enough explained to avoid “notification fatigue”.

Screen-scraping is a tricky one. It’s a quick way to gather biometric data by copying photos of people. But its also a recipe for trouble. Meta has just written a cheque to the state of Texas for over $2 billion for its unauthorised gathering of facial recognition data on its platform by looking at photos posted to Facebook and identifying friends of Facebook users in them. They’ve since deleted their biometrics database of profiles. if screen-scraping is allowed in the Code, it should come with the caveat that the data can only be used for a specific context and users need to know it is happening and have the option to opt out.

The real crux of the matter comes down to “fair processing limits”. The draft code details four restrictions on using biometrics, banning the use of biometric classification collection of biometric information “to collect information about people’s health, inner state (personality or mood), physical state (attentiveness, fatigue) or their demographic information like gender or ethnicity (protected categories in the Human Rights Act)”.

AI startups have already developed biometric-based applications involving all of the above. Muller and the tech businesses he represents make a valid point that effectively banning these uses cuts out swathes of potential applications for biometric data.

Huge potential in healthcare

For instance, by linking a patient's biometric data to their medical records, hospitals and medical clinics can ensure that the right person receives the right treatment, preventing errors and medical and insurance fraud. 

Biometric screening by employers can help identify illnesses or behaviours that may prevent them from safely doing their job. For instance, a camera mounted in a truck could monitor the driver to measure their attentiveness to the road, and check for signs of fatigue.

All of this is legitimate use of biometrics, as far as I’m concerned. But it needs to be proportional, restricted to the context, be explained transparently, and crucially, be opt-in.  What happens if a truck driver opts out? Will she receive less pay or face disciplinary action? There’s a danger the Code could create some inequities as use of biometrics becomes more pervasive in our lives.

The Privacy Commissioner will now incorporate the feedback into a new draft and seek additional feedback. He needs to stay the course and arrive at something under the banner of the Privacy Act that protects New Zealanders from overreach and misuse of their data, while enabling the innovation that could make a real difference in areas like healthcare.

Previous
Previous

A US Court has ruled Google is an illegal monopoly – and the internet might never be the same

Next
Next

ITP Cartoon by Jim - Oligopolistic