Flawed Facial Recognition Technology Accuses Māori Woman of Theft in Rotorua Supermarket

Māori woman wrongly accused of theft due to biased facial recognition tech in NZ supermarket, highlighting need for stronger regulations on biometric data use.

author-image
Mazhar Abbas
Updated On
New Update
Flawed Facial Recognition Technology Accuses Māori Woman of Theft in Rotorua Supermarket

Flawed Facial Recognition Technology Wrongly Accuses Māori Woman of Theft in New Zealand Supermarket Trial

A Māori woman was wrongly accused of theft at a New World supermarket in Rotorua, New Zealand, due to flawed facial recognition technology being trialed at 25 of Foodstuffs' North Island supermarkets. The woman was trespassed from the supermarket despite offering multiple forms of identification to prove her innocence.

Māori AI and data experts, such as Karaitiana Taiuru, were not surprised by this incident. They point out that the facial recognition systems used in the trial were trained on international datasets that do not adequately represent the New Zealand population, particularly Māori and Pasifika people. This bias in the technology makes it more likely to misidentify individuals from these ethnic groups.

Experts say that the bias in facial recognition systems could potentially be reduced with appropriate staff training customized to the New Zealand context. However, there is currently no out-of-the-box training available for businesses using this technology. Staff often rely too heavily on the AI rather than using their own judgment when dealing with potential theft cases.

Why this matters: This incident highlights the reliability issues and potential for racial bias in facial recognition technology, especially when used in areas with high Māori and Pasifika populations. It highlights the need for stronger regulations and oversight around the use of biometric data in New Zealand to protect individuals' rights and prevent discrimination.

The Privacy Commissioner is currently working on creating a code of practice for the use of biometric data, as New Zealand lacks special rules governing such technologies. Advocates say this wrongful accusation is a clear example of why facial recognition may not be reliable at this stage and its use should be closely scrutinized.

Facial Recognition Bias: "Multiple forms of identification were offered, and they were still trespassed from the store," said Karaitiana Taiuru, a Māori AI and data expert, regarding the incident in Rotorua. He emphasized that facial recognition systems are often biased against non-European individuals due to the datasets they are trained on, making them unreliable in a New Zealand context with its diverse population.

Key Takeaways

  • Māori woman wrongly accused of theft due to flawed facial recognition tech
  • Bias in facial recognition systems due to lack of diverse training data
  • Experts say bias could be reduced with customized staff training
  • Incident highlights need for stronger regulations on biometric data use
  • Privacy Commissioner working on code of practice for biometric data use