The stores have eyes: CCTV, biometric information and consumer privacy
For shoppers entering bricks-and-mortar retail spaces, the presence of security cameras has long been the norm. But some CCTV systems do more than just “watch.” Technological advances allow in-store systems to collect and analyse biometric information from individual customers – and it’s this retailer activity which is now attracting headlines. Biometric information such as electronic copies of faces, fingerprints, voices collected via CCTV can be used by retailers for many purposes, including to build profiles of the individuals entering their stores, identify returning shoppers and to identify specific individuals that have previously been removed from their premises. But the technology also raises privacy and other ethical concerns.
Biometric information used for automated biometric verification or biometric identification or to create biometric templates is classed as “sensitive information” under the Privacy Act 1988 (Cth). This can include the use of CCTV systems to identify specific individuals, whether or not an individual is named. The collection, use and disclosure of sensitive information must only occur where it’s reasonably necessary for the collecting entity’s functions or activities and (for the initial collection) with the consent of the individual to which the information relates.
Sentiment among Australian consumers about collection of biometric data in retail settings is generally negative. For example, the federal privacy regulator, the Office of the Australian Information Commissioner (OAIC), found in its Australian Community Attitudes to Privacy Survey 2020, that 66% of Australians were reluctant to provide biometric information to businesses – higher than their unwillingness to provide medical or health information (60%) or even location data (56%).
In line with these sentiments, the OAIC has conducted high-profile investigations of retailers using CCTV to collect biometric information:
- In 2021 the OAIC made a determination against a multinational convenience store operator regarding its large-scale collection of sensitive biometric information. The organisation captured images of consumer faces via tablets provided for customers to complete surveys regarding their in-store experience. The OAIC determined that this collection was not reasonably necessary for the purpose of improving and understanding customers’ in-store experience, and that organisation had collected the information without consent. This amounted to two breaches of the Privacy Act.
- In 2022, an independent investigation by consumer advocate group Choice led to major national retailers Kmart, Bunnings and the Good Guys being referred to the OAIC over their alleged use of facial recognition technology in their in-store CCTV systems. Choice considered the use of such technology to be “disproportionate” to the legitimate business functions of those retailers. The OAIC has since opened investigations into Bunnings’ and Kmart’s use of facial recognition technology (with the Good Guys having paused their use of the technology). These investigations are ongoing.
In 2020, it was reported that the number of CCTV cameras in the UK reached 5.2 million (one camera for every 13 people). The UK’s data protection regulator, the Information Commissioner’s Office (ICO), has published guidance on video surveillance, available on its website, which covers CCTV and other systems which make use of AI. It also provides checklists for businesses to ensure their use of video surveillance complies with UK data protection law.
One use case some retailers have for facial recognition technology is to identify “problem” customers. A regional consumer co-operative used this technology to add customers to a blacklist with alerts to staff when those customer(s) entered stores without being transparent on this processing. Biometric data was not passed to police but instead kept for up to two years. Whether this is a proportionate response in high-risk stores for shoplifting is now under investigation by the ICO.
Like the Australian statistics in terms of public sentiment, a 2019 study in the UK found that, while 82% of respondents supported the use of facial recognition technology by law enforcement agencies, less support was found its use by retailers – only 30% believing this was acceptable. In terms of the thoughts of the regulator, the ICO’s video surveillance guidelines note that, given the potential intrusion on individual rights and freedoms, “it is therefore important that the use of surveillance is not seen as the cure to the problems that organisations may face. But instead, a helpful supporting tool where lawful, necessary and proportionate in the circumstances.” So it’s important to ensure any use of the tool is justifiable, and that retailers have carried out data protection impact assessments and have visible CCTV signage explaining processing.
Conclusion Businesses operating in the retail sector should take stock of activities in this space. This includes reviewing in-store monitoring practices to identify if biometric data is being collected, how that data is being used and otherwise processed, and to what extent processing aligns with local data protection laws. Retailers should also stay alert for further developments in this area from local regulators, including outcomes of investigations into use of facial recognition technology and updated guidance.