Bunnings’ appeal against the OAIC’s finding that Bunnings breached privacy law by using facial recognition technology (FRT) in its stores was successful (at least in part). This is a win for the use of FRT - but businesses should not rush in as good governance and careful planning are required.
Background
The facts are set out in the OAIC’s initial decision (available here).
On 11 July 2022, the OAIC opened an investigation under s40(2) of the Privacy Act into the FRT practices of Bunnings.
Between January 2019 and November 2021, Bunnings used an FRT system in 62 of its stores across NSW and VIC.
The FRT system was provided to Bunnings by a third party. The FRT system enabled Bunnings to create and maintain a repository of individuals who Bunnings considered posed a risk to its operations and detect when those individuals entered a store where the FRT system was in use. The FRT system used an AI algorithm to match images against the repository. Where no match was detected, the images were automatically deleted within 4 milliseconds.
Bunnings argued that it did not in fact collect personal information of non-matched individuals because the activity lacked the necessary purposive character of collection. Even if it did, the activity fell within a permitted general situation, and it took steps to notify individuals by displaying posters in store and implementing privacy by design principles.
The OAIC decision
The OAIC found the following circumstances relevant:
The nature of the entity. Bunnings is a large corporation.
The number of individuals affected. Probably hundreds of thousands of individuals’ images were collected.
Length of time personal information held.
Type of personal information. Sensitive information was collected.
Sensitive information was collected without consent.
The consequences of collection. There were some false positives.
The OAIC found that individuals did not consent to the collection of their personal information, a permitted general situation (i.e. a serious threat or unlawful activity) did not exist, therefore Bunnings breached APP 3.3.
The OAIC found that Bunnings breached APP 5.1 by failing to properly notify individuals that their sensitive information was being collected as part of FRT.
The OAIC found that Bunnings breached APP 1.2 by failing to take steps to implement practices and procedures to ensure compliance with the APPs.
The OAIC found that Bunnings breached APP 1.3 by failing to include information in its privacy policy about the FRT information it collected and how it was used.
The OAIC made declarations (under s52(1A)) must not repeat or continue the acts or practices that were an interference with privacy, and must publish a statement about this on its website.
The appeal - the ART decision
The decision of the ART is available here.
Section 96 of the Privacy Act states that a party can make an application to the ART to have a privacy determination reviewed.
The ART found that Bunnings did not breach APP 3.3 because a permitted general situation existed i.e. preventing a serious threat or unlawful activity (retail crime).
So essentially, the ART found that Bunnings was allowed to use FRT for the limited purpose of combatting very significant retail crime and protecting their staff and customers from violence. The ART was also persuaded by the technological features of the FRT system which automatically deleted footage of non-matched individuals within 4 milliseconds, and had features to prevent cyber attack.
Key takeaways
The case highlights the tension between balancing the protection of privacy of individuals against the interests of entities carrying out their functions and activities.
Businesses already using FRT should review their current practices, which may include performing a PIA, and implementing good AI and cyber security governance.
Businesses considering using FRT should undertake a PIA first, use privacy by design principles, and put in place good governance practices and procedures.
The case potentially could be seen to highlight the limitations of current privacy laws, and lend weight to the idea of dedicated AI regulations in Australia.
Disclaimer: None of this is legal or professional advice. It is general information only and should not be relied on.