By Felicity Le | 16 July 2024
Smart city technology not only purports to improve the convenience and standard of living for civilians but also assists government agencies in their executive functions.[1] The main smart city infrastructure enabling this objective are the internet of things (IoT) technologies where internet-enabled devices transfer information
to other systems using sensors, monitors and video surveillance.[2] This is especially true for law enforcement agencies (LEAs) with the rise of artificial intelligence (AI), particularly for crime prevention and counterterrorism where surveillance is a major aspect. [3] However, one IoT technology that is the subject of controversy is automated facial recognition technology (AFRT).One criticism asks whether such advanced technology risks being misappropriated for menial purposes not originally intended and thus disproportionate to the AI’s powerful capabilities – this is known as function creep. This is especially concerning to LEAs in Australia where AFRT-policing is unregulated.[4] Furthermore, this problem is exacerbated when comparing the counterterrorism and surveillance capabilities of federal LEAs against those of state police agencies (SPA). It is this research problem that is the contextual focus of this article.
It is important to explore this problem as there is no state legislation regulating AFRT-policing in Queensland, for counterterrorism or otherwise.[5] Such legislation is necessary for two reasons. Firstly, the Australian Government purports to proliferate IoT technologies to enhance infrastructure capability and capture volumes of data.[6] As an IoT technology, AFRT deployment is likely given the Government intends to implement nation-wide identity verification services upon the assent of the Identity Verification Services Bill 2023 despite the lack of legal frameworks to regulate this service. Secondly, and importantly for Queensland citizens and the Queensland Police Service (QPS) as the local LEA, the population and economic activity of Queensland is predicted to increase exponentially over the next decade, particularly considering upcoming major tourism projects like the 2032 Olympic Games. Given that terrorists target densely populated areas, it is necessary for the QPS to capitalise on the latest technologies to protect public safety without misappropriating their new powers.[7]
This situation begs the question: how can AFRT-policing be regulated in Queensland to ensure the QPS do not misappropriate the technology, and act within the capabilities of their counterterrorism powers?
Answering this involves identifying whether AFRT-policing for counterterrorism is within the scope of the QPS’s capabilities, and if not, how can it be regulated to mitigate the risk of function creep. To achieve this aim, the counterterrorism powers conferred on the QPS pursuant to the Police Powers and Responsibilities Act 2000 (Qld) (PPRA) and the Public Safety Preservation Act 1986 (Qld) (PSPA) will be examined to ascertain their adequacy to regulate AFRT-policing. Additionally, factors underpinning future frameworks will be examined to assist policymakers drafting effective regulations that limit function creep.
To assist answering the research question, three contextual matters will first be explained: AFRT fundamentals, policing and counterterrorism fundamentals, and the differences between state and federal LEAs.
These contexts clarify the extent of the research problem and the need for fit-for-purpose AFRT regulations. This need will be articulated in three arguments. Firstly, current counterterrorism powers conferred on the QPS are inadequate to regulate AFRT-policing because the definitions of the relevant devices do not capture the capabilities of AFRT and the situations in which the devices can be deployed does not align with that of AFRT. Secondly, the lacking regulation for AFRT-policing increases the risk of function creep because it enables the police to have broad discretions when using the technology and wide access to extensive databases. Thirdly, to address these inadequacies, policymakers should consider enacting fit-for-purpose regulations by utilising community engagement to maintain civilian-police relationships whist ensuring security objectives are still met.
To ascertain and analyse the required information, three research methods will be used. Firstly, doctrinal research will be used to analyse both federal and state legislation pertaining to counterterrorism powers conferred on LEAs. Following this, interdisciplinary research supplemented with socio-legal doctrines will be used to examine the effects of AFRT and AFRT regulations on the wider community. Finally, as recommendations must be made for policymakers to draft AFRT regulations, reform-oriented research will be conducted to identify optimal approaches.
2.0 Context
2.1 AFRT
AFRT is a type of AI automatically extracting, digitising, and comparing facial features for identification or identity verification.[8] The AI uses three components: cameras for photographing, a database storing the images, and importantly, algorithms comparing the collected facial images.[9] The algorithms achieve automation in two steps: firstly, it detects a face from the stored images to develop a facial map and converting it into a digital template, and secondly, it compares the template to other images stored in other databases.[10] This capability distinguishes AFRT from traditional facial recognition where LEAs physically examined sketches, photos, or CCTV footage.
As an identity verification tool, AFRT is most applicable in the public services faction of a smart city because it streamlines identity document verifications, thereby reducing administrative burdens.[11] For example, citizens applying for driver’s licences can instantly verify their identity with instant access to other health or legal records.[12]
However, what differentiates AFRT for administrative services against AFRT-policing for counterterrorism is the former requires consent whilst the latter purports to use the technology for covert mass surveillance in public places thereby foregoing such requirements because consenting to being surveilled defeats the purpose of it being a secretive operation.[13] This distinction is important because AFRT-policing for covert mass surveillance without consent is potentially beyond the scope and capabilities of SPAs.
AFRT’s advanced functionalities presents its handler with unique capabilities including identifying multiple people simultaneously from afar, identifying individuals in crowds, and executing these functions continuously and autonomously.[14] Additionally, these capabilities are achieved at a scale and speed beyond human capabilities.[15] AI-enabled AFRT can sort datasets with millions of images and measure more facial features than humanly observable to obtain a single match within seconds.[16] The common theme amongst these capabilities is they intend to increase policing efficacy. These autonomous functions eliminate stationing physical officers in various areas for surveillance when their resources can be used in other higher-order duties. This raises the issue whether AFRT-policing for the sake of conveniency justifies mass surveillance of majority innocent civilians and impeding into their personal information regardless of criminal history or affiliations.
2.2 Policing and counterterrorism fundamentals
The main principle justifying police special powers for counterterrorism that would otherwise be a breach of civil liberties is that there is an operational need for the power, that their use is only to be exercised in cases of need, and that it is proportionate to such impacts on the relevant civil liberties.[17] Whether these elements are satisfied when granting police powers to use AFRT is questionable. This was evident in a 2021 study into public perceptions of AFRT-policing, where the authors noted three main concerns from their analysis of public commentaries: the inaccuracy of AFRT, abuse of authority, and exploitation of human rights.[18]
Privacy is the main human right civilians sacrifice as subjects of AFRT because biometric information is considered sensitive information.[19] This is especially problematic when AFRT is used to identify unknown persons, which is often the case in counterterrorism surveillance where mass crowds are monitored for suspicious behaviour. Furthermore, the algorithm can return several possible matches, most of which are innocent unassuming civilians with their biometrics and other confidential data viewable to not only the on-duty officer, but also the SPA as a whole and the private third party supplying the technology.[20] Consequently, what was intended to identify suspected terrorists amongst the crowd unavoidably includes data of the whole population thereby granting local LEAs with liberty-depriving powers.
This is problematic for the QPS because a strong police-citizen relationship is necessary in a democracy where SPAs are not authoritative figures controlling civilian life.[21] Rather, democratic police are answerable to the public and operate consistently with human rights principles.[22] These fundamentals are integral to the QPS given that Australia uses a democratic government system.
Moreover, whether there is an operational need is further dependent on the capabilities of the LEA exercising that power. In Australia, the Commonwealth and the states each employ their own LEA whom are responsible for their own law enforcement framework.[23] However, this separation means each entity is responsible for their own resources, thus leading to differing capabilities between state and federal LEAs.[24]
As a SPA, whether the sacrifice of privacy, and whether there is an operational need for the QPS to use AFRT becomes a further issue when considering these differences. This is discussed below.
2.3 Differences between state and federal LEAs
The overarching difference between SPAs and federal LEAs is that SPAs operate from policing perspectives whilst federal LEAs are intelligence or military based.[25] Therefore, not only are both entities operating with different objectives, instructions, and strategies but are also subject to different legislative requirements and standards of practice.[26] This is especially true for matters relating to counterterrorism as it is a matter relating to national security as a whole, rather than a localised phenomenon.
This was evident in a 2008 study where Victoria Police (VP) counterterrorism policing was analysed. The VP officers expressed their unfamiliarity with federal counterterrorism frameworks and that only a minority of VP officers considered themselves sufficiently trained in counterterrorism operations.[27] A similar but more recent study from 2016 explained that the difference in counterterrorism capabilities was due to the uniqueness of such investigations and that the former was not equipped to engage in counterterrorism investigations.[28]
Another interview study exploring counterterrorism policing asserts that the main difference between counterterrorism and conventional policing was that the former utilises pre-crime frameworks.[29] Comparatively, police investigations occur after the fact and uses crime prevention frameworks, as opposed to pre-crime.[30] Crime prevention frameworks focusses on reducing opportunities for individuals to commit crime or addresses the broader social and environmental factors that make a criminal. Comparatively, pre-crime frameworks disregards these factors and focusses on specifically identifying a potential terrorist amongst the masses rather than addressing the root causes – the latter focus being the purpose of AFRT.[31] Furthermore, pre-crime measures are not concerned specifically with prosecution, evidence gathering, convictions, and punishment – functions to which SPAs are most familiar with.[32]
The importance of this is that given the gap in capabilities between state and federal LEAs, it is a risk to allow the QPS to use AFRT with it being such advanced technology and the QPS as an SPA with underdeveloped counterterrorism capabilities.
2.4 Relevance to AFRT
The amalgamation of these matters is the need for fit-for-purpose legislation appropriately regulating AFRT-usage for counterterrorism exclusively to protect public safety. Otherwise, the QPS cannot operate seamlessly in a smart city if they cannot use AFRT as an IoT technology given its significance in upgrading public safety, as per a smart city purports.
The need for such legislation is twofold: that the QPS cannot rely on current counterterrorism and surveillance powers conferred on them because of the inherent inadequacies of these frameworks due to the historically lacking capabilities of SPAs, and that lacking regulation increases the risk of function creep for purposes disproportionate to AFRT’s powerful features.
3.0 Inadequacies of current legal frameworks
3.1 Current statutory powers
There are two Queensland Acts conferring counterterrorism powers upon the QPS: the PPRA and the PSPA.[33]
The PPRA provides relevant three provisions: Chapter 9 which regulates covert searches, Chapter 13 which regulates surveillance device usage, and s 609A which regulates body-worn cameras (BCW) usage. Given that Chapter 9 regulates the power of covert searches (not surveillance), it will not be discussed.[34]
The PSPA provides two relevant provisions: Part 3B which regulates surveillance device usage in emergencies, and s 8PAA which regulates biometric information collection.
The above provisions are not adequate to regulate AFRT-policing in the QPS for three reasons. Firstly, nil expressly mentions AFRT. Though, it can be argued that prima facie, AFRT is a surveillance device. S 322 PPRA inclusively defines a surveillance device as an optical surveillance device. An optical surveillance device is a device capable of recording an activity.[35] Therefore, AFRT may be considered as an optical surveillance device because it uses a camera to photograph civilians.
However, this legislative definition does not capture AFRT’s advanced capabilities including the surveillance of mass crowds, autonomous sorting and comparing facial features of any citizen against other databases, and to do continuously. Furthermore, Chapter 13 PPRA does not list these capabilities as actions the legislation authorises pursuant to s 322 PPRA. The definitions in s 322 PPRA, in its plain meaning, refers to standard cameras without the deep-learning capabilities that distinguishes AFRT from other surveillance devices. Additionally, this issue in definitions is also reflected in s 609A PPRA where a BWC is defined as a device worn or secured on a person, and is used to record footage.[36] Again, AFRT does not fit within this definition because it does more than merely record images. Therefore, Chapter 13 PPRA and s 609A PPRA cannot be used to regulate AFRT-policing because the definitions of a surveillance device and a BWC does not capture the advanced capabilities of AFRT as a novel technology.
Secondly, even if the s 322 PPRA or the s 609A PPRA definitions did include AFRT as a surveillance device or a BWC, the provisions enabling their usage within Part 3B PPSA and s 8PAA PPSA would not apply to AFRT-policing.
The power cannot be invoked under Part 3B PPSA as this provision only authorises surveillance device usage in declared emergencies. Therefore, it is a power used in response to an emergency event, rather than on-going surveillance which is what AFRT purports. This limitation is also evident in s 8PAA PSPA as this section only allows the power to collect biometric information from a person who is in, is about to enter, or has recently left a declared area of a terrorist emergency.[37] Therefore, this provision regulates biometric information collection in response to a terrorist emergency event, rather than for on-going data collection which AFRT intends with the storing of facial features within its database.
Thirdly, the commissioner is required to destroy the biometric information once it is no longer required for the emergency.[38] However, AFRT relies on databases with pre-stored information to allow the on-going matching to continue given the technology will be used year-round, rather than only for specific events.
3.2 Inadequacies of other legislations
In fact, there is no legislation in Australia specifically regulating AFRT-usage.[39] Rather, the development and use of AFRT is loosely governed by existing privacy, anti-discrimination, and human-rights laws.[40] The Privacy Act 1988 (Cth) (PA) is the main legislation relevant to biometrics and is based on 13 Australian privacy principles (APP).[41] There are three APPs relevant to AFRT: APP 3.3 (collection, notification, and use of sensitive information without consent), APP 3.5 (the collection of personal information is fair), APP 5.2 (compliance with notification requirements).[42]
These principles were a key issue when Office of the Australian Information Commissioner (the Commissioner) investigated AFRT service provider Clearview AI (the Clearview AI decision).[43] Clearview offered trials to some Australian police agencies, including the QPS, for identification purposes.[44] The Commissioner successfully argued that APPs 3.3, 3.5, and 5.2 were breached thereby interfering with the rights to privacy.[45] Ultimately, the Commissioner ordered Clearview to cease collecting biometric data and destroy the remaining data collected.[46]
The significance of this decision was that it was the service provider that was being investigated and penalised, not the LEAs as the user. In fact, it is unclear whether the LEAs continue to use the service currently.[47] The reason the LEAs were not investigated is unclear and the commentary of this issue is limited. However, an analysis into the oversight of Australian police intelligence suggested that the powers of the Commissioner is limited because the body is acutely under-resourced.[48] Furthermore, s 70(1)(g) PA prevents the Commissioner from requiring persons to give information concerning the matter at hand if doing so prejudices the effectiveness of the operations, investigative practices or techniques of agencies. Here, as the LEA’s used the AFRT Clearview supplied them for identification purposes in active investigations, the Commissioner would have been unable to commence an action against the impugned LEAs because doing so would have disrupted the operations of these agencies.
The limited power of the Commissioner suggests the QPS may be allowed to utilise novel technologies that are not expressly regulated, including AFRT, to assist them in their policing operations without penalties. Consequently, this indicates that despite the PA being the overarching legislation that regulates the collection of biometrics, it is not sufficient to regulate AFRT-policing.
This is problematic if AFRT were to be deployed in smart cities where AFRT purports to continually monitor the population and identify any civilian regardless of any criminal affiliations.[49] This is because if there is to be continual surveillance and the police have broad discretions to use AFRT with no limits and no risk of penalisation for adverse breaches of privacy, there is a significant risk of the QPS misappropriating the technology for other purposes disproportionate to the capabilities of AFRT. This consequence is known as function creep and is further explained below.
4.0 Function creep (FC)
FC is the notion that a product used for a specific purpose will be used for purposes not initially intended, particularly without authorisation.[50] The main principle of FC is that it is a gradual and pervasive sneak into other functions.[51] This creep occurs not only due to lacking regulations, but also because of the various databases the AFRT handler will have access to.[52]
Firstly, lacking regulations means the police will have unlimited discretion when using AFRT. This was evident in the English case of R (Bridges) v Chief Constable of South Wales Police and Others where the Court of Appeal was not satisfied that the Data Protection Act 2018, the Surveillance Camera Code of Conduct, and the respondent’s internal policies could sufficiently regulate AFRT-usage.[53] Rather, the framework conferred excessive discretion upon officers to determine how the technology is deployed because it did not sufficiently provide the terms where the power could be exercised.[54] This was a stark overturn of the original decision where the court initially expressed that the privacy infringement of the plaintiff was legitimate because the police’s authority to prevent crime outweighed such liberties.[55] Whilst latter situation is somewhat valid, the issue was that the police justified their AFRT use, thereby their infringement into privacy, using the statutory authorisation for overt photography despite AFRT being an advanced novel technology that necessarily requires more safeguards for use.[56]
Secondly, access to various databases is necessary in AFRT because it enables the deep-learning capability of AFRT to evolve and become accustomed to various facial representations in varying environments.[57] In most jurisdictions, police can access law enforcement databases including publicly operated CCTV systems and mugshots.[58] However, for AFRT in smart cities that is enhancing its deep-learning capabilities, it requires access to secondary sources including driver licence databases, passport photos, miscellaneous government identification records, and even records from private organisations including private surveillance cameras.[59] Furthermore, this access is extended further with a third source of information including social media regardless of one’s privacy settings.[60]
These two factors enables function creep in two ways: the purpose of acquiring the biometric information is widened to include other purposes, or using the newly collected biometric information of an individual to gather more information about that individual by applying the deep-learning capability to the various datasets available.
For example, in China, AFRT was initially used to prevent violent crime but then was subsequently deployed at stoplights for identifying and fining civilians for jaywalking across roads, and in luxury shops to track big spenders.[61] Hence, it is likely AFRT will not be limited to CCTV cameras in populous areas, but attached to BCWs, affixed onto police cars, and deployed in streets, restaurants, and parks in the name of protecting the peace. If so, this erodes the communal element of policing that is fundamental to SPAs in democracies.
It can be argued that in smart city, allowing AFRT to be used in all policing aspects should be encouraged. Prima facie, the convenience of AFRT due to its deep-learning and autonomous capabilities appears to integrate seamlessly in a smart city. Given that smart cities purport to improve the efficiency of services, including law enforcement, AFRT being used in traffic control, street patrol, and other miscellaneous policing duties improves policing efficiency because it mitigates administrative burdens with identity verifications. Therefore, it is arguable that despite the creep into other low-level policing duties being unintended and unauthorised particularly by the public, it is acceptable because it coincides with the smart city objective to improve service efficiency which improves the standards of living and safety for civilians.
Despite this integration with smart city objectives, there are significant issues that negate these benefits. That is, FC leads to mass surveillance which then erodes communal policing crucial for a democracy, and police over-reliance on AFRT erodes the human traits of policing that a computer cannot replicate.
Firstly, mass surveillance catalyses AFRT from being a mechanism protecting public safety to being a weapon of social control. Whilst AFRT for counterterrorism is covert to maintain operation confidentiality, AFRT for miscellaneous purposes does not require such because the stakes are much lower. However, public awareness of mass surveillance does not necessarily indicate that democratic standards are still being valued. Rather, one article claims that people tend to act differently if they know they are being watched.[62] For example, citizens may feel dissuaded from practicing their religion or participate in peaceful protests for fear of having their collected information used against them.[63] This occurs because mass surveillance indiscriminately monitored the public without any reasonable suspicion and without the public knowing how their biometrics will be used.[64]
Secondly, the convenience of AFRT risks police over-relying on the technology which erodes human elements of policing that distinguishes SPAs with federal LEAs and that a computer cannot replicate. These elements include human feelings such as trust, empathy, and rapport.[65] Importantly, for accountability, these elements also include bias checks, resolving technical flaws, and contextualising collected data.[66] An autonomous machine that is continuously photographing, searching, and comparing as much information as possible as quickly as possible is incapable of these elements because it is programmed for data efficiency, not community engagement.
For counterterrorism in particular, these human elements of policing are necessary to avoid tensions between certain members of the community and the police themselves. This issue is furthered by the fact that despite its advanced capabilities, AFRT consistently misidentifies people of colour more frequently than non-people of colour.[67] Such tensions were observed in the US where AFRT misidentified an innocent African-American man for another whom was involved in a theft.[68] Consequently, this led to a false arrest of the former and it was not until after the officers met arrestee in person that they realised the differences.[69]
A leading study on ethnic discrimination due to inaccurate machine-learning algorithms revealed that dark-skinned females have the highest error rates for all gender classifiers, and dark-skinned males are more misidentified than light-skinned males.[70] The paper further explained that pose, illumination, and expression greatly impacts accuracy, and that default camera settings expose light skin better than dark skin.[71] Furthermore, another article explained that the algorithms utilising information from historical databases which stores information over-representing ethnic minorities further adds to this AI discriminatory practice.[72]
What is significant about the above two issues is that they are counter-productive to the smart city objective of improving the standard living. The erosion of communal policing and other human elements due to mass surveillance represents a demise in AFRT to becoming not a tool for the government to serve its people but one that encourages authoritarian control. Such a society does not represent a high-quality standard of living if the society is living in constant intimidation. If left to govern themselves, the QPS is at risk of this function creep which is not only detrimental to their relationship with the community, but also to their skills as community law enforcers. Consequently, AFRT cannot be said to be a smart city technology that improves public services if it sacrifices societal values for the sake of conveniency.
4.1 Banning AFRT
With the issues surrounding AFRT-policing, it is reasonable that stakeholders have reservations over the technology, with some advocating for a complete ban.[73] However, what is common about these reservations is that they do not criticise AFRT itself, but rather fear how it will be used and who the users are. This indicates that whilst most literature emphasized the negative effects of AFRT, there is still a place for it in counterterrorism. This is especially the case considering smart cities where its advanced capabilities allow police to operate with increased efficiency whilst decreasing costs.[74]
Furthermore, the negative discussions of AFRT overlooks the advantages it offers including identifying missing persons and finding exploited children.[75] Thus, for AFRT-policing to operate seamlessly in a smart city, SPAs should utilise AFRT to protect public safety without misappropriating their powers. This delicate balance can only be achieved with fit-for-purpose legislation that directly governs AFRT-policing.
5.0 Future legal frameworks
To be effective, future frameworks should address the concerns associated with conferring power upon SPAs to use AFRT.[76] According to critiques surrounding the shortfalls of AFRT regulations, the most prominent issues are the risk of misappropriations of the technology, and eroding the civilian-police relationship upon deployment including unjustified breaches of privacy.[77] The mitigation of these issues will be discussed in turn.
With regards to FC, such a risk arises because current legal frameworks are inadequate to regulate AFRT in place of non-existent express AFRT-policing legislations, and because the QPS will have access to databases beyond the ones SPAs are usually privy to. These shortfalls enable the SPA broad discretions to use AFRT beyond the scope upon which was originally intended because no express limits are placed upon this power. Therefore, the straightforward approach is to narrow the usage scope. Importantly, the scope must account for the aforementioned contextual matters that distinguish SPAs from federal LEAs which if ignored risks the former engaging in operations they have no jurisdiction in.
Firstly, the prospective framework must precisely define AFRT with its’ AI ability to deep-learn and autonomously collect biometric information through facial recognition and database analysis to distinguish it from conventional surveillance devices.[78] Secondly, the scope should be limited only for counterterrorism purposes in extreme circumstances. To ascertain what is an extreme circumstance, one jurisdictional analysis comparing EU law regulating AFRT-policing asserted that defining the situations where AFRT can be used should consider two criteria: a legitimate purpose principle and a necessity principle.[79] The author explained that legitimacy and necessity is satisfied when other less intrusive measures have been exhausted and the investigation cannot proceed any further.[80] This means AFRT will only be used as a last resort, thereby strictly limiting police usage. Given that acts of terrorism are one of the most serious displays of public violence, this justifies conferring SPAs with powers to use AFRT as counterterrorism is one of the only situations signficant enough to warrant using such advanced technology.
By limiting the scope of usage, the risk of mass surveillance because of FC is decreased which in-turn maintains the communal efficacy of state policing as per a democracy requires, rather than remodelling SPAs into authoritarian figures.
With regards to eroding civilian-police relationships, these fears arise because of the limited transparency, accountability and consent crucial for democratic policing, especially for using technology that intrudes one’s liberties.[81] Prima facie, the common solution would be to consult with the community directly. This solution was abundant amongst the literature where the most common reform approaches included a calibrated trust-based approach where the perspectives of stakeholders are actively monitored and considered to maintain trust between AFRT users and AFRT subjects,[82] and a public co-constructed policy-making approach where public engagement is sought specifically from civilians and incorporated into police decisions pertaining to AFRT.[83]
Though these approaches address the issue of transparency, accountability and consent through public engagement, it disregards the essential element of confidentiality required for counterterrorism. Admittedly, LEAs insist upon minimising the three aforementioned democratic elements to protect disclosing sensitive information to hostile actors.[84] Therefore, an optimal framework must allow balance confidentiality with transparency, accountability, and consent.
One article suggested an incremental approach to regulations rather than sweeping reforms.[85] Incrementalism is a law reform theory endorsing progressively adjusting legal frameworks in light of the inadequacies of those frameworks.[86] Such theory was recommended because it favours components benefitting adjustments AFRT-policing specifically: sectoralism where the adjustments are applied to a specific jurisdiction, identifying shortfalls and finding solutions in current regulations, evidence-based regulation where the adjustments are based on evidence of actual harm rather than opinion, and flexibility where the regulations are able to react to the needs of the time.[87] These components are particularly applicable to QPS AFRT-policing as the prospective frameworks will only be implemented under Queensland legislation, the adjustments will be based on the shortfalls of the PPRA and PSPA and consider the harms caused in precedent AFRT cases, and the framework processes are flexible enough to keep pace with a continuously evolving society like Queensland.
6.0 Conclusion
The lacking frameworks regulating AFRT gives rise to the risk of FC where the QPS may misappropriate the technology for menial purposes other than counterterrorism – which is the only situation that warrants such novel usage. The question whether how AFRT can be regulated to mitigate FC was answered with consideration of essential contextual information including the definition of AFRT, policing fundamentals with a focus on counterterrorism, and the differences in capabilities between SPAs and federal LEAs. These contexts built the arguments supporting the notion of enacting AFRT regulation. The first argument explained the inadequacies of current legal frameworks to regulate AFRT. The second argument advised of the risk of function creep should AFRT continue to be unregulated. The third argument advised of factors policymakers should consider to draft effective legislation with the aforementioned matters in mind.
Policymakers must ensure these effective frameworks are enacted prior to an official deployment of AFRT in Queensland before society suffers the consequences of being subject to authoritative SPAs.
[1] Elham Farazdaghi, Mojtaba Eslahi and Rani El Meouche, ‘An Overview Of The Use Of Biometric Techniques In Smart Cities’ (2021) The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 41, 42; Li Yanchi, ‘Research on the Application of Face Recognition Technology in Public Service of Smart City’ (2020) 2020 International Conference on Intelligent Transportation, Big Data & Smart City 167, 167.
[2] Commonwealth of Australia, Report 462: Commonwealth Infrastructure Spending (Report no 462, June 2017).
[3] Adam Fletcher, ‘Government surveillance and facial recognition in Australia: a human rights analysis of recent developments’ (2023) 32(1) Griffith Law Review 30, 30.
[4] Lauren Wilson et al, ‘Australian biometric system to meet national security objectives – Part II legislation and policy’ (2020) 54(1) Australian Journal of Forensic Sciences 1, 136.
[5] Adam Fletcher, ‘Government surveillance and facial recognition in Australia: a human rights analysis of recent developments’ (2023) 32(1) Griffith Law Review 30, 31.
[6] Commonwealth of Australia, Report 462: Commonwealth Infrastructure Spending (Report no 462, June 2017).
[7] Meng Xi, Nie Lingyu and Song Jiapeng, ‘Research on urban anti-terrorism intelligence perception system from the perspective of Internet of things application’ (2019) 58(2) International Journal of Electrical Engineering & Education 248, 249; Marcus Smith and Seumas Miller, ‘The ethical application of biometric facial recognition technology’ (2022) 37 AI & Society 167, 167.
[8] Marcus Smith and Seumas Miller, ‘The ethical application of biometric facial recognition technology’ (2022) 37 AI & Society 167, 168; Marcus Smith and Seumas Miller, Biometric Identification, Law and Ethics (Springer, Cham, 1st ed, 2021) 22.
[9] Marcus Smith and Seumas Miller, ‘The ethical application of biometric facial recognition technology’ (2022) 37 AI & Society 167, 167.
[10] Marcus Smith and Seumas Miller, ‘The ethical application of biometric facial recognition technology’ (2022) 37 AI & Society 167, 168; Giuseppe Mobilio, ‘Your face is not new to me – Regulating the surveillance power of facial recognition technologies’ (2023) 12(1) Internet Policy Review 1, 4.
[11] Li Yanchi, ‘Research on the Application of Face Recognition Technology in Public Service of
Smart City’ (2020) 2020 International Conference on Intelligent Transportation, v Data & Smart City 167, 167.
[12] Ibid; Elham Farazdaghi, Mojtaba Eslahi and Rani El Meouche, ‘An Overview Of The Use Of Biometric Techniques In Smart Cities’ (2021) XLIV-2/W1-2021 The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 41, 42.
[13] Li Yanchi, ‘Research on the Application of Face Recognition Technology in Public Service of
Smart City’ (2020) 2020 International Conference on Intelligent Transportation, Big Data & Smart City 167, 168; Dallas Hill, Christopher D O’Connor and Andrea Slane, ‘Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making’ (2022) 24(3) International Journal of Police Science & Management 325, 328.
[14] Harry Aniulis, ‘Facial recognition technology, privacy and administrative law’ (2022) 45(4) UNSW Law Journal 1513, 1515.
[15] Elizabeth E Joh, ‘Reckless Automation in Policing’ (2022) Berkeley Technology Law Journal 1, 117.
[16] Zhilong Guo and Lewis Kennedy, ‘Policing based on automatic facial recognition’ (2023) 31 Artificial Intelligence and Law 397, 401.
[17] Rebekah Quinlan and Zin Derfoufi, Stop and Search The Anatomy of a Police Power (Palgrave Macmillan London, 1st ed, 2015) 123.
[18] Adelaide Bragias, Kelly Hine and Robert Flet, ‘‘Only in our best interest, right?’ Public perceptions of police use of facial recognition technology’ (2021) 22(6) Police Practice and Research 1637, 1643.
[19] Privacy Act 1988 (Cth) s 6(1).
[20] Ibid.
[21] Adelaide Bragias, Kelly Hine and Robert Flet, ‘‘Only in our best interest, right?’ Public perceptions of police use of facial recognition technology’ (2021) 22(6) Police Practice and Research 1637, 1637.
[22] Ibid 78.
[23] Darren Palmer and Chad Whelan, ‘Counter‐terrorism across the Policing Continuum’ (2006) 7(5) Police Practice and Research 449, 450.
[24] Adrian Cherney, ‘Police community engagement and outreach in a counterterrorism context’ (2018) 13(1) Journal of Policing, Intelligence and Counter Terrorism 60, 64.
[25] Nicolas Johnston, ‘Considering military involvement in Australia's domestic counter-terrorism apparatus’ (2019) 15(2) Australian Army Journal 95, 96.
[26] Sam Mullins, ‘Counter-terrorism in Australia: practitioner perspectives’ (2016) (11) Journal of Policing Intelligence and Counter Terrorism 93, 98.
[27] Sharon Pickering, Jude McCullock and David Wright-Neville, ‘Counter-terrorism policing: towards social cohesion’ (2008) 50 Crime, Law and Social Change 91, 98-106.
[28] Ibid 96-97.
[29] Jude Mcculloch and Sharon Pickering, ‘Pre-Crime and Counter-Terrorism: Imagining Future Crime in the ‘War on Terror’’ (2009) 49(5) The British Journal of Criminology 628, 629.
[30] Ibid 629.
[31] Ibid.
[32] Ibid 631.
[33] Counter-Terrorism and Other Legislation Amendment Bill 2017.
[34] Police Powers and Responsibilities Act 2000 (Qld) s 212.
[35] Police Powers and Responsibilities Act 2000 (Qld) s 322.
[36] Police Powers and Responsibilities Act 2000 (Qld) s 609A(5).
[37] Public Safety Preservation Act 1986 (Qld) ss 5-6.
[38] Public Safety Preservation Act 1986 (Qld), s 8PAA(1).
[39] Harry Aniulis, ‘Facial recognition technology, privacy and administrative law’ (2022) 45(4) UNSW Law Journal 1513, 1516.
[40] Nicholas Davis, Lauren Perry and Edward Santow, ‘Facial Recognition Technology: Towards a model law’ (Research Paper, Human Technology Institute, September 2022) 37.
[41] Wilson et al, Australian biometric system to meet national security objectives – Part II legislation and policy, 137; Fletcher, Government surveillance and facial recognition in Australia: a human rights analysis of recent developments, 41.
[42] Sarah Whitfield-Meehan, ‘Privacy: Biometric recognition technology and the 'Clearview AI' decision’ (2022) (86) LSJ: Law Society Journal 85, 85.
[43] Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021).
[44]Ibid. Sarah Whitfield-Meehan, ‘Privacy: Biometric recognition technology and the 'Clearview AI' decision’ (2022) (86) LSJ: Law Society Journal 85, 85.
[45] Ibid.
[46] Ibid [240]–[242].
[47] Stephen Gray and Yee-Fui Ng, ‘Taming the electronic genie: Can law regulate the use of public and private surveillance?’ (2022) 48(3) Monash University Law Review 113, 132.
[48] Lyria Bennet Moses, ‘Oversight of Police Intelligence: A Complex Web, but Is It Enough?’ (2023) 60(2) Osgoode Hall Law Journal 289, 319.
[49] Stephen Gray and Yee-Fui Ng, ‘Taming the electronic genie: Can law regulate the use of public and private surveillance?’ (2022) 48(3) Monash University Law Review 113, 132.
[50] Adam Henschke et al, Counter-Terrorism, Ethics and Technology (Springer Nature, 2021) 94; Monique Mann and Marcus Smith, ‘Automated facial recognition technology: Recent developments and approaches to oversight’ (2017) 40(1) The University Of New South Wales Law Journal 121, 133.
[51] Bert-Jaap Koops, ‘The concept of function creep’ (2021) 13(1) Law, Innovation and Technology 29, 33.
[52] Peter Dauvergne, ‘Facial recognition technology for policing and surveillance in the Global South: a call for bans’ (2022) 43(9) Third World Quarterly 2325, 2327.
[53]Joe Purshouse and Liz Campbel, ‘Automated facial recognition and policing: a Bridge too far?’ (2021) 42(2) Legal Studies 209, 216.
[54] Ibid.
[55] Zubair Ahmed Khan and Asma Rizvi, ‘AI Based Facial Recognition Technology And Criminal Justice: Issues And Challenges’ (2021) 12(14) Turkish Journal of Computer and Mathematics Education 3384, 3387; R (Bridges) v. CCSWP & SSHD [2019] EWHC 2341 (Admin)
[56] Joe Purshouse and Liz Campbel, ‘Automated facial recognition and policing: a Bridge too far?’ (2021) 42(2) Legal Studies 209, 215-216
[57] Zhilong Guo and Lewis Kennedy, ‘Policing based on automatic facial recognition’ (2023) 31 Artificial Intelligence and Law 397, 402.
[58] Peter Dauvergne, ‘Facial recognition technology for policing and surveillance in the Global South: a call for bans’ (2022) 43(9) Third World Quarterly 2325, 2327.
[59] Zhilong Guo and Lewis Kennedy, ‘Policing based on automatic facial recognition’ (2023) 31 Artificial Intelligence and Law 397, 405.
[60] Ibid.
[61] Halie B Peacher, ‘Regulating Facial Recognition Technology In An Effort To Avoid A Minority Report Like Surveillance State’ (2021) 25(1) Marquette Intellectual Property & Innovation Law Review 21, 27.
[62] Ibid.
[64] Vera Lúcia Raposo, ‘The Use of Facial Recognition Technology by Law Enforcement in Europe: a Non‑Orwellian Draft Proposal’ (2023) 29 European Journal on Criminal Policy and Research 515, 515.
[65]Agnė Limantė, ‘Bias in Facial Recognition Technologies Used by Law Enforcement: Understanding the Causes and Searching for a Way Out’ (2023) Nordic Journal of Human Rights 1, 14.
[66] Ibid.
[67] Scott Robbins, Counter-Terrorism, Ethics and Technology (Springer, Cham, 2021) 92.
[68] Sidney Perkowitz, ‘The Bias in the Machine: Facial Recognition Technology and Racial Disparities’ (2021) MIT Case Studies in Social and Ethical Responsibilities of Computing 2, 2.
[69] Ibid.
[70] Joy Buolamwini and Timnit Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (2018) 87 Proceedings of Machine Learning Research 1, 11.
[71] Ibid 11-12.
[72] Sidney Perkowitz, ‘The Bias in the Machine: Facial Recognition Technology and Racial Disparities’ (2021) MIT Case Studies in Social and Ethical Responsibilities of Computing 2, 2.
[73] Peter Dauvergne, ‘Facial recognition technology for policing and surveillance in the Global South: a call for bans’ (2022) 43(9) Third World Quarterly 2325, 2326.
[74] Brenda Leong, ‘Facial recognition and the future of privacy: I always feel like … somebody’s watching me’ (2019) 75(3) Bulletin of the Atomic Scientists 109, 113.
[75] Li Yanchi, ‘Research on the Application of Face Recognition Technology in Public Service of Smart City’ (2020) 2020 International Conference on Intelligent Transportation, Big Data & Smart City 167, 168; Dallas Hill, Christopher D O’Connor and Andrea Slane, ‘Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making’ (2022) 24(3) International Journal of Police Science & Management 325, 326.
[76] Hannah Harris and Andrew Burke, ‘Artificial Intelligence, Policing and Ethics – a best practice model for AI enabled policing in Australia’ (2022) AI-PLE 2021 International Workshop on AI-enabled Policing and Law Enforcement IEEE EDOC Conference 2021 – Proceedings 1, 54.
[77] Gary K Y Chan, ‘Towards a calibrated trust-based approach to the use of facial recognition technology’ (2021) 29(4) International Journal of Law and Information Technology 305, 306.
[78] Claire Poirson, Katharina Miller and Karen Wendt, The Fourth Industrial Revolution and Its Impact on Ethics (Srpinger Cham, 1st ed, 2021), 294; Dallas Hill, Christopher D O’Connor and Andrea Slane, ‘Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making’ (2022) 24(3) International Journal of Police Science & Management 325, 326.
[79] Vera Lúcia Raposo, ‘The Use of Facial Recognition Technology by Law Enforcement in Europe: a Non‑Orwellian Draft Proposal’ (2023) 29 European Journal on Criminal Policy and Research 515, 520.
[80] Ibid.
[81] Dallas Hill, Christopher D O’Connor and Andrea Slane, ‘Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making’ (2022) 24(3) International Journal of Police Science & Management 325, 328.
[82] Gary K Y Chan, ‘Towards a calibrated trust-based approach to the use of facial recognition technology’ (2021) 29(4) International Journal of Law and Information Technology 305, 323;
[83] Dallas Hill, Christopher D O’Connor and Andrea Slane, ‘Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making’ (2022) 24(3) International Journal of Police Science & Management 325, 325.
[84] Hannah Harris and Andrew Burke, ‘Artificial Intelligence, Policing and Ethics – a best practice model for AI enabled policing in Australia’ (2022) AI-PLE 2021 International Workshop on AI-enabled Policing and Law Enforcement IEEE EDOC Conference 2021 – Proceedings 1, 54.
[85] Asress Adimi Gikay, ‘Regulating the Use of Live Facial Recognition Technology by Law Enforcement Authorities: An Incremental Approach’ (2023) 82(3) Cambridge Law Journal 414, 414.
[86] Ibid 415.
[87] Ibid 437-441.