Published 17 August 2020
In the context of global climate change and the concurrent need for increased sustainability, adaptability and resiliency, ‘smart city’ technologies are hailed as “intrinsically sustainable” urban development and management solutions.1 Moreover, cities are eager to present such solutions as win-win, improving “quality of life for all residents” while addressing climate change. At present, the City of Darwin is in the second phase of its $10 million ‘Switching on Darwin’ project. The project—billed as the largest and fastest implemented of its kind in Australia—will utilise Big Data technology and infrastructure, transforming Darwin into a ‘smart city’.
Although explicit definitions are rare, a smart city is typically identified by the use of Big Data technologies to address social, environmental and economic challenges of an urban environment.2 Indeed, Darwin’s new technologies include environmental and noise sensors to monitor weather and climate changes, microclimate sensors to collect data on CO2 levels, and water and energy monitoring sensors to improve efficient resource use. However, other technologies include free public WiFi, advanced CCTV capabilities, video analytics, artificial intelligence (AI) and machine learning, some of which will be used to track mobile phone signals to “manage crowds, analyse people movement patterns and detect abnormal occurrences”.3 This data will then be aggregated on a centralised data platform able to analyse and visualise data to inform decision-making.4
“Darwin’s new technologies include environmental and noise sensors to monitor weather and climate changes. However, other technologies […] will be used to track mobile phone signals to manage crowds, analyse people movement patterns and detect abnormal occurrences.”
As is common with smart city projects, marketing materials enthusiastically presume broad public demand for more efficient products and services, improving ‘quality of life’ while sustainably managing city resources.5 Critics point out, however, that such technologies carry many risks including increased surveillance facilitated by opaque and unaccountable actors, impacts on civil liberties and human rights, and the deepening of discrimination and marginalisation of disadvantaged communities.6 Indeed, it has been further argued that there simply is no evidence to support “untested” smart city claims. Nevertheless, Lord Mayor Kon Vatskalis dismissed concerns, claiming they were merely from “conspiracy theorists”.
In light of high-profile cases uncovered by the 2013 Snowden leaks and China’s emerging social credit system, this remark is particularly revealing—indicating a serious lack of consideration for the genuine and well-founded concerns of critics.7 Indeed, a 2017 Privacy International report illustrates many examples of how smart city technologies have been used in improper and abusive ways. In Beijing, for example, data collected by the city’s central database has been analysed using AI to map social unrest.8As a result, administrators are alerted to areas of the city with higher levels of protests leading to policy adjustments in the form of extra policing. The use of similar tactics in Australia may have “chilling effects for democracy”.9 Although, one may argue that this example is context-specific, that Australia is different. They would be right, but not necessarily for the reasons they may first suppose.
“In Beijing, data collected by the city’s central database has been analysed using AI to map social unrest […]. The use of similar tactics in Australia may have ‘chilling effects for democracy’.”
In Australia, recent Black Lives Matter protests have put bias and racism back in the national spotlight. Time and again police surveillance and harassment are skewed disproportionately towards Aboriginal and Torres Strait Islander peoples. Data from the Australian Bureau of Statistics show that while Aboriginal and Torres Strait Islander people make up 25.5% of the Northern Territory’s population, they account for 84% of the adult prison population, or, 12 times the national non-indigenous incarceration rate. There are many factors contributing to this over-representation. However, the Australian Law Reform Commission notes that the Northern Territory has been particularly zealous in “over-policing of public order and criminal infringement offences [and] ‘proactive’ policing in relation to bail and residential checks.” This is further reinforced by a recent Northern Territory Anti-Discrimination Commission report showing that discrimination based on race is the most complained about issue, the vast majority of whom identify as Aboriginal.
This raises two fundamental problems with smart city technology. First, a belief that technology is neutral and eliminates human error, and second, that cities become more surveilled than smart.10 In the first instance, belief in neutrality often leads to a “one-size-fits-all” approach to technology application, ignoring local social and cultural specificities.11 Consequently, the use of poor or incomplete data embeds structural inequalities into data systems, yielding results that reflect and reinforce existing biases.12 This relates specifically to the second problem of surveillance. Big Data-driven policing based on unrefined crime statistics reflects policing patterns rather than underlying crime patterns, reifying discriminatory police practices.13 Finally, privately sourced technology is often protected by proprietary secrecy. Therefore, the project’s algorithmic functions remain hidden from public scrutiny.14 For example, the use of facial recognition by police agencies in the US has led to more wrongful arrests of African Americans because the software better differentiated white faces compared to black faces—misidentifying black women 35% of the time while identifying white men with near-perfect accuracy. Moreover, studies from both the UK and US have shown that increasing police powers while decreasing police oversight and accountability leads to a greater risk of racial profiling.15
“The use of poor or incomplete data embeds structural inequalities into data systems, yielding results that reflect and reinforce existing biases.”
Yet, concerningly, a central theme of Darwin’s project is the problematically vague policing of “anti-social behaviour”. Considering the history of racial discrimination by police agencies across Australia, in conjuncture with current examples of gross misuse of data-based ‘preventative policing’ methods, data surveillance systems will likely be coloured by “algorithmic suspicion”.16 As such, algorithms that may or may not function as claimed, allow space for constant surveillance and racial profiling without any prior reasonable grounds to do so—all with relative ease, anonymity and unaccountability.
Therefore, it is completely reasonable—far from conspiratorial—to suggest that Darwin’s smart city project will have deep negative implications for the Indigenous community. The likelihood of such an impact is compounded by how uncritically these capabilities are viewed by policy and decision-makers such a Lord Mayor Vatskalis. As such, it is more than justified to seek a larger public review of both the functioning of these technologies and transparency and accountability measures that should accompany them—certainly a reassessment of the basic assumption that such capabilities improve “quality of life for all residents“.
1. Himesh, S. in Kim, K. (2018). Low-Carbon Smart Cities: Tools for Climate Resilience Planning. Springer International Publishing, pp. 77-79.
2. Smart Cities: Utopian Vision, Dystopian Reality. (2017). Privacy International, October 31, 2017. p. 5
3. Making Our City Smarter. (n.d.) The City of Darwin.
5. Making Our City Smarter, op., cit; Kim, op., cit. pp. 77-79; Clarke, R. (2019). Risks Inherent in the Digital Surveillance Economy: A Research Agenda. Journal of Information Technology, 34(1), p. 61, 66; Galloway, K. (2017). Big Data: A Case Study of Disruption and Government Power. Alternative Law Journal, 42(2), p. 93; Heeks, R. & Shekhar, S. (2019). Datafication, Development and Marginalised Urban Communities: An Applied Data Justice Framework. Information, Communication & Society: Data Justice, 22(7), p. 993-994.
6. Clarke, op., cit. p. 67; Heeks & Shekhar, op., cit. 993-994.
7. Couldry, N. & Mejias, U. A. (2019). Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject. Television & New Media, 20(4), p. 337; Galdon-Clavell, G. (2013). (Not so) smart cities? The Drivers, Impact and Risks of Surveillance-Enabled Smart Environments. Science & Public Policy (SPP), 40(6), p. 717
8. Privacy International, op., cit. p. 20
10. Taylor, L. (2017). What is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally. Big Data & Society, 4(2), p. 1-2.
11. Galdon-Clavell, op., cit. p. 718.
12. Taylor, op., cit. p. 4.
13. Ferguson, A. (2017). Black Data: Distortions of Race, Transparency, and Law. In The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement. New York: NYU Press. p. 131-136; Taylor, op., cit. pp. 2-4.
14. Ferguson, op., cit. p. 136-140; Taylor, op., cit. pp. 2-4.
15. Hopkins, T. (2017). Monitoring Racial Profiling: Introducing a Scheme to Prevent Unlawful Stops and Searches by Victorian Police. Police Accountability Project, Flemington and Kensington Community Legal Centre. August 2017. p. 15
Barton Quilkey is an undergraduate student at the University of Sydney in the Faculty of Arts and Social Sciences. He is currently finishing a Bachelor of International and Global Studies majoring in International Relations with a strong focus on environmental justice and political ecology.
The Sydney Environment Institute Student Series features original content written by Honours, Masters and PhD students at the University of Sydney who are undertaking environment-related research. If you are a current postgraduate student who would like to participate in the series, find out more here.