Back in 2016, my screen froze while filling out the Census online. I felt a sense of unease, mainly because I know that time and time again we’ve seen data collected for good intentions later exploited in disturbing and unintended ways.
I didn’t know if my data might be taken, by whom, or what it might be used for. But I did know that if history has taught us anything, it’s that data in the wrong hands can be used in a myriad of ways.
The first poverty maps created by Charles Booth stemmed from his concern for the plight of the urban poor. Forty years later Booth’s maps likely played a role in planning where the London Metropolitan Police would place police phone boxes. Booth likely didn’t consider his maps would be used this way. His project was designed to give charity workers and officials visibility on the level of poverty and the location of those in poverty. But the richness of the data he collected also meant it was used to increase police supervision in marginalised communities, and other applications such as planning rubbish collection points.
More recently, the UK Home Office was found using sensitive data that was designed to protect rough sleepers to target vulnerable individuals for deportation.
These uses of data led me down a path of questioning how we can avoid data and digital systems being used for unintended purposes – especially against vulnerable members of society.
The concerning element isn’t the movement of data that you’re consenting to share, but rather the exploitation by certain companies, that abuse legal gaps to deny the visibility of the flow of your own personal data. Take the case of Clearview AI, which continues to use data defined by the law as public, in order to create a digital tool for law enforcement. Examples such as this act as a reminder of how vulnerable we are.
Allowing data to flow is of critical importance for furthering humanity through research and social initiatives. However, we need a degree of consent so that individuals are aware of how their data moves, hence handing back the responsibility of steering data to the individual.
Part of the solution for ensuring safe and secure data flows requires the law to satisfy community expectations. Coincidently, many experts are applauding the recommendations in the Australian Government Privacy Act Review Report released in February 2023. However, while the government might focus on enacting data protection laws, they might miss the opportunity to increase awareness for everyone, from law enforcement, and data specialists, to the general public on why data must be allowed to flow in order for us to collectively benefit from it. People are increasingly requesting the right to steer the flow of their data.
The Australian National University’s School of Cybernetics is investigating and prototyping cybernetic approaches to help data users, collectors, and custodians anticipate potentially harmful future data flows within our interconnected systems.
Cybernetics is concerned with the study of communication and control within purposeful systems that are reliant on technology. Just as a ship’s captain considers many factors in steering toward a destination, so too do cyberneticists. Within our cybernetic world, the focus is on tools for steering data in conjunction with data laws that protect that data at rest.
While history tells us one narrative, we are beginning to move in the right direction. Australian government agencies are increasing the availability of complex datasets for researchers and social good initiatives, via programs such as the inter-departmental data sharing platform Multi-Agency Data Integration Project (MADIP), which allows government programs and researchers to use the flows of data for the benefit of all Australians.
Companies like Gener8t and Brave seek to empower people anywhere in the world to steer the flow of their web browsing data and advertisement revenue. In attempts to reach better outcomes, they enable individuals to steer their data toward collective services that improve human flourishing. This is similar to purposefully designed platforms such as Solid – an open web platform created by the founder of the World Wide Web, Sir Tim Berners-Lee.
Solid allows individuals to store their data in what is called a Personal Online Data Store – or Pod. This means individuals can retain full sovereignty over their personal data, granting and revoking consent for it to flow through government, business, education, and services sectors. These types of approaches are more likely to convince individuals to confidently engage in data collection activities, something that is still of crucial importance to advancing humanity
A cybernetic approach to data management focuses our attention on the approach we take, and the questions we ask along the way. Cybernetics can be our night-vision goggles, helping us see through the darkness toward our preferred data-driven future that is safe, sustainable, and responsible.