Apps promised to increase safety are often used for everyday logistics – and normalize secret surveillance.
Researchers at Stockholm University have examined 48 Swedish safety apps and warn that the technology is packaged as care while ethical questions disappear.
In two research projects at Stockholm University in Sweden, researchers are investigating various safety technologies in Sweden – everything from digital safety maps and security sensors to apps marketed as tools for creating safer communities. But instead of measuring whether the technology works, the researchers critically examine its consequences.
— It’s important to ask what kind of safety we’re after, and for whom? What is worth calling safety? Which actors and interests determine what constitutes safety in a society? The project on safety apps shows, among other things, how surveillance becomes normalized when we use this technology, says Katarina Winter, associate professor and senior lecturer in criminology and doctor in sociology at Stockholm University.
She leads the projects, which are conducted in collaboration with researchers from the University of Gävle and Södertörn University. The researchers have mapped 48 Swedish safety apps and interviewed both developers and users, including parents who use apps to keep track of their children.
“The technology is so kindly framed”
A central finding is how normalized it has become to monitor children, often without their knowledge.
— One example is how normalized it has become to monitor your children even though they don’t know about it, although some have an agreement with their children. Because the technology is so kindly framed – that it’s about protecting the children – it doesn’t become something you have to stand up for as a parent. The normalization can therefore happen under the radar. When technology is packaged as care, we easily lose sight of the ethical questions, she explains.
The surveillance also affects family relationships.
— Many use the apps to avoid nagging their children, and in the short term it may be convenient and simplify family logistics. But something happens on an interpersonal level, we cut off part of the interaction between each other. It’s seen as deviant behavior if you don’t want to share your location, which I think is negative.
Confusing messages during adult education center shooting
The researchers see a clear discrepancy between developers’ ideals about a safer society and how the apps are actually used. For private individuals, it’s often about completely different things than safety.
— In a way, these parents reproduce an insecurity in society related to crime and vulnerability when they justify why they use an app. But in reality, it’s often extremely connected to everyday logistics – when should I start cooking the pasta depending on where my child is? explains the criminologist.
The researchers have also examined the school safety app CoSafe, which was used during the shooting at Campus Risbergska, an adult education center in Malmö, southern Sweden. The app was criticized for sending contradictory alerts about both evacuation (leaving the building) and lockdown (staying inside and seeking shelter). Of the total eleven people killed, two students followed the instruction to evacuate instead of seeking shelter indoors.
— The Risbergska case demonstrates the complexity of technical solutions for crisis situations. While the app may have helped some seek shelter, the incident raises important questions about responsibility distribution and technical reliability when it comes to life and death, Winter notes.
Private actors profit from insecurity
The researcher also sees how private companies use the public debate about insecurity to sell their solutions, particularly to municipalities.
— We have both a political landscape that focuses on insecurity and a market that takes it on because it’s in focus. It’s logical that opportunities for entrepreneurship are found in the societal debate we’re in, but it becomes more brutal when it comes to safety than with other phenomena. Partly because actors profit from portraying society as unsafe, and partly because companies are generally interested in specific user groups that may not have many safety problems.
She calls for a critical attitude toward technological optimism.
— It’s important to pause on these questions that otherwise tend to rush ahead in a kind of faith that ‘now everything will be better because we have new technology’. When the overarching word is safety, questions about surveillance and privacy risk being deprioritized.