The lasting effect of digital surveillance at Black Lives Matter protests

Protesters around the world have – wittingly or unwittingly – put themselves squarely within the lens of a surveillance state. Here's how unregulated and untested facial recognition software is disproportionately affecting the black community.

Salem had a tough decision to make, but decided it was important they attended the Black Lives Matters protests in Miami and Orlando, Florida.

I’m black, so police brutality and their abuse of power is something that affects my family and I directly,” they say.

The idea of attending the protests was terrifying” – not least given the potential for violence, and the danger of contracting coronavirus when gathered in a group. There was also a third issue weighing on the 21-year-old’s mind: the likelihood of ending up on a monitoring list for simply turning up to speak their mind.

Surveillance of Black Lives Matter protests has become a common sight, with police officers carrying camcorders alongside riot shields, batons and guns in the US and elsewhere. Some protests have even been watched from above, with drones circling the scenes where protesters congregate.

I’m immediately a target regardless of what I may be doing, even if it weren’t for the protests. I walk into a store and eyes are on me, especially if I don’t seem to buy anything because people assume I’m stealing,” Salem says. The risk has always been present for black people, so surveillance because of the protests is just another day for us.”

I know that there are risks, but I’ve accepted them and even if I do get arrested or get put into a database, it’s for doing the right thing,” they say.

Despite a raft of companies, including IBM and Amazon, saying they’d stop supporting police forces using their facial recognition software, those attending protests around the world have – wittingly or unwittingly – put themselves squarely within the lens of a surveillance state.

We’re here protesting the fact that black people are twice as likely to die in police custody, and they’re using software that’s probably erroneous to take these people protesting against police custody into custody,” says Charlene Prempeh of campaign group A Vibe Called Tech, which considers the impact of technological developments on the black community and the way in which it contributes to systemic racism. It’s perpetuating the issue.”

Part of the problem is that there are few rules regulating the use of facial recognition or other surveillance technology in public places. Trials by London’s Metropolitan Police were started with little consultation and barely any notice, with Essex University reporting just 19% accuracy with the software.

If there isn’t a solid governance framework, there’s no limit on how it can be used,” explains Lofred Madzou, artificial intelligence lead at the World Economic Forum. Facial recognition technology must be regulated to ensure it is being used ethically.”

The issue is it’s not just about the people at protests – when you think about the locations where facial recognition is being deployed, it is disproportionately targeting people of colour.”

It’s not just top-down surveillance that puts protesters at risk. In the US, Dallas Police department was quick to roll out its iWatch app when the Black Lives Matter protests began, encouraging members of the public to upload footage they had recorded on their phones identifying illegal behaviour in order to start prosecutions. It took the collective will of an unusual set of supporters – fans of K‑pop groups – to bombard the system with irrelevant footage of fancams of their favourite pop stars to swamp the system to stop it from continuing.

But it’s not just potentially being caught on camera, whether by the police or passersby, that could put protesters at risk. Various surveillance technologies are out there beyond facial recognition. They include drones, RFID chips, mobile location-tracking data, and so on,” says Madzou. Even organising lists of charities to donate to in order to support the Black Lives Matter movement using apps like Google Docs could theoretically put participants at risk of being caught up in the surveillance dragnet.

These technologies are made more powerful by the rapid development of artificial intelligence,” says Madzou. While the collection of data has been made ubiquitous with the development of the smart city and CCTV, the processing of this data for facial recognition purposes has become almost seamless.”

And the law hasn’t yet fully caught up to the realities of the situation. Protesters taking part in the marches currently occurring across the globe shouldn’t simply be arguing for racial equity in their lives. They need to actively advocate for the enforced regulation of surveillance technologies,” says Madzou. If left unregulated, we could see feature creep”, with facial recognition tracking our every move without justification, infringing on our right to privacy.

The lack of transparency is always problematic,” says Prempeh. The issue is it’s not just about the people at protests – when you think about the locations where facial recognition is being deployed, it is disproportionately targeting people of colour.” From police trials to active use of similar technology by shops and supermarkets, the places that it’s rolled out in – most often poorer, working-class neighbourhoods – are those most likely to perpetuate the inequalities that people are currently protesting. When they’re looking at facial recognition software, they’re looking at Barking, not the middle of Chelsea,” she adds.

The solution put forward by big tech firms, advocates of facial recognition technology and the surveillance state is to make the systems better by training them on a broader range of people. If AI misidentifies people of colour because it hasn’t encountered many before, feed it a load more, they argue. But that just adds to the issues, bringing more people under surveillance, and giving away their digital footprints to Big Brother. In order to create an equal future, this problem needs to be tackled now, before it’s too late.

We have issues beginning here that are going to be problematic in the future,” says Prempeh. Building awareness of that, making people realise the issues exist, are important so we can start to lobby.”

More like this

Loading...
00:00 / 00:00