The future of your face

Isobel Thompson traces the backlash against facial recognition. From police biometrically mapping faces to nationwide watch-lists, will the balance between citizens and the state soon topple?

In Decem­ber 2017, office work­er Ed Bridges was Christ­mas shop­ping dur­ing his lunch break when he noticed a police van. By the time he came close enough to see its facial recog­ni­tion” sign, he believes his image had been cap­tured by the tech­nol­o­gy, being tri­alled by South Wales Police in Cardiff City Cen­tre. A few months lat­er, the for­mer Lib­er­al Demo­c­rat coun­cil­lor spot­ted the cam­eras again: this time at a peace­ful protest against the arms trade. Unset­tled, he launched a crowd­fund­ing cam­paign and, along­side the human rights organ­i­sa­tion Lib­er­ty, brought a land­mark case against the police, claim­ing the tech­nol­o­gy – which bio­met­ri­cal­ly maps faces and com­pares them with images on a watch-list – breach­es data pro­tec­tion and equal­i­ty laws. It just struck me as wrong that some­thing that instinc­tive­ly feels so author­i­tar­i­an was being used against me, and had been rolled out with­out pub­lic debate or con­sul­ta­tion. It felt like a police state,” he says. 

Bridges’ case chimes with a mount­ing unease about facial recog­ni­tion and its cor­re­spond­ing watch-lists, which can con­tain images of indi­vid­u­als scraped from social media, or lift­ed from the vast cus­tody images data­base – com­posed of peo­ple who have come into con­tact with the police, includ­ing thou­sands of inno­cent peo­ple. Devel­op­ing faster than the law, and so oper­at­ing in a legal and pol­i­cy vac­u­um (par­lia­ment has nev­er passed a law enabling the use of facial recog­ni­tion), mul­ti­ple high-pro­file claims that the tech­nol­o­gy is dan­ger­ous have, sig­nif­i­cant­ly, come from Sil­i­con Val­ley circles. 

Recent­ly, Amazon’s share­hold­ers unsuc­cess­ful­ly tried to stop the behe­moth sell­ing its con­tro­ver­sial sur­veil­lance soft­ware, Rekog­ni­tion, to gov­ern­ment agen­cies. Microsoft researcher Luke Stark wrote an essay com­par­ing the tech­nol­o­gy to plu­to­ni­um. Sim­ply by being designed and built, is intrin­si­cal­ly social­ly tox­ic, regard­less of the inten­tions of its mak­ers; it needs con­trols so strict that it should be banned for almost all prac­ti­cal pur­pos­es,” he argued. And in May, San Fran­cis­co – a glob­al sym­bol of the micro-dos­ing, mil­len­ni­al-front­ed tech boom – sub­vert­ed assump­tions that mass accep­tance of the tech­nol­o­gy is inevitable, becom­ing the first major Amer­i­can city to ban the police and oth­er author­i­ties from using it. I think part of San Fran­cis­co being the real and per­ceived head­quar­ters for all things tech also comes with a respon­si­bil­i­ty for its local leg­is­la­tors,” Aaron Peskin, the city super­vi­sor who spon­sored the bill, said. We have an out­size respon­si­bil­i­ty to reg­u­late the excess­es of tech­nol­o­gy pre­cise­ly because they are head­quar­tered here.”

“…behind the uni­form veneer of tech neu­tral­i­ty, the algo­rithms strug­gle to rec­og­nize women and peo­ple of colour.”

For stretched British police forces, suf­fer­ing aus­tere bud­get cuts, facial recog­ni­tion is osten­si­bly an effi­cient, inno­v­a­tive way to fight crime. Dur­ing the Bridges hear­ing, South Wales Police defend­ed their use of the cam­eras, say­ing if no match is made between scanned faces and watch lists, the data is delet­ed in mil­lisec­onds. We have sought to be pro­por­tion­ate, trans­par­ent and law­ful in our use of AFR (Auto­mat­ed Facial Recog­ni­tion) dur­ing the tri­al peri­od,” the force said in an emailed statement. 

The sys­tem they use, Neo­Face Watch, can scan and iden­ti­fy 18,000 faces a minute. Tri­alled by forces span­ning Leices­ter­shire and the Met­ro­pol­i­tan Police, it has been used to scour crowds at crowds at fes­ti­vals and foot­ball match­es and, in 2016 and 2017, the Not­ting Hill car­ni­val. In 2018, Greater Man­ches­ter Police scanned rough­ly 15 mil­lion peo­ple over a peri­od of six months at the Traf­ford shop­ping cen­tre before the Sur­veil­lance Cam­era Com­mis­sion­er inter­vened, cit­ing con­cerns about the trial’s pro­por­tion­al­i­ty. Com­pared to the size and scale of the pro­cess­ing of all peo­ple pass­ing a cam­era, the group they might hope to iden­ti­fy was minus­cule,” he wrote on a gov­ern­ment blog.

Crit­ics believe facial recog­ni­tion pos­es two lead­ing risks. The first is that it for­ti­fies bias around race and gen­der. In short, behind the uni­form veneer of tech neu­tral­i­ty, the algo­rithms strug­gle to rec­og­nize women and peo­ple of colour. In 2016, founder of the Algo­rith­mic Jus­tice League, Joy Buo­lamwi­ni, gave a TED talk explain­ing how algo­rithms tend to echo, and then entrench, the bias of their cre­ators, a phe­nom­e­non she has coined the cod­ed gaze”. Dur­ing the talk, Buo­lamwi­ni showed a video-clip of an algo­rithm fail­ing to rec­og­nize her face (she is black) until she donned a white mask. 

Joy’s work is mak­ing a dif­fer­ence not just in the world of com­pu­ta­tion, but in the wider world, show­ing how much bias is a part of our lives every day,” says Suzanne Liv­ingston, guest cura­tor of the Barbican’s exhi­bi­tion AI: More Than Human, which fea­tures Buolamwini’s poet­ic pre­sen­ta­tion, A.I., Ain’t I a Woman? Some benign uses of facial recog­ni­tion tech are in pub­lic libraries (speed­ing up the book bor­row­ing process), or even, poten­tial­ly, help­ing to iden­ti­fy miss­ing pets. But the uses of it which are more wor­ry­ing are in rela­tion to police records, at pass­port con­trols, or in smart cars. In these sce­nar­ios, the tech­nol­o­gy has to be accu­rate and able to recog­nise and respond to indi­vid­u­als from the full spec­trum of soci­ety. This is where the work needs to be done.”

The great­est flaw is that it erodes pub­lic free­doms. Even if the tech­nol­o­gy is improved it will remain the fact that it pos­es too great a threat to people’s rights and free­doms, cre­at­ing a dan­ger­ous imbal­ance of pow­er between cit­i­zens and the state”

Flawed soft­ware leads to flawed polic­ing: an inves­ti­ga­tion by civ­il lib­er­ties group Big Broth­er Watch found that the auto­mat­ed facial-recog­ni­tion sys­tem used by the Met­ro­pol­i­tan Police had a false-pos­i­tive rate of nine­ty-eight per cent. It is most like­ly to misiden­ti­fy women and peo­ple of colour – so they are more like­ly to be stopped by police and forced to account for them­selves as they try to go about their every­day lives. This bias is ingrained both in how the tech­nol­o­gy has been trained and how it’s deployed,” explains Han­nah Couch­man, Pol­i­cy and Cam­paigns offi­cer at Liberty. 

The sec­ond charge against facial recog­ni­tion is that, as it becomes plait­ed into polic­ing and pub­lic life, it will shift the bal­ance of pow­er from the indi­vid­ual towards author­i­ties. The great­est flaw is that it erodes pub­lic free­doms. Even if the tech­nol­o­gy is improved it will remain the fact that it pos­es too great a threat to people’s rights and free­doms, cre­at­ing a dan­ger­ous imbal­ance of pow­er between cit­i­zens and the state,” says a spokesper­son for Big Broth­er Watch. In 2015, Chi­na announced plans to build an inte­grat­ed cit­i­zen-mon­i­tor­ing sys­tem by 2020, over­seen by omnipresent, ful­ly net­worked, always work­ing and ful­ly con­trol­lable” cam­eras. Once the project is ful­ly imple­ment­ed, cit­i­zens will be assigned social cred­it” rat­ings, informed by their day-to-day activ­i­ties. Facial scan­ning soft­ware has also played a pow­er­ful role in the Chi­nese government’s mass sur­veil­lance and deten­tion of Mus­lim Uighurs – as well as being used to track and reward mod­el cit­i­zens, facial recog­ni­tion can be abused to repress poor and mar­gin­al­ized com­mu­ni­ties, who are often dis­pro­por­tion­ate­ly tar­get­ed by state sur­veil­lance, anyway. 

This is obvi­ous­ly extreme ter­ri­to­ry. But the link between facial recog­ni­tion and sub­se­quent reward or ret­ri­bu­tion has played out on a small­er scale in the U.K.: when police were tri­alling the soft­ware in south-east Lon­don, they fined a man £90 for refus­ing to show his face as he passed. This slant­i­ng of pow­er rais­es broad­er ques­tions. What hap­pens to the pre­sump­tion of inno­cence when not want­i­ng to give your bio­met­ric data to police as you walk down the street becomes a source of sus­pi­cion? If we know we are being watched, and the police are grant­ed hefti­er pow­ers to pun­ish us if we don’t play along, will that impact the way we inter­act with pub­lic spaces? Will we start to self-cen­sor, and col­lec­tive­ly con­tain our behav­iour? Take any nerves you’ve had about your inter­net his­to­ry or What­sApp mes­sages being made pub­lic, and then trans­plant those fears – of being exposed, mis­un­der­stood, cat­e­gorised – to a non­de­script Christ­mas shop­ping ses­sion in Cardiff. 

What peo­ple tend to for­get, as they thrash out the eth­i­cal impli­ca­tions of facial recog­ni­tion, is the extent to which it has already slipped, seam­less­ly, into our every­day lives.”

It’s not just States that are tri­alling facial recog­ni­tion. Pri­vate com­pa­nies – sub­ject to less account­abil­i­ty – are start­ing to rapid­ly roll-out the tech­nol­o­gy, often work­ing in con­cert with author­i­ties. The pow­er that live facial recog­ni­tion cam­eras give to an organ­i­sa­tion can so eas­i­ly be abused. We’re already hear­ing about pri­vate com­pa­nies cre­at­ing black­lists’ of indi­vid­u­als they don’t want in their shops, bars or busi­ness­es, which peo­ple can find them­selves on with­out hav­ing done any­thing unlaw­ful,” adds Big Broth­er Watch. 

Last year, res­i­dents of Atlantic Plaza Tow­ers, a rent-sta­bilised apart­ment block in New York, dis­cov­ered their land­lord planned to switch their key fob sys­tem with facial recog­ni­tion tech­nol­o­gy. The appar­ent aim was to mod­ernise the building’s secu­ri­ty sys­tem, but, as The Guardian reports, some res­i­dents sus­pect­ed the move was linked to gen­tri­fi­ca­tion, and a move to try and allure wealth­i­er white res­i­dents to the block, whose inhab­i­tants were large­ly black. More than 130 ten­ants have filed a com­plaint with the state to try and block the move. We do not want to be tagged like ani­mals,” Ice­mae Downes, who has lived in the block for 51 years, told the paper. We are not ani­mals. We should be able to freely come in and out of our devel­op­ment with­out you track­ing every movement.”

What peo­ple tend to for­get, as they thrash out the eth­i­cal impli­ca­tions of facial recog­ni­tion, is the extent to which it has already slipped, seam­less­ly, into our every­day lives. Scan­ning oth­er people’s faces is cen­tral to human rela­tion­ships – we’re expert at detect­ing fear, for­give­ness, bore­dom, dis­ap­point­ment from the minute widen­ing, soft­en­ing, glaz­ing or nar­row­ing of an eye. The suc­cess of this loose­ly-reg­u­lat­ed, opaque, bil­lion-dol­lar indus­try, though, rests on its abil­i­ty to get us addict­ed to our own faces and, at the same time, con­vince us to hand over troves of data. And it has worked; we’re hooked. The shock-hor­ror of self­ies qui­et­ly melt­ed into dopamine-laced Insta­gram posts. iPhone X users unlock their phones via a soft­ware sys­tem that con­nects more than 30,000 invis­i­ble dots to cre­ate a facial depth map. Exper­i­ment­ing with make­up on an app, Snapchat­ting with a snuf­fling bun­ny nose fil­ter, for­ti­fy­ing homes with a door­bell that rec­og­nizes fam­i­lies: these inno­va­tions are meant to help us live a sim­pler, more con­nect­ed life, seduc­tive­ly spliced with a hit of narcissism. 

But what are the stakes of appar­ent­ly biased algo­rithms flat­ten­ing our nuance and using sur­face scans of our faces as a resource? What kind of face will be deemed, and com­mod­i­fied, as aspi­ra­tional, or authen­tic? In con­trast, what fea­tures will be deemed as sus­pi­cious? Obvi­ous­ly, clich­es already exist: but could facial recog­ni­tion mag­ni­fy categorisations?

The UK’s intel­li­gence agency, GCHQ, col­lect­ed images from mil­lions of inter­net users’ web­cams between 2008 and 2012, and used them to cre­ate and test facial recog­ni­tion technology.”

There are instances when the pub­lic might be uncom­fort­able to know that the tech­nolo­gies they are using for ease and enter­tain­ment could be motor­ing the con­cerns raised above. Although there is a dif­fer­ence between the police scan­ning hun­dreds of faces in a crowd, and a per­son, say, choos­ing to unlock their phone with their face, engage­ment with tech­nol­o­gy doesn’t always trans­late to robust con­sent, or a com­pre­hen­sive under­stand­ing of how it works. 

A key point here is that com­pa­nies need bil­lions of images to feed and train their algo­rithms. So, where do they look to feed these algo­rithms? Accord­ing to Big Broth­er Watch, GCHQ goes to our web­cams. The UK’s intel­li­gence agency, GCHQ, col­lect­ed images from mil­lions of inter­net users’ web­cams between 2008 and 2012, and used them to cre­ate and test facial recog­ni­tion tech­nol­o­gy (OPTIC NERVE). Peo­ple should be aware how any pho­tos they use online or on phone apps might be used by the web­site or app provider.” And then there is the case of Ever, a cloud stor­age app that mar­ket­ed itself as help­ing you cap­ture and redis­cov­er your life’s mem­o­ries.” The com­pa­ny did not adver­tise that the mil­lions of pho­tos users uploaded were used to train its facial recog­ni­tion soft­ware, which it aimed to sell onto the mil­i­tary, law enforce­ment and pri­vate companies. 

Despite decades-worth of debate about tech­nol­o­gy and sur­veil­lance – 1984 was pub­lished in 1949 – facial recog­ni­tion has crept up fast, and loose. The cur­rent nar­ra­tives around AI are often about fear and con­trol but this isn’t the whole sto­ry. If we let these dom­i­nate too much, we are like­ly to become pas­sive or antag­o­nis­tic to the very tech­nol­o­gy which we play a large part in cre­at­ing,” says Liv­ing­stone. As a species, we’ve been liv­ing in a tight box for a long time.” But, used with­out debate, trans­paren­cy, or even a skele­tal frame­work, it’s hard to see how the tech­nol­o­gy can equate to free­dom, rather than fear.


Relat­ed

Loading...
00:00 / 00:00