1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
Politics

Test facial recognition

Chase Jefferson
Jefferson Chase
August 24, 2017

Privacy activists hate the idea of using biometric ID software in public places. But DW's Jefferson Chase says that it's better to test whether it can help prevent terrorism and major crime than to reject it out of hand.

https://p.dw.com/p/2ilDB
Face-Tracking-Software
Image: NDR

There's a part of me that would love to join in the chorus of those who want to scupper the tests of facial-recognition security software at a Berlin train station. There's something seductive about identifying a conspiracy theory in which malevolent forces "from above" seek to control people's lives. Only in this case there's no conspiracy. How could there be? Everything in this process is out in the open.

Earlier this year, with no shortage of publicity, Berlin police found volunteers to participate in a test of a prototype facial-recognition system at Südkreuz station. The system seeks to match images of people on CCTV cameras with pictures of the volunteers in a test database. Volunteers also wear transponders providing information about their whereabouts. Comparing the two sets of data will give a good indication of whether the technology is of any use.

DW's Jefferson Chase
DW's Jefferson Chase says facial-recognition software should be given a fair chance

Criticism has arisen because the transponders apparently collect a bit more information than originally intended, but there's been no suggestion that authorities are trying to use any of this data, and none of the volunteers has decided that he or she no longer wants to take part. I covered this story when the project was launched a few months back and asked a group of volunteers whether they had privacy concerns. None of them did. On the contrary, they were curious about whether the technology would work and eager to make a contribution to what they say is a good cause.

Naïve, the activists might say, but is it? Most of us are completely unaware of the types and volume of data that mobile phones and social media collect and share about us. Yet how many of us think twice about employing GPS to find our way and checking Facebook or email accounts with private companies for new messages? Compared to that, participating in a highly public test monitored by journalists, politicians and watchdog organizations seems pretty safe to me.

Intelligent video surveillance

And for a pair of political reasons, it's important that the tests continue. After the terrorist attack on a Christmas market in Berlin last December and the subsequent manhunt for the attacker, right-wing populists accused the German government of failing to do everything in its power to prevent terrorism. A rejection of biometric identification technology on general principle would feed such accusations. Testing the technology, on the other hand, shows that the government is trying out various ways to protect people against terrorism.

Experiments like the one at Südkreuz station are also in the public interest because they reveal the limitations and potential abuses of facial recognition systems. Critics say the technology won't work and is prone to mistaken identifications. If that's so, isn't it better to find out and confirm that in a test environment?

When asked about the technology while touring the station on Thursday, German Interior Minister Thomas de Maiziere spoke about it in terms of an experiment, saying that if the technology worked, it would be "an unbelievable plus for security in Germany."

I take de Maiziere at his word. This experiment may fail. The technology may prove useless or incompatible with individuals' right to privacy in Germany. If so, let's find out. I'm very glad that watchdog groups are keeping close tabs on what's going on at Südkreuz and look forward to the debate once the results of the experiment are in. Canceling it would only deprive the public of the sort of information it needs to come to a consensus about this potentially beneficial, potentially harmful technology.