Thursday, December 17, 2015

At the station, software to detect “behavior … – Le Monde

Military Austerlitz station in Paris, November 14th.

The SNCF announced that it was experimenting detection technologies suspicious behavior on her CCTV cameras, while calling for assigning new powers its security agents, as provided a bill being debated in the Assembly.

Facing the “exceptional character” the threat of terrorism after the attacks in Paris, SNCF tests a behavioral analysis software that could be integrated with its 40,000 surveillance cameras, said the general secretary of the public company Stéphane Volant AFP. Announcements Mr. Volant also detailed in a message posted on SoundCloud:

The software is based on various criteria, such as “the body temperature change <- - atom snippet!> The shrug of voice or the staccato nature of gestures that can show some anxiety “ and also provides detection of luggage or parcels abandoned.

Technically complex

Software of this type have existed for several years and are mainly used in airports, in North America but also in Europe, eg Schiphol (Netherlands). They usually combine several criteria, ranging in body temperature, analyzed by infrared cameras, which increases in case of nervousness, analyzing facial expressions associated with stress or anger. The approach of passersby, or posture, the way they nod or movement of their eyes may also report a “suspicious behavior”

Read also:. The hard reality of security officers Roissy airport

But these signals are blurred, to analyze complex and highly variable from one person to another. These detection systems thus produce a very large number of “false positives” – a detected abnormal behavior like that is actually quite ordinary. Detection algorithms are also disturbed quickly when faced with changes in brightness or face a large number of stimuli, including in public places frequented … very like airports or train stations. They are therefore usable only with the supervision of human security officers to “sort” alerts the machine. A study of the British university Queen Mary (pdf) from 2001 summed up:

Existing technologies suffer from a high false positive rate, on a -sensibilité to changes in the visual context because of very strict rules, and poor adaptation to environments welcoming crowd.

Recent technical progress, particularly in terms of facial recognition have lessened some of these problems, but are far from having solved.

“emotions Detection”

Most importantly, these particularly intrusive techniques of “detection emotions “worried defenders of human rights and privacy. In Britain, where the number of CCTV cameras has exploded over the last decade, is responsible for controlling the monitoring tools that are moved in January 2015 is the development of these technologies increasingly used in the private sector.

I’m not against CCTV ,” he told the Guardian.

But the lack of public information on these issues trouble me. When people say that the public love the CCTV, they really know what these cameras and what they are capable of? Do they know that with the development of technology, they begin to predict the behavior ? “

Many observers also point out that video surveillance has a significant contribution to the resolution of investigations but remains ineffective to deter or detect upstream the crime commission. And that multiple detection algorithms runs the risk of falling into a system of “preventive justice”, leading to challenge people who have committed no crime, with risks of discrimination and error. In Britain, a facial recognition program called “Facewatch” and shares the CCTV camera footage of 10,000 traders with police. “ This idea that you can stop a crime before it is committed is worrying. Are we not all innocent until we are not guilty? , “worries the NGO which defends Big Brother Watch privacy.

A test that is intended to be part of the life

The SNCF, which operates the test under the control of the National Commission of Information and Freedoms (CNIL), has already announced its intention to include this test in duration, and not the limit the risk of attack. “It tests whether it identifies people who have a negative intent, an attacker or a” jobber “, but also the social acceptability” , to see if travelers are willing to accept such technology, once lifted the emergency rule. In the spring, an application should be launched to allow travelers to launch the alert from their smartphones in case of suspicious behavior.

The SNCF also questions the possibility of equipping its agents portable cameras. They could both identify fraud or suspicious behavior, but also, if necessary, verify afterwards the conformity of the actions of agents with the code of ethics and professional conduct of the SNCF and the law.

Measures already in place again in the UK, where more and more police and security officers, in transportation, stores or on university campuses, are equipped with “bodycams” cameras Gopro kind fixed on their uniforms. “ If more and more people are walking around with surveillance equipment, they must have a compelling reason to do ” believes, also in the Guardian The British controller monitoring.

This changes the nature of our society and this raises moral and ethical questions about the type of society we want to live.

Furthermore, SNCF supports the legislation under consideration in the Assembly safety in transportation. The text would allow the SNCF security officers and RATP to conduct security pat-downs or searches of luggage with passengers agreement.

The Defender expressed “serious reservations on some key provisions” of this bill that give him according to “public security missions to private security agents” from the SNCF and RATP. “The Defender of Rights is in its role” , reacted Stéphane Volant, emphasizing the ethical code of the SNCF, and the importance of the partnership established with SOS Racisme to frame any such searches and pat-downs.

LikeTweet

No comments:

Post a Comment