TOKYO -- East Japan Railway Co.'s (JR East) temporary use of a system enabling its security cameras to recognize faces including individuals released from prison has caused a stir. Although the system contributes to greater public safety and assists criminal investigations, it presents privacy protection issues, and there are fears it will intimidate society.
In July, JR East took the looming 2020 Tokyo Olympic and Paralympic Games as an opportunity to install security cameras in major stations and other locations capable of detecting suspicious behavior from people, and announced that according to necessity the firm will in some cases inspect people's belongings.
But it didn't reveal that the system was also going to be programmed to detect named individuals wanted by police on suspicion of crimes, as well as individuals on full or temporary release from prison for terrorist acts and other major incidents that affected JR East or its passengers.
The information on individuals released from prison was obtained from the Public Prosecutors Office, based on the Victims of Crime Notification System. If a person fitting a description of a relevant individual were to be caught on the AI-installed cameras, it would automatically detect them. However, the information's inclusion into the system was eventually scrapped before the registration phase.
Regardless, detection of people behaving suspiciously and of those wanted by police is ongoing. Among the 8,350 cameras mounted in major stations and elsewhere, details such as how many have the AI system installed or which ones use it have not been revealed.
How to ensure safety on public transportation has become an urgent issue. In August, 10 passengers were injured, some seriously, in a knife attack on an Odakyu Line train in Tokyo. Damage avoidance and reduction plans compiled by the Ministry of Land, Infrastructure, Transport and Tourism in September included efforts such as making security camera functions to detect suspicious individuals and objects more sophisticated. Even so, in October another 17 passengers were hurt in an attack taking place on a Keio Line train in Tokyo. No special prescription that ensures both convenience and safety has been found.
Despite this, railway firms' expectations for the technology are by no means low. On Sept. 21, JR East revealed that individuals subject to its original crime prevention measures using face-recognition security cameras would be narrowed down. But the company has also said, "It is possible we will review this again in response to changes in the state of society."
Tokaido Shinkansen operator the Central Japan Railway Co. (JR Central) said regarding the measures: "We intend to continue with careful investigations while taking privacy and other issues into consideration."
These security cameras translate the characteristics of people's faces into data, and compare the information against already registered facial information. It can identify an individual standing far away without them realizing. But because this facial recognition system can also be used to track people's interactions with others, their movements and purchasing history, there are concerns that, regardless of whether someone has been released from prison, it could infringe on people's rights and intimidate society.
But the Act on the Protection of Personal Information does not require the individual's consent for acquisition of data describing their facial features. Still, the government's Personal Information Protection Commission, which oversees the law's implementation, indicated on its website that it is of the view that "there must be notifications and announcements on the aim of use" in the event that security cameras with facial recognition capabilities are installed.
Is JR East's response appropriate? Regarding JR East notifications at stations that read "Security System (Facial Recognition System) in Operation," professor Taro Komukai, a specialist in information law at Chuo University, told the Mainichi Shimbun: "There needs to be a clearly defined aim such as 'We may detect people who have caused problems at stations.'"
Regarding information on people released or temporarily released from prison, the professor was of the view that "if it can be limited to use as a means of early prevention of incidents to protect the lives of passengers and station facilities as assets, then it may be a legal exception that is unavoidable." But he added, "If it goes beyond that aim and begins sharing information with investigative institutions, then it becomes problematic."
In the U.S. and Europe, facial recognition data acquisition is subject to notably strict handling. The EU's General Data Protection Regulation forbids the use of data on a person's body, including their facial characteristics, without the individual's consent. According to reports, in Spain this year, retailers and other entities were fined for identifying customers it surveilled by matching biometric data with information on people released from prison.
(Japanese original by Ken Aoshima and Shotaro Kinoshita, Tokyo City News Department)