China: Police ‘Big Data’ Systems Violate Privacy, Target Dissent
Human Rights Watch, 19 November 2017
By Human Rights Watch – The Chinese government should stop building big data policing platforms that aggregate and analyze massive amounts of citizens’ personal information, Human Rights Watch said today. This abusive “Police Cloud” system is designed to track and predict the activities of activists, dissidents, and ethnic minorities, including those authorities say have “extreme thoughts,” among other functions.
China has no enforceable protections for privacy rights against state surveillance.
“It is frightening that Chinese authorities are collecting and centralizing ever more information about hundreds of millions of ordinary people, identifying persons who deviate from what they determine to be ‘normal thought,’ and then surveilling them,” said Sophie Richardson, China director at Human Rights Watch. “Until China has meaningful privacy rights and an accountable police force, the government should immediately cease these efforts.”
The Chinese government has a long track record of amassing large amounts of information about citizens, and it is now actively exploring new technologies, such as big data analytics and cloud computing-based systems, to more efficiently aggregate and mine personal information. Authorities aspire to connect disparate databases to better enable data sharing and analysis across government departments, national and local levels, and from private sources.
Chinese police are using various applications to analyze large volumes and varieties of data, including text, video, and pictures. These applications can deliver useful analytics in real or near-real time, such as monitoring traffic patterns. Chinese police have said the use of big data will improve the police force’s ability to search for suspects, predict crime, and respond efficiently.
But some of these systems also enable the police to arbitrarily gain unprecedented information about the lives of ordinary people, including those who have no connection to wrongdoing.
One of the Ministry of Public Security’s (MPS) most ambitious and privacy-violating big data projects is the “Police Cloud” (警务云) system. The system scoops up information from people’s medical history, to their supermarket membership, to delivery records, much of which is linked to people’s unique national identification numbers. This allows the Police Cloud system to track where the individuals have been, who they are with, and what they have been doing, as well as make predictions about their future activities. It is designed to uncover relationships between events and people “hidden” to the police by analyzing, for example, who has been staying in a hotel or travelling together. It can also alert the police to activity that might seem unusual – such as when someone who has a local residence frequently stays in a local hotel.
The fact that these systems are designed in part to track groups the authorities deem politically or socially threatening raises serious concerns about social and racial profiling. Through predictive policing, these platforms vow to analyze their past pattern of activities to “alert and warn” the police about their future activities so as to “more effectively intercept” them. Meng Jianzhu – the former Minister of Public Security and the current Secretary of the Communist Party Political and Legal Committee – which oversees the Party-state’s police, procuratorate, and the courts – said in 2015 that big data is important to “find order… in fragmented information” and “to pinpoint a person’s identity.”
Human Rights Watch has analyzed a number of tender documents from police bureaus in Shandong and Jiangsu Provinces, and Tianjin Municipality, as well as academic and press reports. Human Rights Watch has primarily focused on these three regions because these documents are publicly available; Shandong and Jiangsu Province also claim to have some of the most established Police Clouds in the country. The tender documents reviewed for Shandong include those published by police bureau in the major cities of Jinan, Tai’an, and Weihai; for Jiangsu, Yancheng City; for Tianjin, the document was published by the Tianjin Municipal Public Security Bureau.
These tender documents were published between 2015 and September 2017. The Tianjin Police Cloud – at around US$4 million (27 million RMB) – is the most expensive.
The Police Cloud system appears to be a national project. In 2015, the MPS issued a regulation on information sharing (公安机关信息共享规定), ordering aggregation of data and the construction of provincial-level Police Clouds, which form the basis of a national Police Cloud database.
“As the Police Cloud soaks up ever more data about citizens, a perfect storm is on the horizon,” said Richardson. “With authorities increasingly able to track everyone’s every move, what’s at stake across China isn’t just people’s privacy – it’s also many of the rights they hold.”
Aggregation of citizens’ data from government and business sources
The Police Cloud system aims to integrate different types of information, including data routinely gathered by China’s police, such as residential addresses, family relations, birth control methods, and religious affiliations. The cloud platforms will also integrate hotel, flight and train records, biometrics, CCTV footage, and information from other government departments and even private companies.
In Weihai City, Shandong Province, the Police Cloud aims to integrate 63 types of police data and 115 types of data from 43 other government departments and industries (click here for the list). Among the data government collects are patient records – including names and illnesses – obtained from the National Health and Family Planning Commission; names and causes of petitioners – individuals who complain to the government, usually for official abuses – from the State Bureau of Letter and Visits; and the names and addresses of individuals convicted of crimes from the Bureau of Justice. The Police Cloud will also aggregate company data, including user names and their IP addresses from telecoms companies; usernames of their social media accounts (wechat, weibo, QQ, and email) from internet forums; and senders’ and receivers’ names, phone numbers, and declared package content from delivery companies.
In Xuzhou City, Jiangsu Province, a state press article explained that the police purchase company data from third parties. The information purchased includes “navigation data on the internet, [and] the logistical, purchase and transaction records of major e-commerce companies.” Some of this data is collected in real-time, according to the article: “In the past police officers would go door to door to collect data, and at most they could collect information from 40 to 50 households; now every day the machines collect data continuously.” This data includes MAC addresses (a unique hardware identifier of a computer or other networked device) and router information of Internet users.
Big data systems with intrusive ‘insights’ and predictive policing
The Police Cloud system aims to enable police to “visualize” (可视化) hidden trends and relationships between people in the sea of data. They provide interfaces called “Police Qiandu” (警务千度, a play on China’s popular search engine, “Baidu”) in Shandong, and “Sky and Earth E-Search” in Tianjin, which allow officers to search for and monitor individuals, vehicles, and cases of concern. The system can also be trained to alert police of people, relationships, and events of interest based on the data and patterns it analyzes. For example, the Jinan system will alert the police of what seems to them like unusual activity – such as when someone who has a local residence frequently stays in a local hotel – analyzing collected data of hotel and hostel lodging, vehicle movements, and express delivery.
A key feature of these systems is to discover relationships not otherwise apparent to the authorities. The big data system in Jinan will allow police to search for those who “are closely related to persons of concern.” This means, for example, finding out who “has gone to internet cafes together more than twice, or has travelled together twice” with the persons of interest. Once these relationships are found, police “can further mine and analyze [information about] this related individual,” and the system can also display this data as relationship maps. Similarly, the Tai’an Police Cloud allows the police to visualize relationships by analyzing those “who travel, who live, who work together; who go on the internet; who share the same hukou [China’s household registration system]; who share the same family members; and who are involved in the same case.”
The Police Cloud system is also designed for surveilling groups of people the police are most concerned about, such as those the government considers to be most threatening to regime stability. The MPS defines these “seven categories of ‘focus personnel’”: petitioners, those who “undermine stability,” those who are involved in terrorism, major criminals, those involved with drugs, wanted persons, and those with mental health problems who “tend to cause disturbances.”
In effect, local police can decide that virtually anyone is a threat and requires greater surveillance, especially if they are seen to be undermining stability. There are no legal avenues for people to be notified of this designation, or contest it. A tender document from Tianjin describes its Police Cloud as capable of monitoring “people of certain ethnicity,” “people who have extreme thoughts,” “petitioners who are extremely [persistent],” and “Uyghurs from South Xinjiang.” Xinjiang is a region with 11 million minority Muslims called Uyghurs, most of whom live in the south of the region and whose rights are heavily repressed. The Tianjin system says it can pinpoint the residences of these individuals and track their movements on maps. In Yancheng City, Jiangsu Province, the Police Cloud can “give rapid feedback” on the trajectory of “vehicles driven by focus personnel, by those involved in drugs… and vehicles [registered in] Xinjiang” on a map.
Another feature of these systems is so-called “predictive policing.” Analyzing past crime data, behaviors, and movements, the system purports to predict future criminal activities. Police may want to use these systems, for example, to target certain locations at certain times at which the behaviors of interest most likely will take place.
The tender document of the Jinan Police Cloud says it will “analyze the focus personnel who have come to Jinan and their cases…watch them and warn [the police] of [their presence] according to combinations of characteristics including ethnicity, criminal offense records, and others.” On the basis of such analysis, this system will generate and send analytics to handheld devices of police officers on a daily or weekly basis.
The officers are reported receive some of these analytics from a Shandong Police Cloud every morning at 8:00 a.m. According to an article on Shandong Legal Daily:
“…The ‘Morning at 8’ system… aggregates information from the area from the day before, including information about cases, ethnicity, hometowns, as well as information from police intelligence, hotel and hostel [registration], Internet cafes, civil aviation and other systems. The cloud system then analyzes this information for abnormality and trends… and sends them to the police officers’ mobile phones at every morning at eight o’clock.”
This system is designed to catch “focus personnel.” The article quotes an officer as saying:
“Every day at eight o’clock… based on our location and subscription options, the system sends us targeted messages; in particular, it alerts us to individuals who are involved in terrorism and in [undermining] social stability who’ve entered our jurisdictions.”
The article illustrates such predictive policing with one example:
“At 12:54 on September 17, 2015, a drug user named Mai (买某) went to stay at… a hotel in Dongying District, and officers from the Yellow River Police Station went to check on him after receiving an SMS alert from the ‘Morning at 8’ system. After investigation, [the police discovered] that Mai’s sister, named A (阿某), has been punished for ‘endangering state security.’ According to the relevant mechanism, police then subject Mai to key [personnel] monitoring.”
The system seems to have alerted the police because Mai had been listed as having a record of drug use. The article goes on to say that the system can “find out where ethnic minorities and those with a criminal record gather and stay for the long term,” information which guides police patrols.
The predictive policing programs also purport to track people involved in terrorism. Human Rights Watch has criticized the Chinese authorities for using terrorism allegations to justify the suppression of peaceful dissent, particularly against Uyghurs.
As the Chinese police continue to build these systems, there are many challenges to their efficacy and functionality. According to academics, the police have encountered resistance from other government agencies when collecting information; frontline officers have not been diligent enough in collecting useful or complete information; the information collected under various programs is inconsistent and difficult to reconcile; and few officers have the skills to conduct big data analysis for their work.
Two big data experts who reviewed this press release also suggest that China’s police may not yet have enough data points to track large numbers of people in real time. Chinese police have access to all hotel, flight, and train records. China is developing facial recognition and automatic number plate recognition on CCTV footage to improve the ability to track people, according to available evidence. The police also have access to stored location data held by telecom, mobile, and internet providers, but it is less clear whether they have access to ongoing data streams. An added challenge is that this kind of tracking can exact heavy requirements in terms of data storage, computing and analytical resources.
Big data policing in Chinese law and international law
In recent years, the Chinese government and the Communist Party have issued a number of directives and regulations on the collection, integration, and sharing of data to improve “social stability” – a euphemism that concerns creating an outward appearance of calm through suppressing crime as well as political dissent. In 2014, the MPS issued a notice (关于做好公安“十三五”规划编制工作的补充通知) about the construction of a smart personal data collection system between 2016 and 2020 that can “strengthen the ability to issue early warning concerning the abnormal behavior of key personnel.” In 2015, an MPS leadership meeting adopted a set of principles on “Strengthening the Collection of Basic [Policing] Information” (关于大力推进基础信息化建设的意见), and vowed to increase the use of big data and cloud computing in policing. Also in 2015, the Office of the Central Committee of the CCP and the General Office of the State Council issued an “Opinion on the Strengthening of the Construction of Social Security Prevention and Control Systems (关于加强社会治安防控体系建设的意见),” which includes using technological means including cloud computing and big data to achieve “social stability.” There are also provincial-level directives; Shandong police issued several directives in 2014 on integrating data and building a Police Cloud.
Current Chinese laws also do not meet international privacy standards enshrined in the International Covenant on Civil and Political Rights, which China has signed but not ratified. Those require that the collection, retention, and use of the personal data of individuals for policing purposes must be both a necessary and proportionate means to handle a genuine threat to a public interest such as national security or public order, in the sense that it is the least intrusive measure to accomplish that end.
China does not have a unified privacy or data protection law to protect personally identifying information from misuse, especially by the government. The police do not have to obtain any sort of court order to conduct surveillance, or provide any evidence that the people whose data they are collecting are associated with or involved in criminal activity. Police bureaus are not required to report surveillance activities to any other government agency or to publicly disclose this information. In practice, there are no effective privacy protections against government surveillance. It is very difficult for citizens to know what personal information the government collects, and how the government uses, shares, or stores their data. There is no way for citizens to know if they are being classified as “focus personnel,” much less to challenge their treatment if so classified, or if associated with people designated “focus personnel.” Those who try to investigate government surveillance are vulnerable to being charged with crimes including “stealing state secrets.”
The government, however, has a number of laws that empower state agencies and private companies to collect and use information concerning citizens, and government departments as well as local governments have issued numerous directives, rules, and regulations to collect and use miscellaneous information. State security-related legislation, such as the State Security Law, invests police and other state security agents with the broad power “to collect intelligence involving state security.” The Cybersecurity Law, while imposing requirements on network operators to keep user data confidential and to get consent before collecting it, also compels internet companies to store user data in China and provide undefined “technical support” to security agencies to aid in investigations.
The government’s use of big data and predictive policing exacerbates already widespread violations of the right to privacy in China. Policing algorithms and big data analytics rely on large datasets. As more police departments build cloud-based policing systems, they collect more and more personal data, including through their own increased surveillance activities and through cooperation with the private sector. As conceived, the Police Cloud system will lead to enormous national and regional databases containing sensitive information on broad swaths of the population, which could be kept indefinitely and used for unforeseen future purposes. Such disproportionate practices will intrude on the privacy of hundreds of millions of people – the vast majority of whom will not be suspected of crime.
Also directly at risk are the rights to be presumed innocent until proven guilty, and freedom of association. By the government’s own description, these systems are designed to track, monitor, and potentially detain and prosecute “focus personnel” and anyone who may travel, meet, or communicate with them. Such predictive policing systems will thus place individuals under suspicion and surveillance merely because they have associated with “persons of concern.”
Big data analytics that rely on social media monitoring and the aggregation of online activity can also further chill freedom of expression. If users fear that their every weibo post or chat will be used to determine whether they are a threat to security, it may increase self-censorship online.
These systems are also likely to have a discriminatory impact on ethnic minorities and other groups. This is in part by design: the government severely represses the ethnic minority Uyghur population as part of its counterterrorism campaign. Tender documents show that the Police Cloud system is in part specifically designed to monitor Uyghurs and “people of certain ethnicities.”
In addition to the human rights concerns mentioned above, serious questions remain as to whether predictive policing tools can reliably direct police attention to areas at highest risk for crime. In the US, for example, various police departments have begun using predictive policing systems to locate crime “hotspots” or individuals who are most likely to become involved in crime. But these systems train on data of past police reports, which may not reflect the pattern of actual risks. What results is prediction simply of the targets at which police enforcement actions are typically directed, not the places or people that actually are most involved in crime.
Preventing crime is a legitimate state interest, but predictive tools often point to the same old patterns, making it likely for policing to replicate old mistakes or biases such as targeting of people of lower socioeconomic status. This throws into doubt whether the use of these predictive tools adds much new, and whether they are either a necessary or proportionate intrusion on the rights of individuals.
Reported by Human Rights Watch