‘Xinjiang is the first great model in the era of digital mass surveillance. Nothing like it has ever been seen’
Professor Darren Byler has been investigating China’s treatment of the Uyghur minority in Xinjiang for more than a decade. Mobile phones and facial recognition have become tools of widespread espionage
Vera Zhou was crossing a street in mid-2019 in Kuitun, a small city in the Xinjiang region of northwest China, when she felt a tap on her shoulder. It was a police officer. The man took Zhou to the station, where she saw a screen with a high-definition shot of the city crowd, her face surrounded by a yellow rectangle. All the other faces were outlined in green. Zhou had left the area where she was allowed as a former inmate in a re-education camp. A facial recognition system had detected her among thousands.
It was the second time that Zhou had been stopped for a “precrime,” on the suspicion that she could end up committing a real crime. The first was in 2017, when she returned to China to see her boyfriend while she was studying geography at the University of Washington. That time the police arrested her for using a VPN. In China, for example, she could only check her university email or social media using the service, which simulates a connection from another country. VPNs are illegal in China, but the authorities had never punished their users–until then.
Professor Darren Byler recounts Zhou’s story in his book “In the Camps.” Byler has traveled to Xinjiang several times in the last two decades, and he has watched the repression against Uyghurs in China multiply. In the last five years, technology began playing a role in the persecution. “It is the first great model for the era of massive digital surveillance. Nothing like it has ever been seen,” he told EL PAÍS in a video call from Vancouver, Canada, where he teaches at Simon Fraser University.
The book gives a look inside a regime that forces inhabitants to carry a device that spies on what they read, what they search for, what they look at, what they say and where they are. The level of control is extraordinary: surveillance is no longer just about elaborate algorithms that predict future suspects. Citizens become suspects if they download files related to religion or use WhatsApp, an application not used in China, where the alternative is the local WeChat.
“There are 75 signs, but the 75 signs are quite broad. Possessing religious material or possessing political material is one of the signs, and that could be lots and lots of different things,” Byler explains. “They are looking for between 50 to 70,000 different markers. But my sense is that they’re looking for probability or partial matches as well. So it would extend much beyond that to 75,000 to millions in some cases.”
Such broad searches have contributed to the fact that there are now more than 1.5 million Uyghurs in re-education camps. Byler tells testimonies of the horrifying life inside those camps, whose unsanitary conditions house abuse, torture and inexplicable punishments. The surveillance network includes cameras and microphones capable of detecting whispers.
“The relative of someone I know was detained because they had pictures of young Muslim women wearing hijabs. It was like a meme that she sent during the celebration of Ramadan about how you should pray and be devout or fast during Ramadan. Very innocuous, something that millions of Muslims everywhere in the world would share,” Byler says. The dangers are so widespread that sometimes they simply depend on who has been the previous user of a mobile. Device-scanning detects old or deleted files: “They criminalize past behavior, things people did years before, or even previous owners of the phone. If you buy a second-hand phone, you don’t know what the owner was doing before. So if a previous owner had installed WhatsApp, it will show up in one of these scans,” he says.
In Byler’s first visits, when mobile and 3G had just become prevalent, the control was manual: an agent opened a user’s applications and looked at what was there. It was ineffective. Then they moved on to a USB cable-based system to which they connect mobile phones at police checkpoints in the city, using massive data analysis tools on the mobile hard drive.
“Over a million and a half people are detected to have used some kind of app that was now illegalized. The most prevalent kind of app was called Zapia, which is on an Android phone like AirDrop that you have with an Apple phone, phone-to-phone file transfer using Bluetooth,” explains Byler. “It doesn’t go through the internet, which means that the state can’t control what people were sharing. So file sharing is something that they’re really interested in. Having that is a marker of suspicion. It wouldn’t necessarily mean that you’d be sent to the camp immediately or only for that. But it’s one of the markers. And if there’s other things that show up either in your digital history or your social history in general, those things cumulatively could lead to being labeled untrustworthy.”
The government also uses an application similar to parental control programs, which sends the mobile information to government servers. “It is called Clean Net Guard, and it is linked directly to the police database. They’re taking your data from your phone as well. We don’t know for sure if all people have to use that app or not, but there’s a lot of evidence that at least a couple of million people have that kind of app installed on their phone. It’s a spyware system that is permanently installed on your phone,” he adds. During his visit in 2018, Byler saw that some citizens had their devices analyzed via USB and others only were checked to ensure they had the application installed.
One of the goals of the low-level agents who guard these checkpoints is to leave alone the millions of non-Muslim Chinese who live in Xinjiang. “If you appear to be non-Muslim, it’s just racial profiling based on how you look. They’re just going to wave you through and they won’t check your phone,” he says.
These controls also achieve another sophisticated level of surveillance: with facial recognition they verify that each citizen carries their own mobile, which they identify using the MAC address, a unique device number. “It lets them know that this person was in this place at this time. And so when they get to the next checkpoint, they’re able to track movement in those ways in a very concrete way,” he says.
This omnipotent system poses a problem even for China: it is a huge amount of work and effort. Its massive size gives Byler hope that it won’t be replicated. “There are a number of places in the world that are at the position that the Uighur region was in 2015 where they were just starting to do some scans. You see it in Russia, police checking people’s phones to see what apps they have installed, they’re doing it in Belarus, Egypt, Israel, Palestine, Kashmir,” he says.
China has spent immense amounts of resources to create the Xinjiang system. “It takes a lot of physical infrastructure, checkpoint systems, all those sorts of things. You’d have to hire an army of low-level police to scan people. Even elsewhere in China, I think it would be difficult to implement the system at this scale,” explains Byler.
Because it is such an intrusive system, with so many suspects, sustaining it is complex. The Chinese excuse is its own war against terrorism, which in the collective imagination began after the 2001 attacks in the US. “They invested around $100 billion in this region. They call it the people’s war on terror. It’s a whole-society approach to counterterrorism,” he says.
How to lead a normal life
Citizens struggle to lead relatively normal lives under the pressure of surveillance. Part of the Chinese government’s goal is for citizens to feel the paranoia that their every move is tracked. From there, they may begin to feel control over each thought. Uyghurs must also use their devices to obey nationalist directives and approve comments from government leaders: they must demonstrate that they are actively pro-government.
“You have to participate in WeChat groups that are oriented towards political loyalty, political ideology. You should document your political activity. For each week, there should be a certain minimum number of posts that you do. There’s a political performance that they’re doing online as a way of showing loyalty. Sometimes they leave their phones behind or turn them off and go to a park near their home, or go to a sauna because it’s not a space where you can bring a phone. There’s also Han people, not Muslims, that live in the region. Some are really opposed to the control, and they help Uyghurs to get information out, let them use their phones because their phones aren’t being tracked in the same way,” Byler explains.
The technology companies that have received this Chinese investment make an economic profit from their work, but they also benefit from the data: it allows them to train a program on unimaginable scales. “For the tech companies, it’s a sort of win-win because they’re getting all of this investment and they have access to all of this data. It’s hard to know exactly what the capacities of these systems are. Obviously they’re sending people to camp or to prison for things that are not crimes. Some of it seems to be sort of accelerated by the demands or stress that’s placed on the local authorities to detain people. They need to find terrorists in order to show that they’re doing a good job, and so they’re manufacturing them,” he says.
In his book, Byler explains that some US companies have also taken advantage of this wealth of programs trained by tracking Uyghurs under threat. After the publication of the book, he has met with large technology companies such as Google and Microsoft, which have taken measures to avoid contributing to the repression.
Tu suscripción se está usando en otro dispositivo
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
FlechaTu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.