As I See It: Orwellian
May 24, 2021 Victor Rozek
In last month’s article, I expressed ambivalence about our increasing reliance on technology and its growing dominance even as it slips ever further from our control. The challenge presented by a host of computer technologies is how to maximize their benefits while minimizing their potential for harm. And the more powerful the technology, the greater the temptation to weaponize it.
I closed the article with the following statement:
We are already using artificial intelligence on an enterprise scale. It won’t be long before it’s used on a planetary scale. Computer intelligence is evolving much quicker than human intelligence. According to IEEE Spectrum, “A quantum computer with 300 mutually entangled qubits could theoretically perform more calculations in an instant than there are atoms in the visible universe.”
What could possibly go wrong?
Well, it turns out we didn’t have to wait long for that answer. Things are already going horrifically wrong in China, where a surveillance state of unimaginable proportions is being assembled.
Ross Andersen, a deputy editor of The Atlantic, travelled to China to do research for an in-depth article on artificial intelligence. In recent years, China has invested heavily in an attempt to catch and surpass America’s lead in developing AI. Andersen’s concern, depending on who, if anyone, is able to dominate the technology, is that the entire planet will have to eat whatever is baked into the AI pie.
The nugget of what he discovered was that Xi Jinping, China’s current dear leader, has authoritarian ambitions far beyond veining his country with CCTV-type monitoring. Xi, Andersen asserts, “wants to use AI’s awesome analytical powers to push China to the cutting edge of surveillance. He wants to build an all-seeing digital system of social control, patrolled by precog algorithms that identify potential dissenters in real time.”
Under the system everyone runs the risk of being labeled an enemy of the state simply because an algorithm determines they are. Not for some overt action or public protest, but because of a book they’ve purchased, or a person they were seen with, or the fact that an app on their phone did not register sufficient praise for Xi’s latest proclamation.
“In the near future,” Andersen writes, “every person who enters a public space could be identified, instantly, by AI matching them to an ocean of personal data, including their every text communication, and their body’s one-of-a-kind protein-construction schema. In time, algorithms will be able to string together data points from a broad range of sources – travel records, friends and associates, reading habits, purchases – to predict political resistance before it happens. China’s government could soon achieve an unprecedented political stranglehold on more than 1 billion people.”
China is already testing early versions of such a system in the northwestern territory of Xinjiang on the hapless Uighurs, a Muslim minority over a million of whom have been imprisoned and subjected to physical and psychological abuse. Their every action is recorded by a system known as Sharp Eyes that begins with ubiquitous face recognition surveillance. In fact, Chinese companies are developing systems they claim can identity people even when they are wearing masks.
Uighurs are forced to install nanny apps on their phones that use algorithms to detect “ideological viruses” or, for that matter, any deviation from normal behavioral patterns. Are you getting to work by a different route? Are you spending less time with neighbors? Are you leaving your house by the back door at night? Has your electricity use spiked perhaps indicating an unregistered guest? Are you deliberately avoiding social media?
Resistance need not be overt. “The government,” writes Andersen, “could use emotion-tracking software to monitor reactions to a political stimulus within an app. A silent, suppressed response to a meme or a clip from a Xi speech would be a meaningful data point to a precog algorithm.”
Uighurs are forced to give blood and DNA samples. Their travel is restricted and monitored through checkpoints. The system can identify if the car they are driving is their own. The various data points can be time-stamped and geo-tagged. If the precog algorithm determines they are a potential threat – either because of something they did or didn’t do – they can be denied access to plane or rail travel. And, of course, they may be visited by local authorities.
It has the makings of a perfect digital dictatorship and pieces of it are now being exported around the world. Dictators and aspiring autocrats in Asia, Africa, and Europe have already purchased elements of this system, and China is now hawking what Andersen calls “plug and play” surveillance systems in Latin America to Bolivia, Ecuador, and Venezuela. Eventually, it is thought, these individual systems could be linked into a single global surveillance monolith.
Of course, the United States has similar technology although it has so far not been deployed for purposes of social control. But Andersen’s concerns are valid because these things happen incrementally.
There are other problematic technologies coming online that are poised to spawn the next tech revolution. Neurotech or Brain Tech, a stepping stone to Kurzweil’s wet dream of singularity, promise brain/computer interfaces that will allow mental control of everything from gadgets to cyborg soldiers.
Video game aficionados can already manipulate avatars by putting their attention to a specific part of a screen. And Facebook recently announced that it wants to interpret your intent before your finger presses a key – so you don’t have to.
The military already uses specially equipped helmets that sense and transmit neural signals to control drones. In the near future a computer will be able to capture what you think, display it, or communicate it to another person. Keyboards may become obsolete as people become accustomed to typing with their thoughts, or surfing the Internet by just thinking about where they want to go.
But it’s a small step from reading your thoughts, to collecting your thoughts, to implanting thoughts in your brain. These things happen because they can, not because they should. It’s not hard to imagine a million Uighurs being herded through indoctrination centers, forced to put on helmets which will seed their brains with politically correct beliefs and render them fully compliant.
Futurist Gray Scott put it this way: “The real question is, when will we draft an artificial intelligence bill of rights? What will that consist of? And who will get to decide that? ”
It may be the question for our age.