The must-read article of the week is Mara Hvistendahl’s Wired investigation of the development of “social credit” in China. The terror is in the (seemingly) independent development of different tracking and rating systems. On the one hand, the Chinese people are much further along than Americans in using a smartphone app to pay for things and gain credit. On the other hand…
For the Chinese Communist Party, social credit is an attempt at a softer, more invisible authoritarianism. The goal is to nudge people toward behaviors ranging from energy conservation to obedience to the Party. Samantha Hoffman, a consultant with the International Institute for Strategic Studies in London who is researching social credit, says that the government wants to preempt instability that might threaten the Party. “That’s why social credit ideally requires both coercive aspects and nicer aspects, like providing social services and solving real problems. It’s all under the same Orwellian umbrella.”
What is developing is a system in which every aspect of a person’s life feeds into an algorithm:
The algorithm behind my Zhima Credit score is a corporate secret. Ant Financial officially lists five broad categories of information that feed into the score, but the company provides only the barest of details about how these ingredients are cooked together. Like any conventional credit scoring system, Zhima Credit monitors my spending history and whether I have repaid my loans. But elsewhere the algorithm veers into voodoo, or worse. A category called Connections considers the credit of my contacts in Alipay’s social network. Characteristics takes into consideration what kind of car I drive, where I work, and where I went to school. A category called Behavior, meanwhile, scrutinizes the nuances of my consumer life, zeroing in on actions that purportedly correlate with good credit.
If you do the sorts of things that the people who control the scoring system like, you find yourself receiving special treatment across society. If you miss some payments (or maybe refuse to pay for something for justified reasons), have disfavored friends, and run afoul of the government, you may find yourself with a low enough score to be all but banned from means of modern transportation and tolerable accommodations, even finding educational and occupational doors closed.
Worse, much of the work of oppression happens socially. Already, the system encourages people to socialize only with those who have good scores. When fully implemented, the system could make life with a low score simply impossible.
Short of that outcome, one can still see the allure of the system — even imagine the sorts of arguments progressives would make on its behalf. After all, if your score isn’t purely a financial question, people who start adulthood off with financial disadvantages could gain access to favorable credit and special treatment by other means. Moving up the economic ladder could become a possibility for those who are willing to behave according to social ideals in their purchases and activities. In a way, this merely automates and makes more efficient the social processes by which we already interact and assess each other.
The problem is that somebody owns the algorithm. Somebody decides for everybody what counts as good or bad behavior. Somebody decides whether dedication to the powerful gets a big bonus or helping the wrong kinds of people results in demerits. Contrary to the apparent belief of the West’s more politically correct citizens, none of these answers are self-evidently correct, much less certain to be recognized by whoever happens to be in power at a particular time.