5 Comments

Nice piece, Henry--and also I liked the Foreign Affairs piece.

Yang's research is going at one of the enormous weaknesses of *all* social science research that seeks to mine social media for an indication of what people really think and even more importantly what they are prone to *do* about what they think. In a comparative sense, this would be like European ancien regimes mining the print culture of the 18th Century and concluding that the texts which most directly attest to political sentiments and call for political action are leading indicators of the possibility of unrest or dissent, whereas if you buy even part of Darnton's argument in The Forbidden Bestsellers of Pre-Revolutionary France, the more important texts that worried the regime were pornography, dreamy utopian narratives, and vicious attacks on minor public figures--but sometimes *all* texts did. But the real argument is that it wasn't until the French ancien regime identified print culture *as* a problem (and *as* containing information that the regime needed to know) that those texts started mobilizing rebellious sentiments among readers. I think you could add layers to Darnton's take--18th C. print culture was a complicated product of individual authors, the chaotic improvisations of publishers, and the constantly protean social worlds where print was kept, read and talked about, and that beyond those worlds were the complicated social ties between the readers and non-readers, and how what was read was communicated and re-circulated. Contemporary social media is a product of platforms and their cyborg-making interfaces, of the temporary relations that exist inside platforms, and of the ways what is said inside platforms gets communicated onward to people who aren't there. (Am I exactly the sum of my likes and dislikes on Facebook, or that man a creature who exists only inside Facebook's media ecology? Can you learn anything of what I might do out here in my daily material existence from it? Something, but not even close to everything.) Garbage in, garbage out in terms of *whether* what citizens say and do in texts is what they're going to say and do in material reality is a problem older than AI, and it's where a humanistic understanding of representation becomes really important for understanding the problem. (Because it's not just a problem for governments, it's a problem for social scientists who are looking for data that will tell them what people "really" think and how what they "really" think will govern what they concretely *do*.)

But I'd also work it from the other end of the problem. (Forgive me if Jeremy's new book does this at length, as I haven't had the chance to read it yet.) Do authoritarian regimes really want to know what their people want? In ethnographic and historical terms, I'd say there's a fair amount of evidence that they don't, at least not in the upper reaches of their hierarchies. In fact, I'd say that a good deal of the time, the people who are ostensibly charged with making decisions that are supposed to be guided by accurate information don't want information that will complicate, deflect, or outright invalidate what they ideologically or predispositionally want to do, that authoritarian states are sometimes more inclined to act according to some theory of power (or some whim of the authoritarian) and then to try and force social reality to *cohere to the action* or to suppress (with various degrees of violence or coercion) any attempt to dissent from that re-alignment. Harari's piece presupposes a kind of evolutionary race between authoritarian and non-authoritarian regimes in which accurate data mining of what the people really think and are really inclined to do will make authoritarian regimes more successful and powerful because they will more accurately anticipate and neutralize popular feelings. That seems to just misunderstand actually-existing authoritarian regimes, or it sees contemporary China as defining bleeding-edge authoritarianism, which I think is at least worth debating.

This might be a point that applies more generally to large-scale modern governments period. E.g., the people at the top of governmental hierarchies often have potent reasons to not know what they could know, or to discount an accurate source of information in favor of an inaccurate one simply because the accurate information will force decision-makers to move in a direction that they're politically or ideologically indisposed to take, whether that's invading Ukraine or putting out a 'mini-budget' that included dramatic tax cuts and no budgetary audit of the consequences. I hate to invoke "deep states", but it does seem fair to say that if there's an appetite for copious amounts of deeply accurate information about the population of a country, it's lower down the pyramid of power *within* states, I think.

Expand full comment

What do 'authoritarian' government actually look like?

Do they give their leaders sole right to:

* Hire and fire the country's 5,000 top officials.

* Declare war. Frequently.

* Issue 300,000 national security letters (administrative subpoenas with gag orders that enjoin recipients from ever divulging they’ve been served);

* Control information at all times under his National Security and Emergency Preparedness Communications Functions.

* Torture, kidnap and kill anyone, anywhere, at will.

* Secretly ban 50,000 citizens from flying–and refuse to explain why.

* Imprison 2,000,000 citizens without a court trial.

* Execute 1,000 citizens each year prior to arrest.

* Kill 1,000 foreign civilians every day since 1951

* Massacre its own men, women and children for their beliefs

* Assassinate its own citizens abroad, for their beliefs.

* Repeatedly bomb and kill minority citizens from the air?

If so, then the USA is the most authoritarian government on earth.

Expand full comment

Quite helpful. If frustrated minorities within the ruling class have access to similar data as the dominant group, would they not be able to use it to organize disaffection?

Expand full comment

Excellent. Tiny caution: is it possible that the Twitter data is not, or not all, from outside China, but from VPN users inside? Which could bias the sample in some way (people who can afford to use VPNs)?

Expand full comment

"Do authoritarian regimes really want to know what their people want?" There's a parallel version with politicians in non-authoritarian regimes: they want information in very specific forms: actionable, credible, politically useful. In education policy, David R Garcia's book Teach Truth to Power discusses the literature on this (in the US) along the way to his recommendations to researchers.

Expand full comment