14 Comments
User's avatar
erymanthian bore's avatar

your second point underscores the broader phenomenon that the state tends to turn upon itself those same tools and weapons which it previously exerts upon others. perhaps a trite point but your example reminded me of the militarisation of US police departments post-War on Terror, as another instance

Expand full comment
Wheatpaste's avatar

I think this is an extraordinarily valuable point: it’s both a reason we should be personally interested in what our government does (what tools it develops) throughout the world, and an antidote to the planned confusion of calling chaos: order, or weapons: protection, or dismantling: efficiency…along with much of the playbook of 1984.

Expand full comment
mike harper's avatar

Describe what is going on as "Cooking the books". I think a lot of folks know what that is.

Expand full comment
Nurit HaDov's avatar

Someone recently asked GROK what it (or other AI) could do with the Treasury Data it received before the federal court decision. It’s absolutely Orwellian. Here’s GROK’s response as posted on Xitter:

"The data from the U.S. Treasury's payment systems could be used in negative or harmful ways against individuals if not managed with ethical considerations. Here are some potential negative uses:

1. Discrimination and Profiling

Targeting for Scams: With access to personal financial data, malicious actors could target individuals for sophisticated scams or phishing attacks, knowing exactly when and how much money they receive.

Creditworthiness Manipulation: If this data were used by entities outside the intended government scope, it could lead to discriminatory lending practices or affect credit scores based on government payment status.

2. Surveillance and Control

Behavioral Manipulation: Knowing when and how much money people receive could be used to influence behaviors, perhaps through timed interventions or offers that exploit financial vulnerability.

Political Suppression: In extreme scenarios, data could be leveraged to control or suppress individuals based on their financial dependency on government programs, potentially affecting free speech or political activity.

3. Financial Exploitation

Predatory Marketing: Companies could use this data for targeted marketing of high-cost, unnecessary products or services to individuals based on their payment patterns.

Debt Traps: Knowledge of exact payment timings could lead to predatory lending practices, where offers are timed with government disbursements to trap individuals in cycles of debt.

4. Privacy Invasion

Stigmatization: Even if anonymized, if data were to be re-identified or linked to individuals somehow, it could lead to stigmatization based on reliance on government aid, affecting social standing or employment opportunities.

Data Leaks: Any data breach would expose individuals to risks like identity theft, where their financial details could be used for fraudulent activities.

5. Psychological Impact

Power Dynamics: The knowledge that one's financial details are being analyzed could create a sense of powerlessness or paranoia among recipients, affecting mental health or trust in government.

Manipulation via Fear: Using data to predict and manipulate when individuals might be most financially vulnerable could be used for coercive practices.

6. Legal and Illegal Targeting

Legal Actions: If the data were used by law enforcement or other agencies without proper oversight, it could lead to unjust surveillance or targeting for minor infractions related to financial status.

Blackmail or Extortion: In illegal contexts, this data could be used to blackmail individuals, especially those who might be embarrassed or fearful of their reliance on government aid being exposed.

7. Misuse in Al Development

Biased Al Models: If Al were trained on this data without considering demographic or socio-economic biases, it could perpetuate or worsen existing inequalities when applied in decision-making processes.

Unintended Consequences: Al might make decisions or predictions that could negatively impact individuals if the models are not carefully designed to consider ethical implications.

Even though we're assuming no privacy concerns, the ethical handling of such data is crucial. The potential for misuse necessitates strong safeguards, ethical guidelines, and transparency to prevent harm to individuals. This underscores the importance of data governance, ethical Al use, and the protection of personal welfare in any data utilization strategy."

Expand full comment
Wheatpaste's avatar

Reports are that DOGE has installed some AI-interfacing software. Private data, once digested by a generally-available LLM, cannot be put back in the toothpaste tube. So we may be past the point of managing their leaks; we don’t have reliable methods to “unteach” and barely the ability to censor data…all of which is reliant on private companies to hold the line.

Expand full comment
Sean Coley's avatar

Analysing data with an LLM does not ‘teach’ that data to it. You have to train the LLM with it for it to retain the information, and given the scale of the privacy breach that making a so-trained LLM publicly available would be, I don’t think you need to worry about it.

Expand full comment
Wheatpaste's avatar

Sure, let's use the term 'train'. I really don't have any clarity that the data is limited to analysis versus training. So I'll remain worried.

There's also documentation that LLMs can be subject to phishing https://arxiv.org/html/2403.00871v1 - especially concerning if it's widely known what type of data to look for with deceptive instructions.

Expand full comment
Wheatpaste's avatar

I mean, the largest leaker of my PII is Equifax, fintech companies that my health care providers have to use to bill my insurance, and other groups I'm required by law to provide this data to in order to meet basic needs. So the threat of an embarrassing "privacy breach" is practically meaningless: there are no consequences for leaking troves of SSNs, bank account numbers, health conditions, etc. other than having to send me a chagrined letter via US Mail.

Expand full comment
Nurit HaDov's avatar

Congress did not authorize this. The People did not consent to this. The President does not have the authority to violate our data privacy rights. And the AI and DoGe platform are both not secure.

Expand full comment
Sam Pooley's avatar

It interesting to think how AI works in the sense of a simple prompt into a database that only needs “Read Access” but then exports the information gained from that prompt. For those of us who grew up programming/coding, trying to do this would be incredibly dense: if AI makes it easy, the Muskrats can do it once they get access to the Treasury or other government agency data base PLUS the ability to send their prompt “into” that data base (I don’t know the AI vocabulary for this yet). The simplicity is key: loyal followers don’t need to be smart. They just need to be loyal.

Expand full comment
Alex Tolley's avatar

This seems more like a detail of the main purpose - getting control of the government through its finances. Trump and Musk are effectively bypassing the treasurer and writing checks to themselves and the people and organizations they like, not what was budgeted. This is pretty standard authoritarian MO. Trump's actions mirror Hitler's takeover of Germany, the main actions are very similar, but the details differ. While Hitlar, and Organ in Hungary, were able to change their nations' constitutions to stay in power, Trump will have to do the same thing by different means, extra-legally. I would watch for how he intends to do this.

Expand full comment
Lee A. Arnold's avatar

"Elon Musk's DOGE is Feeding Sensitive Federal Data into AI to Target Cuts" --Washington Post, yesterday

In other words, he is downloading everybody's information into his own system.

Democratic Party should say loud and clear that any government employee who becomes a whistleblower, or breaks a confidentiality agreement by going public with information, will be given a pardon by the next Democratic president, and reinstated with full back pay.

Expand full comment
Courtney's avatar

Thank you! More of this please.

Expand full comment
eyebones's avatar

What is 'weaponized'? The word could be globally replaced with 'evil' without changing the content of the article.

Expand full comment