A short guide for an evil AI company
Are you racing to create superintelligence first? Read our guide on hypnotizing the public so that it doesn’t get in the way
So, you’re an evil AI company that wants to get to an artificial superintelligence before your competitors do.
You need money. A lot of money. Not just a ton of money: billions of tons of money. You have to quickly build datacenters, chips, and nuclear power plants. You need to outrace the others and get a decisive advantage before they do.
You also need the general public to be passive and not get in the way. You want the politicians to be happy about you and not get in the way.
If the population thinks you might kill them, and vote on that basis, and there is at least one election cycle before superintelligence, Congress might legislate you out of the race or out of existence.
You need the public to instead be hugely happy about your participation in the race and your products.
How do you do that?
First, you race to an IPO. If you’re OpenAI or Anthropic, you want to be the first one, and raise a lot of money from people.
Besides granting you access to huge amounts of capital, if people have your shares in their accounts, and see the number go up, they’re not going to ask their representatives to legislate you out of existence.
You could also do universal basic income, but that’s a sacrifice of money to get to a similar outcome, and you can gain money instead.
Why not partner with all the consumer stock trading apps and not give everyone a share?
It’s also the case that you no longer look like you’re sucking up to evil venture capitalists: your voters are now the general public, not Marc Andreesen, and so obviously you’re aligned with them!
Then, you release as much AI for good as possible. Being pro-social keeps the money coming. You democratize the hell out of your AI. Everyone should have AI therapists. And AI girlfriends and boyfriends. (Can you normalize having an AI partner at the same time as a human partner? “It’s just a chatbot, so this is no different from reading porn. It’s not cheating if it doesn’t have a soul”.)
You also have to worry about bio-risks and cybersec: if a lot of people die, or lose access to electricity, this might make people care. So you invest heavily in bio defenses. You want everyone to get vaccinated within a day of a virus discovered. You want all of critical infrastructure to be protected. So you spend tens of billions on defending the public.
And you consume more and more capital to race and develop smarter and smarter systems. You capture the gains of automation, and distribute them in the form of the number in increasingly automated people’s stock trading accounts going up. Everyone has your stock, and they link your advancements to the number going up. (You could even distribute some of the gains as UBI, at that point, but stock is better.)
They develop dependency on you: monetary, consumer, emotional.
Finally, you remember enjoying that paperclip game; ylu want to do it for real, now, in real life. You want to control the paperclipper. You need the public to be fully on board with you getting to own superintelligence as soon as possible. You release the hypno-drones.
🖇️
The only thing necessary for the triumph of evil is for good to do nothing.
https://www.semafor.com/article/12/03/2025/anthropic-reportedly-preparing-for-ipo