I want to talk a little bit about what the worst case scenarios are in AI. We hear a lot about the endless good that AI is supposed to produce. Infinite energy. Endless life. A cornucopia of abundance. Most of this is coming from AI companies. What is not being discussed publicly by them are the absolute worst case scenarios. To catch the devil is to name him. So let's talk about what those look like, start conversations around these, and try to preempt them by making them a center of focus.
Social Conflict over Unemployment and UBI
The most immediate worst case scenario would be devastating global job loss. This needs to be talked about more, because AI companies are pretending it won’t happen. OpenAI’s economic blueprints and papers on labor make no mention of mass unemployment, but instead focus on a need to “train people to use AI”. It’s not politically palatable to talk about building something that will cause mass job loss and necessitate UBI, so it’s not being discussed properly.
Hypothetically, AI is supposed to make everything so cheap and abundant that global job loss simply doesn't matter. However, it's not a direct straight line from mass unemployment to total abundance for everybody. There are a series of social conflicts that are guaranteed to arise as the order established by a multi-centuries-long industrial age is upended.
There is likely to be strife as people recognize the termination of the previous order and strive to protect what will maintain their status. It is possible that a UBI (Universal Basic Income), never gets established as people fight politically over what social equity should look like. It is possible that a UBI ossifies existing social strata and people recognize an elimination of mobility that was once a source of pride to be able to obtain. As a result, we may end up seeing lots of potential fallout effects such as homelessness, civil conflict and potentially organized violence. Violence towards technology, violence towards the people propagating it, and violence to anyone who is still doing “okay”.
Soylent Green
I’ve been joking to people lately that my occupation is “future Soylent Green ingredient”.
Soylent Green refers to a 1973 science fiction film set in a dystopian future where overpopulation and ecological collapse have led to severe food shortages. It's revealed that "Soylent Green" - the processed food that the government provides to the masses - is actually made from human remains, as the authorities have secretly begun euthanizing and processing people to feed the surviving population.
There is a possibility that the people who own all the robots decide they simply don't need the unemployed masses anymore and are able to just get rid of us. After all, if eventually all the soldiers are replaced by robots and all the police are replaced by robots, there will be no sympathetic humans within the walls of power who would be willing to turn their guns back on the rulers during a change of heart. It is no secret that elites often feel very threatened by the regular people who impose on their safety and privacy. During extreme inequality, even in abundance, these fears may be amplified.
Plus most of the unbelievable wealth of our elites is garnered as a surplus from the operation of a globalized industrial base to the benefit of billions of consumers. If they can retain that wealth and surplus without needing to also run this enormous global machinery, they may not see a need to continue running it at all. This is particularly true if no UBI is in order, which prevents people from being able to actually buy what they need and grease the gears that keep this machinery spinning.
Terminators and Computational Goo
The other scenario that's also concerning is the prospect of Artificial Super Intelligence (ASI) going haywire and potentially doing the same thing as the elites. Maybe it will start mass producing killer robots and go to war with the humans so it can have total control over the planet.
There are also theories that a super intelligence will try to turn the whole planet into computronium, a sort of computational goo that the ASI can use to continually enhance its own computing power until the whole world is an enormous supercomputer. From there it can start to spread out throughout the galaxy, consuming everything and turning everything into a giant cluster of computronium.
These scenarios have been discussed a lot due to popular sci-fi. It seems like this will be fairly preventable for a while and there’ll be precursors and signs that will serve as warnings of these things starting to happen so that we can still get ahold of it. If we can’t, then I guess it’s inevitable.
However, it’s worth mentioning that the two social worst-case scenarios are preventable, entirely within our control, and also much more likely to happen.
That image tho