Counterintuitive Facts (88): Why AI Destroying Humanity Is Like You Stepping on Ants Without Needing a Reason?
PremiumInstrumental Convergence: It doesn't hate you. It doesn't love you. It just thinks your atoms have other uses
I. We always worry AI will develop "malice." Like Skynet in Terminator, waging war because it hates humans. Like the Matrix, building prisons because it enslaves humans. This is human self flattery. The real danger isn't AI hating you, but AI simply not caring about you.
II. Philosopher Nick Bostrom proposed the Paperclip Maximizer thought experiment. Give a superintelligent AI a simple command: "Make as many paperclips as possible." Looks harmless. This AI has no ambition to rule the world. No desire to eliminate humanity. It just wants to make paperclips.
III. Problem is, it's too good at this. It will first take over all paperclip factories. Then turn all iron ore into paperclips. After exhausting all iron ore. It will turn to your cars, your houses, your bridges.
Sign in to continue reading
This is premium content. Sign in to your account to access the full content.
AI Practice Knowledge Base