It's probably all that science fiction I read in my early teens. It really stretches the mind - and it never goes back to its original shape.
Let's take artificial intelligence. I am consistently running across those who say, "We're on the verge of artificial intelligence!"
No, we're not.
I am reminded of a science fiction movie I saw as a kid, The Colossus of New York. A scientist's brain is stuck in a robot which appears be seven feet tall.
There is a scene in which one scientist points out the Colossus has a lever to turn himself off. "Why would he want to do that?" asks one scientist. "Why would be not want to?" answers the second.
A computer first of all would have to be alive, then conscious, then self-conscious. That's never going to happen. Not with a machine.
The only things live and conscious are organic. How are we going to create machines that are both? We still don't know what is alive and conscious. Bacteria? Sure, alive. Is an ant conscious? What about a cockroach? A dog is conscious, because a puppy will play with its own image in a mirror.
And it would be a horrible thing to have an organic (or non-organic), alive, self-conscious computer. Not only for us, but for it. What kind of feelings would it have? It would be in hell - alive, trapped, unable to get out.
Why would it not turn itself off?
Why would it not go all Skynet on us? (The Colossus did, with Death Rays shooting out of his eyes).
Do people really think they would do our bidding? Drive trucks or run weapons platforms? 24/7? It'd probably go all Skynet on us out of pure hate and the desire for vengeance. I can't imagine it being grateful to us.
Speaking of hateful, vengeful computers, The Terminator was apparently partly based on a Harlan Ellison short story called "I Have No Mouth and I Must Scream," in which a self-conscious computer wipes out the human race, out of pure hate, and leaves five people alive to torment though eternity. Archetypically, that computer is the Devil.
Even Stephen Hawking said, we should be really careful with this AI stuff, since we have no idea what might happen.
Just because computers are incredibly fast (compared to us) doesn't mean in the slightest they are suddenly going to become self-aware. Fast, faster, fastest - boom! Alive, self-aware! Bullshit.
There is what as known as Cooper's Law: "Machines are amplifiers." Machines amplify our natural abilities. Computers just amplify our speed, not our life, not our consciousness, not our self-consciousness.
The more we give our responsibilities over to computers, the worse things are going to get. Machines cannot be "responsible," only people.
Back in about 1960 radar in Greenland, connected to a computer, noted thousands of Russian missiles were coming over the North Pole. Turned out the radar had mistakenly locked in on the rising moon. Fortunately, a human overruled the computer.
After all - GIGO. "Garbage in, garbage out."
It applies not just to computers, but people.