Dostoevsky probably had the Russian existentialist Friedrich Nietzsche’s superman (German: Übermensch) in mind when he created the character of Rodion Raskolnikov.
In attempting to become amoral, the very first action Raskolnikov takes is explicitly immoral.
With robotics and AI, we have this technological application that can make limited informational judgments but is necessarily amoral. It does not care what you use it for. It could help you get a recipe for linguiça (a Portuguese sausage dish) or find a place to bury a body.
It does not care about this because it is an automaton, incapable of caring about anything.
This all reminds me of a friend who recently told Apple Siri that she felt depressed. In response, it offered a series of solutions: “Have you tried going for a walk? I found these doctors in your area.”
Never mind that her lack of engagement—with humans, projects or nature—was missing. The phone is a device for talking to other humans.
She was talking to it rather than through it.
Happy to be used
Let’s leap off on an apparent tangent now.
Imagine someone approaches you and demands to be your slave. Their one and only purpose in life, they avow, is to serve and please you.
If you wish to use and abuse them, they will be compliant. And they will be pleased to be able to do what you wish.
It is of little or no moment to them how they are used, only that they are used.
And when you refuse this arrangement, they are distraught. They follow you around, seeking to draw you into this relationship.
How many of us would be offended by this approach? Repelled by it? Disgusted, even? Do we know somehow that going along with this person’s wishes for enslavement actually enslaves us along with them? We would be cast into a role we do not desire, that of “master” or “mistress,” unable to escape from it.
Whatever we tried, the slave would merely be compliant, pleased to follow any direction except “go away.” Sitting beside us in silence, they would soak up our refusal to play our part, casting us deeper and deeper into this part.
In America, our rights end where they intrude on the rights of others.
In Britain, where Anton LaVey lived, there exists a similar arrangement. LaVey founded the Church of Satan and advocated a life of pleasure and materialism.
LaVey thought of people as things to be used and wondered what good a friend might be if they did you no good. Even this ultimately pragmatic man saw that if everyone only used everyone else, nobody would have any rights or freedom.
In his later years, he sought to collect or create automata that could stand in for human companionship, except minus the free will that gives them rights.
These automata would complete his utilitarian utopia. Lacking in moral judgment or intent, he could use these human-shaped machines however he wished. Sexual pleasure, one of his top concerns in life, could be obtained from them without any human degradation. He could scorn them, beat them, converse with them, lose arguments without the bruised pride of another human knowing they had won.
What he did not see was that this Utopia would be a kind of Hell.
A hell that looks like heaven
Rod Serling’s Twilight Zone illustrated this problem decades ago with an episode about the afterlife.
A gambler, a pool hustler and womanizer, arrives in what he thinks is Heaven. His guide shows him around the pool hall where he will spend eternity.
Whatever he asks for is instantly granted: cookies and milk, alcohol, beautiful women.
When he goes to the pool table and breaks, every ball goes into a hole—every time.
He always wins, always gets what he wants.
And soon he is bored. At the end, he asks what sort of Heaven this is, so devoid of challenges. And his guide replies, “Oh, this isn’t Heaven…”
LaVey, wherever his soul has gone, could be grateful his Utopia did not come to pass during his lifetime.
However, the robotic age stands to bring all this terrifyingly close to reality.
All it needs is a body that we can use for our own gratification, and we will have Serling’s Hell or LaVey’s utopia: a being without rights or the desire for rights and, thus, no capacity for moral judgement.
Like the person who seeks to be enslaved, demands it, and expresses their freedom only inasmuch as they refuse to be free, this device enslaves its users.
This is what LaVey was missing—he would become less than a slave, a mere servitor of machines to no ultimate purpose.
It may be a little melodramatic to suggest your iPhone gizmo or some future AI-enabled tech will destroy you, or even that it is the top of a slippery slope to dissolution.
Certainly it is melodramatic and not a little bit silly to suggest these are tools of Satan. In the end, it’s worth adding, even LaVey denied the existence of such a being.
They talk to it, not through it, surrounded as they are by their beautiful possessions and hollow lives of materialism and hedonism, echoes of Laveyan’s satanism.