Isaac Would Be Apoplectic

First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Second Law: A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Those are Isaac Asimov’s famous Three Laws of Robotics, and he incorporated them into many stories and several novels, often showing just how ambiguous interpreting them can be, for both humans and robots.

Asimov was very invested in those laws, which is why I’m certain he would be apoplectic about certain plot developments in David Goyer’s adaptation of his Foundation series.

Just for the record, there are going to be some mild spoilers in the rest of this post, so if you haven’t yet watched the Foundation series on Apple TV+, be warned. By mild spoilers, I mean I’m going to reveal in very general terms an action that one of the characters takes, without mentioning exactly who does what to whom.

Here’s an anecdote that Asimov related. He had gone to see the premiere of Stanley Kubrick’s 2001: A Space Odessey, and at the conclusion of the first part, just before the intermission, it became apparent that the computer, Hal, was planning on killing the humans onboard the spacecraft.

The Caves of Steel

The Caves of Steel, Asimov’s first robot novel and the first successful blending of science fiction with a detective story

Asimov, rightly thinking that Hal was just another kind of robot, became apoplectic and began yelling “They’re breaking First Law! They’re breaking First Law.”

Finally, one of his companions had had enough and said, “Well, Isaac, why don’t you smite them with a lightning bolt?”

And that’s why I am certain that Asimov would have been very upset with a robot killing a human being in the Foundation adaptation. Nor would he have been mollified by the final episode where the robot in question seems to be suffering remorse for its actions.

So I’m rather surprised that Robyn Asimov seemingly okayed this development.

That’s not the only thing that bothers me about the show, which I like a lot but don’t love.

But first, I just want to add that I’m not bothered by some of the things that seem to bother other readers of the books. In the first few stories of Asimov’s series, the Foundation undergoes a crisis in each one and emerges triumphant because it was “inevitable”, in Hari Seldon’s word. They were actually only so-so stories, and it wasn’t until Asimov introduced a character known as The Mule, that the series really picked up steam. And frankly, if Goyer had tried to dramatize those stories “faithfully”, it would have made pretty boring television. Each of the stories could have been handled in a half hour.

As it was, Goyer dramatized the first story in in the first episode, and then spent the rest of the first season freely adapting and expanding the second story.

So what else bothered me?

Well, the depiction of religion, for one. Asimov was an atheist, as am I, and in his stories he used religion as a means of the Foundation controlling other societies, so I’m not happy (and I doubt that Asimov would have been either) with the seeming reverence that Goyer bestows on the fictional religions (yes, I know, all religions are fictional by definition), even having a robot being an adherent of one of them.

And why are all the characters bandying around the word “soul”, another fictional concept? I could really do without that, although in fairness to Goyer, when he talks about the soul, he seems to mean the ability to think rather than any more mystical interpretation. I just wish he had found a better word to describe it.

Leave a Reply