Last night I couldn’t sleep because I was thinking about writing a paper on morality and robots. Are you allowed to write about Asimov in law school?

Powell’s radio voice was tense in Donovan’s ear: “Now, look, let’s start with the three fundamental Rules of Robotics — the three rules that are built most deeply into a robot’s positronic brain.” In the darkness, his gloved fingers ticked off each point.
“We have: One, a robot may not injure a human being, or, through inaction, allow a human being to come to harm.”
“Right!”
“Two,” continued Powell, “a robot must obey the orders given it by human beings except where such orders would conflict with the First Law.”
“Right”
“And three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”
“Right! Now where are we?”
“Exactly at the explanation. The conflict between the various rules is ironed out by the different positronic potentials in the brain. We’ll say that a robot is walking into danger and knows it. The automatic potential that Rule 3 sets up turns him back. But suppose you order him to walk into that
danger. In that case, Rule 2 sets up a counterpotential higher than the previous one and the robot follows orders at the risk of existence.”
“Well, I know that. What about it?”
“Let’s take Speedy’s case. Speedy is one of the latest models, extremely specialized, and as expensive as a battleship. It’s not a thing to be lightly destroyed”
“So?”
“So Rule 3 has been strengthened — that was specifically mentioned, by the way, in the advance notices on the SPD models — so that his allergy to danger is unusually high. At the same time, when you sent him out after the selenium, you gave him his order casually and without special emphasis, so that the Rule 2 potential set-up was rather weak. Now, hold on; I’m just stating facts.”
“All right, go ahead. I think I get it.”
“You see how it works, don’t you? There’s some sort of danger centering at the selenium pool. It increases as he approaches, and at a certain distance from it the Rule 3 potential, unusually high to start with, exactly balances the Rule 2 potential, unusually low to start with.”
Donovan rose to his feet in excitement. “ And it strikes an equilibrium. I see. Rule 3 drives him back and Rule 2 drives him forward–”
“So he follows a circle around the selenium pool, staying on the locus of all points of potential equilibrium. And unless we do something about it, he’ll stay on that circle forever, giving us the good old runaround.” Then, more thoughtfully: “And that, by the way, is what makes him drunk.
At potential equilibrium, half the positronic paths of his brain are out of kilter. I’m not a robot specialist, but that seems obvious. Probably he’s lost control of just those parts of his voluntary mechanism that a human drunk has. Ve-e-ery pretty.”
“But what’s the danger? If we knew what he was running from–”?