Cute roboto gummies

Curmudgeon’s Corner: Ethical Programming?

Written by Sanford Begley

The murder of five police officers and the wounding of nine others was ended through use of a remote bomb detonation device erroneously called a robot by the mainstream media. Drone would have been a much more accurate word, but we must allow for the lack of knowledge and intelligence among today’s journalists. Perhaps historically they knew more, or perhaps because the patina of time covers their mistakes we just don’t realize what idiots reporters have always been. Regardless, it caused some people to bewail the fact that Asimov’s Three Laws Of Robotics were not followed.

The first problem with that complaint was that they were followed. As a remotely operated vehicle the bomb disposal unit was not a robot and the human operator that piloted it and executed the mission would not have been bound by the three laws even if they existed. Using the drone was in no way a cybernetic operation and the officer was bound by human morals and laws. In my honest opinion he was perfectly moral and correct in doing so. I am sure there are people who would deny that, but they are in the minority. Are they wrong for their beliefs? I think so, but that is the heart of this essay. Morals are not universal.

My real complaint about the gnashing of teeth over not following the Three Laws is that the Three Laws are fiction, not laws. In 1942 Asimov postulated the Three laws as a basis for his robot stories. The various permutations of this gave him fodder for a lot of very good stories, that is all they were.  Despite what a lot of people think, just because an idea was popular before they were born does not make it law or reality.

At our present level of technology Artificial Intelligence is still a generation away, as it has been since before my birth. Without A.I. any questions of ethical programming such as the three laws are moot. A machine which will do a limited number of functions in a limited set of circumstances cannot be held to ethical standards. You cannot commit murder if you are not aware of what murder is. A robotic welder using its welding head on a human is a tragedy and a horrible accident, it is not an ethical failure on the part of the welder. It would be more accurate to call a man eating tiger a murderer than an unintelligent machine. At least the tiger acts volitionally.  

So the question arises of: should the Three Laws apply? Not now of course, when the technology is not sufficiently advanced for it to be a problem, but when the technology catches up to our dreams?  I don’t know. The three laws were largely a response to the general public fear of scientific Frankenstein’s monsters. Since the only robots imaginable were the magical golems of medieval maguses people have feared large, powerful, inhuman, devices uncontrollable and unstoppable. I’m pretty sure that Data from Star Trek  would almost fit that bill if he had not been portrayed as someone sympathetic and searching for his humanity. After all, he was mostly immune to what the average person would have for self defense. The android going amok in a shopping center would have been horrifying.

So some sort of Ethical Programming will probably be necessary  when the tech gets to the point where it is possible. Will it be a variant of the three laws? I don’t know, I do find it likely if it is done here in the west. Asimov shaped our thoughts on what reasonable programming should be.  Thing is, I’m not positive it is a good Idea. I can think of several situations where the three laws would be impossible for the uses of the builders and programmers. The simplest place where a Three Law obedient robot would not work would be in the Law Enforcement or Military sphere. A robot unable to kill a human would not be suitable to stop a busload of suicide bombers going into an orphanage. A robot constrained by the three laws would be useless in charge of a tank or fighter plane. For that matter, what about a robotic factory where the robots would be constrained to stand helplessly by while someone destroyed their owner’s business and them?

And the real kicker that no one ever considered? Number 2. “A robot must obey the orders given it by human beings except where such orders would conflict with the First Law”  That means your toddler could come by and order the robots building my house to play marbles with her. I think I’d have issues with that.

On top of that is the problem I have with number 3. If a robot is an artificial intelligence, with its own personality, friends, and dreams, why is its life any less valuable that the life of a derelict drug addict? If we come up with artificial intelligence the question of what defines humans will become very important. Lets face it, most of us feel our pets are more important than the life of some guy in outer Bangladore, would not the same hold true of the robot we play chess with on Tuesdays?
Or are we looking to create another race of subhumans to feel superior to? It’s a question and I don’t have the answer.


Comments

7 responses to “Curmudgeon’s Corner: Ethical Programming?”

  1. Martin L. Shoemaker Avatar
    Martin L. Shoemaker

    Asimov got a lot of mileage out of the ambiguity in the Three Laws. I enjoyed the stories tremendously. But as a programmer, I find them meaningless. Unimplementable. Too vague to ever translate into code.

    I did use the Three Laws in a recent story, but it was parody. I can’t take them seriously.

    1. I’ve always felt they were off-base, and I’m not a programmer. I have enjoyed Freefall’s take on them, though.

  2. Misha Burnett Avatar
    Misha Burnett

    Personally, I don’t believe that machine intelligence will ever reach the point of independent consciousness, but if it does it seems to me that it would, by definition, no longer be programmed in any conventional sense. Any machine smart enough to be able to understand Asimov’s laws would be smart enough to break them, essentially.

    1. Sanford Begley Avatar
      Sanford Begley

      This was essentially my point. Asimov’s laws make good fiction, they aren’t practical in the real world

  3. John in Philly Avatar
    John in Philly

    Mike in “The Moon Is A Harsh Mistress.” Self aware with none of Asimov’s Laws.

    And if we theorize self aware computers, will they be capable of good actions, as well as evil actions.

    Science fiction has a habit of turning into science fact when we least expect it.

    1. Sanford Begley Avatar
      Sanford Begley

      One of the things I find hardest to understand is why people are so insistent on humaniform robots to this day, considering that almost none of the working robots are even vaguely human