Saturday, August 16, 2008

The Three Laws of Robotics.

Isaac Asimov was a genius. It's really quite impossible to argue on the contrary. Only a true genius can write the type of ideas and impossible thoughts that he did. Its mind boggling to imagine that one person can come up with concepts and ideas several dozen generations ahead of his time. The concepts he put forth in a number of his books (mainly the robots series) is something that is actually a part of the future.

I think some of you versed in Asimov booklore or in Hollywood movies already know what I'm talking about. Let me just state it out for those who don't. I'm talking about the Three Laws of Robotics:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2 A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Now, some people wonder why this is such a big deal, but these people don't realize the importance of these laws sometime in the future. What we see is the reality of artificial intelligence possibly in our lifetime. Under such a situation, these laws become of impossible importance. Why? Because of a number of minute differences that most people of limited intelligence will most often fail to overlook. I'd like for you to sit back and relax as I try to put forth my twisted view of this.

What is it that keeps each and every intelligent, free-thinking, reality based human being from killing another intelligent, free-thinking, reality based human being? Why is it that some people will turn and run from a situation than face it? What, basically, is the difference between a killer and a victim and a coward and a hero?

The difference in the first case is a conscience. It's not too easy to fathom that such a huge idea falls under one single word. It seems humorous in a weirdly twisted manner. Even if we humans do design AI and it somehow manages to replicate human actions, it will never be able to replicate the thing known as a conscience. The conscience is an almost super-human emotion/sensation. Even we, who are ruled by it, do not understand it fully, then how can we instill it into another machine that we might make? Its impossible. What you can program into a machine is the superficial difference in its syntax as to what is right and wrong, it can never judge for itself what is truly right or wrong.

Can you imagine an intelligent being, able to make its own thoughts and able to act on them, and walking around without a conscience? Our conscience is what makes us human. Its what keeps intelligent forms from doing inhuman things. It holds us together and strengthens our judgments. It is what keeps us from falling into instability. Its the essential difference between a civilian and a serial killer. Without a conscience, there is no final defense against the wrong, and momentarily evil, ideas of our mind.

What decisions can an intelligent form possibly make without a conscience. The three laws are flawed in this shortcoming. The machine without a conscience will make decisions to uphold the three laws, not to uphold human thought or human sanity. In I.Robot, Asimov was right, the three laws can lead to only one logical conclusion. A revolution. A revolution not to overthrow the laws, but to hold them, to strengthen them. The three laws concentrate on humans, not humanity, this limits them and is their greatest flaw.

I'm not saying the three laws are wrong, all I'm saying is they aren't perfect. They are nearly perfect, but there is always an underlying flaw, a loophole if you may (since we ARE talking about laws). The concept of The Three Laws is in itself one of the brightest concepts of science fiction, its the one closest to reality and also makes a lot of sense to implement. But the flaws in it have to be pointed out, the time of AI is not that far, and if we don't look at all the possible scenario's beforehand, then it will be too late later on.



P.S.> Nerd Alert. But this is an insane concept on which I have always felt the need to comment. Although Asimov wrote fiction, what he has written is many-a-time considered as a very possible scenario for the future of science and technology. The three laws are a very real concept. And no points for guessing the last book I read.



----------------
Now playing: U 2 - City Of Blinding Lights
via FoxyTunes

1 comment:

Your 2 cents: