June 26, 2006

You, Robot


Spoiler Alert. In Will Smith’s movie based on Isaac Asimov’s short stories, robots created to be butlers realize that human beings are evil, and decide to kill off the bad ones as a logical solution to ensure the healthy future existence of the human civilization. If these robot ‘Gods’ truly know best, is this really wrong since the good outweighs the bad…logically?

14 comments:

Arcane Rest said...

How do you know that the 'bad' person couldnt change? You never gave them a chance. I mean isnt the best thing to do is pray for God to enter into the life of a killer, or Dictator, so they change. For example, because I am an American, wouldnt it be better for Osama to become a Christian and change, than to kill him because he is bad?

Sabai said...

So why destroy Sodom and Gomorrah instead of preaching to them?

Arcane Rest said...

So why destroy the whole world except Noah and family? would be the best question.

Arcane Rest said...

I guess the answer to that would be. It is up to God Himself to decide who dies and who does not because they are bad.

Although in Leviticus he does allow the government to do so too, i.e. the death penalty

Sabai said...

And that's why since we can't agree on what God wants us to do, I want Will Smith to create a robot that thinks objectively and tells us what to do.

Arcane Rest said...

if you want your life dictated to you by a robot that is your business, but i like to include things that are not as objective. I mean would the robot somehow weigh all the good vs the bad things that you did, or would there be some sort of heirarchy of good or bad weighed.

What you propose seems pretty pharisaical to me, meaning that only good things you do will allow you to live.

And who is making this robot again?

Sabai said...

the robots won't kill bad people, they'll kill people who are destroying the earth, like...i don't know...Kathy Griffin.

Arcane Rest said...

how do you mean destroying the earth?

Sabai said...

the robots will decide that. put your faith in the robots.

Arcane Rest said...

you sold me........Heil Robot (Sonny)

magonline said...

the interesting thing about artificial intellegence is that all it can do is process and output information, there is no way to communicate with artifical intellegence.

Sabai said...

you sir have misunderestimated one will smith.

Steve said...

Many heinous acts can be justified through pure consequentialism if you only look at it from a certain viewpoint (classic example: slavery was good for the economy, overlooking of course the fact that everything else about it was an abomination). Then again, a complete rejection of so-called situational ethics opens up different problems (The classic hiding a Jewish family from the Nazis example). Maybe morality demands weighing both the action and its consequence, not one or the other.

Thanks man, you just caused me to have an internal boxing match over morality because of a Will Smith movie. I hope you're happy now.

falloutboy said...

Where's the Sears Tower in Sears clothing when you need him?