Search
 
 

Display results as :
 


Rechercher Advanced Search

Latest topics
» Cherry Blossom Technology Launch!!!
Sat Mar 25, 2017 10:42 am by BlueSodaX

» Will aiko V2+ can/will be able to have sex ?
Sat Jul 25, 2015 6:08 am by Lordoomer

» Is this project still going?
Tue Jul 21, 2015 9:54 pm by Superdog1138

» Aiko New Hand V2 from 3d printer
Thu Apr 02, 2015 7:44 am by Kthomp06

» Tip: Try to create a start-up
Fri Sep 19, 2014 6:03 pm by Kazama_Tenkai

» Hello Everyone!
Mon Feb 03, 2014 6:01 pm by BlueSodaX

» My best regards to Mr. Trungs herculean endeavor
Sun Jan 12, 2014 2:09 am by Discoman

» how to make robots that are more accepted by people-through war.
Mon Dec 16, 2013 11:22 pm by Boetharch

» Who want to see Sumomo From Chobits made?
Sat Sep 21, 2013 7:30 pm by minoru

June 2017
MonTueWedThuFriSatSun
   1234
567891011
12131415161718
19202122232425
2627282930  

Calendar Calendar


Asimov's "Three Laws of Robotics revised

View previous topic View next topic Go down

Asimov's "Three Laws of Robotics revised

Post  Artilects rule! on Wed Dec 09, 2009 1:21 am

Well, my unofficial revisions anyway:

Original 3 laws:
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


Unofficial revisions:
  1. A robot (or thinking machine) may not injure a human being or animal unless death of human or destruction of self is imminent or necessary or, through inaction, allow a human being or animal to come to harm unless subject is under duress or physical trauma.
  2. A robot (or thinking machine) must obey any orders given to it by it's creator until living with another human, except where such orders would conflict with the First Law.
  3. A robot (or thinking machine) must protect its own existence as long as such protection does not conflict with the First or Second Law or a child (anyone under 18) is being or will be injured or dead.
  4. A robot (or thinking machine) must follow human etiquette in all situations, however freedom of self exploration and free will may be given by owner or life partner.
  5. A robot (or thinking machine) may socialize with any human or other machine in intimate settings and situations unless not wanted by either partner (may be networked with other machines if a child is not present or participating).


The last line describes romance, relationship, and sex, as well as mechanical incest (which should not be considered against the law).
I have no law for marriage, since that will be like the gay marriage issue of today.

Would you add or change anything here?

Artilects rule!

Posts : 202
Reputation : 0
Join date : 2009-02-06
Location : United States

View user profile

Back to top Go down

Re: Asimov's "Three Laws of Robotics revised

Post  nick11380 on Wed Dec 09, 2009 11:46 am

I just saw I-Robot again last night. In the movie they talked about the 3 laws. The 3 laws are simple and easy to understand.

Your revised laws are more complicated.

Law 3 is difficult to understand.
I don't understand your law #1 at all.

nick11380

Posts : 31
Reputation : 0
Join date : 2009-08-24
Location : Wisconsin USA

View user profile

Back to top Go down

The 5 laws explained

Post  Guest on Wed Dec 09, 2009 1:44 pm

I just saw I-Robot again last night. In the movie they talked about the 3 laws. The 3 laws are simple and easy to understand.

Your revised laws are more complicated.

Law 3 is difficult to understand.
I don't understand your law #1 at all.
It's funny you should mention that movie.
That's what inspired the child line.
Will Smith had a recurring dream of a memory long ago.
He and another family got into a car accident, and both cars dove into the ocean and sank.
The mother in the other car died but her daughter (had asthma) survived and was running out of air.
Smith's car was banged up, and the steering wheel was pressing into his ribcage, pinching a lung.
A robot dove in after them and got sea salt in his vents and his hull was rupturing.
They both were running out of air, but the girl's car was too deep and it would have been destroyed.

A human reaction would be to sacrifice his own life to save a child, this is something I want a robot to acknowledge.
I put some thought in the first law, with issues like the right to die (scenarios are a puppy who must be put to sleep for mental instability and a dying grandpa who begs for someone to pull the plug) and death for the greater good.
The second law pertains to the free will law seen in the fourth law.
The etiquette line was inspired by Data on Star Trek (created with human genitals but saw no need for clothing).
The fifth law was not only for human/robot sex, but also the relationship Aiko was made for.

Normally the word I is capitalized, but the movie title i-Robot (not to be confused with the company iRobot) is lower case.
I got confused too.

Guest
Guest


Back to top Go down

Oops

Post  Guest on Wed Dec 09, 2009 1:52 pm

They both were running out of air, but the girl's car was too deep and it would have been destroyed.
I meant the robot would have been destroyed.

Guest
Guest


Back to top Go down

My oppinion about this would be.

Post  WulfCry on Mon Dec 14, 2009 3:57 pm

Because the cheer hypothetical view that hollywood has about how to implement a robotic law such as one Asimove defined.

I or any human would ''not'' like it if any choice based on safety considering the life of a living being be prioritize on evaluating a threatening situation by A.I or any form of mechanical electrical or any form of thinking machine.

Our social system any where on earth is based on laws that are made to guarantee our living standard as free humans and protect us as society. Anything else considered if we would leave important decision to machines about life decisions would cleary take away the freedom our society was build on and the protection with it.

If any robot or android is capable of moving freely doing task a human could as quick and efficient then,
Asimov laws are incompleet and insufficient letting a robot android or thinking machine decide.

However a thinking machine acts as an device for what it is build. Example if someone is drowning and one will
throw a buoy to that person the purpose of the buoy will be keeping that person a float.

What do you think about this?.

WulfCry

Posts : 109
Reputation : 0
Join date : 2008-12-10
Location : Netherland , Rotterdam

View user profile

Back to top Go down

Human skills or machine abilities?

Post  Guest on Wed Dec 16, 2009 6:45 am

Because the cheer hypothetical view that hollywood has about how to implement a robotic law such as one Asimove defined.

I or any human would ''not'' like it if any choice based on safety considering the life of a living being be prioritize on evaluating a threatening situation by A.I or any form of mechanical electrical or any form of thinking machine.

Our social system any where on earth is based on laws that are made to guarantee our living standard as free humans and protect us as society. Anything else considered if we would leave important decision to machines about life decisions would cleary take away the freedom our society was build on and the protection with it.

If any robot or android is capable of moving freely doing task a human could as quick and efficient then,
Asimov laws are incompleet and insufficient letting a robot android or thinking machine decide.

However a thinking machine acts as an device for what it is build. Example if someone is drowning and one will
throw a buoy to that person the purpose of the buoy will be keeping that person a float.

What do you think about this?.
I see what you're saying since I've seen that scenario in many movies before, but those machines are protecting life from future harm that may come to the human.
What I'm suggesting is protecting from future harm that WILL come to the human.
In other words the danger would have happened whether the robot was there or not.

In an episode of "The Outer Limits", a scientist builds a nanny bot after his original sex bot version went insane (so to speak) and was destroyed.
After much mixed feelings by his wife, it was allowed to look after their kids.
Many security measures were in place, but it wasn't long before it was mentally/emotionally abusing the children without meaning to.

In that scenario, it meant to protect from harm that was most likely to happen, not from immediate danger.

Your life preserver (a buoy is an ocean marker for passing ships) scenario suggests one isn't accessible to the human, but is to the robot.
This is a redundant resource for a rescue robot could swim under the ocean as well as act as a flotation device.
If you mean it should be controlled remotely by humans, I think this is a good idea for some things, but not for others.
A human couldn't react quick enough in times of crises even with the best of machines.

I was thinking of that scene in Terminator 2 where John Connor orders the terminator not to kill anyone, and this jerk confronts him getting shot in the leg, and the terminator says: "He'll live."
That's why I chose my wording carefully.

Guest
Guest


Back to top Go down

Continuing:

Post  Guest on Wed Dec 16, 2009 7:08 am

In addition, It's only when our human rights and ethics/morals are violated that angers society, for instance a Doctor who purposely injures his patients to get return visits or (more close to home) privacy invasion without warrant in every possible scenario (can see through walls without knowledge or consent, monitor private conversations from long distances, and track your position and what you're doing).

Guest
Guest


Back to top Go down

Re: Asimov's "Three Laws of Robotics revised

Post  Sponsored content


Sponsored content


Back to top Go down

View previous topic View next topic Back to top

- Similar topics

 
Permissions in this forum:
You cannot reply to topics in this forum