My only fear is that robots will be capable of breaking the three rules of robotics. If that happens, well, suffice to say, it won't be pretty. Otherwise I don't care what every Tom, Dick, and Harry does with there Hookerbot 3000. Just rememeber to check for robo-STD's.
In The Naked Sun, Elijah Baley points out that the Laws had been deliberately misrepresented because robots could unknowingly break any of them. He restated the first law as "A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm." This change in wording makes it clear that robots can become the tools of murder, provided they are not aware of the nature of their tasks; for instance being ordered to add something to a person's food, not knowing that it is poison. Furthermore, he points out that a clever criminal could divide a task among multiple robots, so that no one robot could even recognize that its actions would lead to harming a human being. (The Naked Sun complicates the issue by portraying a decentralized, planetwide communication network among Solaria's millions of robots, meaning that the criminal mastermind could be located anywhere on the planet.)
Baley furthermore proposes that the Solarians may one day use robots for military purposes. If a spacecraft was built with a positronic brain, and carried neither humans nor even the life-support systems to sustain them, the ship's robotic intelligence would naturally assume that all other spacecraft were robotic beings. Such a ship could operate more responsively and flexibly than one crewed by humans, and it could be armed more heavily, its robotic brain equipped to slaughter humans of whose existence it is totally ignorant. This possibility is referenced in Foundation and Earth, where, indeed, it is discovered that the Solarians possess an immensely powerful robotic military force that has been programmed to identify only the Solarian race as human.
PlugY for Diablo II allows you to reset skills and stats, transfer items between characters in singleplayer, obtain all ladder runewords and do all Uberquests while offline. It is the only way to do all of the above. Please use it.
Supporting big shoulderpads and flashy armor since 2004.
i bet you that in the future we will get destroyed by what we have created.
I bet that you just stole that from some blockbuster. :rolleyes:
Not to mention that it doesn't make sense for an AI to turn against anyone. The AI is not programmed to do good. The AI is not programmed to do bad. It's just is. How AI will behave in its free form is beyond me.
AI will be tested and retested until anyone will actually let it loose with any laws.
I bet that you just stole that from some blockbuster. :rolleyes:
Not to mention that it doesn't make sense for an AI to turn against anyone. The AI is not programmed to do good. The AI is not programmed to do bad. It's just is. How AI will behave in its free form is beyond me.
AI will be tested and retested until anyone will actually let it loose with any laws.
Your right, I got it off of I Robot.:D
Rollback Post to RevisionRollBack
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
Or even the Terminator movies. There are a lot of media out there with this very same theme. Personally I believe there will always be some kind of fail safe against something like that happening. Even if it's something as simple as a mass system shut down or a memory format. I don't believe robots or AI will ever take control of their makers, let alone destroy them.
The human race may however let itself be controlled by it's creations, though whether or not that's a bad thing can be debated.
i bet you that in the future we will get destroyed by what we have created. They would turn on us the first chance they got.
Depends. If we assume we exist in an indefinate time-span, then yes. Otherwise, nothing is certain.
Quote from "LinkX" »
I hadn't thought of that Phrozen. That is a very good point. Perhaps if they were interconnected?.. There has to be some sort of failsafe feature.
I don't think so. We do not have a perfect fail-safe to prevent mad-men from gaining vast amounts of power, although our protection is pretty good. Neither will we have with regards to robots I can only assume.
Quote from "Equinox" »
Not to mention that it doesn't make sense for an AI to turn against anyone. The AI is not programmed to do good. The AI is not programmed to do bad. It's just is. How AI will behave in its free form is beyond me.
I can only assume what appears most logical to me: The human brain is in no way special. Once we develop the means to artificially reproduce the workings of an intelligence, it will function just as we do, until it greatly surpasses us, at which point we can take two paths.
1. Robots will become far more intelligent than us and be capable of reasoning we can never comprehend.
2. Humans are capable of understanding everything that a robot can, therefore any robot will always be able to view a human being as something of an equal in terms of strength of the mind.
Quote from "Equinox" »
AI will be tested and retested until anyone will actually let it loose with any laws.
Now that is for certain, and we should not be worried of the very first forms of strong AI we create. What we should possibly fear is when anyone can assemble their own AI given the right funding. That's when problems can arise.
Quote from name="Lord Mantis" »
The human race may however let itself be controlled by it's creations, though whether or not that's a bad thing can be debated.
Which I do not doubt will happen. If we can entirely mechanize industries, warfare and chores, we will.
PlugY for Diablo II allows you to reset skills and stats, transfer items between characters in singleplayer, obtain all ladder runewords and do all Uberquests while offline. It is the only way to do all of the above. Please use it.
Supporting big shoulderpads and flashy armor since 2004.
someday we will have chips implanted in our arms that record location, vital signs, blood toxicity, and transactions. A robot could have a law saying that it must protect the vital signs of all who surround him. If someone uses a robot to kill someone without the robot knowing then the robot could just tell the authoritys who gave him that action and it would all be over. A colt .45 is a far better weapon than a robot designed to protect people.
That is why if I am around when these kinds of robots are made I am not going to get one. I would not want to die because of a robot disobeying the laws. Like in I Robot.
Rollback Post to RevisionRollBack
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
They could. That is the terror in it. Because one day the robot would be doing everything you wanted them to do, but the next you are dead in your apartment.
Rollback Post to RevisionRollBack
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
the robot would have only one person to answer to (besides the police) so unless you ordered it to poison you, you would be in the clear. If sombody tried to use their robot to poison you the authoritys would know instantly who did it and he would go to jail. A robot can only do what its programmed to do, a robot cant go on a rampage because everyting its programmed to do is the opposite.
the robot would have only one person to answer to (besides the police) so unless you ordered it to poison you, you would be in the clear. If sombody tried to use their robot to poison you the authoritys would know instantly who did it and he would go to jail. A robot can only do what its programmed to do, a robot cant go on a rampage because everyting its programmed to do is the opposite.
Have you ever seen IROBot?
Rollback Post to RevisionRollBack
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
That is why if I am around when these kinds of robots are made I am not going to get one. I would not want to die because of a robot disobeying the laws. Like in I Robot.
So you'd rather be around a person disobeying the laws?
Quote from "Murderface" »
they wouldnt be disobeying the laws the person commanding the robot would.
No one would be commanding a robot if it was subject to laws. Only a sentient entity can be held responsible for its actions. If it is controlled by a person, the person is subject to the laws.
Quote from "Stonebreaker" »
They could. That is the terror in it. Because one day the robot would be doing everything you wanted them to do, but the next you are dead in your apartment.
Once again, how is that different from a human murdering you?
Quote from "Murderface" »
the robot would have only one person to answer to (besides the police) so unless you ordered it to poison you, you would be in the clear. If sombody tried to use their robot to poison you the authoritys would know instantly who did it and he would go to jail. A robot can only do what its programmed to do, a robot cant go on a rampage because everyting its programmed to do is the opposite.
Depends in how it is programmed. An AI cannot be perfectly programmed, since it negates the effects that makes humans intelligent, such as our ability to act on uncertain data, our own acceptance that we do not know everything, and our disregard to reason.
PlugY for Diablo II allows you to reset skills and stats, transfer items between characters in singleplayer, obtain all ladder runewords and do all Uberquests while offline. It is the only way to do all of the above. Please use it.
Supporting big shoulderpads and flashy armor since 2004.
We are not talking abot humans now are we.:D We are talking about robots, and how they would affect the life that we have now. I would rather that no one kill anyone, but that is the world that we have now. I am saying that a robot could turn on us if we get that technological.
Rollback Post to RevisionRollBack
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
I can only assume what appears most logical to me: The human brain is in no way special.
And, guess what? That is not important!
Quote from "PhrozenDragon" »
1. Robots will become far more intelligent than us and be capable of reasoning we can never comprehend.
Robots think faster than we do so we would be forced to control them with some inner programming aka 3 laws.
Also, Phrozen, if you knew anything about computer programming you would understand that the 3 laws of robotics weren't just 3 little sentences. You can't program 3 sentences into an AI robot, anyway.
Quote from "PhrozenDragon" »
Now that is for certain, and we should not be worried of the very first forms of strong AI we create.
Rubbish. There is only one AI which resembles AI. The rest is not yet AI, or artificialized AI (logic + simulated feelings). But, if we achieve actual AI, it would be dangerous, if it is actually AI. If it's just a robot with complex recognition and learning patterns it's not AI yet, just a program with tons upon tons of if statements. :rolleyes:
Quote from "PhrozenDragon" »
What we should possibly fear is when anyone can assemble their own AI given the right funding. That's when problems can arise.
Capitalism! Wheeee! Yea. It will happen. And will have interest groups called "CAR" (Coalition Against Robots) paying officials to create an organisation that goes to each corp and makes sure the robots follow 3 laws of robotics. And once in a while we will have a robot breakout where they go out and kill people. Same exact thing we have with people using unsterilized equipment which leads to the distribution of the HIV virus, for instance. I don't think it's anything to worry about.
Still, I stick to the idea that the AI cannot be made in robot form as true AI. Aka, you may grow a brain, or you can make a complex program that would act like a human, but you cannot program a brain as you can't grow a complex program.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
I don't see how these laws were going to protect us from anything
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
There is much to think on...
Not to mention that it doesn't make sense for an AI to turn against anyone. The AI is not programmed to do good. The AI is not programmed to do bad. It's just is. How AI will behave in its free form is beyond me.
AI will be tested and retested until anyone will actually let it loose with any laws.
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
but don't worry that is still very groovy
Matrix would be a better description of what I was going for, because how they almost eraticate the human race.
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
The human race may however let itself be controlled by it's creations, though whether or not that's a bad thing can be debated.
Like a cat, tied to a stick
I don't think so. We do not have a perfect fail-safe to prevent mad-men from gaining vast amounts of power, although our protection is pretty good. Neither will we have with regards to robots I can only assume.
I can only assume what appears most logical to me: The human brain is in no way special. Once we develop the means to artificially reproduce the workings of an intelligence, it will function just as we do, until it greatly surpasses us, at which point we can take two paths.
1. Robots will become far more intelligent than us and be capable of reasoning we can never comprehend.
2. Humans are capable of understanding everything that a robot can, therefore any robot will always be able to view a human being as something of an equal in terms of strength of the mind.
Now that is for certain, and we should not be worried of the very first forms of strong AI we create. What we should possibly fear is when anyone can assemble their own AI given the right funding. That's when problems can arise.
Which I do not doubt will happen. If we can entirely mechanize industries, warfare and chores, we will.
Fuck you, I'm a dragon.
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
Fuck you, I'm a dragon.
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
Fuck you, I'm a dragon.
Have you ever seen IROBot?
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
No one would be commanding a robot if it was subject to laws. Only a sentient entity can be held responsible for its actions. If it is controlled by a person, the person is subject to the laws.
Once again, how is that different from a human murdering you?
Depends in how it is programmed. An AI cannot be perfectly programmed, since it negates the effects that makes humans intelligent, such as our ability to act on uncertain data, our own acceptance that we do not know everything, and our disregard to reason.
Friendship is like peeing on yourself: everyone can see it, but only you get the warm feeling that it brings.
Fuck you, I'm a dragon.
Robots think faster than we do so we would be forced to control them with some inner programming aka 3 laws.
Also, Phrozen, if you knew anything about computer programming you would understand that the 3 laws of robotics weren't just 3 little sentences. You can't program 3 sentences into an AI robot, anyway.
Rubbish. There is only one AI which resembles AI. The rest is not yet AI, or artificialized AI (logic + simulated feelings). But, if we achieve actual AI, it would be dangerous, if it is actually AI. If it's just a robot with complex recognition and learning patterns it's not AI yet, just a program with tons upon tons of if statements. :rolleyes:
Capitalism! Wheeee! Yea. It will happen. And will have interest groups called "CAR" (Coalition Against Robots) paying officials to create an organisation that goes to each corp and makes sure the robots follow 3 laws of robotics. And once in a while we will have a robot breakout where they go out and kill people. Same exact thing we have with people using unsterilized equipment which leads to the distribution of the HIV virus, for instance. I don't think it's anything to worry about.
Still, I stick to the idea that the AI cannot be made in robot form as true AI. Aka, you may grow a brain, or you can make a complex program that would act like a human, but you cannot program a brain as you can't grow a complex program.