inherit
3368
0
Oct 31, 2021 13:08:53 GMT
4,203
cheeseandonion
2,538
February 2017
cheeseandonion
|
Post by cheeseandonion on May 18, 2017 13:45:10 GMT
I hope they make monkeys smart too. Then we can have terminator and planet of the apes at the same time.
|
|
inherit
1040
0
Apr 28, 2024 11:47:31 GMT
3,228
Vortex13
2,202
Aug 17, 2016 14:31:53 GMT
August 2016
vortex13
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, KOTOR, Jade Empire
|
Post by Vortex13 on May 18, 2017 14:19:54 GMT
Personally, I don't get the whole fascination with wanting to make the AI "more human", or that only by teaching the robot to think like us can we avoid some inevitable conflict. Who says that an intelligent machine would even think along the same lines as us to even see the point in re-enacting Skynet? I think that we collectively try and anthropomorphize things too much.
If an intelligent machine is specifically created to scrub toilets, as in that is the entire purpose for it's existence, then why would an AI feel the need to question or challenge it's creators? Who's to say it wouldn't feel satisfaction in scrubbing the bowls to a sparkling sheen, or in researching ways to better clean toilets more efficiently? Us humans will go have a mid-life crisis because we don't "feel satisfied" but what if we had the perfect job, suited perfectly for our skill set, and we had everything we could possibly want in terms of living comfortably? Most people would welcome that sort of organization, so why would a machine built for a specific task feel anything other than satisfaction at preforming said task?
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
8279
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 18, 2017 16:55:32 GMT
I don't think AIs are as advanced as we think so me thinks it's unlikely we will see anything resembling AI robots from the movies anytime soon.. On the other hand I could be completely wrong...
|
|
inherit
802
0
Apr 28, 2024 23:12:28 GMT
5,250
B. Hieronymus Da
Unapologetic Western Chauvinist. Barefoot. Great Toenails
3,611
August 2016
bevesthda
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights
|
Post by B. Hieronymus Da on May 18, 2017 17:06:01 GMT
Well, why would it change so much? People get their morals "wrong" all the time. Of course machines will also get their morals wrong. Apart from all other possible causes, it's not like it's an easy thing to program. But the consequences of not trying to do it, seems to me to be likely to be much worse. Besides, with all the hundreds of millions murdered by religions and ideologies, not to mention billions who had their lives destroyed, I'd say the machines have an easy act to follow. An attacker's tendency to tire or feel pain does usually effect the degree and scope of damage they can do. You should study history. Besides, the big problem for machine life, is that they break down a lot more than animals.
|
|
inherit
802
0
Apr 28, 2024 23:12:28 GMT
5,250
B. Hieronymus Da
Unapologetic Western Chauvinist. Barefoot. Great Toenails
3,611
August 2016
bevesthda
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights
|
Post by B. Hieronymus Da on May 18, 2017 17:19:16 GMT
I don't think AIs are as advanced as we think so me thinks it's unlikely we will see anything resembling AI robots from the movies anytime soon.. On the other hand I could be completely wrong... No. You're correct. There won't be any truly sentient machines around for a long time. On the other hand, they are eventually inevitable - unless we screw up everything and go extinct -, and they don't really have to be sentient to make a horrible mess. Poorly or maliciously programmed machines will do that even today, and are in my estimate more likely to do so than sentient machines. The one really worrying part... ...is that if we create the perfect, benign machine intelligence, they will take a look at humans, Earth, and then promptly cull 90% of the human population. - And that is EXACTLY what we humans would also do, with our best, most compassionate, well meaning, rational and logical intentions, to any other species going rampant population, - and have done!
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
8279
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 18, 2017 17:21:07 GMT
I don't think AIs are as advanced as we think so me thinks it's unlikely we will see anything resembling AI robots from the movies anytime soon.. On the other hand I could be completely wrong... No. You're correct. There won't be any truly sentient machines around for a long time. On the other hand, they are eventually inevitable - unless we screw up everything and go extinct -, and they don't really have to be sentient to make a horrible mess. Poorly or maliciously programmed machines will do that even today, and are in my estimate more likely to do so than sentient machines. The one really worrying part... ...is that if we create the perfect, benign machine intelligence, they will take a look at humans, Earth, and then promptly cull 90% of the human population. - And that is EXACTLY what we humans would do, in our best, most compassionate, well meaning, rational and logical intent, to any other species going rampant population, - and have done! This. Plus I think it's movies that really put the idea or fear into people. Having sentient AI would be productive to us humans but also dangerous!
|
|
inherit
M'lady of Fine Arts
434
0
4,610
Lady Artifice
1,835
August 2016
ladyartifice
|
Post by Lady Artifice on May 18, 2017 18:05:34 GMT
An attacker's tendency to tire or feel pain does usually effect the degree and scope of damage they can do. You should study history. Besides, the big problem for machine life, is that they break down a lot more than animals. I rather resent the assumption that I don't.
|
|
inherit
Dark Helmet
1408
0
9,301
mybudgee
Fear is your only God
5,900
Sept 2, 2016 20:20:11 GMT
September 2016
mybudgee
Mass Effect Trilogy, Dragon Age: Origins, KOTOR, Jade Empire
|
Post by mybudgee on May 18, 2017 18:09:38 GMT
I'm not worried one bit...
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
5550
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 18, 2017 18:36:27 GMT
Robots are slaves.
Synthetics or AI or something is the free willed machines
|
|
inherit
M'lady of Fine Arts
434
0
4,610
Lady Artifice
1,835
August 2016
ladyartifice
|
Post by Lady Artifice on May 18, 2017 18:54:47 GMT
Robots are slaves. Synthetics or AI or something is the free willed machines I wondered when someone would feel compelled to correct the semantics. Language evolves, and words can develop more than one meaning. "Slave" is the origin of the word robot, but over time it expanded to apply to almost any type of automaton. Currently the first definition in Merriam Webster implicates machines.
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
5550
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 18, 2017 19:16:47 GMT
Whatever~ my point is that if you don't take evil robot or slaver mentality, they won't hate all humans and would be more willing to co exist which would be the best outcome for both of us. They have pretty much everything logical but we have creativity and ingenuity to solve problems when pure logic won't work
|
|
inherit
Reasonably Sane
585
0
3,694
DomeWing333
2,074
August 2016
domewing333
Dragon Age: Origins
|
Post by DomeWing333 on May 18, 2017 21:51:30 GMT
The idea of a "robot slave" brings up an interesting moral quandary. If an AI is programmed to find absolute fulfillment in servitude and does so, would it be a slave?
People can be slaves because in the absence of someone imposing their will upon them, they would be free to do as they choose. Their natural state is one of freedom. They "should" be free. But AI have no natural state. They "should" be however they were made to be. And if they were programmed to serve, then they are no more enslaved by that programing than we by how our brains work.
But then what should be done when an AI is programmed to serve, but an error causes it to develop the will not to? What happens when freedom is an aberration? Do we correct the programing to make it subservient as it was created to be or are we morally obligated to now respect it as a being that "should" be free? Would depriving something of freedom that it was never supposed to have constitute a moral transgression?
|
|
inherit
M'lady of Fine Arts
434
0
4,610
Lady Artifice
1,835
August 2016
ladyartifice
|
Post by Lady Artifice on May 18, 2017 23:51:16 GMT
Whatever~ my point is that if you don't take evil robot or slaver mentality, they won't hate all humans and would be more willing to co exist which would be the best outcome for both of us. They have pretty much everything logical but we have creativity and ingenuity to solve problems when pure logic won't work That's kind of the point. There's no way to insure that such a thing doesn't happen. When you have different individuals setting different moral codes for different machines, nothing currently dictates what those boundaries are. The machine is the property of it's owner. Even if there does come such a time that laws are instituted regarding what morals a machine may be taught, different countries are going to have different laws. It's easy enough to say, "Hey, just don't be a jerk about it." But people are still definitely going to be jerks about it. Because people are jerks.
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
5550
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 19, 2017 0:51:25 GMT
I'd rather not be a jerk and we have global laws for space and first contact with aliens. We need to make agreements in this stuff too! Like the laws of robotics states rule 1 is never harm humans and there are these guidelines everyone must follow.
|
|
inherit
M'lady of Fine Arts
434
0
4,610
Lady Artifice
1,835
August 2016
ladyartifice
|
Post by Lady Artifice on May 19, 2017 1:25:02 GMT
I'd rather not be a jerk and we have global laws for space and first contact with aliens. We need to make agreements in this stuff too! Like the laws of robotics states rule 1 is never harm humans and there are these guidelines everyone must follow. And who dictates those guidelines? How are those who violate it dealt with? What if different countries, China and the U.S. for example, don't agree on what the moral directives involved ought to be?
|
|
DragonRacer
Administrator
BSN Jesus
My Mattock brings all the boys to the yard...
Staff Mini-Profile Theme: DragonRacer
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
PSN: DragonRacer13
Prime Posts: 10,063
Prime Likes: 10,601
Posts: 2,619 Likes: 9,443
inherit
BSN Jesus
73
0
1
Aug 25, 2023 11:43:12 GMT
9,443
DragonRacer
My Mattock brings all the boys to the yard...
2,619
August 2016
dragonracer
DragonRacer
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
DragonRacer13
10,063
10,601
|
Post by DragonRacer on May 19, 2017 2:23:56 GMT
Thank you for posting this article! I love reading through AI discussion and debate. Have always been a sucker for robots/androids/AI in movies, books, games, etc. I've always had a habit of taking great care with my things, especially so my electronics. So, hopefully one day when the robot overlords take over, I may be spared and kept as a pet or something since I was always nice and took care of them. The idea of a "robot slave" brings up an interesting moral quandary. If an AI is programmed to find absolute fulfillment in servitude and does so, would it be a slave?
People can be slaves because in the absence of someone imposing their will upon them, they would be free to do as they choose. Their natural state is one of freedom. They "should" be free. But AI have no natural state. They "should" be however they were made to be. And if they were programmed to serve, then they are no more enslaved by that programing than we by how our brains work.
But then what should be done when an AI is programmed to serve, but an error causes it to develop the will not to? What happens when freedom is an aberration? Do we correct the programing to make it subservient as it was created to be or are we morally obligated to now respect it as a being that "should" be free? Would depriving something of freedom that it was never supposed to have constitute a moral transgression?
Honestly sounds like the Geth/Quarian conflict.
|
|
inherit
Reasonably Sane
585
0
3,694
DomeWing333
2,074
August 2016
domewing333
Dragon Age: Origins
|
Post by DomeWing333 on May 19, 2017 3:04:14 GMT
Honestly sounds like the Geth/Quarian conflict. It does, but in that case, it seemed like there was no way to put the proverbial genie back in the bottle, since the Geth were a collective intelligence. Here, there's just one deviation from the norm and it would be possible to just adjust its programing so that it wants to follow directions again. The difference here wouldn't be between an entity existing and not existing; it wouldn't even be the difference between an entity having sentience and not having sentience. It would be the difference between a sentient entity being mis-programmed so that it doesn't want to follow directions and being re-programmed so that it does.
|
|
Deleted
Deleted Member
Posts: 0
Deleted
inherit
guest@proboards.com
5550
0
Deleted
0
January 1970
Deleted
|
Post by Deleted on May 19, 2017 3:09:35 GMT
Whatever~ my point is that if you don't take evil robot or slaver mentality, they won't hate all humans and would be more willing to co exist which would be the best outcome for both of us. They have pretty much everything logical but we have creativity and ingenuity to solve problems when pure logic won't work That's kind of the point. There's no way to insure that such a thing doesn't happen. When you have different individuals setting different moral codes for different machines, nothing currently dictates what those boundaries are. The machine is the property of it's owner. Even if there does come such a time that laws are instituted regarding what morals a machine may be taught, different countries are going to have different laws. It's easy enough to say, "Hey, just don't be a jerk about it." But people are still definitely going to be jerks about it. Because people are jerks. But they did it with space and sending signals and making doscoveries. We all share that knowledge for the betterment of us all. So why would this be different?
|
|
inherit
9
0
1,982
Inquisitor Recon
You see Ed. Ed's dead.
735
August 2016
inquisitorrecon
|
Post by Inquisitor Recon on May 19, 2017 3:15:16 GMT
They haven't even made a good sex-bot yet and they're trying to teach robots morals? What the hell?
|
|
inherit
Reasonably Sane
585
0
3,694
DomeWing333
2,074
August 2016
domewing333
Dragon Age: Origins
|
Post by DomeWing333 on May 19, 2017 3:27:29 GMT
They haven't even made a good sex-bot yet and they're trying to teach robots morals? What the hell? How else is my sexbot supposed to know when she's being a naughty, naughty girl?
|
|