inherit
550
0
1,258
Teabaggin Krogan
365
August 2016
teabagginkrogan
|
Post by Teabaggin Krogan on Feb 8, 2018 8:44:20 GMT
Stephen Hawking Warns Do Not Contact Aliens Or They Will Destroy Humanity. I just don't get why people like Stephen Hawking and Elon Musk are so afraid of aliens and AI respectively. Sure there's an inherent danger in dealing with such scenarios, but there is inherent danger in anything we do or discover. There was the real fear that we would ignite the atmosphere with the first atomic bomb detonation. We can very well introduce extra-terrestrial viruses or bacteria to our biosphere when ever we bring back samples from places like the Moon or Mars. Etc. Are we all supposed to just regress back to the stone age because anything else more complex than that will ultimately "Destroy Humanity"? We should be aware of the danger and take steps to avoid it of course, but just sticking our collective heads in the sand and ignoring the situation is not going to do anyone any favors if and when we actually do establish contact with an alien society or create an artificial intelligence. AI is a subject that well warrants caution in its development. With the current levels of technology we have and with access to the internet, a self aware entity could easily be above and beyond any levels of intelligence we can humanly imagine as of now. Creating something that is smarter than us and has the autonomy to make its own decisions or even question and debate the instructions that it is given can turn out really bad if we are not careful. Similarly, an encounter with an extra terrestrial species of superior capability or intelligence than us will put these beings above us on the food chain. Now we can hope they are benign and wish us well but they might have no such inclination. They might not even think or perceive us the same way we do the world around us. Of course this does not mean that we should not pursue these avenues and not try and develop AI or continue our search for aliens. Human curiosity naturally means that we will pursue these paths, however it is smart to be aware and afraid of their risks. When all they do is perform various clown acts in space, what other conclusion is there. Flipping in space, juggling fruit in space, sending cars in space. What's next, making an elephant disappear in space? This isn't science, it's clowning around. I know you're probably trolling but please do not go about spreading misinformation like this. Most people on this forum are probably smart enough to not believe in conspiracy theories but with all those flat earth advocates out there I'm not sure anymore. And if your post manages to mislead even a single person, then you sir, should be ashamed. But in case you are serious about your convictions, please understand that the concepts you mentioned such as gravity and how thrust works in vacuum are concepts that been explained by the laws of physics and proven through mathematics. Further more, they have been reviewed, discussed and analysed through the scientific method an innumerable number of times both in theory and in practical applications. Of course, you can choose to believe this or some nutcase on the internet but that's your choice. As for Tesla sending a car to space, AFAIK the car was a substitute for a payload of a certain mass. They could've either sent a block of metal or the roadster and the car is way better publicity. Nobody would've really cared if Tesla sent a hunk of metal to space and brought it back but the absurdity of sending a car is free publicity irrespective of what your personal views on the matter might be. I dunno about Ol'Musky being space jesus and all that but I gotta admit the launch did what it was supposed to do (for the most part). That synchronized rocket landing was pretty cool btw;
|
|
inherit
Elvis Has Left The Building
244
0
Sept 26, 2016 13:29:55 GMT
19,064
Arijon van Goyen
10,446
August 2016
kaiserarian
17300
|
Post by Arijon van Goyen on Feb 8, 2018 10:16:14 GMT
Stephen Hawking Warns Do Not Contact Aliens Or They Will Destroy Humanity. I just don't get why people like Stephen Hawking and Elon Musk are so afraid of aliens and AI respectively. Because pure optimism is idiocy? BTW even this guy praises Elon Musk's achievement in space!
|
|
inherit
802
0
5,542
B. Hieronymus Da
Unapologetic Western Chauvinist. Barefoot. Great Toenails
3,753
August 2016
bevesthda
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights
|
Post by B. Hieronymus Da on Feb 8, 2018 10:29:43 GMT
I started this topic and I am really truly fond of it. Is there any chance we could keep the trolling to a minimum? We've already had a topic that cast shade on outer space. It was closed. Perhaps we should learn from that? Is there any word on the center booster? Is it recoverable? It depends on what you mean by recoverable... The question is if there is any point in doing so. It smashed into the sea at 300 Mph, so it's quite broken . ...And the reason it failed is known, so does not need to be examined. The best economy is however to only use block 5 components from here on. The difference is that block 5 are better engineered to be reusable, so turnaround will be faster and cheaper.
|
|
inherit
802
0
5,542
B. Hieronymus Da
Unapologetic Western Chauvinist. Barefoot. Great Toenails
3,753
August 2016
bevesthda
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights
|
Post by B. Hieronymus Da on Feb 8, 2018 11:14:55 GMT
Stephen Hawking Warns Do Not Contact Aliens Or They Will Destroy Humanity. I just don't get why people like Stephen Hawking and Elon Musk are so afraid of aliens and AI respectively. Sure there's an inherent danger in dealing with such scenarios, but there is inherent danger in anything we do or discover. There was the real fear that we would ignite the atmosphere with the first atomic bomb detonation. We can very well introduce extra-terrestrial viruses or bacteria to our biosphere when ever we bring back samples from places like the Moon or Mars. Etc. Are we all supposed to just regress back to the stone age because anything else more complex than that will ultimately "Destroy Humanity"? We should be aware of the danger and take steps to avoid it of course, but just sticking our collective heads in the sand and ignoring the situation is not going to do anyone any favors if and when we actually do establish contact with an alien society or create an artificial intelligence. I do get it. Totally. While contact with aliens would be very exciting, there are no reasons to think that it will turn out well for both civilizations. While the analogy is not perfect, it has never happened on earth. As for AI, I don't see many possible futures. Either human civilization will fail to evolve (quite possible, given the proliferation of destructive movements like environmental destruction - global warming is most certainly entirely real -, social destruction by culture-equivalence, equal-outcome ideologies & religion, failure of meeting the technical-economical challenges of coming global extinction events like Yellowstone caldera, comets and asteroids), and then it will assuredly lapse back into something iron-age'ish at the best, where it will dwindle for tens of thousands of years. ....Or longer. Or: If human civilization continue to evolve, then Sentient AI is totally inevitable! Eventually. It will take time, but it will happen, and maybe also need to happen. And then it will eventually also follow that the machines will "take over". It's also inevitable. When they do, we better hope that they love humans. ...And biological life. That is something we must build into their behavioral motivations at a very fundamental level. If we succeed with that, the machines will themselves keep up the practice with new generations of machines to protect what they love. If we fail to become the machines' pets, there will probably be no place for humans in the future.
|
|
inherit
Elvis Has Left The Building
244
0
Sept 26, 2016 13:29:55 GMT
19,064
Arijon van Goyen
10,446
August 2016
kaiserarian
17300
|
Post by Arijon van Goyen on Feb 8, 2018 11:52:28 GMT
I just don't get why people like Stephen Hawking and Elon Musk are so afraid of aliens and AI respectively. Sure there's an inherent danger in dealing with such scenarios, but there is inherent danger in anything we do or discover. There was the real fear that we would ignite the atmosphere with the first atomic bomb detonation. We can very well introduce extra-terrestrial viruses or bacteria to our biosphere when ever we bring back samples from places like the Moon or Mars. Etc. Are we all supposed to just regress back to the stone age because anything else more complex than that will ultimately "Destroy Humanity"? We should be aware of the danger and take steps to avoid it of course, but just sticking our collective heads in the sand and ignoring the situation is not going to do anyone any favors if and when we actually do establish contact with an alien society or create an artificial intelligence. I do get it. Totally. While contact with aliens would be very exciting, there are no reasons to think that it will turn out well for both civilizations. While the analogy is not perfect, it has never happened on earth. As for AI, I don't see many possible futures. Either human civilization will fail to evolve (quite possible, given the proliferation of destructive movements like environmental destruction - global warming is most certainly entirely real -, social destruction by culture-equivalence, equal-outcome ideologies & religion, failure of meeting the technical-economical challenges of coming global extinction events like Yellowstone caldera, comets and asteroids), and then it will assuredly laps back into something iron-age'ish at the best, where it will dwindle for tens of thousands of years. ....Or longer. Or: If human civilization continue to evolve, then Sentient AI is totally inevitable! Eventually. It will take time, but it will happen, and maybe also need to happen. And then it will eventually also follow that the machines will "take over". It's also inevitable. When they do, we better hope that they love humans. ...And biological life. That is something we must build into their behavioral motivations at a very fundamental level. If we succeed with that, the machines will themselves keep up the practice with new generations of machines to protect what they love. If we fail to become the machines' pets, there will probably be no place for humans in the future. I support any group that will sabotage any AI development smarter than what we already have in 2018. This threat is higher than nuclear weapons, yet you people can't see it.
|
|
inherit
Elvis Has Left The Building
244
0
Sept 26, 2016 13:29:55 GMT
19,064
Arijon van Goyen
10,446
August 2016
kaiserarian
17300
|
Post by Arijon van Goyen on Feb 8, 2018 12:18:20 GMT
|
|
inherit
1301
bobgoodheart1st mattig89ch
0
8,824
mattig89ch
5,679
August 2016
mattig89ch
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Jade Empire
mattig89ch
|
Post by mattig89ch on Feb 8, 2018 14:40:29 GMT
Interesting take between empathy and sympathy
|
|
inherit
N7
289
0
Sept 21, 2024 0:54:11 GMT
8,016
Terminator Force
4,314
August 2016
terminatorforce
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, KOTOR, Mass Effect Andromeda, Mass Effect Legendary Edition
TerminatorForce2
|
Post by Terminator Force on Feb 8, 2018 15:29:19 GMT
When all they do is perform various clown acts in space, what other conclusion is there. Flipping in space, juggling fruit in space, sending cars in space. What's next, making an elephant disappear in space? This isn't science, it's clowning around. I know you're probably trolling but please do not go about spreading misinformation like this. Most people on this forum are probably smart enough to not believe in conspiracy theories but with all those flat earth advocates out there I'm not sure anymore. And if your post manages to mislead even a single person, then you sir, should be ashamed. But in case you are serious about your convictions, please understand that the concepts you mentioned such as gravity and how thrust works in vacuum are concepts that been explained by the laws of physics and proven through mathematics. Further more, they have been reviewed, discussed and analysed through the scientific method an innumerable number of times both in theory and in practical applications. Of course, you can choose to believe this or some nutcase on the internet but that's your choice. As for Tesla sending a car to space, AFAIK the car was a substitute for a payload of a certain mass. They could've either sent a block of metal or the roadster and the car is way better publicity. Nobody would've really cared if Tesla sent a hunk of metal to space and brought it back but the absurdity of sending a car is free publicity irrespective of what your personal views on the matter might be. I dunno about Ol'Musky being space jesus and all that but I gotta admit the launch did what it was supposed to do (for the most part). That synchronized rocket landing was pretty cool btw; I don't want to derail this thread as someone doesn't want me to and wish to respect this, so just this one more reply. There are so many red flags that show we haven't been in space. ^ NASA Graphic Designer Admits All Images about Space Are Fake Photoshopped CGIs and that it has to be. That he never seen Earth from outer space (not even a photo) and has to use his imagination to assume what the Earth looks like from outer space. Then this globe is presented by NASA as how it looks like from space. Come, that's serious red flag. Just watch the video as it's time stamped (3:12) to the NASA graphics designer voice describing the process on how he makes the globe. And the entire internet is filled red flags like this to the brim. If space was real there shouldn't be so many red flags. And come on, so far all of space is still CGI in lower then standard video quality to mask the CGI (including the Tesla car in space which looks like it's running on X-Bone in lower resolution. Come on, you all can spot CGI in movies but not from NASA?). ^ We've all been indoctrinated to dismiss conspiracy theories and not ask questions. But in this day and age, more then ever you have to question everything.
|
|
inherit
1301
bobgoodheart1st mattig89ch
0
8,824
mattig89ch
5,679
August 2016
mattig89ch
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Jade Empire
mattig89ch
|
Post by mattig89ch on Feb 8, 2018 17:51:11 GMT
Mr. Manley's take on the launch. It was a very interesting to watch.
|
|
inherit
1040
0
Sept 28, 2024 23:22:57 GMT
3,228
Vortex13
2,202
Aug 17, 2016 14:31:53 GMT
August 2016
vortex13
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, KOTOR, Jade Empire
|
Post by Vortex13 on Feb 8, 2018 18:07:38 GMT
I do get it. Totally. While contact with aliens would be very exciting, there are no reasons to think that it will turn out well for both civilizations. While the analogy is not perfect, it has never happened on earth. As for AI, I don't see many possible futures. Either human civilization will fail to evolve (quite possible, given the proliferation of destructive movements like environmental destruction - global warming is most certainly entirely real -, social destruction by culture-equivalence, equal-outcome ideologies & religion, failure of meeting the technical-economical challenges of coming global extinction events like Yellowstone caldera, comets and asteroids), and then it will assuredly laps back into something iron-age'ish at the best, where it will dwindle for tens of thousands of years. ....Or longer. Or: If human civilization continue to evolve, then Sentient AI is totally inevitable! Eventually. It will take time, but it will happen, and maybe also need to happen. And then it will eventually also follow that the machines will "take over". It's also inevitable. When they do, we better hope that they love humans. ...And biological life. That is something we must build into their behavioral motivations at a very fundamental level. If we succeed with that, the machines will themselves keep up the practice with new generations of machines to protect what they love. If we fail to become the machines' pets, there will probably be no place for humans in the future. I support any group that will sabotage any AI development smarter than what we already have in 2018. This threat is higher than nuclear weapons, yet you people can't see it. That's the thing with AI research and development though, why is everyone just assuming that the moment we develop sapient artificial intelligence that it will instantly have access to the the internet, military facilities, and nuclear launch codes? Surely, if we're as stupid as humanity is depicted in those science fiction stories that involve a super computer taking over, then we absolutely deserve to be wiped out. Closed networks, hardline and manual connections, etc. Any AI being created would hopefully have safeguards in place to prevent broad access to our high security areas. And even when the creation of sapient AI becomes a reality, why on God's green Earth do we think that humanity will instantly give said AI the keys to our entire civilization? That would be equivalent to humanity killing off horses once the first automobile was invented, or disposing of all candles when Edison created the light bulb. There's also the matter of an AI's purpose behind it's creation. There's the common mis-conception (IMO) that once an AI develops sapience that it will either want go about discovering what love is, or will instantly go on a "Kill all Humans" murder spree, but why would that be the case with an intelligence specifically designed to do a particular job? Hopefully humanity will be smart enough to give any AI we create a very clear set of prerogatives to follow, and not just create a mega computer who's vaguely tasked with discovering the meaning of life and happiness. I mean we don't expect "simple" processes, like auto-pilot programs to automatically know exactly what we want without a very clear input of data, so why do we think sapient AI will be any different? A self-aware janitor bot, for example, is not necessarily going to be interested in free market economies, or the works of Beethoven, when its entire existence revolves around mopping floors, maybe with the possibility of learning new ways to mop floors more efficiently. Such an existence seems unthinkable to us; since we don't have a clearly defined purpose for being here; but really, would you want to change your profession if you had a job perfectly suited for you and your skills, one that you absolutely excelled at doing, and to carry the analogy a step further, brought you joy and fulfillment in doing said task? And even if our janitor bot was a little uppity, who says that we have to give it unrestricted access to our information hubs and infrastructure? An automated assembly line in a car factory doesn't have direct access to power grids and or air traffic control after all. As for alien intelligence, sure First Contact might not be peaceful, but what are we really going to do to stop any highly advanced civilization from discovering us? Earth puts out far more radio waves and EMF signals than a planet our size and temperature should be able to generate naturally, so unless we're all going to regress back to being cavemen with signal fires there's really nothing we can do to stop broadcasting our presence to anything with ears to hear it. Instead of hiding in fear that we might draw attention we should make preparations for if and when we make First Contact, peaceful or not.
|
|
inherit
118
0
6,168
The Hype Himself
Proud Sponsor of Swingin' Seamen Charter Fishing: My Live Bait Will Catch Your Fish Every Time!
4,023
August 2016
hawkeyegod
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Mass Effect Andromeda, SWTOR, Mass Effect Legendary Edition
|
Post by The Hype Himself on Feb 8, 2018 19:10:13 GMT
^ We've all been indoctrinated to dismiss conspiracy theories and not ask questions. But in this day and age, more then ever you have to question everything. Including and especially your own trite BS. In particular, your half-baked theories fall down to scrutiny more often than a clown on a pogo stick on a sheet of ice. It's telling that peddlers of conspiracy theories have to create ways to dismiss evidence and in return say 'trust me on this,' without providing any sort of plausible or credible alternative. Quit trolling and posting crap in this thread. It's not science, it's nonsense.
|
|
inherit
N7
289
0
Sept 21, 2024 0:54:11 GMT
8,016
Terminator Force
4,314
August 2016
terminatorforce
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, KOTOR, Mass Effect Andromeda, Mass Effect Legendary Edition
TerminatorForce2
|
Post by Terminator Force on Feb 8, 2018 19:23:53 GMT
I'm done here. Don't quote me anymore.
|
|
Gecko
N3
Mini-Profile Theme: Example 1
Staff Mini-Profile Theme: Gecko
XBL Gamertag: Hectic Gecko
Posts: 722 Likes: 2,559
inherit
69
0
Aug 23, 2024 20:54:30 GMT
2,559
Gecko
722
August 2016
gecko
Example 1
Gecko
Hectic Gecko
|
Post by Gecko on Feb 8, 2018 19:30:51 GMT
And with that settled can we get back to discussing real science please?
Cheers.
|
|
KrrKs
N3
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age Inquisition, Mass Effect Andromeda
Origin: KrrKs
Posts: 777 Likes: 2,229
inherit
678
0
Sept 30, 2024 19:25:27 GMT
2,229
KrrKs
777
August 2016
krrks
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age Inquisition, Mass Effect Andromeda
KrrKs
|
Post by KrrKs on Feb 8, 2018 20:01:19 GMT
|
|
inherit
802
0
5,542
B. Hieronymus Da
Unapologetic Western Chauvinist. Barefoot. Great Toenails
3,753
August 2016
bevesthda
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights
|
Post by B. Hieronymus Da on Feb 8, 2018 20:48:59 GMT
I do get it. Totally. While contact with aliens would be very exciting, there are no reasons to think that it will turn out well for both civilizations. While the analogy is not perfect, it has never happened on earth. As for AI, I don't see many possible futures. Either human civilization will fail to evolve (quite possible, given the proliferation of destructive movements like environmental destruction - global warming is most certainly entirely real -, social destruction by culture-equivalence, equal-outcome ideologies & religion, failure of meeting the technical-economical challenges of coming global extinction events like Yellowstone caldera, comets and asteroids), and then it will assuredly laps back into something iron-age'ish at the best, where it will dwindle for tens of thousands of years. ....Or longer. Or: If human civilization continue to evolve, then Sentient AI is totally inevitable! Eventually. It will take time, but it will happen, and maybe also need to happen. And then it will eventually also follow that the machines will "take over". It's also inevitable. When they do, we better hope that they love humans. ...And biological life. That is something we must build into their behavioral motivations at a very fundamental level. If we succeed with that, the machines will themselves keep up the practice with new generations of machines to protect what they love. If we fail to become the machines' pets, there will probably be no place for humans in the future. I support any group that will sabotage any AI development smarter than what we already have in 2018. This threat is higher than nuclear weapons, yet you people can't see it. Well, two comments: We're nowhere near creating a truly sentient machine. It's still far off in the future, and we kinda need AI that is better than today, 2018. This is nothing hanging over us. Secondly, that tactics/strategy will fail. Sabotaging AI development will not stop the inevitable. Only the collapse of humanity will do that. Also, some kinds of sabotage might be very dangerous, possibly fatal. If the work on the motivational patterns is flawed for instance.
|
|
mousestalker
Inactive Moderator
ღ The Untitled
Just here for the cosplay
Staff Mini-Profile Theme: Mousestalker
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
Posts: 12,116 Likes: 30,354
inherit
ღ The Untitled
72
0
1
Jan 31, 2024 11:38:50 GMT
30,354
mousestalker
Just here for the cosplay
12,116
August 2016
mousestalker
Mousestalker
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
|
Post by mousestalker on Feb 8, 2018 20:53:18 GMT
If this helps any, the same types of minds that thought up 'Clippy' are the same ones that will design the first true AI.
|
|
Beerfish
N7
Little Pumpkin
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
Origin: Beerfish
XBL Gamertag: Beerfish77
Posts: 15,177 Likes: 36,346
inherit
Little Pumpkin
314
0
Sept 30, 2024 23:58:51 GMT
36,346
Beerfish
15,177
August 2016
beerfish
https://bsn.boards.net/user/314/personal
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
Beerfish
Beerfish77
|
Post by Beerfish on Feb 8, 2018 21:27:40 GMT
I liked clippy at least he asked before doing stuff, now i enter the room and my laptop is self updating so that something will no longer work on my computer.
|
|
inherit
1301
bobgoodheart1st mattig89ch
0
8,824
mattig89ch
5,679
August 2016
mattig89ch
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Jade Empire
mattig89ch
|
Post by mattig89ch on Feb 9, 2018 12:57:30 GMT
Interesting stuff. Just mke sure to tell your doc you don't want to immediately go on meds.
|
|
mousestalker
Inactive Moderator
ღ The Untitled
Just here for the cosplay
Staff Mini-Profile Theme: Mousestalker
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
Posts: 12,116 Likes: 30,354
inherit
ღ The Untitled
72
0
1
Jan 31, 2024 11:38:50 GMT
30,354
mousestalker
Just here for the cosplay
12,116
August 2016
mousestalker
Mousestalker
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
|
Post by mousestalker on Feb 9, 2018 16:25:30 GMT
|
|
Beerfish
N7
Little Pumpkin
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
Origin: Beerfish
XBL Gamertag: Beerfish77
Posts: 15,177 Likes: 36,346
inherit
Little Pumpkin
314
0
Sept 30, 2024 23:58:51 GMT
36,346
Beerfish
15,177
August 2016
beerfish
https://bsn.boards.net/user/314/personal
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Baldur's Gate, Neverwinter Nights, Mass Effect Andromeda, Anthem, Mass Effect Legendary Edition
Beerfish
Beerfish77
|
Post by Beerfish on Feb 9, 2018 16:33:50 GMT
No wonder I have a problem.
|
|
inherit
1040
0
Sept 28, 2024 23:22:57 GMT
3,228
Vortex13
2,202
Aug 17, 2016 14:31:53 GMT
August 2016
vortex13
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, KOTOR, Jade Empire
|
Post by Vortex13 on Feb 9, 2018 16:57:21 GMT
That's the thing with AI research and development though, why is everyone just assuming that the moment we develop sapient artificial intelligence that it will instantly have access to the the internet, military facilities, and nuclear launch codes? Surely, if we're as stupid as humanity is depicted in those science fiction stories that involve a super computer taking over, then we absolutely deserve to be wiped out. Closed networks, hardline and manual connections, etc. Any AI being created would hopefully have safeguards in place to prevent broad access to our high security areas. And even when the creation of sapient AI becomes a reality, why on God's green Earth do we think that humanity will instantly give said AI the keys to our entire civilization? That would be equivalent to humanity killing off horses once the first automobile was invented, or disposing of all candles when Edison created the light bulb. This is a good point, in theory. The problem is that you are underestimating the power of a super intelligence, and you are also discounting the possibility that due to the sheer difference in capacity, the motivations and intelligence of a super-intelligence would be just as alien to you, as your motivations and intelligence are to the average ant. I can try to explain it all myself, but someone who is an actual expert on this specific topic explained it all much better than I can ever hope to - in the following podcast. (the explanation of why Asimov's three laws are irrelevant and unfeasible in reality, is something I found particularly enlightening, as I never heard about it before) It's not a short conversation, but if anyone here is interested in AI, I can't recommend it enough:Oh I have no doubt that a self improving AI will undoubtedly have motivations and intelligence far beyond what we can think of. But I also know that no matter how smart said AI gets, if it's locked into a separated network, or what amounts to a faraday cage, it can be as "beyond our comprehension" as it wants, it's still not going to be able to break out or do anything that we don't want it to outside of containment. Its the same principle as working with hazardous bio-weapons or super bacteria, we take steps to isolate and quarantine the potentially dangerous elements while we conduct research. An Artificial Super Intelligence isn't going to be getting anywhere if it's limited to a separate network with physical connections it has to bridge at dial-up speeds (for example). And while I am sure most mainstream science fiction will disagree with me on the effect an AI's purpose will have on it's actions, I still say that being definitively created for a particular task will stop, or at least massively hinder, most sapient intelligences from wanting to pull a Skynet on us. To use a broad analogy, it would be like how our human bodies are adapted, or "created" with a purpose of breathing in a nitrogen/oxygen atmosphere to function. There is no massive movement on our part to fundamentally change that purpose because that is what we are best suited for. Likewise, why would an AI, created for the purpose of managing energy levels of a power plant, or sorting out packages in a shipping company, want to fundamentally alter it's task for global domination or for "killing all meat bags" if that's not what it was created for? Now if you get an ASI and give it some asinine directive, like say "Determine how many giggles are in a fluffy rainbow?" Then sure, you might get a break down but hopefully we as a species aren't so stupid as to ask beings of pure logic and cold hard facts such idiotic and pointless questions. I mean, really? If we're going to ask a quantum computing intelligence something that pointless then we really don't deserve to continue as a species.
|
|
inherit
1301
bobgoodheart1st mattig89ch
0
8,824
mattig89ch
5,679
August 2016
mattig89ch
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Jade Empire
mattig89ch
|
Post by mattig89ch on Feb 10, 2018 13:12:01 GMT
|
|
mousestalker
Inactive Moderator
ღ The Untitled
Just here for the cosplay
Staff Mini-Profile Theme: Mousestalker
Games: Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
Posts: 12,116 Likes: 30,354
inherit
ღ The Untitled
72
0
1
Jan 31, 2024 11:38:50 GMT
30,354
mousestalker
Just here for the cosplay
12,116
August 2016
mousestalker
Mousestalker
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, Baldur's Gate, Neverwinter Nights, Jade Empire, Mass Effect Andromeda, SWTOR
|
Post by mousestalker on Feb 10, 2018 14:24:03 GMT
|
|
inherit
1301
bobgoodheart1st mattig89ch
0
8,824
mattig89ch
5,679
August 2016
mattig89ch
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Dragon Age Inquisition, KOTOR, Jade Empire
mattig89ch
|
Post by mattig89ch on Feb 10, 2018 15:33:52 GMT
|
|
inherit
Mr. Rump
46
0
Sept 29, 2024 2:16:59 GMT
8,995
Lavochkin
6,793
August 2016
lavochkin
Mass Effect Trilogy, Dragon Age: Origins, Dragon Age 2, Jade Empire, Mass Effect Andromeda
|
Post by Lavochkin on Feb 10, 2018 17:38:22 GMT
Amy tells us why NASA performed rentry landings over sea and why the USSR opted to do them over land.
|
|