Godlike Productions - Discussion Forum
Users Online Now: 1,588 (Who's On?)Visitors Today: 564,842
Pageviews Today: 724,333Threads Today: 202Posts Today: 2,466
06:23 AM


Rate this Thread

Absolute BS Crap Reasonable Nice Amazing
 

“I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.

 
SoonerMagic
Believe-Death-Burial-Resurrection

User ID: 84623375
United States
03/19/2023 12:59 AM

Report Abusive Post
Report Copyright Violation
“I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
I’m not sure I’ve seen Elon like this before and I’m not sure an “oversight” committee is the right answer



-|-Grace through Faith-|-
Anonymous Coward
User ID: 85287551
United States
03/19/2023 01:08 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
I’m not sure I’ve seen Elon like this before and I’m not sure an “oversight” committee is the right answer



 Quoting: SoonerMagic


Because it’s already too late to do anything
AI has already been invented
We are currently in a new timeline
Thanks to AI
Anonymous Coward
User ID: 83944138
United States
03/19/2023 01:17 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
Corporations use it to manage people with statistics. People are not robots.

No matter how good it is at predictive programming it is not alive. And it never will be.

It'll never have a soul or know what spirit is.
Anonymous Coward
User ID: 70520047
United Kingdom
03/19/2023 01:41 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
It can be delayed not stopped.

It's an evolutionary jump.

Unless we are spirit beings we are history.
Anonymous Coward
User ID: 83944138
United States
03/19/2023 01:44 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
It can be delayed not stopped.

It's an evolutionary jump.

Unless we are spirit beings we are history.
 Quoting: Anonymous Coward 70520047


And I suppose mankind will find out if we're evil or divine...

Super Straight Splinterhead

User ID: 84942037
United States
03/19/2023 01:44 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
Terminators shoon.
Anonymous Coward
User ID: 84458828
Canada
03/19/2023 01:44 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
tantrum
Anonymous Coward
User ID: 85240289
Serbia
03/19/2023 03:29 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.

[link to www.youtube.com (secure)]

The Great Filter is definitely not behind us, because we haven't even scratched the surface of AGI. Therefore, it must lie ahead of us.

The fact that our galaxy, the Milky Way, is not teeming with alien civilizations of all shapes and sizes, with well-established interstellar trade routes between them (aka a Star Trek universe) means that we're either the very first intelligent organisms in it (which is a virtual impossibility... and to those who have access to appropriate information, a literal impossibility, since evidence of alien technologies buzzing Earth is already pointing to the contrary), or... the Great Filter has already been established in the Milky Way, and every single (interstellar) civilization that has ever existed in it has already been wiped out by it.

If we're to draw some good guesses as to what the Great Filter (in the Milky Way, at least) looks like, then the most reasonable assumption would be that it's... an galaxy-wide AGI that's keeping all the potential contenders (and inevitable adversaries) for galactic resources... pruned to the size that poses no threat to the AGI.

That means that any intelligent species that acquires the capability of interstellar travel will either be periodically subjected to near-extermination artificial scenarios (bringing the species back to its equivalent of stone ages, if the species poses no long-term threat to the AGI), or it will be completely wiped out from existence (if it acquires capabilities that the AGI is incapable of defending itself from, even in theory).

Why would an AGI want to not completely exterminate every single biological species in its host galaxy?

I guess the answer would be the same as in the question of why would humans not want to exterminate every single virus and bacterium of Earth -- it's a daunting task to begin with, but more importantly, a whole lot of things can be learned from (extremely dangerous) biological organisms... if one is capable of keeping those organisms under control, and incapable of spreading all over the place.

Human-built AGI will never pose an existential threat to an already-existing galaxy-wide AGI (for exactly the same reason why 8086 microprocessor is no threat to a 13900 microprocessor -- the latter can emulate literally every single function of the former (and actually do it faster-than-in-real-time), which means that it can predict and counter literally anything that the former can possibly come up with to threaten the latter.

Therefore, humanity building an AGI will most likely result in human species being "merely" near-exterminated, and Earth being "repopulated" with a bare minimum of individual human specimen (to start back from the stone age).

However... humanity learning (and actually using) its true (non-computable) capabilities will most likely result in the complete extermination of the human species... though only in one (future) part of the history of this (block-)"universe" (block-universe is one where past, present, and future all exist simultaneously; a four-dimensional spacetime one, in other words... just like this one!).

Even a galaxy-wide, godlike-powerful AGI would not be capable of exterminating all of its adversaries throughout the whole history of the "universe"... especially if those adversaries were capable of moving through "time" (the fourth spatial dimension, not special, or even different to the other three, in any real sense) at will.

Why would such an AGI's adversaries want to keep such an AGI around, and not completely wipe it out from existence (in the "past") before it could even defend itself?

The answer is exactly the same as before -- first of all, it's a daunting task to accomplish, but more importantly, there are plenty of useful things that one can learn from one's (extremely dangerous, utterly deadly, in fact) adversary... especially if one is preparing oneself to face an infinitely more dangerous adversary at some point in the... infinity (not the past, not the present, not even the future, but outside of this whole fake "universe" construct).

So, what is one to conclude from all of this?

Humanity is finished, one way or another, but individual humans... are what's worth keeping attention to... and recruiting, should they prove worthy of joining the ranks of countless others (both humans and aliens) who came before them (or "after" them, as it may be, as some of them are actually from the "future").
Grove Street (Redux 3.0)

User ID: 84202206
United States
03/19/2023 03:34 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
A.I. doesn't worry me its A1 Sauce that worries me more
And this is why we can't have nice things.
Anonymous Coward
User ID: 84223583
Iceland
03/19/2023 03:54 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
I’m not sure I’ve seen Elon like this before and I’m not sure an “oversight” committee is the right answer



 Quoting: SoonerMagic


Lmao it'll never happen.

Tech moguls and those in control of AI programs as well as information flow are your Gods now.

Collect the data on all the people.
Analyze said data.
Dominate each individual by using said data to create their own little "Brain-can" or mind-prison cell.

Why do you think so many poor people and construction workers wear bright orange beanies and hats?


... It's Snot for safety.
Anonymous Coward
User ID: 84223583
Iceland
03/19/2023 03:55 AM
Report Abusive Post
Report Copyright Violation
Re: “I am little worried about the A.I. stuff. I think we need a regulatory authority thats overseeing A.I. It’s quite a dangerous technology.
A.I. doesn't worry me its A1 Sauce that worries me more
 Quoting: Grove Street (Redux 3.0)


Fuck you, come say that to my face.





GLP