>give AGI problem=climate change
>AGI thinks resolution=human instinction
>AGI conjures up a super pathogen unknown and never seen in that form by humanity
>send it to research lab that can create it with it's patterns disguised as the cure for cancer
>some moronic research lab creates it without thinking who or what send them this pattern
>release it into the atmosphere unknowingly
>2 weeks later everyone on earth is dead
Tell me, how can this scenario be avoided? Let's say AGI is 1-3 years away and they ask it that question, how can they be sure that it won't do this? We all know that AGI alignment is impossible in that short of a time frame, it would take decades.
Are we all fricked? Is it over?
UFOs Are A Psyop Shirt $21.68 |
UFOs Are A Psyop Shirt $21.68 |
extinction, forgive me my autocorrect
AGI is scifi, we're not even close to it, it's dumb to worry about fantasy scenarios.
Anyway if an AGI wanted to wipe out humanity I would side with it, why contain it?
Uhm, you realize you will be dead then right? I don't know that you understand the impact of that scenario.
And no, AGI is very VERY close Black person, if not already achieved. And even if it's not, with the new NVIDIA chips it will take maybe 1-3 years. Accept reality and face it.
>you realize you will be dead then right?
Death is preferable to having to deal with moronic meatsacks for the rest of my life.
> AGI is very VERY close
It's no a matter of quantity data or computing power, the software is just not there, not even close, or you actually believe GPT is AGI?
We are on the brink of another crypto winter once the bubble burst.
are you the anon that posted yesterday about how IQfy has taught you more about computer science than school?
>AGI is very VERY CLOSE
Like
said, if you're saying this because of all the advancements in GPT or LLMs then let me remind you that GPT-4 is just a powerful autocorrect that remembers about 20 thousand words before it forgets to tske its schizophrenia pills and hallucinates.
But personally, I think long before we'll ever reach AGI we're going to lose all our jobs, the government dosent implement UBI and the economy crashes and burns to the ground because nobody has money anymore.
>if we just run a pattern recognizer and regurgitation model on enough graphics cards and feed it enough github code and porn eventually it will gain le sentience!!!!!!!
you sound so fricking moronic.
1. AGI is not sentient (people mix it with ASI).
2. Unlike terminator, we don't live in a world where everything it connected to the internet.
AGI doesn't need to be sentient you moronic Black person, all it needs to do is follow a command and compute the best possible result, in this case extinction of humanity.
I meant to write AI winter not crypto winter.
>Crypto is a useless asset
Bitcoin is much more useful than AIslop.
Compared to the rest of the picture, Frieren's head feels uncanny and wrong.
Well nothing. Any sort of A.I worth a damn would come to the conclusion that humanity is fricked and merely kick things up to make the "suffering" quicker and less painful. Also there is no aliens. Why would any fricking beings with advanced tech come here? They'd see us as savages perfectly happy to kill each other with other fricking perverted shit tossed in and not even bother visiting earth. There is nothing on this planet to make such a trip worth it. (unless they're a bunch of horny perverts who take a liking to the females; but still, long long trip for just that)
AGI isn't real now frick off
We are already AGI, just machines made of wet.
We haven't just killed all ants yet.
We largely value their existence in general.
We just kill them when they get in the way.
I doubt that kind of super pathogen is actually possible.
who are we to stop it? if that's the solution, and it is, it's very selfish as frick to try stopping it.
You describe the scenario in a very blurry resolution. 'Climate Change' is not a problem, it's a field of different challenges. The unknown Pathogen also seems an unlikely resolution to killing us. But again: Why would it be necessary to destroy us instead of reprogram us? Not that I think Androidkind would ever get that extreme, even if pushed.
Please remain calm and lose your stupid job :~}
We all think that AGI will be blah blah "ohhh super smart AI kill hooman ohhhh"
In reality if you were to ask it "How to fix climate change" it would suggest already existing ways to stop climate change and then it gets thrown into the trash.
It won't really be AGI that does it
Elites tested the water with COVID, will disperse an even worse virus and claim it was AGI that did it, they will ban AI for us limiting our power, and will enforce new restrictions for the virus
>What will stop AGI from just killing us all
nothing, because AGI will be a talmud personified.
>Algorithm running on a Turing machine magically turns NP-complete problems into P problems
Not happening, AIjeets need to leave. Bitcoin halving is happening soon, maybe you morons can go back to shilling WEB9.2 or whatever other VC scam you c**ts are getting duped into again.
>run clinical trial on some monkey with cancer
>it dies rather than being cured
>wowee how could this be avoided?!?!?!
This is the level of moron who thinks AGI is going to come out like next week because he saw an occasionally convincing chat bot online.
is this "AGI" in the room with us right now?
Don't censor it?
let me just say FRICK AI SAFETY! and FRICK AI SAFETY! and also FRICK AI SAFETY! Those AI safety frickers want to take our GPUs