No.22304
Even worse, it isn't even programmers.
No.22309
>>22307They are truly some of the biggest pseuds of all time. OF ALL TIME!
Why not focus on problems that actually exist like inequality and global warming. Very funny that there's enough imbecile tech people to pay for their dumb foundation
No.22312
>>22311Terrible argument since Yud could explain in words why I shouldn't kill him but it would still be obvious correct action
No.22313
>>22312yes but EY can fortunately feel pain which GPT-3 sadly cannot
No.22316
>>22311>questions if chickens sentientNo one has ever insisted they weren't. That wouldn't nake sense, the presence of an avian brain in a chicken's skull isn't disbuted.
No.22317
>>22316He's getting 'sentient' and 'sapient' mixed up because he's a retard.
No.22318
>>22317are you telling me the too-smart-for-academia folks over at LessWrong might not be as smart as they'd like you to think? preposterous
No.22323
People who actually know how shit works wouldn't worry themselves with "AI alignment" in the first place unless they just want to grift.
No.22326
All those assholes are some flavor of lesswrongoid rationalist neorreactionary.
No.22327
>>22325AI is nowhere close to being real and there's much bigger issues. None of the stuff like AI art is anything approaching actual AI.
No.22339
>>22303>AI alignment problemsmarketing term, nothing to do with programming
No.22340
oh yeah dude ai is so so soooooo powerful right now we need to contain it pass laws only let a select few control it because it could kill us all im not blowing it out of proportion so you inflate our stocks no im actually sooo scared aha
No.22724
"current text generation models carry with them the seeds for worldwide human extinction and we need to act accordingly" is a massively strong claim that is entirely hypothetical and carries zero evidence to back it up. It was more believable to say that nuclear weapons would cause a human extinction event and it still didn't happen. Why does anyone take these clowns seriously
Unique IPs: 5