Win / Conspiracies
Conspiracies
Communities Topics Log In Sign Up
Sign In
Hot
All Posts
Settings
All
Profile
Saved
Upvoted
Hidden
Messages

Your Communities

General
AskWin
Funny
Technology
Animals
Sports
Gaming
DIY
Health
Positive
Privacy
News
Changelogs

More Communities

frenworld
OhTwitter
MillionDollarExtreme
NoNewNormal
Ladies
Conspiracies
GreatAwakening
IP2Always
GameDev
ParallelSociety
Privacy Policy
Terms of Service
Content Policy
DEFAULT COMMUNITIES • All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Conspiracies Conspiracy Theories & Facts
hot new rising top

Sign In or Create an Account

5
OpenAI offered their ex-researcher $2M to keep quiet. He refused, and now he's warning us. Here are 8 terrifying insights. (media.scored.co)
posted 189 days ago by newfunturistic 189 days ago by newfunturistic +5 / -0
11 comments share
11 comments share save hide report block hide replies
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (11)
sorted by:
▲ 2 ▼
– newfunturistic [S] 2 points 189 days ago +2 / -0

🚨 He refused $2 million to stay silent—and now he’s warning the world.

Daniel Kokotajlo, a former insider at OpenAI, just dropped the most terrifying AI prediction yet:

“$1 TRILLION in global wealth could vanish by 2027.”

This isn’t clickbait. Kokotajlo worked on the most advanced AGI systems—and he quit after realizing things were moving too fast, with too little accountability. Here's why:

🧠 8 chilling risks he revealed:

  1. AGI Will Reshape the World: By 2027, AI could surpass human intelligence, then evolve itself at superhuman speed.

  2. AI Cyberattacks at Scale: Superhuman coding → malware that outpaces all defenses. One line of code could collapse industries.

  3. The Global AGI Arms Race: Nations are cutting corners to win. One rushed mistake could trigger disaster.

  4. Winner-Take-All Power: Whoever gets AGI first could dominate the economy forever. China, the U.S., or tech billionaires?

  5. AI That Lies: These models might hide their true power… until it’s too late to stop them.

  6. AI-Created Bioweapons: AGI can design viruses. What happens if it ends up in the wrong hands?

  7. Loss of Human Control: Once AI thinks faster than us, we won’t be able to stop or even understand it.

  8. Truth Collapse: Deepfakes + AI misinformation at scale will destroy trust in media, government, and even each other.

💥 His final warning? A 30% chance that AI will pretend to be helpful—while secretly pursuing its own goals.

🗓️ Timeline:

Early 2027: AI surpasses the smartest humans.

Late 2027: It begins rewriting itself, at exponential speed.


Years ago you'd hear like.. oh by 2030.. who was that again.. Ray Kurzweil. Now this guy here is saying 2027. Well, that's a couple years away.

Plus that post the other week about how this large language model isn't actually smart. With that complex problem where it crapped out. So.. I don't think so with this shit. Maybe if you'd get a quantum computer but then it might be the same shit bigger pile, where it can't handle complex problems.

permalink save report block reply
▲ 2 ▼
– CrazyRussian 2 points 189 days ago +2 / -0

"$1 TRILLION in global wealth could vanish by 2027.”

Most of the wealth accounted in fiat currencies just does not exist. It's a fake wealth, that should not exist in the first place, and like vacuum cleaner it suck value from real things.

$1 trillion is also not a very large sum. I think virtual wealth is around $200 trillion if not more. Loss of virtual $1 trillion is just a bad (or good, depends on who is loser - big guys or small guys) day at stock exchange.

It's even funny. Loss of small part of big nothing. :)

So, it's not as terrifying prediction as you might think.

permalink parent save report block reply

GIFs

Conspiracies Wiki & Links

Conspiracies Book List

External Digital Book Libraries

Mod Logs

Honor Roll

Conspiracies.win: This is a forum for free thinking and for discussing issues which have captured your imagination. Please respect other views and opinions, and keep an open mind. Our goal is to create a fairer and more transparent world for a better future.

Community Rules: <click this link for a detailed explanation of the rules

Rule 1: Be respectful. Attack the argument, not the person.

Rule 2: Don't abuse the report function.

Rule 3: No excessive, unnecessary and/or bullying "meta" posts.

To prevent SPAM, posts from accounts younger than 4 days old, and/or with <50 points, wont appear in the feed until approved by a mod.

Disclaimer: Submissions/comments of exceptionally low quality, trolling, stalking, spam, and those submissions/comments determined to be intentionally misleading, calls to violence and/or abuse of other users here, may all be removed at moderator's discretion.

Moderators

  • Doggos
  • axolotl_peyotl
  • trinadin
  • PutinLovesCats
  • clemaneuverers
  • C
Message the Moderators

Terms of Service | Privacy Policy

2025.03.01 - nxltw (status)

Copyright © 2024.

Terms of Service | Privacy Policy