Physics sure made accelerating progress before stagnating in the 20th century. Same thing with chemistry. And biology is still accelerating in my opinion. AI is still young and likely has a lot more acceleration left in it. Besides, it's already close to the singularity in that we now have AIs that can do general tasks with some level of competency. All that's left is to get the competency consistently above average human levels. I don't think that's far away at all. Even if it is, we still have to deal with it happening whenever it does.
The AI industry has already proven itself to be full of smoke and mirrors. It is at Elon Musk levels of bsing, so I am putting it in the "just another 10 years bro" category.
So far the experiments where they used coding agents to try and build something serious were predictably awful. They had agents try to autonomously build a web browser, and it just ripped off code from a known open source browser and made a mess of it. They regurgitate, they can follow old instructions (imperfectly), they can summarize consensus views, but they don't think.
The singularity can think and reason, if it were to ever exist, which is not guaranteed.
EDIT: And speaking of Elon Musk he's at the forefront of pushing the autonomy hype. Do I even need to list his many lies? Why should I trust these charlatan's when they say "we're so scared of the AI we've created!"?
Every year they claim to be freaked out over the latest tool. A couple years ago it was "Devin the autonomous software engineer". What happened there? Right it was a bunch of hype that didn't work.
But every year I must accept that "this time is different, we're telling the truth now"
Physics sure made accelerating progress before stagnating in the 20th century. Same thing with chemistry. And biology is still accelerating in my opinion. AI is still young and likely has a lot more acceleration left in it. Besides, it's already close to the singularity in that we now have AIs that can do general tasks with some level of competency. All that's left is to get the competency consistently above average human levels. I don't think that's far away at all. Even if it is, we still have to deal with it happening whenever it does.
The AI industry has already proven itself to be full of smoke and mirrors. It is at Elon Musk levels of bsing, so I am putting it in the "just another 10 years bro" category.
So far the experiments where they used coding agents to try and build something serious were predictably awful. They had agents try to autonomously build a web browser, and it just ripped off code from a known open source browser and made a mess of it. They regurgitate, they can follow old instructions (imperfectly), they can summarize consensus views, but they don't think.
The singularity can think and reason, if it were to ever exist, which is not guaranteed.
EDIT: And speaking of Elon Musk he's at the forefront of pushing the autonomy hype. Do I even need to list his many lies? Why should I trust these charlatan's when they say "we're so scared of the AI we've created!"?
Every year they claim to be freaked out over the latest tool. A couple years ago it was "Devin the autonomous software engineer". What happened there? Right it was a bunch of hype that didn't work.
But every year I must accept that "this time is different, we're telling the truth now"
Physics stagnated because of Bohr, Heisenberg, and Feynman.