5 You don't say! (media.scored.co) posted 200 days ago by RealWildRanter 200 days ago by RealWildRanter +7 / -2 28 comments share 28 comments share save hide report block hide replies
The chat only proves definetly that AI has a bias and that it lacks the ability to discuss abstract subjects that involve logic as it was trained using data from users that are unable to reach such levels of cognition.
It's just trained as a word predictor, there is no logic happening
If logic implies circular thinking, then predicting the course of a circle implies waiting for what comes around again, and again, and again...
What if few tempt many into logic to establish a predicament (predictable mind)?
https://www.cloudflare.com/learning/ai/what-is-large-language-model/
I don't understand if that's supposed to be a refutation, but LLMs do not use logic
That was a refutation of your oversimplification. LMMs can process logical statements as proven by the above chat.