5 You don't say! (media.scored.co) posted 201 days ago by RealWildRanter 201 days ago by RealWildRanter +7 / -2 28 comments share 28 comments share save hide report block hide replies
It's just trained as a word predictor, there is no logic happening
If logic implies circular thinking, then predicting the course of a circle implies waiting for what comes around again, and again, and again...
What if few tempt many into logic to establish a predicament (predictable mind)?
https://www.cloudflare.com/learning/ai/what-is-large-language-model/
I don't understand if that's supposed to be a refutation, but LLMs do not use logic
That was a refutation of your oversimplification. LMMs can process logical statements as proven by the above chat.
LLMs can process logical statements but are not engaging in a logical process