It's a statistical language model used to generate fluff and only gets meaningfully useful via insane amount of reinforced and human supervised feedback (aka "learning", which it doesn't do, it only calculates probabilities on directed graphs).
Has no understanding of humour, meaning, intent, although stupid wiki articles claim that.
Readers give it's text meaning by trying to favourably interpret that.
It's a good reminder of how very little actual factual, accurate information is conveyed even in most human utterances.
So, perhaps it is doing a good job of copying human drivel?
What did you expect?
It's a statistical language model used to generate fluff and only gets meaningfully useful via insane amount of reinforced and human supervised feedback (aka "learning", which it doesn't do, it only calculates probabilities on directed graphs).
Has no understanding of humour, meaning, intent, although stupid wiki articles claim that.
Readers give it's text meaning by trying to favourably interpret that.
It's a good reminder of how very little actual factual, accurate information is conveyed even in most human utterances.
So, perhaps it is doing a good job of copying human drivel?
AI it is not. That's for sure.
I miss Tay.