Win / Conspiracies
Conspiracies
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: None provided.

A useful way of thinking about it is that "training a model" means "building a statistic." A system like ChatGPT is in essence an enormous statistical dataset of which words followed which words in the 35 TB of text it is a summary of.

It doesn't plan or learn or understand, and when it appears to be doing that in response to what people write, it is an illusion based on loosely-similar sequences of words in the training texts.

GPT will fundamentally never scale past that. But that doesn't mean other algorithms won't.

1 year ago
1 score
Reason: Original

A useful way of thinking about is that "training a model" means "building a statistic." A system like ChatGPT is in essence an enormous statistical dataset of which words followed which words in the 35 TB of text it is a summary of.

It doesn't plan or learn or understand, and when it appears to be doing that in response to what people write, it is an illusion based on loosely-similar sequences of words in the training texts.

GPT will fundamentally never scale past that. But that doesn't mean other algorithms won't.

1 year ago
1 score