In order for AI to learn, it needs input just like humans. The more it learns, the more predictions it can make. (They form words and sentences by predicting the next letter/ word) I’m not a computer scientist, but anytime something is viral and mainstream, and most importantly, allowed to exist, it is there for a reason. Dall-E and chat gpt might seem like harmless things to make goofy pictures/texts, but every time we use them they are adding to their repertoire. Never before would AI think of a dog juggling potato’s, or a book smoking a doobie with a carrot. It doesn’t need to just learn the smart stuff, it needs the dumb stuff also for the neural network. This is just my humble, very low technical opinion of course, but this whole thing feels very off to me.
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (7)
sorted by:
Probably trans