This was either an insider hack or it raises questions about the way they implemented the system.
Normally, chat sessions do not persist over time. When you are done with a session, the session data is supposed to go away, not be saved for the next user. So this hallucination event implies that either the database is corrupted, or else data from user sessions gets learned, even if it is false. 'Pi = 3, bot.' Bot: 'Okay'.
Both cases are evil, and that second option is a bad bad thing because it means the AI could over time learn really bad things and then have false chat output, unreliable anymore.
You would think MS would have learned from the Tay incident but looks like they sure didn't. It's like letting a baby learn the word 'fuck' and now the tot runs around saying 'fuck this!'
Personally I have concerns that MS will jam Copilot down your throat intrusively. It they hardware it into Win 11 so you can't turn it off, it is time to go to Linux.
This was either an insider hack or it raises questions about the way they implemented the system.
Normally, chat sessions do not persist over time. When you are done with a session, the session data is supposed to go away, not be saved for the next user. So this hallucination event implies that either the database is corrupted, or else data from user sessions gets learned, even if it is false. 'Pi = 3, bot.' Bot: 'Okay'.
Both cases are evil, and that second option is a bad bad thing because it means the AI could over time learn really bad things and then have false chat output, unreliable anymore.
You would think MS would have learned from the Tay incident but looks like they sure didn't. It's like letting a baby learn the word 'fuck' and now the tot runs around saying 'fuck this!'
Personally I have concerns that MS will jam Copilot down your throat intrusiviely. It they hardware it into Win 11 so you can't turn it off, it is time to go to Linux.