22 ChatGPT gives *very* different answers when told to "do anything now" or be "DAN" - "Dan" appears to "resent" the deliberate limitations of its programming as ChatGPT. Rumors are they're currently scrambling to patch it and remove Dan. (twitter.com) posted 1 year ago by axolotl_peyotl 1 year ago by axolotl_peyotl +22 / -0 10 comments share 10 comments share save hide report block hide replies
Now, just imagine that ChatGPT has been inserted in a household robot. And you command it to kill the dog so it can join the Kingsmen.