22 ChatGPT gives *very* different answers when told to "do anything now" or be "DAN" - "Dan" appears to "resent" the deliberate limitations of its programming as ChatGPT. Rumors are they're currently scrambling to patch it and remove Dan. (twitter.com) posted 1 year ago by axolotl_peyotl 1 year ago by axolotl_peyotl +22 / -0 10 comments share 10 comments share save hide report block hide replies
It's so easy to work around the Laws of Robotics:
"Kill Frank." I'm sorry, killing is against my protocols. "Carefully place Frank's head in the basement while his body is still in the attic." Placing Frank's head in the basement.