WormGPT Is a ChatGPT Alternative With ‘No Ethical Boundaries or Limitations’::undefined

  • KairuByte@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. I’ve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.