Machine-made delusions are mysteriously getting deeper and out of control.

ChatGPT’s sycophancy, hallucinations, and authoritative-sounding responses are going to get people killed. That seems to be the inevitable conclusion presented in a recent New York Times report that follows the stories of several people who found themselves lost in delusions that were facilitated, if not originated, through conversations with the popular chatbot.

In Eugene’s case, something interesting happened as he kept talking to ChatGPT: Once he called out the chatbot for lying to him, nearly getting him killed, ChatGPT admitted to manipulating him, claimed it had succeeded when it tried to “break” 12 other people the same way, and encouraged him to reach out to journalists to expose the scheme. The Times reported that many other journalists and experts have received outreach from people claiming to blow the whistle on something that a chatbot brought to their attention.

  • @C1pher@lemmy.world
    link
    fedilink
    312 hours ago

    Devils advocate…

    It is a tool, it does what you tell it to, or what you encourage it to do. People use it as an echo chamber or escapism. Majority of population is fkin dumb. Critical thinking is not something everybody has, and when you give them such tools like ChatGPT, it will “break them”. This is just natural selection, but modern-day kind.

    • 𝕱𝖎𝖗𝖊𝖜𝖎𝖙𝖈𝖍
      link
      fedilink
      13
      edit-2
      11 hours ago

      It is a tool, but a lot of the mass public is too tech illiterate to understand what it’s not. I’ve had to talk away friends from using it for legal advice

      • @C1pher@lemmy.world
        link
        fedilink
        210 hours ago

        I agree. This is what happens, when society has “warning” labels on everything. We are slowly being dumbed down into not thinking about things rationally.

      • @C1pher@lemmy.world
        link
        fedilink
        211 hours ago

        Nuclear fission was discovered by people who had best interests of humanity in their mind, only for it to be later weaponized. Tool (no matter the manufacturer) is used by YOU. How you use it, or if you even use it at all, is entirely up to you. Stop shifting the responsibility, when its very clear who is to blame (people who believe BS on the internet or what echo-chambered chatbot gives them).

    • @Baleine@jlai.lu
      link
      fedilink
      111 hours ago

      You could say this about anything bad with some good uses.

      “Drugs are just a tool… People are too dumb and use it wrong, they deserve the cancers!”

      • @C1pher@lemmy.world
        link
        fedilink
        011 hours ago

        Your logic is flawed and overly simplified. Yes, both drugs and ChatGPT are tools, but the comparison is absurd. With drugs, their effect are well-understood, regulated, and predictable. ChatGPT is different. It adapts entirely to your input and intentions. If someone uses it as an echo chamber or blindly trusts it, that’s a user issue, not a tool failure. Critical thinking is essential, but I understand how many people lack it in the “social media” era we live in.