• ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 days ago

    Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.

    • webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      13 days ago

      The llm models aren’t, they don’t really have focus or discriminate.

      The ai chatbots that are build using those models absolutely are and its no secret.

      What confuses me is that the article points to llama3 which is a meta owned model. But not to a chatbot.

      This could be an official facebook ai (do they have one?) but it could also be. Bro i used this self hosted model to build a therapist, wanna try it for your meth problem?

      Heck i could even see it happen that a dealer pretends to help customers who are trying to kick it.

      • Smee@poeng.link
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 days ago

        For all we know, they could have self-hosted “Llama3.1_NightmareExtreme_RPG-StoryHorror8B_Q4_K_M” and instructed it to take on the role of a therapist.

      • thefartographer@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        You don’t look so good… Here, try some meth—that always perks you right up. Sobriety? Oh, sure, if you want a solution that takes a long time, but don’t you wanna feel better now???

  • Zacryon@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 days ago

    I feel like humanity is stupid. Over and over again we develop new technologies, make breakthroughs, and instead of calmly evaluating them, making sure they’re safe, we just jump blindly on the bandwagon and adopt it for everything, everywhere. Just like with asbestos, plastics and now LLMs.

    Fucking idiots.