• leadore@lemmy.world
    link
    fedilink
    arrow-up
    48
    ·
    2 months ago

    She has since jumped ship to become a director at another trendy company, PsyMed Ventures, which Newsweek described as a VC fund investing in mental and brain health. Many of the companies PsyMed invests in feature AI tools — which Ner says she still uses, albeit with a newfound sense of respect.

    She hasn’t learned a thing. She should have learned she’s susceptible and needs to stay as far away from the shit as possible.

      • hendrik@palaver.p3x.de
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        2 months ago

        I mean I know a fair share of people who did manage to quit smoking. Or cut down on weed or alcohol. Especially after things went sideways for them… So… Both is possible and being demonstrated by people each day. I mean you’re certainly right. It ain’t easy to do some self reflecting and stay away from addictive things. Requires motivation, mental strength and a good amount of effort. Just saying… because it’s good not to portray it as if resignation and drugs were people’s only option…

  • wavebeam@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    2 months ago

    I have significant problems with AI, particularly around reckless implementation in tasks it is simply incapable of providing real value (most of them), but I struggle to find these kinds of articles as anything but the journalism version of the same lazy application.

    Blaming AI for a mental health issue is like blaming alcohol for making someone an asshole. They were an asshole before they got drunk, it just became more obvious while drunk. Same thing here, AI is not causing psychosis, it’s just revealing it in a place that we’re not used to seeing stuff like this come from: a computer.

    This article isn’t likely using AI to write it, but the application of AI as the subject matter and tying this person’s crisis to the use of AI seems lazy at best, negligent at worst.

    • _g_be@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Yeah AI isn’t causing psychosis, it’s amplifying and enabling people who might already have problems.

      What is your argument here? That since AI is not the direct cause of it, these articles are pointless? I think we’d want to know if, to use your example, commonly available thing like alcohol was giving people psychosis

      • wavebeam@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I think my concern is with it being called “AI Psychosis”

        It doesn’t seem like an effective call to action for treatment for the individual if we simply blame the AI not being “safe enough” or something.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          If we bring this back to alcohol, the alcohol absolutely is to blame for worsening symptoms. There’s even the term “alcohol-induced psychosis” or “alcohol-related psychosis” to describe the effect. Without the alcohol they are fine, with it they enter psychosis.

          If someone is symptom-free without AI and experiences symptoms with AI, then calling it “AI psychosis” would be reasonable.