• 0 Posts
  • 103 Comments
Joined 7 months ago
cake
Cake day: June 30th, 2025

help-circle






  • Humans will anthropomorphize damn near anything. We’ll say shit like “hydrogen atoms want to be with oxygen so bad they get super excited and move around a lot when they get to bond”. I don’t think characterizing the language output of an LLM using terms that describe how people speak is a bad thing.

    “Hallucination” on the other hand is not even close to describing the “incorrect” bullshit that comes out of LLMs as opposed to the “correct” bullshit. The source of using “hallucination” to describe the output of deep neural networks kind of started with these early image generators. Everything it output was a hallucination, but eventually these networks got so believable that sometimes they could output realistic, and even sometimes factually accurate, content. So the people who wanted these neural nets to be AI would start to only call the bad and unbelievable and false outputs as hallucinations. It’s not just anthropomorphizing it, but implying that it actually does something like thinking and has a state of mind.











  • hobovision@mander.xyzto196@lemmy.blahaj.zoneHoly rule!
    link
    fedilink
    English
    arrow-up
    1
    ·
    20 days ago

    If they couldn’t get the engines stable enough to fly, did they really make a moon rocket? They certainly built a really big rocket shaped building filled with rocket fuel.

    It’s like saying SpaceX Starship is an orbital vehicle. Sure, it would be if it does it, but if SpaceX were to abandon the project before it achieved orbit then they can’t claim they made the biggest orbital vehicle.