• gedaliyah@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    5 hours ago

    Just to be clear, companies know that LLMs are categorically bad at giving life advice/ emotional guidance. They also know that personal decision making is the most common use of the software. They could easily have guardrails in place to prevent it from doing that.

    They will never do that.

    This is by design. They want people to develop pseudo-emotional bonds with the software, and to trust the judgment in matters of life guidance. In the next year or so, some LLM projects will become profitable for the first time as advertisers flock to the platforms. Injecting ads into conversations with a trusted confidant is the goal. Incluencing human behaviour is the goal.

    By 2028, we will be reading about “ChatGPT told teen to drink Pepsi until she went into a sugar coma.”