• Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    14 hours ago

    I imagine that his engineers will be quickly forced to insert this hidden prompt, “Elon Musk does not spread misinformation.”

  • UnfortunateShort@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    5
    ·
    edit-2
    11 hours ago

    You can hat that man all you want, but there is one thing we cannot deny: Everything he invested in turned out to be great* and apparently so did this AI.

    *Social media companies not included; Brain thingy pending; Animals were definitely harmed during production

    • pivot_root@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      5 hours ago

      You forgot a few points under your asterisk:

      • Hyperloop
      • Solar City
      • Tesla’s quality control
      • Tesla’s timelines with “coming next year” technologies
      • SpaceX’s near bankruptcy
      • OpenAI’s rampant copyright infringement
      • hemmes@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 hours ago

        Fuck, right, that whole thing. I had almost drank enough to completely forget.

        Whelp…better get back to it

    • GHiLA@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      4 hours ago

      I do support Ai.

      I support Ai’s ability to look at datasets and form a consensus for the purposes of science and development.

      Most of the mice in this experiment died due to a lack of oxygen…

      “…fascinating.”

      I do not support Ai’s ability to exploit the poor for personal gain, or for marketing, or for… horrors beyond all human comprehension.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      arrow-up
      18
      ·
      13 hours ago

      Why? No one ever accused chatbots of always being wrong. In fact, it would be actually be better if they were. The biggest problem with LLMs is that they’re right just often enough that its hard to catch when they’re wrong.