• simple@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Last time they hyped up their AI it was really inferior to ChatGPT. Lots of companies are claiming to be on par with GPT-4 and still end up having terrible reasoning, so I doubt it.

    • Dojan@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 year ago

      Think there’s room for more models anyway. Sometimes accuracy isn’t as important as running costs and such. When proofing translations I’ve found that GPT3.5 works better for me than GPT4.

      • ddtfrog@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        OP isn’t saying there isn’t room. He’s simply stating that non-Chinese companies claim to rival GPT4 which fall through.

        Knowing this is heavily China backed, there is going to be a ton of misinformation related to it. Let alone lies about how smart it is.

        • orclev@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          So, over/under on how long it takes someone to convince it to talk about Tiananmen Square and the CCP nukes it from orbit? Bonus points if you can get it to say Taiwan is an independent country.

          • GenderNeutralBro@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            1 year ago

            I’d be surprised if this was possible without explicitly describing those topics in your prompt. I imagine they carefully scrubbed its training data for non-approved subject matter.

            It’ll be interesting to see what happens when China decides they need to ban something new, though, like Winnie the Pooh. You can’t easily remove training data after the fact, so in that case they’d probably band-aid it and you could probably work around that.