As part of Intel’s Scalable Video Technology (SVT) initiative they had been developing SVT-HEVC as a BSD-licensed high performance H.265/HEVC video encoder optimized for Xeon Scalable and Xeon D processors. But recently they’ve changed course and the project has been officially discontinued.

  • demesisx@infosec.pub
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    4
    ·
    3 months ago

    Intel might not even be around in 10 years unless they can suddenly release revolutionarily reliable and efficient yet capable chips to compete with ARM. If I were in charge over there, I’d switch course to RISC-V and skate to where the puck is headed. Perhaps I’m a fool for that idea but the future certainly ain’t x86 …that’s for SURE.

    • BarbecueCowboy@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 months ago

      I do think they’re on a decline, but enterprise moves SLOW and that’s big money. ARM is going places, but the x86 market could almost just freeze entirely and still be worthwhile for legacy applications for a very long time.

      • BearOfaTime@lemm.ee
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        3 months ago

        And from what I’ve read, ARM has a way to go to best x86 for all-out performance, which is primary to servers. Reduced power consumption is nice, but we’re already maximizing clock cycle usage (power utilization) with virtualization. If you have to install even 5% more servers to meet demand, there’s no value in it.

        • BrikoX@lemmy.zipOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          It’s not that black and white. In cloud computing ARM already beats Intel and AMD at single-core workflows and are more price competitive even with higher RAM requirements. They also beat Intel in multi-core workflows, but AMD is far ahead yet.

      • demesisx@infosec.pub
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        2 months ago

        That’s fair. Please elaborate. I think others have made valid points refuting my hypothesis so please, if you will, pile on. I really don’t mind being wrong.

        The points someone else made about servers was an excellent one. That was a blind spot in my hypothesis.

        • Lumisal@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          If I’m not mistaken, isn’t the N100 chip actually pretty good? It uses like 5 more watts than similar ARM equivalents while still having plenty of power.

          At least when it comes to mind pcs

    • FizzyOrange@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I’d switch course to RISC-V

      And give up their only advantage? That would be insane. RISC-V isn’t quite mature enough to replace x86 anyway.

      • demesisx@infosec.pub
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 months ago

        They could, perhaps start investing in the maturity of RISC-V was my point (a pipe dream). I know it’s not there yet but that’s simply because there’s so much money to be made by licensing closed architectures like x86 and ARM. If RISC-V had parity with x86 or ARM, it would be able to best them someday, IMO.