Nintendo Wii: Sold like gangbusters.

64bit Processors: The computing standard.

Battlestar Galactica: Considered one of the greatest sci-fi series of all time.

Facebook: Continues to be the world’s leading social media platform by literally BILLIONS of users.

High Definition: HD only got even more HD.

iPhone: Set the standard for mobile smartphone form factor and function to this day 16 years later.

  • boonhet@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Since I love playing devil’s advocate, here’s a couple of points in their defense:

    Multi-GPU videocards: Pretty much dead, it’s just not efficient.

    64-bit computing: At the time was indeed slightly overhyped because while your OS was 64-bit, most software was still 32-bit, games in particular. So games couldn’t really use more than 4 GB of memory. And that was standard for multiple years after this article (this was 2008, 64-bit Windows had been out for ages, and yet 3 years later the original Skyrim release was still 32-bit. Games having 64-bit binaries included was a huge thing at the time) Now most software is 64-bit and yes, NOW it’s standard.

    High definition: Depends, did they mean HD or Full-HD? Because the former certainly didn’t last long for most people. Full HD replaced it real quick and stayed around for a while. Of course, if they meant Full-HD then hell no, they were hella wrong, it’s been mainstream for a while and only now is being replaced by 1440p and 4K UHD.

    iPhone: The FIRST one as a singular product really didn’t live up to the hype. It was missing features that old dumbphones had. Of course the overall concept very much did revolutionize the phone market.

    • immortaly007@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Well to be fair, changes like switching to 64 bit always are very slow (especially if they’re not being forced by completely blocking 32 bit). But I don’t think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.

      • boonhet@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Well by 2008 we’d had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it’s not worth the effort in the personal computer space.

        I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).

        It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).