Nintendo Wii: Sold like gangbusters.
64bit Processors: The computing standard.
Battlestar Galactica: Considered one of the greatest sci-fi series of all time.
Facebook: Continues to be the world’s leading social media platform by literally BILLIONS of users.
High Definition: HD only got even more HD.
iPhone: Set the standard for mobile smartphone form factor and function to this day 16 years later.
Dang I had no idea .LGBT was a top level domain
It’s a mastodon instance. Calm down.
…so?
Sorry I’m confused at your comment there. Care to elaborate?
All I’m saying is I think it’s cool ‘lgbt’ can be used in this way
Sorry if I was unclear
My mistake. The internet has programmed me to take any reply in context of lgbt to be with hostile intent. ❤️
No worries!
Just to clarify their point, they weren’t pointing out the mastodon instance, they were pointing out the LGBT TLD (top level domain). It’s interesting, and not widely known, that anyone can make whateverwebsite.lgbt now. You could own the africangrey.lgbt domain for $11.99/yr if you wanted. Nothing to do with mastodon.
To be fair, a lot of these are accurate, or at least were at the time.
-
Multi-GPU just never caught on. There’s a reason you don’t see even the most hardcore gaming machines running SLI today.
-
The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.
-
Spore is largely forgotten, despite the enormous hype it had before release. It’s kind of the Avatar of video games.
-
It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn’t notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.
-
Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.
-
Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)
-
I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.
-
The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.
-
The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier… the iPhone family did make a huge impact in the long run, but it wasn’t until the 3GS that it was a true competitor to something like a Symbian device.
The only entry on this list that’s really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck’s guts and has proudly never had a Facebook account.
Multi GPU video cards (not multiple video cards) might be making a comeback.
Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple’s “UltraFusion” and AMD’s “Infinity Fabric” to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.
As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback… at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.
Are you joking? I thought the Wii was a wild success, I remember it being very popular.
Not only was it popular, but it outsold the PS3 and XBox 360. On top of that, Nintendo was making a profit for every Wii sold. Sony and Microsoft were losing money for each console sold, but they made up for it with software sales.
-
The Wii was extremely popular. For years, it outsold every other console combined by several orders of magnitude.
Netbooks absolutely were overhyped, and the market for them died really quickly. They were barely usable, and by 2010 when tablets really started hitting the market, there wasn’t a space for them anymore.
HDTVs weren’t overhyped, they were just expensive, and in 2008 there wasn’t that much content to take advantage of it. I had a 32" 720p TV that I paid nearly $700 for in 2007. Now, you can gt a 40-something inch 4K tv for a little over $200, and there’s plenty of content to make it worthwhile (though the real-world benefit of 4k on such a small set is debatable).
The first iPhone was so incredibly polarizing at the time. The hype machine leading up to that announcement was unlike any other product launch I can recall. So it was never going to live up to that kind of hype. And while it was limited in features for it’s time, it was clear more was on the horizon. And given how it not only revolutionized the phone market, but also the web as a whole, we know how it all ended up.
The Wii was overhyped though. Most players never bought any other game than Wii sports. I had an unlocked Wii and played all the good titles, and there are not more than ~10 of them. Most Wii games (looking at you, NFS) felt like half-baked mobile ports.
And the Wii U sales showed that. Yeah, the Wii sold to tons of casuals, but hardly any of them upgraded, even though the Wii U was a much more capable system.
The most frequent question I hear to this day when talking with former Wii owners is “What’s the benefit of the Wii U and why would I need to upgrade?” That’s a question I have never heard in relation to any other game console. Or have you ever heard the sentence “What’s so special about a PS3 if I already have a PS2? Why would I need to upgrade?”
And this setup the Wii U to be such a huge commercial flop that Nintendo effectively cancelled their stationary game console line.
I would say it was seriously overhyped, similarly to the Netbooks. It was a fad, it was cool, boatloads of non-techy-people bought them, and none of them bought the successor so it all died quickly.
For the rest I agree though.
God, GoldenEye online was so amazing though for a long time. Only reason I got a Wii.
The Wii had a ton of great games outside of the Nintendo specific ones. The Conduit 1 and 2, Golden Eye, tons of fighting games, it gave us No More Heroes. The Force Unleashed somewhat had the best edition on the Wii (this is mostly subjective but it’s a strong consensus that the Wii’s version held up). Its main appeal to other consoles I think was how diverse the games could try to be - silly games like Boom Blox and De Blob, and niche ones like Endless Ocean for all the marine biologist kids.
Granted, I grew up with some of these games and I’m not trying to say that the Wii’s extensive library is all stellar. But there are many gems amongst it. The Wii’s popularity drew a lot of attention to games that would just be scrolled past as shovelware on other online stores (Xbox Live mostly). Few of these were outside of the Xbox Arcade or whatever it was, but on the Wii they would be digital and sometimes have physical editions. Also because of how wide its demographic, it had a few surprisingly decent Barbie-esque and Horse care games. I mean, it had so many games made for it that only just stopped getting games in 2020.
The Wii U was an attempt to bridge the gap between the success of their portable line, the DS, and the Wii. Growing up all any kind ever wanted was getting their consoles connected. But then when the Wii U finally came out and was marketed, its main selling point was that you could play your game on the tablet while someone else was using the family TV. I mean really, it was exactly what every 10-14 year old into Nintendo was talking about up until Nintendo actually made it.
Part of it was marketing, I remember a lot of people being surprised that the gamepad wasn’t what was being sold, but a whole console with it.
It’s crazy that it failed honestly but at the same time it’s totally understandable. You can’t try to be both a home console and a “portable one” when what’s portable is connected to the Wii 2. It was the genetic imprint that wanted to be everything the Switch became.
They got Spore right.
The only ones that look wrong to me are iPhone, Wii, and downloading movies from the internet. The rest are dead on. HD literally did exactly what they said, as soon as it got popular it was already on to 2k then 4k then 8k.
They said 64bit computing explicitly calling out the need for more 64bit apps. Dead on as well. When apple went 64bit only there was a huge uproar, even from me, because even in 2018 that was a huge problem.
Facebook they literally agree it’s fun and distracting but that it’s not revolutionary so there’s no reason for the hype. It’s the same thing as with TikTok.
I’m really surprised with how accurate it is honestly.
I disagree. HD lasted a super long time. That there would be a new standard after HD was never a question. As far as standards go it lasted a very long time and did about as good as any standard could.
64 bit was an absolute necessity. That it was a lot of work to switch to does not mean it was overhyped.
I don’t like Facebook but that doesn’t mean its success can be ignored. It became the biggest social network and was regularly mentioned in the same breath as Google and Microsoft, so I can’t see how it’s overhyped as much as I don’t like them.
The point is you should read the article rather than going off of headlines. Each thing in the list states as much.
The HD paragraph literally states exactly what happened. As soon as HD had made it to mainstream (actual tvs, laptops, monitors) it was already outdated. They were saying to not overhype it because it will keep happening. And they were completely right.
64 bit they were complaining about being overhyped because it was. Until you were able to get almost any app in 64 bit it was useless for all but the most tech savvy.
No one is ignoring the success of facebook. they’re saying that facebook as a social network was overhyped. It wasn’t the first, there was nothing remarkable about it. Just because Cavendish bananas are the most popular and most successful bananas of all time doesn’t mean other, very good, very tasty bananas didn’t exist before them. Cavendish bananas are just successful.
Nah, the iphone is also overhypped and overpriced
HD is still the standard for most? I don’t know of a single person who uses a 4k TV. 4K is still in early adopters phase.
To be fair, Spore was overhyped - it was fun enough, but not the total gamechanger that it was forecast to be. Will Wright had two amazing strikes with Sim City and then the Sims, and then a whole pile of very middle-of-the-road simulation games, so it wasn’t that hard to foresee.
And EEE PCs occupied the uncomfortable niche where they didn’t do a lot that your phone couldn’t, while being extremely limited compared to a £300 ‘proper’ cheapo laptop. That’s not really a business model.
So yeah, that’s two things that anyone could have seen coming, versus eight where they’re so massively completely wrong they couldn’t have failed harder if they tried. Would have been better to call this list ‘things which are not massively overhyped’, they’d have done better.
Honestly feels like satire reading this today
They were right about Facebook.
EEE PC!!! I miss the age of netbooks - I had a similar one, the MSI Wind - my favorite computer ever :')
I had a HP Mini311, Atom CPU and 11.6" screen, 3GB of RAM, wifi/bluetooth, and it had a dedicated NVIDIA GPU, it was impressive at the time in 2008, and a HDMI port, was able to read 1080p using GPU only. I loved the form factor. Since then I favour laptop with 13 or 14", I don’t want/need a 17.3" laptop with DVD and all!
HD only got even more HD.
That’s exactly what they were saying, no? “even more HD” = UHD.
Downloading movies from the internet (is wildly overhyped)
lol what does that even mean
They’re probably referring to streaming. I don’t know when this article came out, but considering they’re talking about literally the first iPhone I guess we can assume it’s 2007 or 2008, and Netflix started streaming back in 2007.
Apart from that, perhaps they’re referring to when Amazon started offering movies to buy or rent online, but I don’t know when they started doing that.
(amazing, everything you said was wrong)[https://youtu.be/2sRS1dwCotw]
I was going to say that I agree iPhones are smartphones are of no benefit, they don’t do anything, I’m being sarcastic, but looking closer at the list, how did they get it all so very wrong?
Don’t need 64 bit for more than 4GB? Every new computer should have 32GB and 64GB is not unreasonable.
Don’t need full HD? How does 8K resolution sound with 16K being developed?
I question their basic knowledge and experience with technological advancements for higher demand, more complicated work loads, and adcancements for security protections like 64 bit memory address randomization that can’t exist on 32 bit hardware.
Consumerism, seems like always kept overhyped.
While I don’t think the iphone is over-hyped I’ve started fantasizing about going back to a dumb phone. The LG enV 2 is still the best phone I ever owned.
- replaceable battery that lasted 3 days
- headphone jack
- physical keyboard
- fit in shirt pocket
- no bloatware
There’s definitely something to be said for classic simplicity!
You might want to take a look at what I use: https://github.com/Dakkaron/Fairberry
If I still needed to SSH into servers on my phone I’d be all over that.
Yeah, it’s really nice for working on a terminal. But it’s nice for writing text as well.
Since I love playing devil’s advocate, here’s a couple of points in their defense:
Multi-GPU videocards: Pretty much dead, it’s just not efficient.
64-bit computing: At the time was indeed slightly overhyped because while your OS was 64-bit, most software was still 32-bit, games in particular. So games couldn’t really use more than 4 GB of memory. And that was standard for multiple years after this article (this was 2008, 64-bit Windows had been out for ages, and yet 3 years later the original Skyrim release was still 32-bit. Games having 64-bit binaries included was a huge thing at the time) Now most software is 64-bit and yes, NOW it’s standard.
High definition: Depends, did they mean HD or Full-HD? Because the former certainly didn’t last long for most people. Full HD replaced it real quick and stayed around for a while. Of course, if they meant Full-HD then hell no, they were hella wrong, it’s been mainstream for a while and only now is being replaced by 1440p and 4K UHD.
iPhone: The FIRST one as a singular product really didn’t live up to the hype. It was missing features that old dumbphones had. Of course the overall concept very much did revolutionize the phone market.
Well to be fair, changes like switching to 64 bit always are very slow (especially if they’re not being forced by completely blocking 32 bit). But I don’t think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.
Well by 2008 we’d had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it’s not worth the effort in the personal computer space.
I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).
It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).