Star rating systems don’t accurately convey opinions. The majority of reviews will be either 5* or 1* with only a few wannabe critics voting in between applying their own arbitrary votes.
If Amazon are going to change things then why not adopt something more meaningful. Simple up/down votes for things that actually matter.
Was this product as described: 👍/👎
Are you satisfied with the quality: 👍/👎
Are you satisfied with the value for money: 👍/👎
Then a few optional questions for things that aren’t relevant to the product such as postage/packaging etc.
Star ratings are very broken, because everyone seems to think of the rating differently. IMO the criteria should be like this:
⭐️ ⭐️ ⭐️ ⭐️ ⭐️ = The best thing ever.
⭐️ ⭐️ ⭐️ ⭐️⚫️ = Above average.
⭐️ ⭐️ ⭐️ ⚫️ ⚫️ = It’s ok. Get’s the job done. Not great, not terrible. The usual. Nothing special. Totally average.
⭐️ ⭐️ ⚫️ ⚫️ ⚫️ = Below average. I’m disappointed.
⭐️ ⚫️ ⚫️ ⚫️ ⚫️ = Worst thing ever. Crime against humanity. Ban this product and burn the remaining stock immediately.
However, in reality people tend to use it like this.
⭐️ ⭐️ ⭐️ ⭐️ ⭐️ = It’s ok. I don’t have anything to complain about. Could be ok, could be great or anything in between.
⭐️ I’m not happy. Minor complaints, big complaints and anything in between.
When it comes to book reviews, the five star reviews tend to be useless. Especially when it comes to self help books, it seems like those reviews were made by people who are completely incapable of noticing any flaws in the book. I’m inclined to think those people shouldn’t even review a book if they can’t think of it critically. On Amazon there are always lots and lots of fake reviews produced in a click farm, but in other places you’ll also find genuinely incompetent reviewers too.
My favorite is the one star review when it’s a shipping issue. “Look at this, the box is ripped!”
LOL, if only you could rate the shipping company… those reviews could be fun to read.
Even better: the one-star review on the pre-order page complaining that it’s not out yet!
Five stars also means “this was just delivered to me and I haven’t even used it yet”.
Oh, totally forgot that one. The box exists -> instant 5 stars.
I think the problem is partially the fault of companies that insist, at least where rated interactions with employees is concerned, that every interaction should be five stars, which in a system where the stars are all meaningful like this, is simply not realistically possible. This gives people the sense in general that rating anything that you don’t completely dislike anything less than 5 stars is a bad thing to do, because it risks hurting some employee somewhere who doesn’t deserve it.
Tbh, that different understanding doesn’t matter much if you have enough reviews since it averages out.
If you compare two products with one review each, then yes, it hugely matters whether the one reviewer considered 5 stars as “expectations fullfilled” or “the best thing that happened to me ever”.
But if you got >1k reviews, both sides will get equal amounts of both reviewer groups and it will average out so that both products will be compareable based on their stars.
That’s a big misunderstanding many people have in regards to reviews. Many people are also afraid that unfair reviewers will skew the score. But since these unfair reviewers are usually spread equally over all products, it averages out as soon as you have a couple hundred reviews.
And that’s also what that article criticises. It’s much more important how many reviews an article has than the exact value. It’s easy to get a straight 5-star rating if you have only a single review. It’s much harder to do so if you have 10k reviews.
So the information value is: <100 or so reviews, the rating means little to nothing. >1000 reviews it can be usually trusted.
What it’s really useful for, however, is reviews.
People don’t buy 4-star reviews, and angry people don’t give them. Those middle 3 stars are the reviews with reading.
Can confirm. Especially when reading book reviews, I always skip the 5 star renews and focus on reading the others.
To be fair, I think it is reasonable to rate things you have no complaints about as high as possible. If I see a rating with three stars, I assume that it was okay with a few rough spots. I like the idea that all products start out as five stars unless there is something really wrong, and you start knocking points for problems.
Then how do you tell the difference between a good product and a great product? For example, imagine you’re comparing an electric screwdriver from Lidl (parkside) and Boch. Should they both get 5 stars because nothing is broken and they get the job done?
Lidl screwdriver = adequate for occasional DIY, 3 Stars
Bosch screwdriver = can be used as a substitute chisel or crowbar every day for a lifetime, 5 stars
This is good. I would consider adding “Would you recommend to a friend” but that may be redundant given what you have already.
I also found that an odd question.
"you ordered a two pack of Durex extra small and a packet of malteasers. Would you recommend them to a friend? "
No. Because who the hell recommends stuff…? Unless it’s something truly unique im not going to recommend it
I get where you’re coming from, but my wife did really appreciate a recommendation for a feminine hygiene product (menstrual cup) from a family member. We rarely recommend products of any sort, much less inexpensive products, but my wife has already recommended that product multiple times.
That said, I think it’s a poor question. I wouldn’t recommend a feminine hygiene product to my single, male friends, nor would I recommend that point and click adventure game to coworkers who mostly play competitive FPSs. I’ll even recommend “bad” products if I think it’ll fit whatever niche my friend wants to fill (e.g. that crappy screwdriver may not be good for screws, but maybe it’s decent for an art project).
Maybe a better question is: would you buy this again if it was lost/stolen? As in, did this fill your need enough that you wouldn’t look at other products?
“Would you recommend this to a parallel universe instance of yourself in a similar situation, who is not yet decided on a purchase?”
Haha you’re right, it probably doesn’t really apply to most things. I do get asked what I recommend more than the average person because my friends and family know I research purchases too much. I hate buying junk.
Yeah I was using the Microsoft 365 suite for work and it asked if I’d recommend it to a friend. No itd be a weird thing to bring up in conversation with a personal friend to recommend.
Ah yes, Amazon stars. Constantly gamed by companies.
Amazon thinks useless ratings are what makes Aliexpress successful.
This is the best summary I could come up with:
Amazon is testing a new way to show star ratings for products in search results that is more difficult to parse at a glance, as reported by Android Police.
I don’t like this change because it gets rid of the easily glanceable stat of the volume of ratings in favor of the percentage of how many are five stars.
A statement from Amazon didn’t confirm that it’s making the changes shown in Android Police, but it did leave open the possibility that it might.
“We are always innovating on behalf of customers to provide the best possible shopping experience,” Amazon spokesperson Maria Boschetti said in an email to The Verge.
(Note that Amazon’s star ratings aren’t a simple average but are calculated using “machine-learned models,” according to a support page.)
While writing this article, I’ve warmed up to my colleague Sean Hollister’s proposal that Amazon get rid of star ratings altogether.
I’m a bot and I’m open source!
I just noticed this.
As others have mentioned the stars have been largely useless in the last little while so to be honest I’m not sure this has any impact. Even sites that try and give a rating based on fake reviews are not helpful because so many reviews are faked. The only helpful part is to try and read negative reviews.
I imagine this star fiasco is something that’s easy for browser plugins to reverse.
I would love to see AI and Machine Learning used to filter out fake reviews. This would actually be useful.
Use LLMs and machine learning to detect the reviews created by LLMs and machine learning.
deleted by creator
True, but that’s also a well-known machine learning technique called adversarial training, often used in Generative Adversarial Networks (GANs) or when teaching a model to play games like chess or Go.
deleted by creator
they removed the number or ratings.
I know most of the high counts are bots but 5 star means a lot of things based on the count of ratings.
I find it a bit difficult to take this seriously when that page is literally 70% ads.
Tbh if I’m making a quick purchase that doesn’t matter much to me, I just get the one with the highest purchase number or whatever is cheapest. If I’m purchasing something pricey or that I care about… I’m looking for actual reviews elsewhere before buying anyways.
It’s still worth a glance at some reviews, as I’m finding the popular well-rated products sometimes have reviews for different products entirely. Like they took over a genuinely good listing and replaced the item with a different one.
Yup, and that’s essentially what I’m looking for. I read critical reviews, positive reviews that don’t seem like shilling, and maybe a few recent reviews.
It would be even better if Amazon got their act together and penalized companies that introduce significant changes to the same product listing. For example, Wi-Fi dongles sometimes change the Wi-Fi chip, which changes the performance and compatibility. I’ve read reviews on exactly that type of product and chosen a different one because the replacement was worse than the original.
I just don’t trust Amazon listings, and it has saved me enough times that taking the extra effort is worthwhile.
Its why I think companies need to pick up the Valve model of reviews in the area of splitting overall score and recent review average. Recent review reflects whether if something changed and is bad, then the product would rate poorly, despite overall reviews.
Absolutely. That simple change would do wonders.
This.
Sellers are able to entirely replace the product page with a completely different product. Give the reviews a once over to avoid getting something you think is good quality.
I tend to read the complaints with dates of the 1-stars instead of anything the 5-stars say, to see if they’re actually any good or have reocuring problems, particularly recently. Anyone can buy good reviews, but complaints are free.
ReviewMeta can also help filtering out fake reviews.
I like to focus on reviews with pictures.
Looking at the screenshot in the article, putting the numeric score next to the star doesn’t seem more difficult to read to me.
They removed how many ratings the product has received. So a 4.5 star rating from 100,000 people means that a good few people had issues or didn’t like it
A 4.5 star rating from 3 people makes you suspicious of the seller
losing the view-at-a-glance quantity of reviews was really annoying when i had to shop for some things last week.
ended up chasing me to newegg for the first time in forever. was done in 5 minutes (including running to a different pc and booting it up, to complete an unexpected 2fa), and no iffy marketplace sellers getting in the way. got exactly what i was looking for from who i wanted to buy from (the site, not a third party). a split shipment all delivered in 2 days to the boonies, cheaper and faster than amazon.
the difference in experience sent me back to newegg–this time as the first choice, for a little part order i would have normally got from whatever highly-reviewed amazon marketplace seller was cheapest.
So 4.5 is bad in both of those scenarios? So how is the number of people having rated matter?
Personally I always look further than the AVG number anyway. As other commenters have said none of this means anything if you don’t check negative reviews for reoccurring issues.
5.0 ⭐️ (100% 5 stars)
Could be:
5 ⭐️ (1 review)
Or
5 ⭐️ (12,345 reviews)
Knowing the volume provides extra insights. If nothing else, at least when glancing at the list of search results, knowing the volume can allow a slightly faster zero in on the items to dig deeper.
I agree, just the comment before read as ‘i don’t trust 5 stars whether it’s a lot of votes or very few votes’ - so I wasn’t sure why it mattered how many votes it had based on the comment.