I was thinking the other day about online review systems, how unreliable and flawed they can be, and how they could be improved. In many ways we have to look at them with the same critical eyes as research, understanding how the methodology affects the result we are given, and being careful to make the right judgement.
Except most people don't have time (or often the inclination) to read 100 reviews to gauge the reliability of a product's reviews, so they short cut by using the average, or by reading a select sample and making a judgement call as to their accuracy.
One magazine that discussed the flaws with reviews brilliantly was EDGE (quality gaming magazine in the UK), they usually score games out of ten, but for one issue they left scores out of the reviews entirely, looking to their audience to work out for themselves based on the text rather than the score. The reactions varied wildly from "This is brilliant, we are intelligent" to "Where the bloody hell did my scores go??", as expected, perhaps they should have scored their score system...
So what about using a system like the Net Promoter Score as the basis for scores? A simple number that shows likelihood to recommend rather than a subjective measure of the films quality, which will vary slightly for every reviewer.
I.e.: Taking those who love a film, and subtracting the number who are likely to say it's not very good to give a 'promoter' score. Usually based upon a score out of ten as to whether you would recommend a film:
10/9 = Recommend, 8/7 = Not too Bothered, 6-0 = Would not recommend.
So (if 100 people scored) a film everybody loves would get 100, a film everybody hates would get -100. No real change at the extremes of course.
Let's take a film where half the people love it and half hate it. If 50% score it 10 out of 10, but 50% score it 0 out of 10, it will have an average score of 5, which would lead people to believe it's rubbish, unless they went in and read the scores in more detail.
That same film with an NPS system would score 0. Meaning people are equally likely to recommend it as dismiss it. A subtle but meaningful difference over a review score that says 'this is average'.
Although perhaps it's too subtle. Thinking about it some more made me think about You Tube, and Rotten Tomato's system of a simple good/bad score, which creates a similar recommendation score, which in most cases is more reliable an indicator of quality than averaged review scores.
Maybe splitting the output into 'Recommend' and 'Dislike' percentages would make more sense. A film with mixed scores is divisive, while a film which is average has low scores for both, suggesting it doesn't create a strong response.
By focusing on likelihood to recommend rather than a 'review' score which may be quite arbitrary, we might end up with a more reliable figure to make a snap judgement on. Of course this is just a quick exploration of the idea, but I thought it was an interesting way of looking at it. What do you think?
No comments:
Post a Comment