22 April

Any Decent Music?--its title really needing that question mark in order not to seem completely depressing--is another "meta" site with a slight explanation of how it determines the numerical scores it gives. Here's the answer to their self-asked question ("What's that A D M rating all about?") on the About page:

"It's not a straight average, i e the total ratings divided by the number of reviews. We have a formula that is weighted to take into account the number of reviews an album receives, which gives an advantage to albums receiving more reviews.

So an album which receives five 8/10 reviews will have a lower rating than an album with 25 8/10 reviews, which seems right to us. And an album would need more than 30 8/10 reviews to get a straight ADM rating of 8.0 (although it could achieve that rating with a range of 10/10, 9/10, 8/10, 7/10 etc reviews). All clear? Good."

Yes, it's clearer than Album of the Year and Best Ever Albums. Though I wonder how they could justify not giving an album a score of 8.0 if it has received 20 such reviews. Nevermind. 

Unfortunately, they repeat this newly-common Web-only mistake of deciding an exact numerical score (in increments of 1/10--say, a 8.0 or 7.5--pace Pitchfork, whose writers we can only hope originally took that approach as a sly self-parody, but sadly they were probably extremely serious) for reviews that did not give a numerical score. Since I'm severely rebuking the sites that engage in this practice, let's consider just how ridiculous it is. A reviewer at, say, the B B C goes to the trouble of writing a review, a few paragraphs in length: nothing special, given that he probably got the album free of charge. Nonetheless, it suggests engagement with the music, and the challenge of writing about music. Then, another non-reviewer reads the review, decides that it suggests a numerical score 6.0 or 5.8 out of 10. What inspires this non-reviewer other than anti-intellectual, money-grubbing (or perhaps just attention-seeking) arrogance? At least in the past "meta" reviewers of reviews, in order to provide a quick take for readers on the go, would say that a certain reviewer gave a positive or negative assessment. That approach is still not worthy of the original review. It works best if the reviewer uses an even-numbered system (a score out of four, as with Roger Ebert's four-star system, giving him an easy way to say which movies got a "thumb up" (3 or 4 stars) or "thumb down" (1 or 2 stars)). Either way, it's at least not offensive or dumb.

Perhaps the creators of these sites would say in their defense that most of the sites they're compiling do give numerical scores, especially those online; some sites even imitate Pitchfork's tenths nonsense. Perhaps someday an academic will study this phenomenon: for example, have newspapers and magazines switched to giving numerical scores in response to the proliferation of e zines doing so? Will documentation surface from these companies' archival records showing that editors chose to give numerical scores based on the felt need to compete with Pitchfork and others?

Ultimately, Album of the Year, Any Decent Music?, and Best Ever Albums pale in comparison to the archival behemoth Acclaimed Music, whose author, Henrik Franzon, is incredibly thorough. He's been at it much longer, is not trying to make cheap money like the other sites, and acknowledges the importance of another site, Rocklist, which as I've noted before at this blog is an excellent source of lists from a goodly variety of publications.