Main Site

Rating process and guiding parameters


#1

I assume there is a wide range of scoring criteria when I look at different users ranges and rating distribution. I’m curious how everyone rates things on the AOTY scale (or even on a 5 star or 1-10 scale). Do you have a systematic approach or is it a gut reaction? If systematic, what are the parameters that make something an 80 instead of a 75? Are you looking at song ratings within an album to determine the score and if so what about those albums that have absolute favorite songs but overall the album is just ok or worse. What are your thresholds for certain scores? Do you have reference comparison points like a favorite album, an album you’d tolerate, and an album you’d turn off if the person who put it on left the listening space?

I’m also wondering if it would be helpful to standardize the system in some way so that everyone means roughly the same thing when they rate something a 70. Good reads has a word associated with each of the 5 star ratings. Metacritic uses a color coding so that 7-10 is green, 4-6 is yellow, and 0-3 is red. Do you associate any words with certain scores? I’m thinking of using this as my personal guideline here:
100 = perfect in every single way
90 = amazing, top favorites
80 = great, enjoy most of the album
70 = good, but has strong flaws or multiple songs that are just mediocre
60 = ok, but wouldn’t choose to listen to it or recommend it
50 = meh, don’t care at all, all around mediocrity not worth hearing
40 = don’t like
30 = Hate it
20 = So terrible I’m amazed anyone could like this
10 = torture
0 = not music by any standard I can think of / I despise this artist in every way

I’m new here, but definitely not new to rating music. I’ve been collecting music since the 80’s but have struggled with this since I began a digital library with tags and rating systems in the early 2000s and I’d love to know how others approach this valuation process. At the moment I’m using this site mostly to keep track of favorite albums from all time and new releases that I think are good or better, so I don’t really plan on rating all the bad stuff I come across. In my digital library I don’t even bother keeping songs that I rate a 1 or 2, because who has the time to waste on music that you don’t think highly of?

Lastly, if different critics or publications use different scales or systems, how does an aggregator like this reconcile that when showing a critic score?

I’d love to have a discussion that covers all these ideas. Thanks!


#2

I’m testing a new system, using track ratings. I’ve struggled for years to get something I feel truly reflects my tastes and overall enjoyment of an album, but this finally seems to hit the spot. It involves a little mathematics, but nothing too serious. So I start with the overall average score of the tracks on the album, with the below scale:

10- Perfect
9- Favourites
8- Great
7- Good
6- OK
5- Indifferent
4- Meh
3- Bad
2- Why?
1- Please No.

In a way it’s similar to the one you posted. Appart from the inclusion of an indifferent at 5, which allows for those short filler songs, or music that gives neither a positive nor a negative feeling. It means the bad doesn’t really start until 3, with slightly below average (4) still being somewhat bearable.

I take this average, multiply it by 20 which gives me a score out of 100. I then add 2 points for every 10/10 song, a point for every 9/10 and half a point for every 8/10 (with 2 songs needed to validate an extra point, so one 8/10 song on an album does not change anything, 3 gives one extra point…). I then dock a point for every 3/10 or below song. This gives me a score out of 100 that both reflects the overall consistency of an album as well as its highlights and its flaws.

For further fun, I add all 10/10 and 9/10 songs to Spotify playlists. The former is currently at just under 150 songs from 376 albums and the latter at 700.


#3

Nice! Thanks for the in-depth reply! I like your secondary points system which gives some weight to the best and worst tracks. Seems like a good way to make sure that filler tracks don’t bring down the overall score too much.

I similarly make 2 “best of” playlists for each year. The first is all 5-out-of-5 tracks and the second is only the best 1 or 2 songs from each album. That second one, usually around 100-150 tracks, then gets created on Spotify. Well, the last few years lists are on there at least. I still need to duplicate the older playlists on spotify, which I don’t actually use much.


#4

I have a top 12 singles playlists from each year from 1954 to 2018, I’ve had to create a spreadsheet for them to keep track of it all. I also have genre ones, mainly just singles though, I have my favourite albums saved as whole. There’s much fun to be had with playlists. I few years ago, I did some genre guides which were fun to make.


#5

Something I like to do, since there are 100 numbers at your disposal for these ratings is:

Create numbers for underrated or “hidden gem” albums
(for me it is 68 and 78)

Create numbers for varying conventions that exist for rating an album;
(48 for boring albums, that aren’t bad just there aren’t anything good about them either or “meh”)
(58 for albums by artists you truly enjoy, that you really wanted to like but overall it just isn’t there)
(87 for albums that you considered brilliant but which have become worse with passing years)
(88 for albums that take a lot of listens to appreciate or “growers”)
(89 for albums that are so close to being masterpieces (90 upwards) except for one glaring detail)
etc etc.


#6

That’s a cool system of specific significance parameters! I dig it. I’ve mostly been using the scale as a 1-10 system where and 85 is essentially an 8.5, so not quite good enough to be an amazing top favorite but more than half of the album are favorite songs. I might use your method and make 73 represent albums that are pretty mediocre except for a song or two that I absolutely love.


#7

I think you need to take everything with a grain of salt. My ratings genuinely reflect how I feel, but theres a standard error of +/- 4 or 5. A lot of people will argue that critic scores are bullshit and to “just listen to the music for what it is:)”, but we all use this site because we listen to… a lot of music, so there needs to be some sort of quality control when picking and choosing what to listen to next. For this reason I keep my rating scheme flexible and let the scores btw/ different albums (and songs) reflect how I feel.