SEATTLE - OCTOBER 15: Members of the Seattle Sounders FC pose for the team photo prior to the game against Chivas USA on October 15 2010 at Qwest Field in Seattle Washington. The Sounders defeated Chivas USA 2-1. (Photo by Otto Greule Jr/Getty Images)
A topic that comes up frequently around these parts is the comparative lack of objective data in soccer. Statistically, soccer is where the other major US sports were 50 years ago: a small handful of counting stats that tend to measure outcomes rather than ability. But soccer does have one magic wand that no other sport developed which allows the transformation of a totally subjective game evaluation into a statistic: the player rating. Simply watch a game carefully (or not), stick your finger in the air, and then pick a number between 1 and 10 (or, in practice, 4 and 9), rounded to the nearest half, and call it a day.
Obviously there are some severe limitations with this as a statistic. It's entirely at the mercy of the perception of the rater, including that rater's understanding of the game, eye for small plays, prioritization of different types of play, etc. But hey, it's something. And despite its lack of verifiability, player ratings do tend to be pretty consistent across multiple publications, suggesting that there is some kind of consistent standard, even though there's no official guideline for how a performance is rated.
So I thought it'd be nice to put together the player rating data that is available for the Sounders. Most of the ratings come from Prost Amerika and Josh Mayers at the Seattle Times, supplemented by infrequent player ratings done on Goal.com (usually only nationally televised games).
Because of the inherent subjectivity in the data, my first concern was that there would be widely different standards among the multiple raters. While as far as I know, Josh does all of his own ratings, Prost actually has a number of different raters (including occasionally our own Dave Clark) as does Goal.com. Fortunately, there was a surprising amount of consistency. In games rated by more than one publication, the ratings for individual players almost never diverged by more than a single point. The one exception is Kyle Alm's ratings for the recent Seattle-v-Toronto game on Prost Amerika, which included 6 players with at least a 9. Obviously, Kyle has every right to use whatever standard he wants when rating players, but I think it's safe to say that that standard differs significantly from the consensus standard. So I removed that game from the data. Also, not every game was rated. Prost rated 80% or so of the games and so did the Times, and occasionally there's an overlap in the missed games (For example, neither publication has rated the two most recent league matches versus Chivas and Kansas City). With that out of the way, here are the results:
Keep in mind that reserve players were generally rated in games against other reserve players and therefore might be expected to have lower ratings if they had to play every day against starters (which helps explain the positions of Boss, Wahl, etc). Other that, I don't think there's anything too shocking in this list, although Leo is by far the lowest starter. I've noticed he's struggled a lot more this season and that seems to have been picked up by the raters.
I'll follow up in a couple of days with the addition of salary stats and a look into who's giving us the most for the money