Tuesday, April 9, 2013
Lessons of #NBArank
By Adam Reisinger
Ron Elkman/Getty Images
Ranking the NBA's finest, learning surprising things.
In the summers of 2011 and 2012, ESPN.com has asked a panel of experts to rate every player in the NBA. The scope wasn’t quite as wide for this week’s “in-season” #NBArank, but even the smaller-scale project produced a lot of insight. After each voter rated each player, 9,546 individual ratings had been processed. And while simply averaging each player’s rating and producing a ranking from that is the backbone of #NBArank, there are countless other ways to parse the information to gain insight on how the panel of experts views the players involved.
The first important thing is to not get hung up on the actual rank, which may seem counter-intuitive for a project called #NBArank. For example, the players ranked 20th through 30th were separated by just .208 ratings points. A couple of changed ratings by a handful of voters here and there could’ve completely reshuffled that group. In fact, had one voter given Paul Pierce (30th) a rating a point higher and Zach Randolph (29th) a rating a point lower, they would’ve flipped spots. That’s how close things were in that range of the rankings.
Interesting: The group of elite players is shrinking (or the panel is evaluating players more critically as time goes on):
- In 2011, 22 players rated an 8 or higher.
- In 2012, that number fell to 19.
- The ratings compiled near the end of the 2012-13 season produced just 16 players rated 8 or higher.
Injuries may have been a factor in that. Seven different players -- most of whom are currently expected to miss the rest of the season -- received a zero. It’s also a big contributing factor in Derrick Rose’s fall from No. 5 to No. 23 and Andrew Bynum’s fall from No. 13 to dead last among the players rated this time around. Bynum received a dismal 4.84 average rating, which would’ve ranked him 146th in the offseason (for perspective, it’s the same rating Ramon Sessions received).
Injured players also accounted for the top three biggest drop-offs from August: Andrew Bynum (-73), Steve Nash (-39) and Danny Granger (-36).
On the opposite site of the spectrum, 25 different players received a high rating of 10, including three players who didn’t crack the top 30 in the rankings. The voters, though, remained stingy with the 10s. While every voter handed out at least one (and one voter handed out 15), on average voters gave 10s to just under four players. Three injured players -- Kevin Love, Rajon Rondo and Derrick Rose -- received at least one 10 and at least one 0.
Obviously the player on whom the panel agreed the most was LeBron James, who received a 10 from every voter. Using standard deviation, we can find that the next most agreed-upon players were Kevin Durant (who received exclusively 10s and 9s), Russell Westbrook and James Harden.
Excluding injured players, the player with the largest standard deviation was J.J. Hickson, who had a high rating of 8, a low rating of 1, a median of 6 and a standard deviation of 1.27. So good luck trying to calculate what his contract will be this offseason.
Last, but not least, special mention needs to go out to Jimmy Butler. The second-year Bulls guard made a leap in the ratings usually reserved for rookies (who are sometimes under-ranked in the preseason version of the rankings, with no NBA data to go on). Butler jumped nearly 300 spots, though he was helped because only 86 players were rated this time. It’ll be interesting to see how Butler holds up against the full player pool rather than the reduced one the panel worked with here. Butler’s rating of 5.29 was good for 80th here, but would’ve only placed him 108th this past summer.