Subjectivity Objectified: Measuring Fans’ Biases with All-Star Votes

It doesn’t take a hardcore sabermetrician to realize that the All-Star vote is a sham. After all, the undeniable best catcher in the game received only the 11th-most votes at his position, and Omar Infante made the cut while MVP candidate Ryan Zimmerman had to sit at home (not the fans’ fault, but still).

But even if it’s impossible to distinguish the game’s best players by looking at the vote totals, I wondered if it would be possible to gather some more unorthodox information from the results: namely, the impact of fans’ biases on their ballots.

I quickly scratched out an equation for a statistic I made up, called “All-Star Score,” to measure how deserving a player is of fans’ votes for the Midsummer Classic:

All-Star Score = (Wins Above Replacement* + 2) ^ 2

*—numbers as of the All-Star Game

I calculated the All-Star Scores for each player listed on the ballot and added them together. I then added up the total All-Star votes cast (Major League Baseball releases the vote totals for only the Top 25 outfielders and Top 8 vote-getters at other positions per league, so I used 300,000 as a baseline for those players whose results were not available) and divided that by the composite All-Star Score to find out what the average All-Star Score Point was worth (just under 74,000 votes).

Finally, I calculated the votes-per-All-Star Score points ratios for each team, then divided that by the league average to get an estimate of what proportion of votes each team’s players got relative to what they deserved. The numbers below show each team’s relative figure as a percentage—a “Bias Score” of 100 would mean the team received exactly the right amount of support (of course, no club came out at 100).

I’m fully aware of the flaws in my experiment: the statistics used were compiled after the voting, not during it; I’m sure my 300,000-vote estimate for the lower-tier players is extremely generous to some and a big low-ball to others; and, of course, there’s no guarantee that my little equation represents the ideal proportion of All-Star votes a candidate should receive.

Nonetheless, I think the results are both somewhat meaningful and interesting:

Tier 1: The Unloved (79 and below)

1 White Sox 47
2 Royals 47
3 Athletics 48
4 Padres 49
5 Giants 50
6 Cubs 56
7 D-Backs 57
8 Blue Jays 59
9 Indians 59
10 Nationals 59
11 Orioles 60
12 Rockies 66

If you look at the vote totals, seeing the Royals and A’s at the top of the list shouldn’t come as a surprise: they’re two of the three miserable teams that didn’t get a single player on the voting leaderboards. Meanwhile, the starting nine for the Orioles—the only other club to be completely neglected—have been so bad that Baltimore landed in the middle third of the Bias Scores despite having the absolute minimum number of votes. Ouch.

It’s no surprise to see struggling teams like the Indians and Diamondbacks fall this low, but I would have expected Padres, Blue Jays, and Nationals fans to show their favorite players a little more love in light of their teams’ expectations-beating early performances. And I’m shocked that the Rockies haven’t been able to generate more excitement, what with their recent string of comeback wins in playoff races.

However, I’d say the biggest upsets here are the teams from Chicago—particularly the Cubs. North Side fans have a reputation of being among the most loyal and passionate in baseball (after more than a century without a championship, they’d have to be). It’s a telling sign that something is very wrong in Wrigleyville.

Tier 2: The Average (80 to 120)

13 Marlins 80
14 Pirates 81
15 Reds 84
16 Red Sox 90
17 Astros 102
18 Mariners 114
19 Rangers 120

The first team that jumps out at you here is Boston: how can Red Sox Nation be classified as a relatively unbiased fanbase? Take a look at the leaderboards and it becomes clear. Adrian Beltre finished behind Michael Young, Kevin Youkilis got barely half the votes of scuffling Mark Teixeira, even local hero David Ortiz fell behind the anemic Hideki Matsui. Derek Jeter has been better than Marco Scutaro, fine, but does he really deserve six times as many votes?

Two teams in this grouping redefine pathetic. A 20th-place finish for Andrew McCutchen is enough to put the Pirates squarely in the middle of the pack because their eight candidates have combined to be of less value than Dan Uggla. Astros fans, meanwhile, turn out to have a positive bias because of Lance Berkman’s eighth-place finish at first base. That’s what happens when your team has a negative composite WAR.

The two AL West teams are both interesting cases. The Mariners don’t have much of a reputation for a strong fan base, but people love Ichiro and the now-retired Ken Griffey Jr. raked in over a million votes. Given that the Rangers have the third-highest team vote total in the game, you might expect them to have a far higher Bias Score. But you might not realize that Texas also has the third-highest composite WAR.

Tier 3: The Coddled (121-150)

20 Tigers 126
21 Angels 129
22 Dodgers 129
23 Cardinals 134
24 Brewers 138
25 Mets 146

Most of these names were pretty predictable. The Brewers are probably the most surprising team to be ranked this far up. Their high score is entirely the fault of Ryan Braun, who led all outfielders with just under 3 million votes despite a significant offensive dropoff and horrific defensive, even by his standards.

Tier 4: The Overindulgent (151-190)

26 Braves 159
27 Rays 163
28 Twins 171
29 Phillies 181

Eight years ago, the Twins were on the verge of falling victim to contraction. Three years ago, the Rays had never finished a season with more than 70 wins. If you’d said then that both teams would soon have some of the most passionate fans in baseball, you would have been laughed out of the room.

Tier 5: The Insane (191 and up)

30 Yankees 199

I’m sure some commenter will accuse me of writing this article for the sole purpose of blasting the Yankees. I’ll say here for the first and only time that, while their coming out on top was somewhat predictable, this is just how it happened.

Just look at the vote totals. A-Rod over Beltre two-to-one, Curtis Granderson over Alex Rios by a nearly three-to-one margin, Teixeira over Paul Konerko almost five-to-one, Jeter over Cliff Pennington by over 10-to-one. Is there any logical explanation for that? And this isn’t even taking into consideration Nick Swisher’s Final Vote victory over Youkilis.

I’ll be the first to admit that this isn’t a definitive study—the rankings would surely be shuffled around if the full, precise vote totals were available (especially towards the lower end), and I don’t think anyone believes for a second that fans in Houston are more loyal than their counterparts in Boston. But I still think the results are somewhat telling, so in the future, fans in Minnesota and Wisconsin might want to think twice before complaining about East Coast bias.

Lewie Pollis is a recent high school graduate from outside of Cleveland, Ohio. He will be attending Brown University starting in Fall 2010. For more of his writing, click here.

We hoped you liked reading Subjectivity Objectified: Measuring Fans’ Biases with All-Star Votes by Lewie Pollis!

Please support FanGraphs by becoming a member. We publish thousands of articles a year, host multiple podcasts, and have an ever growing database of baseball stats.

FanGraphs does not have a paywall. With your membership, we can continue to offer the content you've come to rely on and add to our unique baseball coverage.

Support FanGraphs

Lewie Pollis is a sophomore at Brown University. For more of his work, go to He can be reached at Follow him on Twitter: @LewsOnFirst or @WahooBlues.

newest oldest most voted

This isn’t particularly complicated. Put the Red Sox anomolous finish aside, and there are two factors at work here. The one that everyone would like to point out is the population density of the Northeast, and that’s all well and good, although I’d like to point out that Dallas is the fourth largest metro area in the U.S.; and then after Philly at number five, the next four are Houston, Miami, Washington, and Atlanta, before Boston comes in. Generally, the Rangers, Astros, Marlins, Nationals, and Braves aren’t considered “big market” teams. The much more important point is the relative popularity… Read more »

Turd Ferguson
Turd Ferguson

The NFL is more popular than the MLB, even in the Northeast. That’s not even up for debate.


JimNYC, while I agree with your sentiments, it should be noted that the Florida game featured Strasburg and Nolasco. Still, that’s no reason for 10,000 empty seats.


One idea that I had to limit the bias of specific fans; have fans vote for the best players on each team in the league. If these players were all taken, that is 16 spots in the NL and 14 in the AL. Let the players still vote on the top position players. Add those who were not already included. Then allow the coaches of the team fill out the rest of the roster. Another thought on the coaches for the all-star game. I think that the 3 division leading teams should have at least one of their coaches on… Read more »


Turd, remember when the Patriots won their first Superbowl? Remember what the crowd was chanting during the victory parade? Yankees Suck. When the Giants won the Superbowl a couple of years ago, and they had their tickertape parade down Broadway, “Red Sox Suck” banners were hanging out of the windows of the office buildings along Broadway. During football season, the back pages of the papers in NY are more interested in potential MLB signings than what’s actually going on in NFL games happening at the time. Trust me, the NFL isn’t a patch on MLB in the Northeast.


JimNYC, Passion and popularity are two different things.


I don’t think popularity of a sport relative to other sports can be well measured by anecdotes.

That being said, I live in New Hampshire, and it seems to me that overall people are more into football than baseball here (i wish it were the other way around).

I enjoyed the article Buizly