Author Archive

Mechanics of the Shift

Earlier this week, 538 put out an article on Ryan Howard, arguing the shift had killed his career…

Rather than the fact he was 37 years old and could not hit or field.

The article paints a picture of a stubborn player who refused to adapt when the league had figured him out:

While some hitters try to overcome the shift with well-timed bunts or tactical changes, Howard always stubbornly refused. “All you can do is continue to swing,” Howard said in a 2015 interview with MLB.com.

Howard’s stubbornness is contrasted with a link to an ESPN article about how a similar slugger (David Ortiz) learned to adjust, and imagines an alternate shift-free universe where Howard remains an MVP threat and HoF material.

This is crap.

Ortiz did not “figure out” the shift. He is a good hitter, who ran a 13% strikeout rate last year. Howard’s is over 28% for his career. I’m sure that the shift hurt him to some extent, but Ortiz and him both had BABIPs around .300 for their careers. He could make that work when he was hammering 40-plus homers, but take that away and there’s not much left. My guess, old age is what did him in. But this lead me to wonder, how does the shift actually work?

Many people treat the shift like some mystic boogeyman, out there to either ruin the game, or certain players in particular unless they “adjust.” As a Twins fan, I know many people who blame Joe Mauer’s decline on the shift.

Personally, I would like to just throw this chart out there:

Groundball BABIP
2017 0.240
2016 0.239
2015 0.236
2014 0.239
2013 0.232
2012 0.234
2011 0.231
2010 0.234
2009 0.232
2008 0.237
2007 0.239
2006 0.236
2005 0.233
2004 0.235
2003 0.215
2002 0.224
Average 0.234

This is the MLB BABIP on groundballs over the last 16 years. Notice how it didn’t go down at all. I don’t have the numbers to prove it, but I think we all know shift usage has exploded since 2002. Not a huge change in ground-ball outcomes. So where has it changed the game? A decline in line-drive BABIP over time. However, counteracting that’s the fact that fly-ball BABIP has gone up. Again, to the charts!

Season liner flyball
2017 0.675 0.126
2016 0.682 0.127
2015 0.678 0.129
2014 0.683 0.123
2013 0.683 0.149
2012 0.682 0.152
2011 0.695 0.143
2010 0.719 0.124
2009 0.722 0.138
2008 0.698 0.150
2007 0.732 0.129
2006 0.713 0.138
2005 0.700 0.126
2004 0.709 0.117
2003 0.743 0.095
2002 0.733 0.083
Average 0.703 0.128

I wondered if some “line drives” of the past were simply fly balls that landed for hits, while outs were labeled “flies.” I don’t actually know if that’s true, if the process where line drives/fly balls are defined has been altered, but I decided to take a look at combined “air-ball” BABIP to see if it has changed over time. So here is the BABIP on all non-ground balls:

2017 0.324
2016 0.335
2015 0.339
2014 0.335
2013 0.338
2012 0.339
2011 0.331
2010 0.332
2009 0.340
2008 0.339
2007 0.335
2006 0.343
2005 0.350
2004 0.332
2003 0.349
2002 0.330
Average 0.337

2017 is pretty clearly an outlier, but considering less than half the season’s in the books so far, and I have no idea how “air-ball” BABIP moves over the course of a season (more hits find grass when weather is warmer? no idea), I wouldn’t put too much stock in that just yet. Another option I had considered was that maybe the breakdown of line drives vs fly balls has changed over time. Since 2002, 36% of air balls have been line drives, and while some years are higher and some lower, there doesn’t seem to be any particular “trend” with respect to that number; the first eight years average 36% and the last eight have as well.

I know the shift has an impact on run scoring in aggregate. But in my opinion, skyrocketing strikeouts and the home-run explosion are the markers of the modern version of this nation’s pastime, not on which side of second base the shortstop stands.