clock menu more-arrow no yes mobile

Filed under:

Lies, Damn Lies, and Defensive Meltdowns

The Vancouver and Colorado games were unprecedented, team-wide failures for the Sounders defense. Still, characterizing defense in the numbers outside of such extreme circumstances is often difficult.

Steven Bisig-USA TODAY Sports

Roles and Numbers, redux

I have recently argued that rates of accumulating defensive stats may be a useful measure of player ability or style. Reasonably, better positioning, range, or ability in the tackle should be reflected in the numbers. I also pointed out that the opportunity and role given to an individual player strongly influences the numbers - and that doing so requires more care than simply naming their position on the field. Take Servando Carrasco as an example, who played only in central midfield for the Sounders. However, Osvaldo Alonso wasn't always his CM partner in those games.

Def1_medium

Carrasco's defensive rates and passing rates were marginally greater in games where Alonso was not his primary partner, translating to about 4 extra defensive actions for every 90 minutes. I doubt that Carrasco was a superior player when playing without the Honey Badger. More likely, he took on more responsibility on both sides of the ball in that particular situation. Defense is a team activity, and the numbers tend to accumulate on particular roles due to strategic design as well as skill. If we look at defensive statistics on a team basis instead of the trends of individual players, can we learn anything about these numbers in a way that would not be confused by role?

Team Defense, or its Absence

To put it another way, Seattle has just suffered two of its worst defensive performances in its history. In the numbers, can we tell?

Def2_medium

There is a relationship between defensive rates, as a team, and goals conceded - but the relationship is largely driven by a few extreme performances, and is a fairly poor fit to the data (r2 = 0.134). Indeed, if one were to cut off the worst 3 performances on the graph, the trend line would be trending upwards instead of down. The Vancouver and Colorado games show a defense that had lost the game completely and was unable to make plays on the ball, but the next two severe defeats - 0-4 at L.A. and 1-3 at Houston - register about 130 defensive actions, well within the middle half of the data range.

Def3_medium

This could be considered a distressing analysis. Indeed, the median defensive actions for wins and shutouts is lower than that for losses or draws, or that for dirty sheets, respectively... and the ranges clearly overlap. The worst team defensive statistics may indicate failure as we expect them to, as we've seen the past couple games, but adding in an extra "clearance" or "recovery" on the stat sheet may reflect a completely negligible impact on goal prevention and team success - even if it looked like a goal-saving intervention. Team defensive statistics do not correlate well with goal prevention. Any individual element or dependent variable of that sum (e.g. tackles, clearances, or even "duels won") does not correlate well with goal prevention. How then are we going to evaluate team-wide defensive prowess, let alone individual defensive statistics? Goals are "outliers" both in creation and completion, representing isolated opportunities in a game that are an extremely small portion of the total actions. I'll show you a strong correlation for goals allowed, and it will be the most predictable graph you ever see:

Def4_medium

When my father coached my intermediate-level youth baseball team, I teased him endlessly for a tongue-not-enough-in-cheek pep talk where he pronounced winning depended on scoring more runs than the other team. Getting a shot on target should be correlated with scoring, as the main complications are the skill of the keeper and the difficulty of the save. How does a team seek to prevent dangerous shots on goal, and are defensive statistics measuring that ability, to any extent?

Def6_medium

This parsing of the data is a crude means of identifying where a defensive action takes place on the field. If the opponent does not regularly come within range of the goal, then scoring becomes fairly rare.

Still, in the end, does this exercise us bring us any closer to measuring defense objectively? Defenders clean up myriad passes, clearances, and speculative plays that add an extreme element of noise to the numbers. Stylistic differences between players may be fairly clear (and previous examples attest to that), but the raw data is very, very noisy.

A contentious, active defensive midfield is invaluable to both the offense and defense.

The past two games represent a stunning, exceptional failure of the Sounders to interfere with their opponent.

Postscript (only read this if you Heart at Sounder graphs)

Def5_medium

I do not believe this graph demonstrates any trend in declining performance among Sounders personnel - at least, not relative to the league. Rather, it's consistent with some trends aolsh of tempofreesoccer (here at TFS and here at shinguardian) has identified. The strong similarity of the midfield and defender lines is interesting, and supports the idea that this is a systemic trend in play over the season.

Oh - and that March midfield outlier? There be Honey Badgers.

Sign up for the newsletter Sign up for the Sounder At Heart Weekly Roundup newsletter!

A twice weekly roundup of Seattle Sounders and OL Reign news from Sounder at Heart