Another Look at Team MoSSBODS Ratings History
The MoSSBODS Team Rating System, while far from perfect, seems, based on its margin predicting performance across VFL/AFL history, to be capturing something useful about the underlying abilities of teams. Which is good, because that's what it was designed to do ...
Today I want to use the System to explore, firstly, recent AFL history and then, later, the entire history of the competition.
TEAM RATINGS FROM 2000 TO THE PRESENT
We'll start today with a tile-map of teams' Combined Rating history using a scale where darker green colourings denote lesser ability and darker red colourings denote greater ability.
(I spent not inconsiderable time finalising the colour palette for this and the following charts, in the end using some that were suggested by the Color Brewer site. On the way I was pleasantly side-tracked in the world of colour distances since I hoped to make the darkest green in the charts as distant from the middle yellow as was the darkest red.)
This view of the data best highlights the temporal trends in each team's Rating history - you can use it to compare teams at any point in time, but only reach broad conclusions - revealing extended periods of above- or below-average abilities and, within those periods, providing some indication of the depths fathomed or heights obtained.
You can see, for example, the extended period of below-average ability displayed by the Melbourne team from about 2007 onwards, bottoming out in 2013, and you can also see the emergence of the exceptional teams of the era: Essendon in 2000, Geelong in 2007 and 2008, St Kilda in 2009, Collingwood in 2010 and 2011, and Hawthorn in 2012.
These Combined MoSSBODS Ratings comprise separate Offensive and Defensive Ratings, which we explore in the charts that follow.
The Offensive view provides a slightly different perspective for some teams. For example, we see the Offensive strength of the Western Bulldogs across the six seasons from 2005 to 2010, and we see that Melbourne's Offensive weakness persisted through 2013 and 2014. It's interesting to note the below-average profile for the Fremantle team of 2015, which finished as minor premiers but failed to make the Grand Final.
We see too that Richmond has enjoyed only a relatively small window of Offensive excellence, roughly across the 2011 to 2013 period, while Hawthorn has been above-average since about half-way through season 2010 all the way to the present day (Round 2 of 2016).
Turning next to the Defensive view we see Sydney's above-average abilities in this area stretching back across virtually the entire period, and Melbourne's below-average abilities for the most part doing the same.
Fremantle's transition from a below-average to an above-average defensive team starting in 2012 with the arrival of Ross Lyon is particularly stark. In the five preceding seasons, Lyon had coached at St Kilda with apparently similar effects on their defensive abilities - abilities which faded almost as soon as he departed according to the chart.
The gradual improvement in the defensive abilities of the new teams, Gold Coast and GWS, is also apparent here, as is a similar improvement in the ability of Melbourne, though they all still remain below-average.
It's interesting that we don't see, amongst the exceptional teams of the era listed earlier (St Kilda 2009 aside), the same domination defensively that we do offensively. More on this a bit later.
One way of summarising this era for each team in terms of Offensive and Defensive Ratings is by way of a 2d-density plot, which appears below.
The upward to the right nature of these plots for most teams reveals the generally correlated nature of teams' Offensive and Defensive abilities (ie teams tends to be better or worse than average on both dimensions). Notable exceptions are Sydney and North Melbourne, which are both extraordinary too for the relative concentration of their Rating data, suggesting that their abilities have varied far less than those of other teams across the period.
In contrast, the plots for Carlton, GWS and St Kilda cover much larger areas, signifying much higher levels of variability in ability over the period.
Also interesting to note is that Sydney's plot sits almost entirely above a Defensive Rating of 0, and that Geelong's sits almost entirely above a Defensive Rating of 0 and to the right of an Offensive Rating of 0. By comparison, the plots for Gold Coast and GWS sit below and to the left of zero Ratings.
(By the way, if you're interested in drilling down on the 2005 to 2015 period a little more, this blog has more specific Rating data for each team and investigates the extremes of the Ratings.)
ALL-TIME TEAM RATINGS
We can create exactly the same set of charts for the entire expanse of VFL/AFL history. So, let's do that, starting again with the tile-map for the Combined Ratings.
In reviewing this chart, note that MoSSBODS Ratings carry over, with some regression towards zero, from one season to the next, so, in creating them, we need to make some decisions about what to do when teams merge or move.
For this purpose, MoSSBODS assumes that:
- Sydney inherits South Melbourne's Ratings
- The Western Bulldogs inherit Footscray's Ratings
- The Brisbane Lions inherit Fitzroy's Ratings
- North Melbourne and the Kangaroos are one, continuous entity
Other assumptions are possible but in any case don't make much difference to the picture as you see it here.
For me, the most striking feature of this chart is the cyclical nature of ability, with most teams enjoying extended periods of above-average and then below-average ability. It's hard to be definitive, but it does not appear as though the introduction of the national Draft in 1986 or the salary cap in 1987 had a huge influence on this phenomenon. This might be an interesting topic to explore more quantitatively in a future blog.
Next, here are the Offensive and Defensive tile-maps.
Comparing the two charts it seems that:
- exceptional teams have dominated offensively to a greater extent than exceptional teams have dominated defensively (because the scale for Offensive ability goes higher)
- defensively poor teams have been relatively poorer than have offensively poor teams (because the scale for Defensive ability goes lower)
A quick analysis of raw points scoring data for every season tends to support this conclusion since, on average across history:
- The team registering the most points in the home and away portion of the season (a measure of the "best" team offensively) scored 1.88% more points than the all-team average
- The team conceding the fewest points in the home and away portion of the season (a measure of the "best" team defensively) conceded only 1.78% points fewer than the all-team average
- The team registering the fewest points in the home and away portion of the season (a measure of the "worst" team offensively) scored 1.92% fewer points than the all-team average
- The team conceding the most points in the home and away portion of the season (a measure of the "worst" team defensively) conceded 2.49% points more than the all-team average
These percentages were much larger in absolute terms in the early stages of the competition, but an analysis just of the period 1980 to 2015 when they were more stable yields the same ordering:
- "Best" Offensive: +1.44% relative to all-team average
- "Best" Defensive: -1.30% relative to all-team average
- "Worst" Offensive: -1.31% relative to all-team average
- "Worst" Defensive: +1.67% relative to all-team average
We also see in these two charts, as we did in the chart for the Combined Ratings, extended periods of offensive and defensive superiority and inferiority.
To finish, here's the all-time 2d-density plot:
Again we see the general tendency for teams' contemporaneous Offensive and Defensive Ratings to be positively correlated, with Fremantle and Sydney the standout exceptions. We also see the remarkable relative consistency in the Ratings of Collingwood, which cover a much smaller proportion of the Ratings space than for any other team with a comparably long period of participation in the competition.
For me, there's a real sense of continuity and connection in this chart, thinking about the millions of fans across the history of the sport who've watched as their chosen team has mapped out some portion of their respective plot here, covering new territory, scaling new heights and plumbing new depths - in some cases, to boldly go where no team has gone before (and that's not always been a good thing ...)