How Good Are Hawthorn, How Poor GWS?
Without the benefit of emotional and chronological distance it's often difficult to rate the historical merit of recent sporting performances. MAFL's MARS Ratings, whilst by no means the definitive measure of a team's worth, provides one, objective basis on which to assess the teams that ran around in 2013.
Firstly then, here's the entire history of MARS Ratings for every team, from the first game of 1897 to the Grand Final of 2013, created using the assumption that the home team in every Final ever played was the team with the higher MARS Rating at the time, and by setting every team's MARS Rating in its initial year to 1,000.
Taking a broad view of the current teams we see the general decline of the Saints, the Dogs, the Dees, the Lions and the Pies, and the (re)emergence of the Swans, Hawks, Roos, Tigers, Dockers, Blues and Dons - and maybe Port as well.
Focussing on teams' more-recent Ratings history, that for the period 2000 to 2013, brings these broad trends into sharper relief.
From this chart we see that, as at the end of 2013, Richmond, Sydney, Hawthorn, the Roos and Fremantle are at or very close to their highest Ratings since the turn of the century, while the Bulldogs, GWS, Melbourne and the Brisbane Lions are at or near their nadir.
Which leads us naturally to ask: historically, how high are those highs, and how low the lows?
The 2013 Hawthorn team's end of season Rating of 1,054.4 is the 12th highest ever recorded for any team and the 4th highest recorded for a Hawthorn side, behind the 2012 minting, which finished Rated 1,056.8 after peaking at 1,059.8, and behind the 1988 and 1989 Hawks editions, both of which finished Rated over 1,060.
Essendon's 2000 side holds the record for the highest ever end-of-season Rating having emerged at the back end of that season with the Premiership and the only ever MARS Rating in excess of 1,070.
From recent years, other teams to Rate highly have been the Cats of 2007, 2008 and 2011, and the Pies of 2010 and 2011. Testimony to how difficult it is to win Premierships is the fact that these five exceptional teams won only three Premierships amongst them.
Before moving on from this table it's worth acknowledging the achievements of the Collingwood 1929 and Essendon 1950 sides, which appear in this list despite playing in eras when 20 games was the entirety of the season making it more difficult to rack up Rating Points (RPs) and, therefore, to appear on this list.
From the best then to the worst.
The 2013 GWS team fell about 4 RPs short of finishing with the worst-ever end-of-season MARS Rating - and, come to that, the worst ever any-time-in-the-season Rating. Their 917.5 Rating was trumped only by the Fitzroy 1996 team, which exited their season and the competition with a Rating of 913.6.
This year's Melbourne side didn't finish all that much higher with a Rating of 928.7, ranking them 5th in the all-time dishonours list.
Other recent teams to appear amongst the worst 25 have been last year's GWS team (934.3) and last year's Gold Coast team (943.0).
Looking across the entire list, St Kilda teams have most appearances (6), followed by Fitzroy (4), and the Roos/North Melbourne (3).
What's especially noteworthy about the St Kilda teams' collective record in this list is the fact that four of those six Saints entries were recorded in successive years - 1897, 1898, 1899 and 1900 - despite these teams being dragged about half way back to a 1,000 Rating at the start of every new season and having only 17 games in which to re-establish their impressively unimpressive Rating.
One way to reduce on our assessment of team merit, the influence of season-length and of differential Ratings at the beginning of particular seasons is to calculate and compare the per game change in each teams' Rating across entire seasons. Thumbnailed below is this data for every team and every season ever played.
A close review of this table reveals some interesting facts about current and past teams, for example that the Blues recorded net positive gains per game for every season except one (1913) from 1903 to 1922, and then again from 1926 to 1949, once more missing only a single season (1946).
Collingwood had a similar streak from 1922 to 1939 (missing 1924), while Hawthorn managed to string together 32 seasons from their inauguration in 1925 right through to the season of 1956 with only a single year during which they grew their Rating (1943).
Richmond grew its Rating in every season except two during the period 1927 to 1951, and then recorded only two positive seasons in the next 13.
St Kilda started with a half-dozen bad years, then had three good years out of five, then four bad and three good. West Coast, in comparison, had 11 positive years in their first 12, while Sydney (then South Melbourne) had a period between 1904 and 1919, one year of which they did not contest, in which they had only two negative years.
The Bulldogs (then Footscray) had only three positive years in their first 15, then had only five negative years in their next 17.
To be frank, it's hard not to review this table and come away with a view that there's a high level of positive inter-seasonal correlation in team Rating performances. That intuition is reinforced by the statistic that, of the 1,398 instances of a team playing in one season and then playing again in the next, 893 have ended with the direction of Rating change in one season being the same as in the previous season. The p-value of that statistic under the null hypothesis that a team's average per game Ratings change is a coin toss from one season to the next is so small that it makes tiny seem huge.
We can summarise that mass of history for each team by grouping season per-game changes as follows.
Here we can see that Carlton, Collingwood, Essendon and Geelong have had more "good" seasons (ie seasons where they've added more than 0.30 RPs per game) than the norm, while Brisbane, Fitzroy, Fremantle, Gold Coast, GWS, Hawthorn, the Roos/North Melbourne, Port Adelaide, St Kilda, University and the Western Bulldogs/Footscray had had more "bad" seasons (ie seasons where they've shed 0.20 or more RPs per game) than the norm.
Looking, instead at only the period of history from 2000 to 2013, we see that Sydney has the most extraordinary record, having grown its Rating in every season except one (2009).
Collingwood has the next-best record, having grown its Rating in 10 of 14 seasons. Geelong, Hawthorn and St Kilda are next best, with nine positive seasons each.
Richmond and the Roos have the worst records amongst those teams to have contested all 14 seasons, both having recorded negative per game averages in 10 of 14 seasons. Not far behind are Melbourne and Essendon with nine, and the Dogs, Port Adelaide and Fremantle with eight.
Again, a summary of these 14 seasons is useful.
We see here that only Geelong and Hawthorn have a significant number of exceptionally good seasons, and that Carlton, Gold Coast, GWS, Melbourne and West Coast have a significant number of exceptionally poor seasons.
With the notion of single-season per game changes in MARS Rating in mind, we can Rank all teams through VFL/AFL history yet again. Here, firstly, is the Ranking based on single-season increases.
On this metric, the 2007 Cats emerge as the most improved team of all-time, their 2.23 per-game increase very marginally superior to that of the 1999 Lions.
No team from 2013 appears on this list, though both of last year's Grand Finalists, the Hawks and the Swans, do appear, in 20th and 26th positions respectively.
Other teams from recent seasons to make the Top 30 are the 2010 Pies (4th), the 2008 Hawks (8th), the 2008 (23rd) and the 2011 (22nd) Cats.
The Collingwood 1929 team, which played only 20 games in that season, makes this list too, as does the Sydney 1942 team, which played only 17.
Whilst eight of the top 11 teams played 24 games or more in their seasons, the fact that 3 of the top 10 played 20 games or fewer in their seasons suggests that the practice of calculating per-game Ratings changes has redressed, at least to some extent, the handicap otherwise faced by teams playing in shorter seasons.
Geelong and Collingwood teams have most entries in this list (5), whilst Hawthorn teams have next most (4), then Essendon, Sydney/South Melbourne and Carlton (3 each). Ten teams, eight of them current, have never appeared on this list: Adelaide, Fremantle, Gold Coast, GWS, the Kangaroos, Port Adelaide, St Kilda and the Western Bulldogs/Footscray.
Finally, here's the same table for single-season per-game Rating decreases.
St Kilda, once again, figures prominently amongst the teams least fortunate, headed by the very first St Kilda team of 1897 that shed a record 3.24 RPs per game.
Amongst more-recent teams, the 2012 Giants loom smallest having dropped almost 3 RPs per game in their season, while the 2011 Suns, the 2013 Giants and Dees, the 2008 Eagles, and the 2012 Dogs also appear in the Top 30.
St Kilda teams dominate this list, occupying 7 of 30 places, followed by Melbourne's and Fitzroy's teams with 4 each. Geelong, Hawthorn, GWS, the Roos and University are the only other teams to make more than a single appearance, whilst Brisbane, Essendon, Collingwood, Richmond, Adelaide, Port Adelaide and Fremantle teams are very much notable by their absences.
Only three teams started their season of infamy with a MARS Rating above 1,005: Geelong of 1915 (who started at 1,022.4), West Coast of 2008 (who started at 1,006.5), and Carlton of 2002 (who started at 1,017.3). Three more teams, the 1897 Saints and the 2012 Giants and Suns perhaps more understandably recorded their history-making seasons in their first year of existence.