MoS' Team Ratings Systems: A First Look at 2016
/This year MoS will calculate Team Ratings using three different Systems, all of them ELO-based:
- MARS, which was created back in 2008, and which has proven itself a competent Rater ever since. It is the simplest of the Systems, using a single multiplier throughout the season and a single Home Ground Advantage (HGA) parameter for all venues and teams. You can find a discussion of it in this blog post, where you'll also find a reference to the original Newsletter available in the Archives.
- ChiPS, introduced in this 2014 post and updated a couple of times since. The latest details are available in this post from 2016. ChiPS was defined such that the difference between two teams' Ratings is a direct estimate of the expected margin between them should they meet on a neutral venue. ChiPS is more complicated than MARS, using different HGAs depending on the venue and home team, different multipliers to convert game margins into Rating changes at different points in the season, a separate Interstate Advantage parameter, and a parameter to account for the differential form of the competing teams.
- MoSSBODS, introduced in this post from July 2015, and then updated in this post from later that same year. MoSSBODS' major point of difference is that it rates every team separately on Offensive and Defensive abilities. Like ChiPS, it uses different multipliers for different portions of the season, but it handles game venue in a much more sophisticated (complex?) manner, estimating each team's ability at every venue, recognising that some venues require interstate travel. MoSSBODS does not, however, recognise teams' recent form.
Each of the Rating Systems carry over some proportion of a team's end-of-season Rating into the next season, this behaviour a reflection of the historical reality that, to varying extents, teams' lists, coaching staff and strategies persist. This season, of course, we have reason to suspect the discontinuity in ability might be larger than normal, especially for a few teams, but all three Systems will nonetheless be adopting their standard season-to-season carryover calculations this year. It will, as I've commented elsewhere, be fascinating to see how detrimental this behaviour is.
That assumption means that the 18 teams will start 2016 with the Ratings shown in the table at right, ordered on the basis of MoSSBODS' Combined Ratings.
West Coast, perhaps surprisingly, will start the season Rated #1 on both Offence and Defence by MoSSBODS, ahead of Adelaide on Offence, and Hawthorn on Defence. MARS and ChiPS have, instead, Hawthorn in 1st and West Coast in 2nd.
Comparing MoSSBODS' Combined Ratings with the Ratings of MARS and ChiPS, we see the largest discrepancies for:
- Adelaide (Ranked 8th by ChiPS and MARS, 5th by MoSSBODS)
- Richmond (Ranked 4th by MARS, 3rd by ChiPS, but 8th by MoSSBODS)
- Fremantle (Ranked 5th by MARS, 6th by ChiPS, but 9th by MoSSBODS)
Adelaide are ranked 5th by MoSSBODS despite being ranked 11th on Defence, while Fremantle are ranked 9th overall despite being ranked 3rd on Defence. Adelaide and Fremantle are two of only four teams - the others being Richmond and Melbourne - whose Offensive and Defensive rankings on MoSSBODS differ by five places or more.
Looking across all 18 teams, ChiPS' and MARS' Ratings show the highest levels of correlation (+0.98), although ChiPS' Ratings are also highly correlated with MoSSBODS' Combined Ratings (+0.96), as are MARS' and MoSSBODS' Combined Ratings (+0.94). MARS', ChiPS' and MoSSBODS' Combined Ratings are more highly correlated with MoSSBODS' Defensive Ratings (+0.91 to +0.94) than they are with MoSSBODS' Offensive Ratings (+0.63 to +0.83), hinting that defensive ability was perhaps more important in 2015 than was offensive ability.
On the far right of the table are the most recent TAB Flag prices. Two prices in particular stand out, Adelaide at $34 and Geelong at $8, suggesting that the TAB Bookmaker has made a quite different assessment of these two teams' abilities in 2016.
As one way of assessing the relevance of this analysis for the 2016 Season, I created the same table for Season 2015, summarising each System's Rating prior to Round 1 of that year. HereI've sorted the table on the basis of each team's official finishing position.
At the foot of the table are the simple rank correlations between each Rating System's pre-season team ranking and the official finishing positions, larger correlations reflecting a closer alignment. On this measure then, MoSSBODS Combined Ratings are best with a correlation of +0.72, though the correlations for all Systems are high - and, with the exception of MoSSBODS' Defensive Ratings, higher than would have been achieved by naively assuming that the 2015 official finishing positions would be the same as those in 2014 (see the second-last column).
If we deem an error of 5 ladder places or more to reflect a "poor" assessment:
- MARS was poor in its assessment of eight teams: Carlton (error of 7 ladder spots), West Coast (6), the Western Bulldogs (6), Port Adelaide (6), Geelong (5), GWS (5), Melbourne (5), and Essendon (5).
- ChiPS was poor in its assessment of only five teams: Carlton (7), West Coast (6), the Western Bulldogs (6), Port Adelaide (6), and Essendon (6).
- MoSSBODS Combined was poor in its assessment of just four teams: Essendon (9), Port Adelaide (7), Carlton (6), and the Kangaroos (5).
The average error in the Systems' team rankings was 3.4 places for MARS, 3.6 places for ChiPS, and just 3 places for MoSSBODS. Relative to their 2014 finishes, teams moved an average of 3.8 ladder positions.
All of which suggests that teams' initial MARS, ChiPS and MoSSBODS Ratings might give us a slightly better view of how teams will fare in 2016 than we'd glean from looking only at the final 2015 ladder finishes.
As always, we'll see.