Matter of Stats

View Original

An Analysis of Strength of Schedule for the Men's 2025 AFL Season

The men’s AFL fixture for 2025 was recently released and, as is tradition here, we’ll analyse it to see which teams we think have an easier or harder schedule.

MEASURING DIFFICULTY

Before I get into the way schedule difficulty is measured here on MoS, I want to make a few observations about different approaches you’ll see used elsewhere in the media.

One common approach is to use the final ranking of the teams in the previous season as a measure of their strength. While that certainly has the benefit of simplicity, I’d argue that it can often do a poor job of reflecting the relative strengths of the teams. As a good example from last season, the Western Bulldogs finished 2024 very strongly and were surely not only the 6th or 7th best team, which is what their home-and-away and final ladder positions would have you believe.

The other obvious potential flaw in this approach is that it weights, for example, 2nd as being 50% easier than 1st, but 3rd as only 33% easier than 2nd.

Some, instead os using competition ranking, use competition points instead, which likely better handles the weighting of teams towards the top of the table, but still suffers from the fact that relative end-of-season competition points don’t always reflect end-of-season relative ability.

The final approach I want to comment on is that of measuring the difficulty of a team’s fixturing based solely on the teams it faces twice. The obvious flaw here is that, for example, two teams with identical fixturing other than that, say, one team plays their sole game against Geelong at Kardinia Park, and the other team playes them at the SCG, would be assessed as having equally difficult schedules.

THE MOS APPROACH

In our analysis of schedule difficulty, we’ll look for answers to three questions:

  1. Which teams fared best and which worst in terms of the overall difficultly of the teams they face in their schedule. We’ll base our assessment of this on the new MoSHBODS model’s relative team strengths and venue effects (the Strength of Schedule analysis)

  2. Which teams fared best and which worst in terms of the matchups the missed out on given that only 23 games out of a possible 34 all-plays-all fixture are played. Again we’ll use the new MoSHBODS model’s opinions about relative team strengths and venue effects in what we’ll call a Strength of Missing Schedule analysis

  3. How much more or less likely would each team be to play Finals were the missing parts of the fixture actually played

For the analyses in 1 and 2, we’ll use the same methodology as we used last year, details of which appear in this post from 2015 and this one from the year before. We’ll use the latest MoSHBODS Team Rating System to provide the required estimates of relative team ability and venue effects. We’ll use the Venue Performance Values as at the start of the 2025 season, and team ratings as they will be for the first game of the 2025 season (which is 65% of what they were at the end of the 2024 season).

THE FIXTURE

The 2024 AFL Fixture has all 18 teams playing 23 of a possible 34 games, each missing 6 of the home and 5 of the away, or 5 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having been accommodated in 2025 by playing only 4 games in Round 0 (sic), 8 games in Rounds 2, 4, and 13, and 7 games in Rounds 3, 12, and 14 through 16.

So, teams will have played the same number of games from the end of Rounds 4 to 11, and then from the end of Round 15 onwards.

THE RULE OF THIRDS

In determining the 99 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 207 home-and-away games, using the ladder positions of 2024 after the Finals series as the measure of that ability.

This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 44 of the 72 (or about 61%) of the possible pairings are included in the schedule. That's down by 2 on last year’s figure of 46.

By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included, while 48 of the 72 (or about 67%) of the possible pairings between the Top 6 and the Middle 6 teams are included.

There are also 44 of a possible 60 pairings (73%) pitting teams from the Middle 6 against one another (down 2 on last year), 48 of a possible 60 pairings (80%) pitting teams from the Bottom 6 against one another (up 4 on last year), and 46 of a possible 72 pairings (64%) pitting a team from the Middle 6 against a team from the Bottom 6.

In total, 138 of the 207 contests (or about 67%) involve teams from the same one-third based on final official ladder positions last season. That’s the same number as in the 2024 season, but more heavily slanted towards Bottom 6 v Bottom 6 contests.

MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE

The metric that we’ll use to estimate Strength of Schedule is the combined strength of the teams met across the 24 rounds of the season, adjusted for venue effects (so meeting Team A at home will tend to make a lower contribution to the overall combined strength of the teams met than will meeting Team A away where they’ll likely have a Home Ground Advantage. Recall that the MoSHBODS Rating System calculates a Venue Performance Value for every team at every venue, and these will tend to be positive for venues that are the team’s home grounds, and negative for benues that are the team’s away grounds, especially if they are interstate).
The overall strength of Team A playing Team B at Venue V is given by Team A’s Combined MoSHBODS rating + Team A’s VPV at Venue V - Team B’s VPV at Venue V.

For this measure, a higher combined strength is interpreted as a more difficult schedule.

The first thing we need for this metric is a measure of each team’s underlying abilities. For this purpose, as noted earlier, we'll use MoSHBODS’ 2025 pre-Round 0 Team Ratings, which are set by taking 65% of their final 2024 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year there are again some teams that are ordered very differently based on their MoSHBODS versus their ladder finish (though fewer than in 2024), with the biggest differences being for:

  • Western Buldogs: 1st on MoSHBODS and 7th on the Ladder

  • GWS: 10th on MoSHBODS and 5th on the Ladder

  • Sydney: 6th on MoSHBODS and 2nd on the Ladder

In the context of the AFL's competition "thirds", only four teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:

  • Western Bulldogs: Middle 6 based on Ladder / Top 6 based on MoSHBODS

  • GWS: Top 6 based on Ladder / Middle 6 based on MoSHBODS

  • Melbourne: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS

  • Essendon: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS

The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:

  • Top 6: Average +0.35 SDs / Range 0.56 SDs

  • Middle 6: Average +0.16 SDs / Range 0.84 SDs

  • Bottom 6: Average -0.51 SDs / Range 1.07 SDs

(1 SD is currently roughly equivalent to about 22.4 points)

We can see that the Top 6 teams from the final 2024 ladder are, on average, slightly stronger than those from the Middle 6, and that those from the Middle 6 are, on average, substantially stronger than those from the Bottom 6.

Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 0.86 SDs or about 19.3 points.

With relatively large spreads in the ratings across the Middle and Bottom thirds - the equivalent of about 3 goals in the Middle 6, and 4 goals in the Bottom 6 - it's quite important which of the teams from these thirds a team plays.

VENUE PERFORMANCE VALUES

MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.

The Venue Performance Values, calculated as they would be on day 1 of the 2025 season, are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2025 home-and-away season. For details on how these have been calculated, refer to this blog.

(Interestingly, GWS again play at 12 different venues in 2025, and Port Adelaide, Brisbane Lions, Gold Coast, North Melbourne, and Western Bulldogs at 10 different venues. In contrast, Carlton play at only five different venues, and Collingwod and West Coast at only seven.)

Venue Performance values are, like Ratings, measured in Standard Deviations (SDs), and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +0.4 SDs better team than their underlying +0.27 SDs Rating when playing at Kardinia.

As noted earlier, the methodology includes a team’s own VPV as well as that of its opponents’ in the Strength calculations, because I think this better encapsulates the full venue effect. Prior to 2021, only the opponents’ VPVs were included.

To reiterate the rationale for this by way of a concrete example, imagine moving a Brisbane Lions v Melbourne game from the Gabba to Carrara assuming that the VPV numbers in the table above apply. Under the old methodology, that would have virtually no effect on Brisbane’s estimated Strength of Schedule, because Melbourne’s VPV at both venues is about the same at around minus 0.22 to 0.24 SDs, and we would ignore Brisbane’s VPV at those venues. But, Brisbane is estimated to be almost a 9-point better side at the Gabba compared to Carrara - a fact which, arguably, seems worthy of inclusion in the Strength calculation. The fixture with that game at Gabba is surely an easier fixture for Brisbane than the one with that same game at Carrara.

The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, North Melbourne), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, again for this year, I’m answering “yes” to both those questions.

One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows, and in much of the analysis, I’ll provide the data to allow you to do exactly that.

Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate (technically, ‘out of region’) travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, all interstate teams are about 3 to 6.5 points worse. (Gold Coast are just over a two-point worse team at the Gabba, but you can’t really attribute that to the travel.)

Generally speaking, the interstate travel component of VPVs is fairly uniform across teams because:

  • only the last 8.5 years of data is included in VPV calculations

  • a team needs to have played 65 games at a venue in that 8.5 year window before the regularisation towards the default of -0.26 SDs for out of region contests is completely discontinued and the calculation becomes based solely on the team’s actual performance relative to expectation

The breakdown of home and away games for each team in terms of ‘in region’ or ‘out of region’ is shown in the chart below (click on it to access a larger version).

A few things stand out in this table:

  • GWS play only 9 games in region in 2025: their 8 home games at Sydney Showground and their away clash with Sydney at the SCG

  • Gold Coast play only 10 games in region in 2025: their 9 home games at Carrara and their away clash with Brisbane Lions at the Gabba

  • Only 5 of Carlton’s and Geelong’s games are out of region for their opponents

  • Only 6 of Collingwood’s, Essendon’s, and St Kilda’s games are out of region for their opponents

  • In contrast, 14 of Port Adelaide’s games are out of region for their opponents

STRENGTH OF SCHEDULE

After performing the necessary calculations for all 23 games for every team, taking into account who each team plays as well as where, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.

(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)

Note that the numbers shown in these tables are all aggregates and so will include eithter 11 or 12 games in the home and the away data, depending on the particular team’s fixture.

In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here, with Melbourne and Carlton the near outliers because of their small and negative VPVs at the grounds where they play their home games.

Based solely on each team’s home fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. North Melbourne (+0.19)

  2. Carlton (+0.12)

  3. Collingwood (-0.32)

  4. St Kilda (-0.49)

  5. Fremantle (-2.0)

If we ignore venue effects, such is the underlying strength of the teams they face at home, Fremantle rank 1st, Sydney 2nd, Gold Coast 3rd, Geelong 4th, and Collingwood 5th. On this metric, North Melbourne would slide into 8th, Carlton into 9th, and Port St Kilda to 7th.

Those with the easiest home schedules (including net venue effects) are:

  1. Port Adelaide (-6.66)

  2. Brisbane Lions(-5.87)

  3. Adelaide (-5.71)

  4. GW Sydney (-5.35)

  5. West Coast (-4.49)

Here, the disproportionate impacts of interstate travel on teams’ home schedules are apparent. Brisbane Lions, for example, face interstate teams in every home game except when they play Gold Coast, Port Adelaide and Adelaide likewise except when they play each other, and West Coast likewise except when they play Fremantle.

Were we to ignore venue effects, only GW Sydney and Port Adelaide would remain in the bottom five in terms of home schedule difficulty, with Hawthorn taking 1st, Melbourne 3rd, and Western Bulldogs 5th.

The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for all teams, though least of all for Western Bulldogs.

Based on their away fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. Port Adelaide (+6.10)

  2. GW Sydney (+5.38)

  3. Hawthorn (+4.30)

  4. Essendon (+4.17)

  5. West Coast (+3.99)

Ignoring venue effects would see only West Coast exiting the top five (to 6th) and see North Melbourne come in at 4th.

Those with the easiest away schedules (including net venue effects) are:

  1. Carlton (+0.81)

  2. Geelong (+1.49)

  3. Western Bulldogs (+1.64)

  4. Sydney (+1.77)

  5. Fremantle (+1.81)

Ignoring venue effects would see only Western Bulldogs exiting the top five (to 10th) and see Gold Coast come in at 3rd.

Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including net venue effects):

  • Tough Schedules: North Melbourne, Collingwood, Essendon, and St Kilda

  • Slightly Harder Schedules: Carlton, Melbourne, and Gold Coast

  • Average Schedule: Hawthorn, GW Sydney, and Fremantle

  • Slightly Easier Schedules: West Coast, Port Adelaide, Sydney, Richmond, and Western Bulldogs

  • Easy Schedules: Brisbane Lions, Geelong, and Adelaide

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out: 

  • North Melbourne, Melbourne and Gold Coast have more difficult schedules than might be expected for teams in the Bottom 6

  • Collingwood and Essendon have easier schedules than might be expected for Middle 6 teams

  • Port Adelaide, Sydney, Brisbane Lions, and Geelong have easier schedules than might be expected for Top 6 teams

To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have, as mentioned, included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).

Looked at that through this lens we see that:

  • Carlton’s and Melbourne’s fixtures appear much easier

  • Brisbane Lions’ and Port Adelaide’s fixture appears much harder

Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 7.1 SDs or 159 points, 147 points across the season, which is just under 7 points per game and up about 1 point per game on last year’s figure.

A 7-point advantage turns a game with an otherwise 50% victory probability into one with about a 58% probability, which converts to about 1.9 extra expected wins across a 23-game season.

DETAILED GAME-BY-GAME NET VPVs

The table below (click on it to access a larger version) provides a full breakdown of the Strength of Schedule calculation for every fixtured game. The top row of figures records the Combined Rating of the relevant team, and the numbers in the body of the table the Net VPVs for each contest.

So, for example, when Brisbane Lions play Sydney at home at the Gabba, the Opponent component of the SoS calculation for the Lions is the +0.24 Rating of the Swans, and the Venue component is -0.64, which is Sydney’s VPV at the Gabba of -0.25 less the Lions’ VPV of 0.39. So, the contribution of this game to the Strength of Schedule for the Lions is +0.24-0.64, which is -0.4.

If we perform this same calculation for each of the teams the Lions play at home, and then add the results, we get -5.87, which you’ll see is the same number for the Lions that’s in the earlier table.

When, instead, Brisbane Lions play Sydney away at the SCG, the Opponent component of the SoS calculation for the Lions is still the +0.24 Rating of the Swans, but the Venue component is now +0.4, which is Sydney’s VPV at the SCG of +0.14 less the Lions’ VPV of -0.26. So, the contribution of this game to the Strength of Schedule for the Lions is +0.24+0.4, which is +0.64.

If we now perform this same calculation for each of the teams the Lions play away, and then add those results, we get +3.9, which you’ll also see is the same number for the Lions that’s in the earlier table.

STRENGTH OF MISSING SCHEDULE

We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 23 of a possible 34 games. 

We use the same metric for this as we did for the Strength of Schedule assessment, here the combined strength of the teams not met across the 24 rounds of the season, also adjusted for venue effects. We assume that all missing games will be played at the home ground most commonly used for the home team in the season proper. So, for example, all of the missing Geelong home games are played at Kardinia. Carlton, who play 6 home games at the MCG and 5 at Docklands in the season proper, are assumed to play all missing home games at the MCG (where their VPV is slightly higher).

For this measure, a lower combined strength is interpreted as a less disadvantageous schedule.

The table below summarises the missing games in the 2025 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2024 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

Four of the six teams in the Bottom 3rd fail to play five of the Top 6 teams twice during the season, the exceptions being Gold Coast and Adelaide who miss only four of the Top 6. They make up for this deficit by missing five of the Middle 6 teams while their third-mates miss only four.

Also, four of the six teams in the Top 3rd fail to play only two of the Top 6 teams twice during the season, the exceptions being GW Sydney and Hawthorn, who miss three of the Top 6 (they also miss playing only four of the Bottom 3rd teams twice, whilst the remainder of the upper eschelon miss playing five of the Bottom 3rd twice).

Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.

The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.

(Note that the teams are ordered here based on their final 2024 ranking.)

On this measure, Fremantle’s schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, Brisbane Lions’ was second-furthest, Port Adelaide’s third-furthest, and GW Sydney’s fourth-furthest.

Conversely, North Melbourne’s schedule was furthest away in a beneficial sense, followed by Richmond’s, Gold Coast’s, and West Coast’s.

As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.

There are some exceptions, however, in that Sydney’s and Geelong’s missing schedules have less impact than might be expected for a team from the Top 6, and Fremantle’s missing schedule has more impact than might be expected for a team from the Middle 6.

Because team ratings are constrained to sum to zero, by adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the Strength of Actual Schedule table shown earlier. In this sense we can see that there is some relationship between a team’s Strength of Schedule and Strength of Missing Schedule.

We can also account for Venue Effects in our calculations relating to Missing Games, which we do in the table below..

It produces a broadly similar ranking of the teams, with the major differences being:

  • Geelong: 2nd here but 10th above

  • Gold Coast: 9th here but 16th above

  • Carlton: 16th here but 9th above

  • Fremantle: 6th here but 1st above

  • Western Bulldogs: 11th here but 7th above

IMPACT OF THE MISSING SCHEDULE

Although the numbers in the previous section provide a way to rank and broadly quantify the effects of the truncated draw on each team, it’s hard to know what that means in practical terms.

To that end, for this analysis we will estimate the difference in the simulated probability of making the Finals between that obtained using the actual 24 round fixture and that using a 35 round fixture in which all teams meet all other teams home and away. What we’re measuring here are the differences between an idealised all-plays-all season and a 24 round season modelled on the actual fixture.

(We assume here too that all missing games are played at the home team’s most common home ground in the actual season.)

For this measure, a larger difference in Finals probability (35- vs 24-round) is interpreted as a more disadvantageous schedule.

The table at right provides information about how each team fared in the 10,000 all-plays-all simulations versus the 10,000 simulations based on the actual fixture. With samples of this size, we should treat any difference under about 1% point as possibly being explained by random variation.

With that in mind we see that Adelaide and Carlton fare best, with their chances of playing Finals about 5% points higher under the actual schedule versus an all-play-all schedule. Melbourne and Richmond also benefit non-trivially.

Conversely, Brisbane Lions and Port Adelaide are most disadvantaged, with their chances of playing Finals about 4% points lower under the actual schedule versus an all-play-all schedule. Collingwood and Hawthorn also suffer non-trivially.

SUMMARY

The table below summarises the team rankings on five of the considered metrics.

CONCLUSION

For most teams, the answer to the question ‘Relatively speaking, how difficult is your schedule?’ is at least somewhat dependent on the measure you think best defines ‘difficult’.

Forced to make at least some definitive calls, I’d go with:

  • Hard to Argue that Schedule isn’t Tough: Collingwood, Hawthorn, Port Adelaide (if you ignore VPVs), Brisbane Lions (if you ignore VPVs)

  • Likely Slightly Harder than Average Schedules: Essendon, Sydney

  • Likely Roughly Average Schedules: St Kilda, Gold Coast, GW Sydney, Fremantle

  • Likely Slightly Easier than Average Schedules: West Coast

  • Hard to Argue that Schedule is Tough: Richmond, Western Bulldogs, Adelaide, Carlton (if you ignore VPVs), Melbourne (if you ignore VPVs)

  • Hard to Be Definitive about the Schedule: North Melbourne, Geelong