2013 : Round 27 (Finals Week 4) - Results
/A punter's opinion about a contest doesn't much matter unless he or she acts on it with a wager.
The same is true of MAFL Funds, so the erroneous opinions of the Head-to-Head and Line Funds about the most likely outcome of the 2013 Grand Final cost Investors nothing, save for the opportunity cost of a few missed winning wagers (which we humans heavily discount and readily ignore - and who am I to fight humanity?).
What Investors will notice is the 0.4c increment in the value of the Recommended Portfolio, which came courtesy of the Margin Fund's willingness to back the opinions of its constituent algorithms. This week, for the second week in a row, it was Bookie_9's assessment of the most likely game margin that proved correct, ratcheting the Margin Fund's return up by just over 4c to have it finish up by 12.4c on the season. Despite Bookie_9's strong finish to the season it was Combo_NN2's season-long performance that led to this Fund's profitability: Combo_NN2 contributed 32.4c to the increased value of the Fund, while Bookie_9 stripped 20c.
MAFL's two other Funds also finished the season in profit, the Head-to-Head Fund up by 10.1c, and the Line Fund up by 6.2c, making this the only season in MAFL history where all Funds have finished in profit for the season, except for 2006 when only a single Fund operated.
So, after 207 games in which we made 361 wagers and risked every dollar in the Portfolio a little more than twice over, Investors finished season 2013 up by 8c. That profit breaks a three-season losing streak for MAFL stretching back to 2010, and is the highest return for the Recommended Portfolio since 2007.
WHICH TEAMS TO BLAME AND WHICH TO THANK
Looking firstly at our wagering from the perspective of the teams we backed in the Head-to-Head, Line or SuperMargin markets we find that eight teams made a net positive contribution to the Portfolio while 10 made a net negative contribution.
Best amongst the octet making a positive difference were Sydney, who added 10.3c to Investor wealth, entirely via returns from seven of eight successful Line market wagers. Next best were the Roos, who added 8c almost exclusively through Line wagers, then Port Adelaide, who contributed 7.3c, mostly from collects on Head-to-Head wagers. Adelaide (5.9c) and Geelong (4.2c) were the other major contributors.
Losses were heaviest from wagers on Richmond, with a 1 from 4 performance in Line wagers contributing most significantly to the 7.1c they stripped from the Recommended Portfolio. Then came three other teams, each about equally as destructive of Investor wealth: Collingwood and Fremantle, each vaporising 4.3c of value, and St Kilda eliminating 4.2c.
Teams, though, are also due credit for Investor gains and bear responsibility for losses when they're the team against which a MAFL Fund is wagering. So, here's an alternative team-by-team view of the season's wagering then, now from the point of view of the team wagered against. (In truth, in the Line and Margin markets you're not really so much betting on the performance of one team over the other as wagering on a result that's the shared responsibility of both.)
Viewed from this perspective, nine teams were responsible for the Recommended Portfolio's eventual profitability. Sydney, again, were the team making Investors happiest, contributing 9.5c mostly from a combination of successful Head-to-Head and Line wagers. GWS made the next-largest contribution, 6.7c, almost exclusively from Line market wagers, then Carlton, who added 5.8c about equally from Line and SuperMargin wagers. (Although the return from Carlton's SuperMargin wagers was numerically very high, Investors cared less about successes in this form of wagering this year because the Margin Fund carried a weighting in the Recommended Portfolio of only 10%.)
Losses could most be laid at the feet of Port Adelaide (6.4c, mostly due to their 0 and 4 record for Line wagers), Adelaide (5.7c, also mostly due to losing Line bets), and Melbourne (3.9c, due to a combination of Line and SuperMargin bets).
Combining and weighting equally these two views of the season's wagering, Sydney, the Kangaroos and GWS are the teams that made the greatest contributions to the Recommended Portfolio's wellbeing this season, while Fremantle, Richmond and Melbourne most imperilled it.
TIPSTERS AND PREDICTORS
Combo_NN2's decision to predict a home team victory this week saw it emerge as the outright leader and final winner in the Head-to-Head Tipster portion of the MAFL Leaderboard. It finished the season on 153.5 correct tips from 207 games (74%), one tip clear of the three WinPred-based Tipsters and a remarkable six tips clear of the usually ladder-leading BKB.
Combo_NN2 finished much further down the list of Margin Predictors, first amongst which were the two RSMP-based Predictors (about which I'll be writing a blog over on the Statistical Analyses pages in a day or two's time). Their final mean absolute prediction errors (MAPEs) of 26.4 (RSMP_Weighted) and 26.8 (RSMP_Simple) points per game are comfortably the lowest we've seen on MAFL since I started creating and following Margin Predictors.
RSMP_Weighted achieved its top position by finishing as the Predictor most-often selecting a victory margin within 6 points of the true margin, a feat it managed in 17% of all games, and as the Predictor least-often selecting a victory margin more than 7 goals from the true margin. So good, in fact, was it at avoiding the catastrophically bad prediction that only once all season was it the Predictor furthest from the final result.
All four of the top Margin Predictors provided forecasts that would have yielded a profit on line betting had they been wagered on at $1.90 prices all season, though the profit on Bookie_LPSO was so close to zero as to be effectively equal to it.
Amongst the Head-to-Head Probability Predictors, Bookie_LPSO maintained its number 1 ranking, a position it held for most of the season. Its final log probability score (LPS) average of 0.259 bits per game was also the best result recorded on MAFL since I started making predictions of this type and tracking their LPSs.
The Line Fund algorithm finished the year with an especially errant probability assessment, the -0.58 LPS recorded for this prediction dragging its season-long average score down to -0.0313 bits per game. That final result is a little worse than the equivalent for season 2012, which was -0.027 bits per game.
This relatively lacklustre performance of the Line Fund algorithm in terms of LPS was mirrored in the 48 and 44 record of the Line Fund itself when it wagered this year. The main reason the Fund finished with a 6% profit was down to its making more correct predictions at the time of the season when it was permitted to bet most heavily. In short, the Line Fund's success this year was more about timing than about season-long accuracy.
SUPERMARGIN PERFORMANCES
Five Margin Predictors selected the correct bucket in the final game of the year: Bookie_Actual (based on the handicap in the line market), the two RSMP Predictors, Bookie_3, and Bookie_9.
That failed to break the deadlock between Win_7 and ProPred_3 as the Predictors with the most correct bucket predictions in the SuperMargin market this year. They each finished the season having been right in 28 games, though Win_7's predictions produced a greater return to wagering.
Win_7 and ProPred_3 were joined by Win_3 and Combo_NN2 as the only Predictors to record positive ROIs to wagering for the entire season.
Had an Investor instead, like MAFL, only followed a Margin Predictor's advice when it was foreseeing a home team victory, then Combo_NN2 would have been the Predictor to follow, its predictions generating an ROI to level-stake wagering of almost 25%. Returns to Win_7 (19%) and Win_3 (15%) were also attractive.
ProPred_3 was the only Predictor generating significant returns from its SuperMargin predictions when they heralded a victory by the away team. Such predictions yielded an ROI of almost 22%.