Unfortunately I didn’t get a chance to review the round last week (or post predictions). I’m putting this here as a placeholder for brevity so that I have the ratings showing for anyone interested.
Following a cracking weekend of finals football, I haven’t had the time I’d like to get this weeks results and predictions up. I also did find out that my simulation of finals script, that I’d hurriedly written back in about round 13 actually couldn’t handle real finals data being thrown at it. Nonetheless, the simulations of the remainder of the finals series are in, and I’ll combine it with my predictions later on in this post!
With the home and away season over, and the controversial bye weekend, it gives us a nice chance to review the season performance for the inaugural plusSixOne season. Our ELO model managed to get through the last round relatively unscathed, with a total of 6 out of 9 tips. We missed on Adelaide’s terribly timed loss to West Coast, the Bombers running over the top of Carlton and Freo getting up against a tired Bulldogs. With a few blowout games on the weekend, our mean absolute prediction error (MAPE) was a pretty season 2nd worst of 37.9, suggesting that perhaps paying less attention to the end of season games may be an improvement I could make to the ELO model in future iterations.
As we enter the final round of the year, our ELO model had a pretty good round. Last week we tipped 7 out of 9 with a mean absolute prediction error (MAPE) of 20.4, which was our best MAPE outside of the bye rounds for the year. Our two misses were Richmond losing out to St Kilda and Melbourne missing against Carlton.
Continue reading “Round 22 Results”
I must admit to not catching a lot of footy over the weekend due to having Olympics fever, which is probably good considering my ELO model had a tough round this week! With a few upsets, and 5 games decided by a goal or less, we struggled picking winners this week. The model managed to tip 5 out of 9 with a mean absolute prediction error (MAPE) of 27.7.
Continue reading “Round 21 Results”
For the second weekend in a row, the ELO model has done well with its predictions albeit without being very accurate. Round 20 saw the ELO model tip a total of 7 out of 9 with a mean absolute prediction error (MAPE) of 34.7.
The two main misses for tipping were considered upset wins. The first one was the Friday night clash between Richmond and Collingwood. Our model gave the Pies the slight edge at 52% however Richmond were able to overcome the Deledio effect, winning against expectation without the star. The other big miss was Melbourne’s upset win over Hawthorn, which will have some big effects on our expectations for the remainder of the season. I’ll release our simulations tomorrow and discuss these effects. A few other games were quite a lot closer (see GWS v Gold Coast) or much bigger blowouts (see St Kilda v Carlton) than we expected, driving up our MAPE.
These results lift our season total to 124 correct tips out of 171 games (72.5%), with a MAPE of 29.4
Our results plots show that we’ve been tracking along nicely with our tipping percentage over the last period, although our accuracy in the margin has been slipping above our goal.
At the end of the season I’ll do a bit more work on this plot as a potential source of improvement for our model. For now, I’ll keep posting it for reference.
I’ve now got around to doing a bit more analyses of my results each week and so have decided to split the AFL ELO model results and updated simulations into seperate posts. The simulation post will be out shortly! Round 19 saw our ELO model do very well, as we expected from our preview, with all teams favoured by our ELO model favoured by a significant margin. The only incorrect tip was the the upset win by Collingwood over West Coast, which our model attached a 41% probability to.
The round total of 8 out of 9 came with a mean absolute prediction error (MAPE) of 30.2. This gives us our AFL ELO model a season total of 117 out of 162 (72.2), with a MAPE of 29.2, just under our goal of 30!
The following two plots outline how the performance of these two metrics have tracked over the season.
Below I’ve plotted our expected margin against our actual margin. The line on the graph shows what we would expect if our model was perfect. I’ll likely use this plot in the off-season to improve the model performance, but will continue to report it.
Last weekend saw our ELO model have its worst week for the season, tipping at less than 50% to record 4 correct tips out on 9. It was also the 2nd worst round of the year for the Margin component of the predictions, with a mean absolute error (MAE) of 36.9 points.
In 4 matches last week, our had difficulty separating teams after the Home Ground Advantage was taken into account, giving the tipped winners less than a 55% chance of getting over the line. Unfortunately for our model, all of teams ended up losing (North Melbourne at 50.1%, Freo at 54%, Adelaide at 52% and Port Adelaide at 53%), with two other upsets also hurting (Essendon at 58% losing to the Lions and the Bulldogs at 70% losing to the Saints). Luckily the Swans scraped over the line against the Blues while West Coast managed to hold on against the Demons despite losing the Inside 50’s handsomely.
Onto our rating system and, with a poor round of tipping, that gives my model more ammo for changing team ratings to try and make sense of it all. Adelaide loses its 8 week long rein as our top team, handing the title back to Hawthorn after their disappointing loss to Geelong, who also benefit, gaining 13 points and jumping back into our top 4 in ratings. The biggest mover in our ratings was North Melbourne, who jumped 17 points and 2 positions to be tied for the 7th best team in the league.
Our simulated season sees Hawthorn as the first team to clinch a finals birth for the year. They also strengthened their grip on securing home field advantage for the whole finals series and avoiding any travel, which could be important given some of the teams just below them. In fact, Hawthorn now finish on top in a neat 2/3 of our simulations.
The biggest loser from the weekends matches in terms of finals positioning appears to be Adelaide, losing 12%, 21% and 18% in their top 4, 2, and minor premiership chances, respectively. The main beneficiary of those chances has been Hawthorn, with only Geelong’s top 4 chances (up 15%) changing significantly outside of Hawthorn.
North bumped out their chances for holding onto top 8 by winning and Port losing, although the Saints have jumped up to be the most likely team to sneak into the 8 if North falter (currently North are at 73%, with Saints at 22%). Port has dropped to an 11% shot to make the finals, with no other team higher than 1%.
As the season starts to heat up, we are getting some interesting games in the context of the season, as noted last week by the super important Sydney v Hawthorn game. In combination with a really tight top 8, whenever two top 8 teams are matched up, my ELO model has difficulty separating them. Nonetheless, we continued on with an OK 6 out of 9 games tipped correctly, with a respectable MAE of 22. Our 3 misses were all ones that our model predicted to be close, with Sydney (54% chance), North Melbourne (56% chance) and Melbourne (54% chance) all losing. This gives us a season total of 105 tips from 144 games (73%) and an MAE 28.6.
One note of interest is I’ve had to adjust my scale for this plot because of Brisbane doing so badly. I might write a bit of a historical piece comparing them to other badly performing teams.
Given the tightness of the MAE this week, we didn’t see many big jumps in our rating system. The two most notable movers are Port Adelaide and North, who swapped places this week (7th and 9th, respectively). North has now almost dropped to the level of an average team (1500) and face a big matchup against a similarly rated Collingwood this weekend. I haven’t released my full match importance ratings but the Friday night clash between North and Collingwood is big for North’s chances of holding onto their top 8 position.
— plusSixOne (@plusSixOneblog) July 18, 2016
I briefly noted last week that my match importance rating was very high on the Sydney v Hawthorn game last week, which can be seen the Swans’ chances of top 1 and 2 taking about a 15 point hit. Their top 4 chances interestingly didn’t change too much (dropping from 65% to 58%), probably as North and WCE chances of going on a run and getting into the top 4 are getting smaller.
The biggest drop from the weekend was easily North Melbourne, whose chances for top 8 dropped from 76% to 62%. The main beneficiaries for that were Port Adelaide – jumping from 20% to 31% and St Kilda, who are now into double figures. We’ve also seen Freo now mathematically 1at least in my 10000 simulations they don’t make it once[\ref] falling out the race for the top 8.
For our first week out of the bye rounds and back to a full 9 games, my ELO model had a pretty good week, tipping 7 out 9 with an MAE of 24. The incorrect tips were GWS losing to Collingwood (a 29% chance)e and Geelong losing to Sydney (a 23% chance). The seas total is up to 99 tips from 135 games (73%) and an MAE 29.0.
Those two upset losses saw the biggest change in ELO ratings for the round with Geelong losing 21 ratings points (giving them to Sydney) and GWS losing 26 rating points (giving them to Collingwood). That saw both Sydney and West Coast leap above those two teams into 3rd and 4th, respectively. Collingwood has also jumped up to be an almost ‘average’ team.
I mentioned last week that the Geelong v Sydney game had our biggest match importance rating since I’ve started reporting it. That can be seen by the new simulated ladder, where Geelong has dropped down in top 4, top 2 and top 1 contention by between 8 and 16 percentage points. The model now only gives them a 5% chance of finishing on top. In contrast, Sydney has jumped up by 20 percentage points in top 4 calculations, with some smaller increases in top 2 and top 1 chance.
was certainly the Sydney v Bulldogs game, is would be expected by a matchup between two top 4 sides. The last minute goal to Jason Johannisen saw the Bulldogs leapfrog the Swans on our simulated season ladder. The Swans top 4 aspirations took a hit, dropping from 54% down to 44%, while the Bulldogs top 4 chances jumped from 34% to 46%. Given the tightness on the actual AFL ladder this season, there should be a few more of these big impact games coming up!
Hawthorn has firmed considerably in Minor Premier race, finishing on top in just over 1/3 of simulations, with Adelaide remaining the next best chance. Both of those teams also firmed slightly for top 4 and top 2 chances after the weekend.
Port’s loss saw its chances of top 8 fall again – they are now 3 wins outside of the 8 – now down to 20%. By far and way the most likely to drop down is now North Melbourne, missing the top 8 in 24% of simulations.