Do the math; Top 10 is off the mark

How's this for irony: The Waterloo Warriors likely wouldn't be a ranked team if the CIS did the kind of advanced number-crunching UW grads do so well to come up with the Top 10.

(The same could be said for the No. 8 Queen's Golden Gaels, but that's just too near and dear. Sorry.)

The 3-0 Warriors being No. 10 in this week's CIS poll by virtue of beating winless Toronto, York and Windsor (whose only win was over York, and played without quarterback-punter Dan Lumley against the Warriors last Saturday) points up an argument about how it would benefit the greater good -- helping people understand the game, and providing a picture of who the strong programs are -- if Canadian university football used some equivalent of RPI instead of the more subjective, "Well, Waterloo's 3-0, so they can't not be ranked."

Who says they can't be? This isn't meant as a slam on the Warriors, who are enjoying success for the first time in a while and have a great shot at the OUA playoffs. It's just totally subjective, though -- RPI cuts through the subjectivity and the "I think" factor better. It doesn't just look at a team's record, but who it played and who its opponents played. It weights those three elements 25%/50%/25%, so that wouldn't help Waterloo, whose three opponents are 1-5 vs. the rest of the world.

Here's an illustration of RPI's worth: During the past CIS basketball season, the Bob Adams CIS Sports Page's RPI rankings showed the rumours about the Carleton Ravens' loss of swagger were greatly exaggerated. The Ravens endured some January blahs -- they actually lost twice in the same week -- and slipped to No. 3 or No. 4 in the men's basketball poll (which is voted on by the coaches, not the media like football's). However, the weekly rankings at Adams' site still showed the Ravens as the No. 1 team in the country, never lower than No. 2 behind Concordia (which gained from the more limited non-conference schedule Quebec schools play in hoops).

Lo and behold, Dave Smart's team verified that at the nationals two months later. Similarly, you might recall Simon Fraser won the CIS women's basketball title last winter after failing to even reach its conference final. Well, according to Adams' site, the Clan were the second-best team in the country based on RPI, even with five losses, so it wasn't such a shock that they pulled it off.

So, since this is coming from someone who hasn't taken math since getting a 65% in OAC calculus 11 years ago (come to think of it, the two classmates who saved me in that went on to graduate from Waterloo and Queen's), could someone out there come up with a system similar to RPI for football? It would have to take into account home-field advantage, margin of victory (up to a point), and a team's cumulative record across the current season and the previous two (to off-set the small sample size and fluctuations owing to injuries and luck).

The gut feeling is such a measuring stick would show that 3-0 or not, Waterloo and maybe Queen's aren't among the top 10 teams in the land. Granted, some people want to believe, but it's always better to know.

Next PostNewer Post Previous PostOlder Post Home


  1. The problem in football is the lack of out-of-conference games. RPI is useful (although not very) because teams play outside their league. Leaving aside the two games played in August, Football has three closed loops (CW, OUA, QUFL-AUS), like the CHL. So you run into the same problems with ranking in football as you do in Major Junior hockey. In the end, objectivity isn't possible, so you're left with opinions as to the relative strength and depth of each league.

    We see comparisons across the top of leagues each year at Bowl & Vanier time, but not the middle and bottom. Where would Queen's place in Canada West? Is Toronto worse than SFU, Mount Allison and McGill? It's impossible to tell. (My guess? Probably, but that's just my guess.)

    Objectivity is possible in other sports because there is cross-league data to compare.

  2. I'm on the Top 10 committee, and do my darndest to make sure my list is decent. It's a difficult thing, especially early in the season, given many teams' rosters shift quite a bit.

    And while I think it's true that the OUA gets too many spots on the list, it's difficult to compare the midlevel teams from there and CanWest when they meet up so infrequently. We heard over and over how Laurier wouldn't have a chance in 2005 against Saskatchewan in the Vanier, but it didn't quite turn out that way.

  3. Well, it could surely have a good application within each conference... I suppose that only brings into it my oft-expressed opinion that some kind of realignment with a "super-conference" in Eastern Canada would make for a better and more widely played game.

  4. The CIS is rampant with horrid rankings. In Men's Ice Hockey the rankings are a complete farce as without fail there are 3 teams from Canada West, 3 teams from the AUS and 4 from OUA always appearing. The selection committee is made up of media members, and regardless of record and strength of competition there is, and has been for the last 5 years, "equal representation" in the Top10.

    Its my opinion that the CIS Top10 for hockey should be printed on toilet paper so that it has at least has some value.

  5. I had a similar complaint about the women's soccer rankings this week.

    But my question is this... Is there any sport where the top 10 matters? or is it just a media thing so teams can promote games (like #4 sask at #2 manitoba this weekend). I can't think of a sport in the CIS that uses rankings for anything (other than seedings at CI's). Your can be the #1 team in the country and still not make it out of your confrence (like the football bisons last year).

  6. You're right, it is just a media marketing tool. But it's a powerful one.

    I know The Globe and Mail otherwise wouldn't have CIS content on the Tuesday the football rankings come out, so that's a positive. Whether or not Waterloo receives a few 10th-place votes or not isn't all that important.