Check out these computer mousepads featuring college logos.
Various self-proclaimed “experts” like to publish predicted BCS numbers, supposedly having the ability to accurately determine how poll voters and computers will react to a weekend’s game results. A couple of years ago, the credibility of a web site administrator took a big hit – this person actually assigned negative computer points to teams! Before the first official BCS rankings of 2010 were released, one self-described “guru” would not have been surprised if Auburn was ranked #2 after beating Arkansas. It is unknown how much surprise resulted when Auburn debuted at #4.
Poll voters generally stick to their guns. That is, they make up their minds where a team should be ranked and keep it there, only moving it down if it loses, or moving it up if teams ahead of it lose. You can see clear evidence of this on the first-place votes page from 2009. Florida had 53 from the coaches in the preseason, and had 53 thirteen polls later. There tends to be more fluctuation from week-to-week among those near the bottom of the 25 BCS ranked teams.
The computer page of BCS Central lists properties of the six BCS systems. There was no guesswork involved – the behavior of these systems over the course of a season was carefully observed to uncover these characteristics. All six publish conference rankings, and with the exception of Wolfe, all list teams’ strength of schedule. Only one algorithm – Colley‘s – has been made public. Colley also provides a mechanism for looking at how specified game results – in isolation – would affect the system’s ratings.
An internet forum of one BCS contending team features an individual who puts a lot of time and effort into highly detailed posts about what the BCS computers’ rankings will look like. The only real value of these messages is as entertainment, since the output of six independent algorithms can in no way be forecast by the hypothetical product of one. “Intuition” is said to help make up for the ignorance of the workings of these systems, but in 2010, one set of top-25 projections of Sagarin‘s rankings matched the actual numbers in only 2 cases.
2011 UPDATE: The projections are even sillier when based upon faulty data. Recently, the individual mentioned above believed he was using Massey‘s BCS rankings, but neglected to exclude non-FBS teams, so he thought his favorite team was ranked several places below where it was really was. Another self-proclaimed BCS know-it-all made a similar mistake, taking Sagarin‘s apparent SoS ranking of Stanford at #92 at face value. (This may have been copied from ABC, which broadcast the incorrect ranking earlier.) Once again, a person neglected to exclude non-FBS teams, and also forgot that USC‘s rankings do not count in the 2011 season.
2012 UPDATE: These sites have to publish something and pretend that non-factual items are relevant to college football fans, so the prognosticating continues. It seems, however, that the administrators of some other sites discovered that their SoS numbers could be proven to be wrong, and since they are unwilling or unable put in the work necessary to produce accurate strength of schedule rankings, have surrendered the task of calculating these important numbers to BCS Central.
The bottom line is that this pair of analogies holds:
Projected BCS numbers are to official BCS ratings as astrology is to astronomy.