Archive for the ‘computer systems’ Category.

Halloween Horror Story – Beware the Projectionists!

[an annual feature of the BCS Central blog]

Check out these computer mousepads featuring college logos.

Various self-proclaimed “experts” like to publish predicted BCS numbers, supposedly having the ability to accurately determine how poll voters and computers will react to a weekend’s game results.   A couple of years ago,  the credibility of a web site administrator took a big hit – this person actually assigned negative computer points to teams!   Before the first official BCS rankings of 2010 were released, one self-described “guru” would not have been surprised if Auburn was ranked #2 after beating Arkansas.  It is unknown how much surprise resulted when Auburn debuted at #4.

Poll voters generally stick to their guns.  That is, they make up their minds where a team should be ranked and keep it there, only moving it down if it loses, or moving it up if teams ahead of it lose.  You can see clear evidence of this on the first-place votes page from 2009Florida had 53 from the coaches in the preseason, and had 53 thirteen polls later.  There tends to be more fluctuation from week-to-week among those near the bottom of the 25 BCS ranked teams.

The computer page of BCS Central lists properties of the six BCS systems.  There was no guesswork involved – the behavior of these systems over the course of a season was carefully observed to uncover these characteristics.  All six publish conference rankings, and with the exception of Wolfe, all list teams’ strength of schedule.  Only one algorithm – Colley‘s – has been made public.  Colley also provides a mechanism for looking at how specified game results – in isolation – would affect the system’s ratings.

An internet forum of one BCS contending team features an individual who puts a lot of time and effort into highly detailed posts about what the BCS computers’ rankings will look like.   The only real value of these messages is as entertainment, since the output of six independent algorithms can in no way be forecast by the hypothetical product of one.  “Intuition” is said to help make up for the ignorance of the workings of these systems, but in 2010, one set of top-25 projections of Sagarin‘s rankings matched the actual numbers in only 2 cases.

2011 UPDATE:  The projections are even sillier when based upon faulty data.   Recently, the individual mentioned above believed he was using Massey‘s BCS rankings, but neglected to exclude non-FBS teams, so he thought his favorite team was ranked several places below where it really was.  Another self-proclaimed BCS know-it-all made a similar mistake, taking Sagarin‘s apparent SoS ranking of Stanford at #92 at face value.  (This may have been copied from ABC, which broadcast the incorrect ranking earlier.)  Once again, a person neglected to exclude non-FBS teams, and also forgot that USC‘s rankings did not count in the 2011 season.

2012 UPDATE:  These sites have to publish something and pretend that non-factual items are relevant to college football fans, so the prognosticating continues.  It seems, however, that the administrators of some other sites discovered that their SoS numbers could be proven to be wrong, and since they are unwilling or unable put in the work necessary to produce accurate strength of schedule rankings, have surrendered the task of calculating these important numbers to BCS Central.

2013 UPDATE: This one should have been in 1984 (or perhaps part of Alice’s Adventures in Wonderland).  In this case, no one was able to predict the past!  Jeff Sagarin changed his BCS rankings algorithm from “ELO CHESS” to something called “PURE ELO” with the release coinciding with the first official BCS rankings.  There was no reason given for the change, and no explanation of how the new method of ranking teams differed from the old one. (Another issue is if this change was actually allowed under his agreement with the BCS).   FCS member Bethune-Cookman would have been #4 in Sagarin‘s BCS rankings!  However, Sagarin published revised rankings before the second official BCS release, and Bethune-Cookman was ranked at #67.

The bottom line is that this pair of analogies holds:
BCS numbers are to official BCS ratings as astrology is to astronomy.

why you should always look for the BCS rankings here

The first official BCS rankings (of the last season of the BCS era) will be released on Sunday the 20th.   Of the 6 BCS computer administrators, only Jeff Sagarin is allowed to publish his ratings before that official release.    Ken Massey has already stated that his rankings won’t be out until the next morning.   So, the weekly rankings should be available on BCS Central sometime on Mondays.

Take a look at the final BCS rankings of the 2012 season to see why the rankings here are simply the best.

  1. Unlike others, the actual computer rankings are listed for teams below #25 in any system.
  2. Win-loss records of the ranked teams are included.
  3. Starting with the second release, the rankings of the previous week are published.
  4. Uniquely, the strength of schedule of each team (detailed on a separate page) is included.
  5. The table is sortable on any column.

ranking the BCS computers

This ordering of the six BCS computer systems (and their administrators)  is admittedly primarily subjective.  The rankings are based on how the systems have published data this season.  They’re not intended to highlight which may be more “accurate”, “fair”,  or predictive of game results.  Be sure to look at the computers page and the linked blog posts to get a better  understanding of how any one system is alike or different from the others.

Find jewelry made for fans of your favorite team.

1) Anderson & Hester place all their BCS ratings, strength of schedule ratings, and conference rankings on a single web page.  There is only one ranking system and no numbers are from non-FBS teams.   This makes getting the numbers used for the BCS computer score easy to get, and the data tends to be published within an hour or two after the official BCS release on Sunday.

2) Billingsley places strength of schedule and conference rankings on a separate page.    Only FBS teams are ranked.  New data comes out soon after the official BCS rankings are released.

3) Sagarin is the only system allowed to publish computer ratings before the BCS rankings come out, and the new numbers tend to come out early each Sunday.  Sagarin’s conference rankings are on a separate page.  Ratings from both a points-based system (Predictor) and the BCS algorithm (ELO CHESS) are displayed in the same table.  FCS teams are ranked right along with FBS teams.   There is only one set of numbers for strength of schedule and a single ranking of conferences, and it’s unclear how much influence point differences have on the determination of these ratings.

4) Colley is unique in that he ranks groups of FCS teams, rather than individual ones, with FBS teams.  This makes determination of strength of schedule rankings for FBS teams very cumbersome.  Conference rankings are on a separate page.   New rankings are sometimes published fairly soon after the BCS release on Sunday, but have not come out until Monday in other weeks.

5)  Massey publishes more data and has by far the most sophisticated web site of any of the BCS computer systems.  This is a positive from the perspective of a college football fan in general, but is actually a negative in this review.   One must be extremely careful, as Massey not only has a point-based rating system and ranks lower division teams,  but it’s also quite easy to think you are looking at the BCS numbers when you are not.   Conference rankings and strength of schedule numbers can be found on the same page as the BCS rankings.  Massey rankings are released very late Sunday or early Monday.

6) Wolfe also ranks more non-FBS teams than just those in FCS, but this does not present much of a problem in getting the correct BCS rankings.  Conference rankings are on the same page as the BCS rankings, but Wolfe does not put out any strength of schedule ratings.

BCS components as predictors of bowl outcomes

In the recent bowl season, there was little variation among the polls and BCS computer systems in how often they ranked a winning bowl team higher.   This is probably due to the fact that there were few games in which two closely-ranked teams faced each other.   Other than in three BCS bowls (BCS NCG, Rose, and Sugar), the difference in the two teams’ rankings was usually substantial.

A cell in the table below has an asterisk (*) to indicate that the component named in the column header ranked the winner of the bowl named in a particular row higher.  In addition to the polls and the six BCS computers systems, Massey‘s Margin OVictory and Sagarin‘s Predictor, which take into account points, have been included.

bowl Harris Coaches A&H B C M S W MOV Predictor
BCS NCG * * * * * * *
Sugar * * * * * *
Fiesta * * * * * * * * * *
Rose * * * * * * * * * *
Orange * * * * * * * * * *
Las Vegas * * * * * * * * * *
Champs Sports *
Alamo * * * * * * * * * *
Liberty * * * * * *
Chick-fil-A *
Capital One * * * * *
Gator * * * * * * * * * *
Cotton * * * * * * * * * *
predicted outcome 10 9 10 11 10 9 10 9 8 10

discarded BCS computers rankings for 2010 and the past six seasons

In the first surprise this seasonBillingsley did not have the most computer rankings thrown out for being the highest or lowest for a particular team.  Instead,  Sagarin had more tossed out for both reasons than any other system.  In another upset, Massey, rather than Wolfe, produced the fewest rankings that were not used to determine a team’s BCS computer score.

The computers page of BCS Central has also been updated – the table on the right now shows discarded computer rankings from 2005 through 2010.


UPDATED: the second, ? final ?, official BCS rankings are here.

First off, this post is not intended to belittle Wes Colley.   (Later reports indicate that Peter Wolfe may share responsiblity)  The error would not have been discovered if, like the other five BCS computer systems, Colley kept his algorithm secret.  He deserves credit for this, and the incident should (but won’t) spur the others to publicize exactly how their BCS ratings are calculated.  I could write once again about how this unfortunate occurrence hurts the credibility of the BCS, but it would have to have some for it to be diminished.

The result of a single FCS game which was not originally considered by Colley resulted in two pairs of  bad rankings.   In Colley‘s rankings, Boise State is now #9, LSU is #10, Nebraska is #17, and Alabama is #18.   There was absolutely no impact on Alabama‘s BCS standing.   Nebraska‘s rating increases to 0.3967, but its BCS ranking of #18 remains the same.

The significant impacts were on Boise State and LSU – they switched places in the correct BCS rankingsBoise State is now #10 and LSU is now #11.

A screen capture of the uncorrected Colley ratings is here.  The uncorrected “official” PDF is here – the “corrected” one is here (but it has even more problems – the “Rank”, “Points”, and “%” columns under “Harris Interactive” are mixed up; the “%” column under “USA Today” is actually a repeat of  points in the Harris Poll; and the headers of the other two columns do not match the data).

more – unique to this site – info about the BCS computers

Two additions have been made to the “BCS computers” page:  There are now links to the blog posts that discuss each of the six systems.   Additionally, there is a new table that combines the data from 2005-2009 in regards to discarded computer rankings.  For those who have read the blog posts relating to the BCS computer systems before, it should be no surprise that Billingsley‘s rankings were tossed out far more often than any other system.  Also worth noting is that when the numbers of Anderson & Hester were discarded, it was far more likely to happen because the rankings were higher, not lower, than those of the other systems.  Since this new table contains information from past seasons, the “BCS computers” page is now also linked from the “BCS history” page.