Archive for the ‘other FBS sites’ Category.
The 2013 NCAA FBS record book is available here as a PDF file. From looking at the section seen below – and recalling this blog post, CFP HQ immediately saw that there were omissions. A prominent case is that of Ohio State, which was last shut out on November 20, 1993 by Michigan. Through the end of the 2012 season, its streak stood at 241. Other teams with longer streaks (which happen to be current) than that of Central Michigan include USC, Wisconsin, Georgia, East Carolina, Texas Tech, and Kansas State. This is a great example of how the FBS database is a uniquely useful resource in the world of college football. An email message sent to the appropriate person at the NCAA has not been answered.
Amazon‘s laptop buying guide will show you what to look for when picking out the perfect portable computer.
2014 UPDATE: There never was a response from the NCAA, but the image below is from the new FBS record book. (It seems that listing only the longest streak was easier than fixing the mistakes pointed out above.) Michigan‘s streak was just broken when it was shut out by Notre Dame, 31-0.
|team||season 1||W1||L1||W2||L2||W2 – W1|
NOTE: The day after this post was originally published, an item titled “Turnarounds land Tigers in SEC title game” appeared on the ESPN web site. It included a table, which can be seen in the image below. ESPN lists 8 1/2 for Hawaii, suggesting a tie, which was not possible. Apparently, ESPN credited Hawaii with half a win for beating Eastern Illinois, an FCS/I-AA team, in 1999. The number 8 for Miami (OH) is simply wrong – as you can see in the above table: 10 – 1 = 9. ESPN also failed to include South Carolina, Central Florida, and Houston, which all won 8 more FBS games in a season than in the previous season.
Check out these computer mousepads featuring college logos.
Various self-proclaimed “experts” like to publish predicted BCS numbers, supposedly having the ability to accurately determine how poll voters and computers will react to a weekend’s game results. A couple of years ago, the credibility of a web site administrator took a big hit – this person actually assigned negative computer points to teams! Before the first official BCS rankings of 2010 were released, one self-described “guru” would not have been surprised if Auburn was ranked #2 after beating Arkansas. It is unknown how much surprise resulted when Auburn debuted at #4.
Poll voters generally stick to their guns. That is, they make up their minds where a team should be ranked and keep it there, only moving it down if it loses, or moving it up if teams ahead of it lose. You can see clear evidence of this on the first-place votes page from 2009. Florida had 53 from the coaches in the preseason, and had 53 thirteen polls later. There tends to be more fluctuation from week-to-week among those near the bottom of the 25 BCS ranked teams.
The computer page of BCS Central lists properties of the six BCS systems. There was no guesswork involved – the behavior of these systems over the course of a season was carefully observed to uncover these characteristics. All six publish conference rankings, and with the exception of Wolfe, all list teams’ strength of schedule. Only one algorithm – Colley‘s – has been made public. Colley also provides a mechanism for looking at how specified game results – in isolation – would affect the system’s ratings.
An internet forum of one BCS contending team features an individual who puts a lot of time and effort into highly detailed posts about what the BCS computers’ rankings will look like. The only real value of these messages is as entertainment, since the output of six independent algorithms can in no way be forecast by the hypothetical product of one. “Intuition” is said to help make up for the ignorance of the workings of these systems, but in 2010, one set of top-25 projections of Sagarin‘s rankings matched the actual numbers in only 2 cases.
2011 UPDATE: The projections are even sillier when based upon faulty data. Recently, the individual mentioned above believed he was using Massey‘s BCS rankings, but neglected to exclude non-FBS teams, so he thought his favorite team was ranked several places below where it really was. Another self-proclaimed BCS know-it-all made a similar mistake, taking Sagarin‘s apparent SoS ranking of Stanford at #92 at face value. (This may have been copied from ABC, which broadcast the incorrect ranking earlier.) Once again, a person neglected to exclude non-FBS teams, and also forgot that USC‘s rankings did not count in the 2011 season.
2012 UPDATE: These sites have to publish something and pretend that non-factual items are relevant to college football fans, so the prognosticating continues. It seems, however, that the administrators of some other sites discovered that their SoS numbers could be proven to be wrong, and since they are unwilling or unable put in the work necessary to produce accurate strength of schedule rankings, have surrendered the task of calculating these important numbers to BCS Central.
2013 UPDATE: This one should have been in 1984 (or perhaps part of Alice’s Adventures in Wonderland). In this case, no one was able to predict the past! Jeff Sagarin changed his BCS rankings algorithm from “ELO CHESS” to something called “PURE ELO” with the release coinciding with the first official BCS rankings. There was no reason given for the change, and no explanation of how the new method of ranking teams differed from the old one. (Another issue is if this change was actually allowed under his agreement with the BCS). FCS member Bethune-Cookman would have been #4 in Sagarin‘s BCS rankings! However, Sagarin published revised rankings before the second official BCS release, and Bethune-Cookman was ranked at #67.
The bottom line is that this pair of analogies holds:
Projected BCS numbers are to official BCS ratings as astrology is to astronomy.
The self-described guru at another site does understand that rankings need to be adjusted to account for teams under sanctions. This one uses Massey‘s BCS rankings – strange as they may seem this early in the season. However, in an apparent attempt to compensate for the missing rankings of Anderson & Hester and Wolfe, another number is calculated from the rankings of computers that have no connection to the BCS.
This results in several rankings that differ from what BCS Central listed. The biggest discrepancy in the top-25 occurs with Iowa State, which BCS Central ranked at #19.
See below for this other site’s ranking. (The blank “Md” cell seems to indicate a ranking below 25.) Iowa State is placed at #22 due to the effect of including non-BCS systems, resulting in a computer score of 0.36 rather than 0.64.
especially when dealing with a team or teams under major NCAA or conference sanctions. As noted on the analysis page, three teams are definitely excluded from BCS rankings this season – one other may be, depending on the timing and result of a ruling on an appeal.
Ohio State should not be included in the final determinations of any rankings, including strength of schedule. The operator of one site doesn’t have enough “BCS know how” to recognize the issue and deal with it. Ohio State is listed as #8 by Colley and is at #14 in Sagarin‘s ELO Chess rankings. The proper way to deal with this is to move up by one every eligible team ranked below the Buckeyes.
As can be seen below, this particular site administrator failed to do that. As BCS Central correctly did, LSU should be raised to #16 in Colley. Oregon‘s ranking should be adjusted to #8 in Colley and to #15 in Sagarin. [A separate issue is that the rankings attributed to Massey are not those used by the BCS]