Clyde Berryman's College Football Rankings

QPRS (QUALITY POINT RATING SYSTEM) - QPRS is a mathematical formula rating system for evaluating the performance of American football teams which takes into account a team's Schedule Strength, Won-Loss Record, and its Points Fielded / Points Allowed.

I am currently only able to do a final end of season college football ranking. My "day job" often involves long or unpredictable hours, travel, etc...Perhaps in retirement, I'll have the time to be able to do a weekly standings of sorts.

My QPRS system is not overly complicated but it is labor-intensive since I'm a shameful computer "dinosaur" and I do all my calculations by hand with pencil, paper, and a calculator. I rate circa the top 65 Div 1A teams (all those with a winning or even record plus any that have particularly noteworthy "tough" schedules) to then determine my Top 40.

Clyde's "Quality Point Rating System"

QPRS - A Mathematical Formula-Based Rating System Created for
the Purpose of Ranking NCAA American College Football Teams


The Quality Point Rating System {QPRS} evolved as a result of this writer's frustrations with the week-to-week opinion poll system of ranking America's top college football teams. The media polls, whether Associated Press {AP}, United Press International {UPI}, CNN/USA Today, or other, reflect the subjectivity of their voting audiences. I always believed that a more methodical approach to rating teams such as the methods used by parent organizations to rate chess players or tennis players would be more desirable

At the same time, I decided to keep the rating system simple enough that it can be performed by any individual who owns a calculator, pen, paper, and who has access to complete football score results. Time is another factor. I find that it usually takes me a full workday {eight hours} to comfortably rate the top 20 teams of any football season using QPRS.

QPRS underwent considerable testing and revisions before I settled on its current form. Some of its rating conclusions on certain football teams will be sure to challenge long-held popular conceptions. Top teams in the general public's mind do not always fare so well while lesser known teams from particularly competitive seasons emerge at the high end of the scale. {Note: In comparing teams from different eras, I do not for one moment pretend that a team from the early years could beat its modern-day counterpart. Football tactics in college have greatly evolved while players have become more powerful, faster, and more professional than their forebears. Rather, I compare how one team did within its season against how well a later team performed against its opponents in its season.}

Unlike the polls which often take on a narrow "who-beat-whom" focus and which seem to rely on increasingly short-term memory as the season progresres, I wanted to create a formula which would accurately measure a team's strength based on its overall season-long performance. I decided early-on that to accurately gauge a team's greatness in comparison to its competitors, it would be necessary to examine how it fared in three key categories:

  1. Its overall Won-Loss-Tie record,
  2. Its Schedule Strength, and
  3. Its average margin of victory or defeat (i.e., Avergage Points Fielded {APF} for Offense, and Average Points Allowed {APA} for Defense)

In order to demonstrate how the Won-Loss-Tie record, Schedule Strength, and average margin of victory/defeat interact in QPRS, the reader's attention is drawn to the examples which appear below. Each involves three hypothetical teams labelled, for the sake of simplicity, Team A, Team B, and Team C. In each example, Team A is the strongest team, Team B is in the middle, and Team C is the weakest team.

EXAMPLE ONE: Differing Won-Loss-Tie Records: All factors for each of the teams remain constant except for different Won-Loss Tie records. Each team played a 12-game season.
TeamWon-Loss-TieSchedule StrengthAverage Pts Fielded Average Pts AllowedOverall Rating
Team A11-1-040.0020.0010.00 347.40
Team B10-2-040.0020.0010.00 314.00
Team C 9-3-040.0020.0010.00 280.60
EXAMPLE TWO: Differing Schedule Strengths: All factors for each of the teams remain constant except for different Schedule Strengths.
TeamWon-Loss-TieSchedule StrengthAverage Pts Fielded Average Pts AllowedOverall Rating
Team A10-2-050.0020.0010.00 347.40
Team B10-2-040.0020.0010.00 314.00
Team C10-2-030.0020.0010.00 280.60
EXAMPLE THREE: Differing Average Points Fielded on Offense: All factors for each of the teams remain constant except for different Average Points Fielded.
TeamWon-Loss-TieSchedule StrengthAverage Pts Fielded Average Pts AllowedOverall Rating
Team A10-2-040.0030.0010.00 340.00
Team B10-2-040.0020.0010.00 314.00
Team C10-2-040.0010.0010.00 270.60
EXAMPLE FOUR: Differing Average Points Allowed on Defense: All factors for each of the teams remain constant except for different Average Points Allowed.
TeamWon-Loss-TieSchedule StrengthAverage Pts Fielded Average Pts AllowedOverall Rating
Team A10-2-040.0020.00 5.00 344.00
Team B10-2-040.0020.0010.00 314.00
Team C10-2-040.0020.0015.00 291.15
By transposing the above results to a graph, the reader will note that a better Won-Loss-Tie record or a more difficult schedule causes a team's overall rating to climb higher at a smooth, progressive rate. A higher Average Points Fielded, however, gains ground quickly but then tends to flatten out at the top of the curve. Conversely, a low Average Points Allowed tends to rise at the top of the curve for those truly tough defensive teams which only allowed a few points per game. In effect, this is a mild uay of putting a cap on the credit a team receives for rolling up scores against already-beaten opponents while also recognizing those unyielding defenses which allow opponents few opportunities to score. {NOTE: Some mathematical formula rating systems tend to handle runaway scores rather brutally and indiscriminately by "collapsing" scores when an arbitrary point margin is achieved.}

Graph of Examples

In QPRS, the Won-Loss-Tie record of a school and its Schedule Strength are inextricably linked. With regard to Schedule Strength, I look at Div 1A as being roughly divided between "normal schedule" {most major conferences, leading independents} and "weak schedule" {Big Sky, Big West, Mid-American, and to a large extent, Western Athletic}. Up until very recently {1990's}, a number of East coast teams {many now in the Atlantic Coast or Big East conferences} would from year-to-year fluctuate between "normal" and "weak" schedule category based on their opponents each year. Some independents cross the line from "normal" to "weak" from year-to-year such as So. Miss., Memphis St., Army, Navy, Ea. Carolina, Louisana Tech, etc. Some of these, such as Louisville, Cincinnati, Tulsa, and others appear to be making a determined effort to stay in the "normal" category of late. I treat Div 1AA opponents, when they occur in Div 1A play, as a third and weaker category choosing the category in which a team belongs can be somewhat subjective in cases where a close call is involved. This was particularly true ten or more years ago when the greater number of eastern independents created unevenness and some tough calls. As recently as 1982, for example, Penn St. played a number of close-call "weak schedule" opponents. However, since these "weak schedule" opponents were by-and-large winners, the net result is that Penn St. still comes out credited with a fairly tough overall Schedule Strength {53.74} for 1982.

For the sake of reader interest, the previous examples using Teams A through C relate to fairly successful teams whose performances would usually put them in the upper half of most end-of-season polls or formula-based rankings. While the Won-Loss-Tie record is self-explanatory, the following is a rough guide of how to interpret QPRS ratings in the areas of Schedule Strength, Average Points Fielded, Average Points Allowed, and the respective Offense, Defense, and Overall Power Ratings:

Schedule Strength:

60s - Very Tough Schedule
50s - Tough Schedule
40s - Average Schedule
30s - Weak Schedule
20s - Very Weak Schedule

Average Points Fielded:                    Average Points Allowed:

40+ - Very High-Scoring Offense            - 5 - Very Tough Defense
30  - High-Scoring Offense                 -10 - Tough Defense
20  - Average Scoring Offense              -15 - Average Defense
1O  - Below Average Scoring Offense        -20 - Below Average Defense
less than 1O - Poor Scoring Offense        -25 - Poor Defense


Offense:      Defense:      Overall:

201 - 250     181 - 225     400 - 500  =  Outstanding
151 - 200     136 - 180     300 - 399  =  Good
1O1 - 150      91 - 135     200 - 299  =  Average
 51 - 1OO      46 -  90     1OO - 199  =  Below Average
  1 -  50       1 -  45       0 -  99  =  Poor
In sum, I hope the above.explanations are useful toward understanding the QPRS NCAA College Football rating system and that the reader will find QPRS rating results of past and present college football teams of interest

(c) June 1994 by Clyde P. Berryman

Historical Rankings

Commentary on 2007 Season

2007 - the 'upset' season - witnessed fierce competition for the national college football title right down to the wire. One of the most exciting seasons in recent memory from a spectator's standpoint. From a purely ratings perspective, the fact that there were no stand-out teams with tough schedules who were able to dominate means that 2007 National Champion Louisiana State (12-2) has one of the lowest national champion ratings since the Washington Huskies back in 1984. In fact, 2006 Louisiana St. (11-2), which only ranked fourth in the QPRS standings last year (behind Florida, Ohio State, and Southern Cal), had an eerily close overall rating (380.123) and schedule strength (53.49) to this year's champion team. Just for historical context, I'm also attaching my QPRS American College National Champions list which goes back to 1940 and shows that increasingly, a team usually needed an overall rating in the 400's to become the champion. None of this year's teams broke into the QPRS Top 100 American College Football teams since 1940 list either.

The amazing upsets of this past year served to highlight just how off-target many of the pre-season polls or rankings could be. You obviously can't stop human nature and everyone enjoys to speculate and have their pre-season favorites. However, I have always been against polls or rating systems which actually make use of a pre-season ranking as a starting point by assigning teams a subjective pre-season ranking order. I think we all remember a few seasons ago how Auburn missed a chance at playing in the national title game only because they started off too low in the pre-season polls. An arbitrary starting pecking order based on pre-season 'gut feelings' only serves to pollute or skew the year-long validity of a poll or ranking which incorporates such subjective information. Serious rankings should be based only on results which take place on the football field once the season is underway.

One thing which QPRS also does is 'self-correct' with regard to the difficulty of a team's opponents as the year progresses. Just as an example, when Oregon St. beat California this year, it was facing a 5-0 opponent and it got a nice bounce in the polls for defeating a highly-ranked, unbeaten team. By the time 'lowly' Stanford beat them near season's end, California had lost five times and the game went by almost unnoticed. Does Stanford deserve less credit for beating California in 2007 than Oregon State? It was the same California team - pretty much the same players, same coaches. What happened is that Oregon State was simply the first team chronologically to learned how to exploit California's vulnerablities. These flaws were there when Tennessee, Arizona or Oregon played them beforehand but they simply didn't get the job done. In some polls or rankings, Oregon State would get credit for beating a highly-ranked 5-0 California (100%) while Stanford would only get credit for beating a forgotten 6-6 (50%) California which by then had dropped well out of the Top 25. So in QPRS, at year's end, both Oregon State and Stanford (and all the other teams before, after and in-between) get the same credit for having beaten a California team with a 7-6 season record. Obviously, margins of victory will differ and may affect the ratings but they all played against 2007 California.

<- Parent Directory

Clyde Berryman / hardbraking at hotmail dot com