If GPA is the best predictor of college success, why do colleges cling to ACT and SAT?

Georgia students saw a rise in their performance on the old SAT. (AJC File)

In preparation for applying to colleges this fall, my 17-year-old twins just took the June SAT and ACT.  So, I was interested in today’s release of a National Association for College Admissions Counseling survey of 400 member campuses. The survey asked how often colleges validate the usefulness of these standardized entrance exams.

The survey found only 51 percent of the colleges conduct predictive validity studies to discover whether the tests tell them anything helpful. Yet, nearly 8 out of 10 colleges require either the ACT or SAT.

Among that 51 percent, 59 percent conduct validity studies annually; 24 percent every other year. Colleges have their own study protocols for gauging whether SAT or ACT scores mean anything.

“Some admissions offices continue to require the ACT and SAT out of habit. Others believe the tests convey ‘prestige.’ As NACAC shows, many of these institutions lack current evidence that the scores accurately forecast academic outcomes,” said FairTest Public Education Director Bob Schaeffer in a statement.

So, what do colleges believe is the most reliable determinant of how well students will fare on their campuses? “Overall, it is clear that high school grades are by far the most significant predictor of college academic achievement,” states the report.

Some other key findings in the survey:

  • Slightly more than half of responding institutions believed that test scores as admission criteria were considerably important. Public institutions were more likely to rank test scores as “considerably important,” as were institutions with larger enrollment.
  • The variable that generally reflects the strongest correlation with college academic achievement is the high school GPA, and almost all of the colleges interviewed used some form of it as a variable in their validity studies. In general, admission offices or researchers recalculate the averages of incoming students so they are all in the same format, usually on a 0-4.0 scale.
  • Rank in class offers the apparent advantage of compensating for the difference in grading curves among secondary schools. Class rank remains in fairly common use as an admission criterion, although it has become increasingly less important over the past two decades. Only two of the 11 institutions interviewed in-depth as part of the NACAC  report made use of class rank in admission decisions, and in one case only as a component of the overall academic rating.
  • While admission offices maintain a common protocol for the evaluation of any application, there are a variety of applicant cohorts that receive additional attention and may in some cases be admitted with a lower academic threshold than the pool as a whole. There are a wide variety of these groups: alumni children, students who live further away from campus, international students, students representing ethnic or racial diversity, first-generation college students, students of one gender or the other, recruited athletes, students applying early decision and more.
  • Many institutions either use a rating for high school extracurricular involvement in their admission process, or subsume it within a broader “personal qualities” rating. Several institutions investigated the relationship of extracurricular activities to college academic achievement. One college found no correlation on a broad basis, but that some of these students did less well than predicted. Another college found a small positive correlation between the extracurricular rating and college grades.

You can read about a study here that found no difference in academic performance in college between students who submitted SAT/ACT scores and those who did not.  (Students applying to public colleges — including Georgia schools — have to submit ACT or SAT scores.  )

In 2009, I interviewed the authors of Crossing the Finish Line: Completing College at America’s Public Universities.” In sifting through data from 200,000 students at 68 colleges, the researchers found students with exemplary grades from weak high schools still graduate at a high rate from college. Their conclusion: Those impressive report cards, regardless of the high school that issued them, are the most powerful predictor of college completion rates.

 

Reader Comments 0

24 comments
Lunaville
Lunaville

I think Chanda Roberts White has given us the most accurate     answer. The testing industry is just that - an industry.


As for these mythical schools that hand out A's and unicorns, I am related to young people who get a zero if an assignment is one day late. Furthermore, if an assignment is complete, but accidentally left in the student's locker, rather than allow the student to go to the locker and retrieve that work, students are frequently given a zero. (Apparently, this policy is meant to teach responsibility, but what company would forbid an employee to go back to his desk for a  critical report?) In fact our neighborhood school had at least one suicide this year and I doubt they were due to all the free A's that were handed out.


So, name these schools that hand out A's to everyone? Where are they? Is there ample housing available in the area? Do you know anyone who would like to sell their home to very, very interested buyers?



Lee_CPA2
Lee_CPA2

"Why do colleges CLING to ACT and SAT?"

Cling??  Gee, I don't know.  Perhaps because they have been proven to be an effective predictor of first year college success?  Perhaps they are an effective way to normalize between students from a way array of high schools that may not have the same grading criteria or rigor?

Let's not stop there.  Why do law students have to take the LSAT?  Why do pharmacy students have to take the PCAT?  Why do medical students have to take the MCAT?  Why do dentist have to take the DAT?  Why do business students have to take the GMAT/GRE to get into business school?  

Geez.  Let's just put everyone's name in a hat and draw names.

Chanda RobertsWhite
Chanda RobertsWhite

Money money money money. Some people got to have it. Some people really need it.

Carlos_Castillo
Carlos_Castillo

I'd suggest that the relative difficulty of the college has much to do with the predictive validity of the standardized tests. At a challenging school, someone who scores over 700 (96th+ percentile or so) in the verbal is highly likely to do far better than someone who gets an average score of 500.  


Also, I'm not sure that the standardized tests were ever designed to predict that someone with 600 and a straight "A" average in high school will do worse during freshman year than someone with a verbal 680 and a B- average graduating from the same high school.


Where the tests might predict trouble is with someone who has 480 scores in a not too great high school but has gotten straight "A" average.  Accepting that kid as a freshman into a savagely competitive school is most likely to do him or her no favors.  


Does this mean that the he or she can't do well in college?  No.  Kids can make up substantial gaps.   That may be more likely to happen when the student who has 480s goes to a college good enough to stretch but not break him or her during two or four years.   If he or she stays diligent, the better path might be to transfer into the highly competitive school in the junior year or stay put and go for the big name grad school.  


The University of Texas at Austin takes the top 10% from every high school in Texas.  If it requires standardized tests for admission, then, as a reasonably difficult school, it would be a good place to compare the relative predictive ability of the standardized tests and GPA, because the suburban schools surrounding big cities likely do best in the standardized test, where inner city and rural schools are not funded so well and generally go worse on standardized tests.


Admissions officers taking high performers from highly rated high schools aren't taking much of a risk.



MaureenDowney
MaureenDowney moderator

@Carlos_Castillo The 1997 Texas Automatic Admissions Law, which guarantees high school graduates in the top 10 percent of their class that they could attend the state school of their choice, does not have any minimum requirement for SAT or ACT scores. It is being in the 10 percent of your class based on GPA that matters.

The law was designed to broaden opportunities for students from rural Texas schools at the premier public institutions, which had been oversubscribed with middle-class suburban kids.

At both the University of Texas and Texas A&M, those admitted under the top-10-percent guarantee produced higher grade-point averages, higher retention rates and higher graduation rates than those not admitted under the 10 percent plan.

Lee_CPA2
Lee_CPA2

@MaureenDowney @Carlos_Castillo

"At both the University of Texas and Texas A&M, those admitted under the top-10-percent guarantee produced higher grade-point averages, higher retention rates and higher graduation rates than those not admitted under the 10 percent plan.

Newsflash.  This just in, UT and TAM just discovered statistics.

MaureenDowney
MaureenDowney moderator

Here is a response to today's blog  from the College Board's director of media relations Maria Eugenia Alcón- Heraux:


The College Board, like NACAC, views predictive validity research as fundamentally important to the admission profession and process. We offer a variety of free services to support institutions in conducting their own validity research, including our online validity study service, ACES™.  We also conduct extensive national validity research that aggregates results from 160+ institutional validity studies so that we can better understand the relationship between test scores and high school grades with college outcomes. We look forward to continuing to partner with NACAC to ensure that institutions using SAT scores for admission can conduct a predictive validity study that fits their needs and supports a fair admission process.

And here’s further background:

As part of the redesign of the SAT, the College Board conducted a preliminary predictive validity study to examine the predictive validity of redesigned SAT scores with college outcomes. According to this study:

-  The redesigned SAT remains as predictive of college success as the old SAT.

-  Redesigned SAT scores improve the ability to predict college performance above high school GPA alone — and more so than has been shown in previous studies.

-   In other words, while the SAT and high school GPA are both measures of a student’s previous academic performance that are strongly related to first year GPA in college, they also measure somewhat different aspects of academic performance and therefore complement each other in their use in college admission and the overall prediction of first year GPA.

-   There is a strong, positive relationship between redesigned SAT scores and grades in matching college course domains, suggesting that the redesigned SAT is sensitive to instruction in English language arts, math, science, and history/social studies.


OriginalProf
OriginalProf

I strongly disagree with the initial assumption that a student's GPA is the best predictor of college success, given the reality of grade inflation here in Georgia.

Don't Tread
Don't Tread

@OriginalProf Yeah I wouldn't put too much faith into many public school straight "A" graduates in the past 20 years or so based solely on grades.  There's too many shenanigans going on in the public schools designed to give people bonuses when they didn't earn them.


The ones like APS (that we now know about) are just the tip of the proverbial iceberg.

redweather
redweather

@OriginalProf While I think the HOPE scholarship has contributed to the problem of grade inflation, the student admission records I review typically show a correlation between HS GPA and SAT/ACT scores.

Wascatlady
Wascatlady

SAT/ACT scores ADD to the predictive power of HSGPA.  That is, HSGPA is a good predictor; SAT/ACT plus HSGPA is an even better predictor.  It also helps weed out students who have a 4.0 but an 800 SAT (out of 1600).  You have to ask--how can someone be a straight A student and have such a poor SAT?  And yes, there is test anxiety, but if you are going to college you darned better find a way to deal with your test anxiety.


I had never really thought about colleges validating, individually, the predictive power of SAT/ACT and grades.  However, doing so, if done correctly, might assist in setting the goals for the admitted class.  One mitigating factor is the push to admit groups that are underrepresented on that campus.  For example, on Tech's campus, perhaps a 3.0 in high school might be a good predictor of success for women or some minority groups, but for white men and Asian students, a 3.5 HSGPA might be a better standard.  Would different admitting standards stand the test of a lawsuit?


I had an internship in the admissions office of a large university in a state south of Georgia in the late 1980s.  I routinely sent out denial letters to white students with high GPAs and SATs of 1100, yet other candidate groups were given automatic admission with much lower scores.  I don't know if that university had done any validation of their cutoffs.  I think nowadays there is more emphasis on GRADUATING students, rather than just admitting them.


I wonder if, in states like Georgia, the SAT might be more important, given the incredible grade inflation we appear to have seen since 1993.

GB101
GB101

“Overall, it is clear that high school grades are by far the most significant predictor of college academic achievement,” states the report.


This quote from the NACAC report is a main argument in the editorial.  The sentence which follows is also in the report, and is found immediately after the sentence about high school grades.


" In addition, for all the schools interviewed for this project who examined the subject, standardized testing made a significant contribution to the ability to predict college academic performance."



Maybe this answers the question of why colleges cling to the SAT and ACT.  



MaureenDowney
MaureenDowney moderator

@GB101 But the point of the survey is that 49 percent of colleges don't do anything to validate the predictive reliability of the entrance exams.

I would recommend looking at the "Crossing the Finish Line" book. It was written by William Bowen, president emeritus of Princeton University and the Andrew Mellon Foundation, Michael McPherson, former president of Macalester College, and Matthew Chingos, now a Senior Fellow at the Urban Institute .

Among their findings:

-High school grades are a far better incremental predictor of graduation rates than are standard SAT/ACT test scores.

-The strong predictive power of high school GPA holds even when we know little or nothing about the quality of the high school attended.

-In explaining the role of grades, they write:  “We believe that the consistently strong performance of high school GPA as a predictor of graduation rates derives in large part from its value as a measure of motivation, perseverance, work habits, and coping skills, as well as cognitive achievements.”

GB101
GB101

@MaureenDowney I am not sure I see a problem if only half of the schools do studies to validate the predictive value of the SAT and ACT.  Would the predictive value vary to a significant degree from school to school?  Seems to me that if the tests are predictive for one college they are predictive for another.  


I am skeptical of the assertions made in your last three paragraphs.  Would Bowen advocate that Princeton accept an applicant from Therrell with an A average as readily as an applicant from Westminster with the same average?  Or would he want the university to do some checking first?  And would standardized tests be a part of that process? 


This is an interesting issue.  Thank you for writing about it.  I may do some reading about Bowen et al.

GB101
GB101

How credible is this "Fair Test" organization?  


As for grades in high school being the best predictor: how does one account for the difference in high schools' quality?  Does an A average at one of the schools identified here


http://www.ajc.com/news/news/local-education/these-are-the-georgia-schools-ranked-among-states-/nmtYs/


equal to an A average at one listed here


http://www.usnews.com/education/best-high-schools/georgia ?


Furthermore, whose GPAs is NACAC considering in its study that shows grades are better predictors than SAT scores?  Is it the high school GPAs of students who were admitted?  And if so, didn't these students also take the standardized tests which in part qualified them for admission?  


In other words, does a high school student's GPA by itself predict success in college?  Or does it have predictive value only for those who are admitted to college based on grades, test scores and other factors?

redweather
redweather

In my experience evaluating students' academic records, there is a clear correlation between SAT scores and report cards. There are certainly outliers in both directions. I've seen students with a 2.3 high school GPA and SAT scores as high as 700; I've also seen a few students with 3.8 GPA's and SAT scores barely nudging 500.

As one of these studies shows, however, a strong high school academic record usually indicates that a student knows how to be a student, and that counts for a lot in college.

Kate Maloney
Kate Maloney

And why is the US Presidential Scholarship still tied to SAT as the first filter - except in the new area of CTAE Presidential Scholars?