The prestige of undergraduate colleges largely depends not upon what their students learn in college but upon what their students were before they get to college (e.g., their SAT/ACT admissions test scores.
One nonprofit group has sponsored the College Learning Assessment Plus test to do assessments on freshmen and seniors to see if they get better at critical thinking skills during their time at the college.
The WSJ filed freedom of information requests at a bunch of public colleges and got data back on 68 colleges. From the Wall Street Journal:
Freshmen and seniors at about 200 colleges across the U.S. take a little-known test every year to measure how much better they get at learning to think. The results are discouraging.
At more than half of schools, at least a third of seniors were unable to make a cohesive argument, assess the quality of evidence in a document or interpret data in a table, The Wall Street Journal found after reviewing the latest results from dozens of public colleges and universities that gave the exam between 2013 and 2016.
At some of the most prestigious flagship universities, test results indicate the average graduate shows little or no improvement in critical thinking over four years.
The University of Texas at Austin had the best freshmen test scores of the 68 public colleges in the WSJ database. But, senior Longhorns scored worse than freshmen.
Some of the biggest gains occur at smaller colleges where students are less accomplished at arrival but soak up a rigorous, interdisciplinary curriculum.
The test seems to be biased in favor of schools with weak freshmen classes.
Here’s the WSJ’s data on the 68 colleges.
Perhaps the most impressive performance among the 68 was Cal Poly San Luis Obispo, which has a strong freshmen class initially and they improve significantly over the 4 years to score the highest on the senior CLA+ test.
That seems plausible. Cal Poly SLO has a reputation as a tough, serious school for tough, serious students.
For prospective students and their parents looking to pick a college, it is almost impossible to figure out which schools help students learn critical thinking, because full results of the standardized test, called the College Learning Assessment Plus, or CLA+, are seldom disclosed to the public. This is true, too, of similar tests. …
Tests such as the CLA+ can be used to fulfill a mandate by accreditors for schools to show that they are trying to assess and improve the education they provide.
The CLA+ measures critical thinking, analytical reasoning, problem solving and writing because it demands students manipulate information and data in real-world circumstances that require different abilities. It has been lauded by a federal commission that studied higher education.
The test has detractors. It is hard to completely untangle cause and effect in something as complicated as improving critical-reasoning skills and as broad as a college education. And students don’t always try their hardest when they take the exam, since there is little at stake for them.
Colleges where students perform poorly say it is unfair to draw sweeping conclusions from a single test. They argue that students from different colleges shouldn’t be compared because freshmen have widely varying abilities. Some prestigious schools say their schools don’t show much improvement between the first and fourth years because their students are so accomplished when they arrive that they have little room to improve.
Or maybe UT-Austin’s students spent all four years partying on Sixth Street?
Colleges where students perform well on the test say it is an accurate gauge of their academic programs.
The CLA+ requires students to use spreadsheets, newspaper articles, research papers and other documents to answer questions, make a point or critique an argument. …
The biggest point gain came at Plymouth State University, a college in New Hampshire with about 3,600 undergraduate students. Plymouth State seniors in 2014 had an average CLA+ score of 1,185 points, which was 178 points higher than the average freshman score at Plymouth of 1,007. The school’s total count, or “value-added score” — which includes factors such as graduation rates — put Plymouth near the top in the 95th percentile of schools that took the test in 2014.
Plymouth State in New Hampshire presumably lets in a lot of white slacker kids and then does a good job with getting them not to slack off as much.
There is also probably a selection effect in that colleges with undistinguished freshmen classes like Plymouth have higher dropout rates, so their seniors are a subset of their better freshmen, whereas UT Austin probably has a low dropout rate.
The CLA+ is not sweeping the world of higher education. Most colleges don’t give the test and the ones that do try to keep it secret.
Another potential way to do value-added analyses would be to link test scores from high school, such as SAT/ACT, to test scores among graduates, such as GRE, LSAT, MCAT, etc. A third party sworn to secrecy could match scores from different organizations based on Social Security Number. This is kind of like how Raj Chetty is allowed to use IRS 1040 tax returns data. The IRS does the matching for him of parents’ income in the 1990s to the income in the 2010s of children whose SSNs had been listed as dependents on their parents’ returns in the 1990s. So, he can’t look up, say, Donald Trump’s income in 1999, but he probably has that number somewhere in the data he’s working with.
Another value-added approach would be to count publicly available information on high achievers and their undergraduate colleges and compare it to the SAT/ACT scores reported by these colleges for their freshmen class to USNWR.
For example, Reed College, an ornery, independent-minded liberal arts college in Portland that is an unusual combination of hippie culture and tough academics, claims that it is very good at producing future college professors (although it is one of the few colleges that doesn’t cooperate with USNWR, so maybe this isn’t a good example). You can find the names and titles of most college professors in the United States online and where they got their Bachelor’s degrees. So, does Reed produce a lot of professors relative to their freshmen class size and SAT/ACT scores?
Writing code to scrape this data off professors’ CVs would be a lot of work, but it could be done. I wouldn’t be surprised if it wouldn’t be that hard to get the undergraduate colleges of everybody who has passed the bar exam or has become a certified doctor. That information is probably publicly available.
Some problems with this kind of backward looking analysis are that it wouldn’t be very up to date. We could tell if, say, Reed has graduated a lot of future professors or doctors or lawyers over the last, say, 50 years, but the data might start getting kind of sparse for smaller colleges for recent years.