Charles Murray has once again pointed his finger at the obvious, and asked social scientists, “Have you ever noticed that?” “Have you ever noticed that welfare gives bad incentives?” (Losing Ground.) “Have you ever noticed that some people are smarter than others?” (The Bell Curve.) Now he’s pointing his finger at higher education, and asking us: “Have you ever noticed that colleges don’t teach a lot of job skills?”
In past instances, social scientists’ first response to Murray was to wrinkle their noses in disgust: “How dare he?” But within a few years, Murray’s common-sense questions changed the way we think. Can his new book on higher education repeat his past success?
It should, but as Murray now frames his argument, it won’t. To once again profoundly change the social sciences, he’s going to need both important neglected facts and a clear story (or “model”) that explains them.
Murray is already doing well on the “important neglected facts” front, boldly pointing out that:
1. Only a tiny minority of students want or are capable of getting a liberal education.
2. For all other students, “[F]our years is ridiculous. Assuming a semester system with four courses per semester, four years of class work means thirty-two semester-long courses. The occupations that require thirty-two courses are exceedingly rare.”
3. Although students acquire few job skills in college, employers pay them extra anyway. “Yes, the wage premium for college is associated with these majors as well, but please don’t tell me it’s because employers think college augmented our human capital.”
I have been in school continuously for over three decades, and all of Murray’s observations match my experience. As a college professor, my job is to teach the material I learned when I was a student — and even I have to admit that there is only a weak connection between what I studied and what I need to know to do my job. For students who leave academia (i.e., almost all of them), the connection between what they studied and what they need to know to do their job is virtually non-existent.
So far, Murray and I are on the same page. But when he tries to explain how useless studies translate into big bucks, his story gets fuzzy. On the one hand, he tells us that “The BA really does confer a wage premium on its average recipient, but there is no good reason that it should.” On the other hand, he insists that “Employers are not stupid.” How can both be true?
Even stranger, Murray often talks as if the entire labor market were centrally planned by university committees. Perhaps I am being too literal. But it is one thing for Murray’s imaginary education task force to say, “Let’s reify the BA.” It is quite another for a task force — even an imaginary one — to say, “We will attach an economic reward to it that often has nothing to do with what has been learned.” What’s wrong with this picture? Universities don’t “attach economic rewards” to their degrees. That decision is up to millions of competing, consenting employers. Unless higher education convinces employers that workers with BAs are more productive than workers without, the BA won’t have any “economic rewards.”
Of course, employers aren’t infallible. But they have a strong incentive to see through academic hype. When firms overpay the overeducated — or needlessly “stigmatize” applicants without a BA — the market charges them for their mistake.
Another paradox in Murray’s presentation: On the one hand, he paints the BA as a force of nature. It’s powerful enough to demote high-school grads to “second-class citizens.” On the other hand, though, Murray feels that “conditions are right for change” – even though most of his “conditions” have been around for decades. Bottom line: If “a handful of key decisions could produce a tipping effect,” why haven’t they already done so?
If Murray can’t clarify his model, no one is going to take his facts seriously. Fortunately, I can help. Here’s what Murray should have said: “To a large extent, the BA is what economists call signaling. Individual students who go to college usually get a good deal; so do individual employers who pay a premium to educated workers. The problem is that this individually rational behavior is socially wasteful, because education is primarily about showing off, not acquiring job skills.”
At one point, of course, Murray alludes to the signaling model:
For people like us [liberal arts majors], presenting a BA to employers amounts to presenting them with a coarse indicator of our intelligence and perseverance. If we have gone to an elite college, it is mostly an indicator of what terrific students we were in high school…
Unfortunately, he doesn’t take his own pointed observation seriously. Consider: If the BA is a signal of intelligence and perseverance, why haven’t certification tests caught on? The obvious explanation is that the first people in line to take the test would have high intelligence but low perseverance. After all, if they are smart enough to do well on the test, why are they so eager to avoid college? Are they lazy or something? Consider further: If elite college is mostly a signal of terrific high school performance, why don’t employers poach the nation’s top high school grads before they set foot at Harvard? Perhaps employers would wind up hiring the top slackers of the Ivy League. The best students wouldn’t be looking for an easy way out.
An unfortunate implication of the signaling model is that cutting the BA down to size will be a lot harder than Murray thinks. As far as employers are concerned, the BA works. When they pay college grads more, they get their money’s worth. You can try jawboning Microsoft into switching to certification tests. But can we really believe that Murray has seen a profit opportunity that Bill Gates hasn’t?
I’m not embracing fatalism — our education system could be a lot better. If we want to get our wasteful education system back on track, though, we’ve got to make the BA less appealing. The most obvious route is to cut government spending for education. It’s just plain crazy for government to subsidize anyone who wants to signal that he’s smart and hard-working compared to other people. After all, no matter how big the subsidies are, only half of us can look better than average.
Once students (and their parents) started paying a larger share of their tuition, Murray’s dream world might stand a chance. Suppose, for example, that people really had to fork over $30,000 per year to attend college. In this environment, there would be a strong demand for certification tests, apprenticeships, and so on, because many high-quality workers wouldn’t go to college. As I often tell my students: In a world without education subsidies, they probably couldn’t afford college. Happily, though, they also wouldn’t need it.
 Unless they hire their own students!
 Admittedly, the “vital and growing world of online education” is more recent. But earlier – and equally “revolutionary” – technological changes have had little impact on higher education. Consider the VCR. It seems like a great substitute for faculty: Why pay for mediocre professors when you can record the best lecturers in the world and learn in the privacy of your own home? In practice, though, it’s hard to see that the VCR has put more than handful of professors out of work.
 This is a standard explanation for why people with GEDs earn less than you would expect given their IQs.
Bryan Caplan is associate professor of economics at George Mason University and author of The Myth of the Rational Voter.