Another entry in the general skills vs. job skills conversation occasioned by recent decision clarifying unpaid internship rules. IMHO point could be more strongly made: the risk shift from corporate employers to individuals is gigantic distortion in higher education and society in general. This from the Chronicle of Higher Education.
June 22, 2015
Business Can Pay to Train Its Own Work Force
In the spring of my senior year, I interviewed for a contract-negotiation job at a law firm.
My college major was in peace, war, and defense, which may have sounded intriguing to professional litigants. But I had no legal training. My chief assets were literacy, an eagerness to please, and a pressing need to pay rent.
The interview got right to the point. “How would you organize a thousand retransmission-consent contracts?” asked the stone-faced lawyer, looking across a conference table.
Having never heard of a retransmission-consent contract, I offered the only sensible response.
“Alphabetically?” I asked back.
This was not the right answer.
But they hired me anyway and trained me to do the job. This cost them in the short run, while I puzzled my way through FCC regulations and Nielsen ratings, but it paid off nicely over time. My contract knowledge earned the firm solid revenue.
This is how employment is supposed to work. Companies hire broadly educated workers, invest in appropriate training, and reap the profits of a specialized work force.
Increasingly, however, employers have discovered a way to offload the nettlesome cost of worker training. The trick is to relabel it as education, then complain that your prospective employees aren’t getting the right kind.
“Business leaders have doubts that higher-education institutions in the U.S. are graduating students who meet their particular businesses’ needs,” reads the first sentence of a Gallup news release issued last year. Barely a third of executives surveyed for the Lumina Foundation agreed that “higher-education institutions in this country are graduating students with the skills and competences that my business needs.”
Bemoaning the unpreparedness of undergraduates isn’t new. Today, however, those complaints are getting a more sympathetic hearing from the policy makers who govern public higher education.
“We’ve got to adapt our education to what the marketplace needs,” Governor Pat McCrory of North Carolina said this year at a conference on innovation. “People are ready to get the work. Let’s teach them these skills as quick as possible.”
The governor spoke shortly after a panel session on “New Delivery Models for Higher Education.” Moderated by the head of the state’s chamber of commerce, the session highlighted a particularly innovative approach to education in the form of a tech start-up called Iron Yard.
Iron Yard is a for-profit code school — it teaches people how to program computers, build applications, and design websites. A 12-week course costs $12,000, promising quick proficiency in one of the tech industry’s in-demand skills.
I don’t object to this, except the part where politicians and business leaders call it a new model for higher education. It is actually a new model for worker training, one in which the workers bear the costs and risks for their own job-specific skill acquisition, while employers eagerly revise the curriculum to meet their immediate needs.
Critics of contemporary higher education lament the decline of a broad, humanistic education but often misidentify the cause. To the extent that such a curriculum is on the wane, the culprit is not ’60s-vintage faculty radicalism or political correctness run rampant, but the anxiety-driven preference for career-focused classes and majors.
Most faculty members would love to have more students delving into the classical canon — or any canon, really. But they’re up against policy makers and nervous parents who think average starting salaries are the best metric for weighing academic majors. Private-sector imperatives also threaten to dominate extracurricular time. I now work at a large public university, where I serve as a staff mentor to a cohort of freshmen. Inevitably I spend the first few weeks of the fall semester tamping down anxiety about summer internships. Students who haven’t yet cracked a textbook or met a professor worry about finding summer programs to improve their résumés.
My university recently began offering grants to low-income students who otherwise can’t afford to take internships. It’s a great program, and I’m glad we have it. But it means that academe and its donors are now responsible for subsidizing profitable companies that want future employees to have work experience but don’t want to pay students for a summer’s work. There are many ways society could choose to address the inequity of unpaid internships. Having colleges collect and distribute tax-deductible grants to the private sector’s trainees is perhaps not the most straightforward.
This blurring of the distinction between education and job-skill training isn’t simply a fight over academic priorities. It’s a fight about who pays the cost of doing business: the companies that profit, or some combination of workers and taxpayers. The more we’re willing to countenance a redefinition of job training as education, the more we ask society to shoulder what were once business expenses.
The same tension between public investment and private returns is playing out in the realm of research.
As state funding for research universities has ebbed, pressure has increased for academic institutions to more efficiently monetize their discoveries. Policy makers talk of shortening the pipeline from laboratory to marketplace, putting ever-greater emphasis on the kind of applied research that yields quick returns.
This is all perfectly fine — no one begrudges the discovery of a breakthrough drug or a valuable new material. But with finite resources on campus, more emphasis on marketable products will inevitably mean less focus on the foundational, long-range science that may not yield tangible results for decades. This has already happened in the private sector, where a relentless focus on short-term returns has crowded out spending on fundamental research. Sending universities down the same path risks eroding one of our most important bastions of basic science.
I sat through an economic-development workshop recently — “Research to Revenue” — in which a successful start-up CEO spoke with admirable bluntness about the need to keep university researchers involved in product development but off the company payroll.
“The salaries of these people are often significant,” noted the executive. “As a company, you really don’t want to take that on unless you absolutely have to.”
Of course not. Much better to let taxpayers, through colleges and federal grant dollars, pick up the tab while private-sector “partners” guide faculty efforts toward privately profitable ends. This is what a more entrepreneurial campus means, after all — a campus more attuned to profit.
“The thought now and then assails us that material efficiency and the passion to ‘get on’ in the world of things is already making it so that the liberal-arts college cannot exist,” the University of North Carolina’s president, Edward Kidder Graham, wrote in 1916. “But this is a passing phase,” he continued, advising colleges to keep their focus on creating and teaching “the true wealth of life.”
If Graham’s confident vision feels like a hopeless anachronism today, then we begin to measure the distance of our retreat. Faced with recessionary state finances and lawmakers who regard the public good as oxymoronic, university leaders have reached for the language of investment and return. The consequences of that narrow view are mounting.
Celebrating the intrinsic value of public higher education is not a nostalgic indulgence but a joyful duty. We spoke that language once; we should try it again.