Talent Management Column

Ill-Prepared Workers?

Ill-Prepared Workers? | Human Resource Executive Online In the past, most entry-level workers learned their skills on the job, but such training programs went away long ago. These days, most employers expect schools to prepare students for the workforce -- and then they are disappointed with the abilities of their new hires. There is probably no easy resolution to the disconnect.

Monday, August 17, 2009
Write To The Editor Reprints

The Conference Board just issued a report describing the way employers assess the abilities of newly hired entry-level workers who recently graduated from high school and two- and four-year colleges.

The findings are more nuanced than the title, The Ill-Prepared U.S. Workforce: Exploring the Challenges of Employer-Provided Workforce Readiness Training, suggests, but still, about half the employers think their new hires are inadequately prepared for work. Note that these are not job candidates, but people who have already been hired.

At the same time, a story appeared in the New York Times about the competition for unpaid internships being so great that people (by that, I mean parents) pay significant fees to get access to them. Why are they paying for the privilege of working for free, above and beyond the tuition already spent for education that is increasingly focused on getting a job?

Because the internships provide the work-based skills that future employers want and that can't easily be learned elsewhere.

There seems to be something wrong with the picture painted by these two accounts, with employers complaining that applicants aren't ready to work, while potential applicants are so desperate to improve their skills that they are paying for opportunities to practice working.

Some of the disconnect may be because the two descriptions concentrate on different segments of the labor force: The Conference Board respondents are concerned mainly -- but not exclusively -- with the lower end of the education distribution, while the New York Times story focuses on the upper end.

But there is something much bigger behind these two accounts, and it has to do with who is responsible for work-based skills. 

In the old days, and by that I mean before the 1980s, employers hired applicants based on the potential to do the job. Entry-level positions were very basic, and there was no expectation that new hires were ready for detailed training associated with more sophisticated jobs.

They gradually got used to the workplace. Training programs were extensive, and apprenticeships and related models were as much about teaching young people how to behave at work as they were about teaching job-specific skills.

And the new hires tended to stay with the employer for life.


Investments in new hires have shrunk dramatically, perhaps because employers no longer expect new hires to be with them for long. The expectations now are that new hires should be able to "hit the ground running" in the sense of contributing very soon, if not right away.

A little less than half the employers in this 2009 Conference Board survey provided any training for workforce readiness. If that sounds like a lot, note that the "training" seems pretty minimal: allowing employees to read materials on the intranet was the most common technique.

New-hire orientations and onboarding programs seem to qualify as workforce readiness training as well, so the actual investments in most of these programs aren't big.

The report indicated that employers that provided no such opportunities thought that it was not their responsibility to do so. So whose responsibility is it? The finger usually points to schools. But the complaints from employers were not about academic skills. Most of the gaps seen in new hires by employers cited in the 2009 Conference Board report center on behaviors such as "creativity," "ethics" and "professionalism." 

Newsletter Sign-Up:

HR Technology
Talent Management
HR Leadership
Inside HR Tech
Special Offers

Email Address

Privacy Policy

Maybe the problem is simply unrealistic expectations. Expecting to find candidates with professionalism and creativity in jobs that pay less than $10 per hour, for example, is asking a lot. As Patsy Cline once said, "People in hell want ice water: That don't mean they get it."

But there is a fundamental disconnect here. How are young people supposed to learn the work-based skills and behaviors associated with success in the workplace?

What employers want in new hires does not conflict with what most people would think are important attributes to be a good person and citizen, but schools have a hard enough time meeting their basic mission of teaching students traditional academic material.

Individuals can't learn these skills by themselves, and few can pay for entry-level job experience to get them. The best place to learn work-based skills hands down is at work. Employers have to get back to providing entry-level work skills, but we can't expect them to foot the bill for investments -- employees -- that they can't keep.

Fifteen years ago, the exact same issue was hotly debated in policy circles. That debate produced the School-to-Work Opportunities Act of 1994 that was designed to bring employers and schools closer together. Work experiences such as co-op programs and internships were coordinated with classroom lessons so the workplace illustrated academic content and the classroom content informed what went on in the workplace.

A crucial part of these programs, though, was having employees mentor students and teach them the attitudes required to succeed on the job. The Act seemed to spur a lot of interesting programs, and a tight labor market helped as well, as companies wanted to hire the best students they saw.

That legislation and the funds to support it went away after 1999. Maybe it's time to bring back something like it again.

Peter Cappelli is the George W. Taylor Professor of Management and director of the Center for Human Resources at The Wharton School.


Copyright 2017© LRP Publications