The latest Bureau of Labor Statistics (BLS) have been updated (as of November 2009), to reflect the Great Recession. The news is terrific for us — computing is only forecast to grow, and at an amazing rate.
“Computer and mathematical” occupations are projected to grow by the largest percentage between now and 2018 — by 22.2%. In other words, “Computer and mathematical” occupations are the fastest growing occupational cluster within the fastest growing major occupational group.
DARPA is so concerned about the lack of IT workers (and the lack of diversity of the ones that they have) that they have launched a new research project to develop more and more diverse IT workers.
DARPA has launched a “far-out research” project to increase the number of students going into “CS-STEM” (Computer Science and Science, Technology, Engineering, and Mathematics) fields. Wired just covered this effort to address the “Geek shortage.”
What makes the Wired piece so interesting is the enormous and harsh pushback in the comments section, like the below:
I’m 43, with a degree in software engineering and enjoy what I do for a living. But I wouldn’t encourage my 12 year old son to major in CS or similar because interesting, new project development jobs are the first to disappear in a down economy and non-cutting edge skills are easily offshored and new hires are cheaper than retraining outdated workers.
Why get a 4 year degree for a career with a 15 year shelflife?
Are these complaints from a vocal small group, or are do they represent a large constituency? Why is there this disconnect, between claims of great need and claims of no jobs? Are old IT workers no longer what industry wants? Is BLS only counting newly created jobs and not steady-state jobs? Is the IT job market constantly churning? Is industry not training existing people and instead hiring new people? It’s a real problem, to argue for the need for more IT in the face of many (vocal) unemployed IT workers.
From what I've seen (in my job search for most of 2009), most employers are very picky about their new hires. They require a long list of varied skills (eight or nine talents, usually including C++, object-oriented programming, SQL, database design, and others) and are not interested in "near miss" candidates. They (apparently) want the perfect candidate -- and are willing to wait and wait and wait for him.
I don't know if this phenomenon is driven by fear of making a mistake or by the dynamics of hiring through HR and a recruiting company. Without specialized knowledge, HR and recruiters are limited to matching acronyms from job reqs to acronyms on resumes. They may view a near miss as unacceptable, limiting the candidates presented to the hiring company.
I am now almost 50 years old and almost every day I thank my father for letting me come to the US to study Computer Science. Computing is a staple diet in my life and Marketing allows me to meet people to show the value of ICT for their business and lives. I also want to work in CS/Software Engineering for another 20 years.
I am quietly teaching my two kids, aged 6 and 7 years, computing. They now use Google on their own to see free Ben 10/Barbie computer games. My son has a super memory and my daughter is sharp as a razor. I cannot predict what they will do when they reach their twenties but I will ensure that they would get a daily dose of computing. I am sure in the future ICT will be common as a telephone in our lives.
There is a distinction between computer science and IT/IS and computer services. I find the majority of workers are in the IT/IS domain with far fewer working in pure computer science roles.
In computer science research and development, one reason that DARPA, NSA, NIST, NASA, and other federal science agencies lack diverse and qualified computer science candidates is that the private sector and academia offer better compensation.
For the rest of the IS/IT employees working in finance, healthcare, retail, media, transportation, government, and other industry sectors, its hard to sustain a career. Mergers, reorganizations, layoffs, age discrimination, and other business transitions offer only an unstable career path. Those of us that succeed have adaptable skills in both science and business. My experience in the field is one where new fangled programming/development skills have more value, service management can be automated to the point of requiring less experienced people, outsourcing is cost effective and can be modified or terminated, and constant technology improvement leads to skill obsolescence. In the end, the perception of senior management is that the churn of technical resources is both desirable and essential to profit.
HR can be faulted for poor hiring practices as well. Non-technical recruiters use lists of acronyms that promote the exclusion of qualified candidates. In the 50s to 70s, IBM was able to hire capable employees through aptitude tests and interviews. Today we have ERP systems, not people, which exclude applicants because certification requirements are unmet. The crisis is not a lack of skilled labor but of inflexible barriers to entry.
If given the chance, I would choose a career where experience and skills are valued, Im compensated for my knowledge, and do not fear an early end to my employment. A slow and steady computer science career in academia would have been a better choice as I would have avoided private sector IT. At least I would be happy, tenured, employed.
Thanks for the comments! I wonder, Richard, how the Bureau of Labor Statistics is counting when the talk about "Computer and Mathematical Occupations." Do you think that they're talking about jobs in "computer science" or in "IS/IT" (to use the terms you mentioned)? I don't know, but my guess is that they're lumping all of those together.
I teach at a small college where the Computer Science major will only be a minor starting next year. Enrollments had dropped, and it was hard to recruit good professors. My teaching job will likely shrink or disappear. This is very sad locally, but the good news is that I have no doubt there is a huge amount of work to do in the field worldwide, the good work that will survive will be done by people with real skills, the demand for that good work will reveal a shortage that the marketplace will recognize and reward, eventually, which some day will bring students back. It's boom and bust and boom, with a cycle time measured in decades. To survive in this field you have to like the subject matter, be patient, and never stop learning.
This is a complicated issue and my comments here will only touch on a few of the finer points.
1. Add me to the disillusioned and sometimes embittered subpopulation in question. I do not accept the statistics showing a shortage of trained U.S. IT workers. (My experiences extend from Federal Civil Service to industry -- not to academia).
2. Perhaps we are seeing a phenomenon similar to under-reporting of unemployed in the larger population.
3. Perhaps it is the unstated desire of some managers to steer toward a less costly workforce, or an equally unstated (and probably unfounded) preference for more easily managed younger workers.
4. It could also due to wide swings in private sector employment prospects (e.g., consider loss associated with banking consolidation, Microsoft's periodic "right-sizing," IBM's U.S. workforce reductions, and HP's recent announcement that they would reduce their U.S. head count by 5,000).
5. Conditions for employment lack transparency. For this reason, it's not possible to obtain causal relationships between age and other factors.
6. Movements of IT professionals between unsatisfying part- or full-time consulting work and regular employment are underreported.
7. Automation enabled by off the shelf software means that fewer workers are needed to perform OBVIOUS computer-related IT. Much of the low-hanging fruit can be performed by specialists in other fields who take on IT as a secondary duty. What enterprises will miss from this approach is the more challenging opportunities for computerization (especially knowledge-based product development); they won't glimpse what a seasoned professional could envision.
8. The eroded U.S. industrial base is not helping. Between the decimated manufacturing base and the preference of capital markets for mega-multinationals, there is less room for entrepreneurial innovation within traditional employment settings.
9. The checkered history of computing tends to reinforce the idea that the private sector has a myopic view of the profession's previous work. For one example, see the reference to Burroughs architectures in "The Resurgence of Parallelism," featured in the current issue of CACM. For another, consider how DOS won out over Unix-derived PC operating systems in the 70's and 80's. The tendency to reinvent is overwhelming when the perspective on software life cycles is short. The presence of fewer enterprise monoliths foster this conundrum as well; fewer players mean fewer places to innovate, experiment, or sell management on alternative approaches.
10. Startups are sometimes a bright spot in all this, since experience matters to investors even when it's ignored by employers. But to cash in, startups must sell out to larger firms. This destroys the incubating talent pool and opportunities for teams.
11. SocNets may change all this, but most IT workers tend to operate in isolated groups. The absence of knowledge sharing across these groups, even within a vertical industry, discourages the type of connection-making that would allow individual workers to transcend the superficial limits imposed by dates on their resumes. Stated differently, although IT workers are keenly aware of the project nature of most white collar work, excellent performance within a project is rarely visible outside a workgroup or Company. Promotion outside the enterprise is rarely merit-based.
12. Contrary to the opinions of most, many seasoned IT workers are MORE flexible, more agile, and more capable of working across professional silos than their inexperienced counterparts. But for that to matter, projects that embrace multidisciplinary innovation must be nurtured, and I would argue there is less of that fuel available to burn in many enterprises.
I don't think that there are too many IT workers. But I would certainly not encourage any teenager or college student to persue computer science as a profession. I've been in the software business for over 20 years ( BS and MS ). It was a great way to make a living for a long time. Now, many companies just want to get labor for pennies on the dollar. It seems that companies feel it is cheaper to get brand new talent than to retrain/upgrade the skills of their current workforce. Anyone who has been in software for a long time has learned numerous languages, operating systems, software packages, etc.
I think it is a crime that companies are looking overseas for talent when there are plenty of intelligent, hard working people in this country. How do they ever expect the unemployment rate to go lower if their mantra is outsourcing?
In reference to Kathy's comment about outsourcing, earlier this year, the company I work for announced the results of an outside evaluation of our IT operations. As many can guess, one of the results was that the company has not been using outsourced resources to the same level as the competition, and recommended that the company increase its outsourcing. Of course the company doing the evaluation was also quick to offer their outsourcing services. Luckily, our company is not so quick to jump on the outsourcing band-wagon, but they have announced that they are taking a deeper look into it.
My point here is that if companies are being told that they need to outsource in order to keep up with their competitors, not all of them will take the time to fully evaluate the truth in that statement, especially if that advice is coming from a trusted source.
Interesting, Edward! Who was doing the outside evaluation? What kind of service provides that? A consultant hired to perform that task? An efficiency organization?
Ditto Mr. Fitzpatrick and Mr. Underwood - all excellent points.
The jobs just aren't there in manufacturing, nor do they seem to me to be there anywhere else.
In a perfect world, applicants wouldn't puff their resumes and employers wouldn't advertise for positions that they have no intention of hiring for.
Both are common, and both are fraud in my book. Employers should not have to decipher fact from fiction on applicant resumes. And applicants should not have to waste their time applying for positions that don't exist.
So please, don't believe the statistics regarding alleged job openings or alleged shortage of "qualified applicants". In many cases the data are being cherry-picked.