Over and over I'm seeing press about the increasing demand for software developers, data analysts and IT professionals generally. The Bureau of Labor Statistics rates Network Systems and Data Analysts as the second fastest growing U.S. profession. The Bureau also expects that the number of software developers is, by necessity, to grow from over a half million to almost 700K. If the government says it, it must be true, right?
In my experience, I can say that demand to fill openings has clearly increased in the past few months. Certainly, the dam hasn't burst yet, and I'm sure many would question the government's optimism. But when you think about the initiatives that are being targeted and funded by the government—especially the health care IT initiative—the future does look rosier for IT workers.
Per a recent Computerworld article, programmers and experienced IT project managers will be the two most in-demand skills in the next year, with tech support and networking also ranking highly. Our recent experience validates that projection, but more heavily on the programmer side of the equation than the project manager side. That may be unique to southeast Michigan's circumstances.
But who is going to fill these jobs? Will there be enough trained and qualified workers in the workforce? Will they have to come off the golf courses and out of retirement?
Why haven't more young people come into the profession? Recently, a community college administrator lamented to me that he was seeing people who were well-qualified for a high paying IT career, instead enthusiastically embracing an education and life in the food industry (famous for long hours, hard work and relatively low pay). The food industry certainly is glamorized on cable TV, perhaps leading one to believe that it is all joy to reach the level of success of the top chefs and cable Food Channel hosts. Maybe that's what we need to attract bigger numbers into IT—more glamour.
It is true that college enrollments in computer related courses are on the increase. Check out this article from earlier this year. A dean from a prominent university is quoted in the article as saying the college successfully placed 87% of its information technology graduates the previous year. The article also quotes another university source stating that the average starting salary for its computer science graduates is a healthy $72,000.
Certainly, starting salary is modified by industry. However, with demand for computer science grads from the financial/insurance sector as well as defense, the promise of high compensation is a good reason for considering this profession again. High compensation is a great way to make an IT career more glamorous.
Some companies have the idea that qualified IT professionals are in abundance and that they can offer a lower salary to a candidate due to "the recession." This strategy might work for short term hiring but it cannot succeed for much longer—per industry reports, the better IT talent is finding work at a relatively high compensation level, and the lower paying employer risks losing its best technical employees to competitors.
A company can sometimes save on its salary budget by hiring an experienced IT worker who isn't necessarily The Ideal Candidate, but who has the experience, ability and willingness to learn and use new skills. Such a candidate may sacrifice salary for a career enhancement opportunity. We have seen this hiring strategy work for our customers that use it.
What about offshoring threatening IT jobs in the US? I've read plenty of reports that college enrollment dropped when offshoring seemed to become more in vogue starting in the 1990s as an apparent cost-saving measure for US firms. Many qualified college students apparently steered clear of a profession that was being promoted as disappearing overseas. During the past few years, US firms have learned that offshoring requires a definite management and facilities commitment that far exceeds the financial and management capacities of many firms. That fact, combined with some well-publicized offshoring failures, makes it clear that the US market will support an increase in new computer science grads, even if it isn't at pre dot-com-bust levels.
Do you need a computer science or engineering degree to make it in the profession, or can "equivalent work experience" still suffice? How valuable are certifications? I'd answer both questions by saying that the most valuable experience in the industry, above and beyond one's ability to understand and effectively apply programming languages, is the ability to integrate software components. With Open Source software so prevalent now in almost every development environment, the ability to make diverse technologies play nice can be the make-break in the success of a career.
Agile programming and testing skills are appearing in more job descriptions, with the perception that agile methodologies will results in faster deliverables and lower development costs. All well and good. However, the challenge for employers is finding employment candidates that actually have experience in an agile environment. The rate of real agile adoption is still low enough that actual job experience is rare indeed.
A vital point for computer professionals who are seeking employment to understand is that companies are almost universally seeking technical employees who can 1) understand the company's business and 2) who can communicate technical concepts to nontechnical people both inside the company and its customers. The days are numbered for the isolated "head down" developer. In the spirit of more "bang for the buck," IT candidates are more and more evaluated for their communication skills and business knowledge. And since we live in a global economy, it also helps candidates to have a global view.