Mainframe Skills Shortage Could Hinder Big Data

With all the attention paid to unemployment rates in recent years, it’s easy to overlook that some IT jobs go unfilled on a regular basis because of a lack of skills. That’s right, even at the height of the Great Recession, some IT positions remained vacant.

One area in particular where it gets harder and harder to find the requisite skills is mainframe computing. Mainframers have warned for years that a disproportionate number in their ranks is nearing retirement age and the industry hasn’t minted enough new mainframe experts to fill the vacuum.

A recent Computerworld study found that 22 percent of COBOL programmers are 55 and older, while 50 percent are between the ages of 45 and 55. If most of them retire by the age of 65, as is common practice, they will take a lot of institutional knowledge with them unless they get a chance to transfer it to younger folks. That knowledge includes the underlying strategy when it was implemented, the understanding and methodologies developed over decades, and the technical management skills that have helped ensure the reliability and availability of IT environments that process all the data in order to deliver the intelligence needed by the business.

The effects of a mass exodus from the mainframe ranks could affect a number of computing trends, not the least of which is Big Data.

The potential impact on Big Data has gotten no noticeable attention – or at least no one seems to be talking about it publicly. While IT professionals recognize there is a mainframe skills shortage with potential deleterious effects on enterprise computing, no one seems to be making the link to Big Data.

A Vanson Bourne survey of 520 CIOs in late 2011 found that 71 percent of respondents had concerns about the business implications of the skills shortage. Specifically, the survey found, 58 percent of CIOs were concerned about increased application risk and reduced productivity, while 53 percent worried about project overruns.

It isn’t much of a stretch to conclude that reduced productivity and project overruns will – at least in some cases – be related to Big Data, since the mainframe is bound to play a starring role in Big Data projects.

Despite premature declarations of its impending demise, the mainframe has remained a critical strategic asset to the enterprise. A 2012 study commissioned by CA and conducted by Decipher Research concluded that 81 percent of IT decision makers around the globe view the mainframe as highly strategic.

Fifty-one percent of U.S. respondents and 46 percent in the rest of the globe said they planned to increase spending on mainframe software in the next 12 to 18 months. Meanwhile, 30 percent of U.S. respondents and 36 percent of global respondents predicted increases in hardware over the same period.

With enterprises placing Big Data at or near the top of their IT priorities in study after study, little guesswork is needed to figure out what is driving this investment. As enterprises look to extract value from the mindboggling volumes of data they collect, many will leverage the mainframe for its processing power, scalability, availability and security.

Big Data is important because it may hold the key to corporate success in the future.

If companies can figure out how to unlock the trends, revelations and obvious courses of action hidden in largely amorphous mountains of data that grow taller each day, they stand to realize significant benefits. Big Data insights will guide them toward building better products and services, improve customer service and – ultimately – boost the bottom line.

But Big Data poses volume and organization challenges. Companies can no longer rely entirely on relational databases for all their data, now that so much of it comes from inherently chaotic sources such as social media sites and mobile devices. And this, of course, is where the mainframe comes in. It provides more computing power and reliability than the typical server farms that run distributed environments while requiring less power, cooling and floor space.

Of course, for Big Data to provide the desired benefits, it will require the tools, specifically the analytics software and hardware to handle the work.

But it will also require the skills to run the tools. And that is why grooming mainframe talent is a priority for a growing number of enterprises, as well as vendors such as IBM and CA.

IBM has been forging partnerships with high schools and universities to attract young talent to the mainframe. Its Master the Mainframe contest has run successfully for almost a decade, inviting students from around the world to demonstrate their mainframe skills.

Other vendors, such as Compuware, are taking action as well. Compuware has partnered with Wayne State University in Detroit to introduce students to mainframe software development. During a 10-week course that runs in the fall and spring, students get access to a mainframe to build and deploy solutions.

Local partnerships of this nature may prove essential to the development of mainframe skills among the young, and more vendors ought to consider the approach. Enterprises have a role as well, by partnering with vendors and colleges to promote mainframe learning.

Big Data, and all the business value that can be derived from it, may depend on these combined efforts.

Recent Stories
IBM Systems Magazine: October Editor's Picks for SHARE

Authenticating Users in an Era of Massive Data Breaches

Why z/OS Can Be a Great Platform for Big Data Analytics