The Big Deal About Big Data – Part 3 of 3

By Pedro Pereira

The amount of data the digital world generates is expected to grow at a 44-percent rate over the coming decade. In his final piece of a 3-part series for SHARE President’s Corner, veteran tech writer Pedro Pereira explores the Big Deal about Big Data…

SHARE’s Peer Guidance

Along with IT companies, SHARE is taking an active role in helping enterprises with big data by sponsoring conferences and publishing educational materials on the topic. Dr. Rao’s Smarter Computing presentation, for instance, was delivered at the organization’s August 2011 conference in Orlando.

SHARE is an independent association with membership from companies large and small in industries such as finance, insurance, manufacturing, retail and utilities, as well as universities and colleges, government organizations and consultants. SHARE’s mission is to provide enterprise IT professionals with continuous education and training, and facilitate peer networking.

Rosen calls SHARE a valuable resource for IT professionals trying to tackle the big data challenge. There is a lot of information on the subject in journals and case studies but it has limitations because typically those sources cover only the successes, Rosen says.

To find out about failures so they can avoid them, IT professionals have to rely on each other, and that is an important role SHARE fulfills as a peer group. “Telling me what works is great, but telling me what doesn’t is even greater to make sure I don’t go down that path,” says Rosen.

Mainframe Solution

One of the paths to taming big data is through the mainframe – yes, the computing colossus that more than a few industry pundits in the 1990s wrote off as a dinosaur. The mainframe’s bulk processing capacity at high rates of speed presents a “real solution” for big data, says Rosen. It has a central role in helping enterprises keep track of all their data, he says.

“What we are seeing with our customers with mainframes,” says Bhambhri “is they want to look at the data from the instance that it enters the enterprise.” To help those customers, IBM has developed the zEnterprise BladeCenter Extension (zBX), which extends System z mainframe management capabilities across the vendor’s server platform and connects with the large-scale DB2 database for business analytics.

“Our customers that have invested in the mainframe and have workloads on mainframe don’t have to change anything,” Bhambhri says. “What we are providing is something that helps extend their data platform. We are providing capabilities that allow them to analyze large volumes of data.”

Enterprises can run workloads on the mainframe while extracting information for analysis on the BigInsights platform. Organizations don’t necessarily know what they will find out from the data, says Bhambhri, but once they run it through the analytics programs, they are sure to find actionable information useful to the business.

Cloud Doubts

Of course, a lot of companies have no mainframes, but still process large amounts of data. For them, cloud-based computing resources present a way to gain big data insights while keeping costs down. Bhambhri says a lot of companies are “kicking the tires” in the cloud.

McKinsey says cloud computing knocks down technology barriers and reduces costs, and it lets companies collaborate with partners and customers on business functions such as R&D, marketing and customer support. In a big data context, you could envision different parties working together to gain and share mutually beneficial insights.

In a recent press release (June 20, 2011) IDC predicted: “Cloud computing will continue to reshape the IT landscape over the next five years as spending on public IT cloud services expands at a compound annual growth rate (CAGR) of 27.6%, from $21.5 billion in 2010 to $72.9 billion in 2015. But the impact of cloud services will extend well beyond IT.”

Real-time big data analysis will help drive this growth.

Rosen says both the government and private enterprise are seriously looking at how cloud computing can help with big data. Security and reliability, however, remain a concern. “And sometimes the cloud is really complicated because you have to be concerned in some cases where the data is being stored,” he says. “I don’t want my data to be stored in China, for example.”

What to Do

Realizing big data’s vast potential will require organizations that generate and process large volumes of data to understand its value and act accordingly. On the technology side, innovation is needed to address storage and security challenges, and to continue the fine-tuning of analytics to make sense of the data. Policy makers also play a critical role in devising strategies that facilitate analysis of big data while protecting the privacy of personal data.

Education, and sharing of experiences is fundamental to everyone's success. To that end, attendees at SHARE’s next semi-annual event, scheduled for March 2012, are sure to receive plenty of actionable information to help them meet the big data challenge.

Recent Stories
Mainframe Matters: How Mainframes Keep the Financial Industry Up and Running

GDPR and Mainframe: What You Need to Know

Millennial Mainframers: Kyle Beausoleil on Mainframe’s Perception Problem