As batch applications process large amounts of data without user interaction to create reports, such as customer billing statements, companies need to ensure that their data is not only secure, but also protected from risks. At SHARE Pittsburgh, Christopher Walker, senior offering manager at IBM, and Rebecca Levesque, president and CEO of 21st Century Software, Inc., focused on how companies can understand and manage the risks facing their data used by batch workloads in their session, Transforming Batch Resiliency through Greater Control Using Analytics and Automation. Walker says, “Using near real-time gathering and continuous curation of SMF data, you can protect your business from these risks and threats to batch data with a process that is auditable, actionable, and repeatable.”
A study from McKinsey & Company and IBM estimates only 20% of enterprise workloads have moved to the cloud. Even within this, many cloud-based applications are ultimately dependent on accessing the mainframe for business-critical data (like that of core banking applications and in health care systems) and processing, Walker says. He warns that because the mainframe is now an integral part of the hybrid multicloud world, clients have exposed their key business assets stored on their systems of record to more risks and threats than before. This is especially true when an application connects, for example, to a bank’s core system on the mainframe. Walker says that because mainframe customers don’t have the information to understand data dependencies, this can affect resiliency, their ability to implement encryption, and other important business objectives. “Batch applications are often dependent on processes written and designed by people who are no longer available,” he explains. “People don’t want to address the issue because they don’t want to own the change. The business doesn’t understand the real gap and no one is going to tell them.”
Automated, real-time gathering of SMF data in conjunction with other standard operating tools, like the job scheduler and tape management catalog, can provide companies with the insight, reporting, auditing, and automated processes they need in relation to the resiliency and recovery of data, says Walker. He adds that companies need to be able to recover batch in a timeframe that is acceptable to the business. For example, clients expect round-the-clock access to their bank accounts, airline reservations, and more — any major outage results in front page news and major brand damage. Businesses also need to understand that hardware alone will not solve the problem, because batch doesn’t have the log or journals like databases.
There are products on the market that can achieve an integrated view of batch data to achieve business-IT alignment while increasing efficiency and reducing risk, explains Walker. These additions can help increase resiliency and bring intelligence and automation to a formerly manual process. "It is important that an end-to-end solution for batch resiliency improve adherence to service level agreements while reducing risk and cost, he adds. For example, minimizing the number of backups taken for any particular file improves the overall efficiency of the application and system. Improved management of backups can reduce overall CPU consumption, reducing overall system cost for Tailored Fit Pricing as well as customers on the four-hour rolling average model, describes Walker. "Risk is reduced by eliminating non-viable backups that can compromise the recovery process," he says.
The batch resiliency solution should not require companies to change their existing processes, says Walker. It should simply sit on top of a company's processes to audit and augment them or the solution can be used to create new processes that are driven by the product, which is the business standard that customers want.
Interested in learning more on the topic? SHARE members can log in and access handouts from this SHARE Fort Worth session online here.