The Impact of PDSE on Mainframe Performance

Security, reliability, performance – these are a few reasons most large businesses in industries like finance, insurance, healthcare, and government turn to the mainframe to power mission-critical applications. Performance is particularly important: when a critical application isn’t functioning as well as it should, it can directly impact the customer experience, which creates commercial risk.

To that end, it’s important that mainframe professionals pay attention to an often overlooked factor in mainframe performance: Partitioned Data Set Extended (PDSE) data sets. The PDSE is one of several key data set types that exist within mainframe storage.

In a recent interview, IBM Software Engineer Thomas Reed told SHARE that PDSE performance is critical to any mainframe environment running smoothly. PDSE data sets are used to store program objects, a type of construct that contains executable code. Without these program objects, and by extension the PDSEs that contain them, mainframe applications can’t run properly.

The PDSE is important in several aspects of the mainframe environment, including several critical z/OS components and applications written using modern versions of the Enterprise COBOL compiler. Given that $3 trillion in daily commerce and 80 percent of in-person financial transactions are powered by COBOL, it’s easy to see the impact PDSE performance could have on real-world businesses.

Still, it’s easy for storage administrators to overlook PDSE, Reed said. However, given its critical role in running the system, it can play a significant factor in application performance.

“When businesses don’t have their PDSE environment configured correctly, or if they’re not familiar with the way it interacts with their environment, it can very easily cause problems,” Reed explained.

IT migrations are a common inflection point where it makes sense to take a harder look at PDSE configurations, he said. For example, the introduction of Enterprise COBOL V5 meant that storage administrators had to account for PDSE’s data set sharing capabilities in their environment.

“There are considerations to take into account that could cause serious problems if the client hasn’t looked at it or incorporated it into their process,” he said.

As an example of one PDSE performance consequence, Reed described the concept of “blocked workload.” This occurs when a client’s CPU workload is at or near 100 percent, leaving no resources available for PDSE serialization to run smoothly. That can impact the system as a whole, as high-priority tasks may be delayed waiting for CPU-starved low priority tasks to release PDSE resources. By understanding this type of risk, storage administrators are able to ensure that they have sufficient overhead to run PDSE and avoid performance issues.

“PDSE is integral to the smooth functioning of a company’s business-critical software, and ignoring it or assuming it’s always going to work could interrupt your business-critical workload,” Reed said. “Making sure you understand the implications of PDSE performance will allow you to continue running smoothly. Or if there is an issue, you can react in the correct way and recover more quickly.”

Reed covered PDSE performance basics in a presentation at a recent SHARE event, where his objective was to give attendees “an idea of what interacts with PDSE and what impacts or enhances PDSE performance within their environments.” To learn more, watch the full presentation on the SHARE YouTube channel.

Recent Stories
Find Defending the Mainframe Difficult? Go on the Offensive Instead!

Digital Certificates 101

SHARE Virtual Preview: Women in IT Talk Inclusivity and Mainframe Innovation