Database Deficiencies Not Only a Hardware Solution
September 9, 2014 Dan Burger
Database deficiencies can often be masked by server horsepower. But at some point, powerful servers are unable to keep response times in the sub-second category. When companies get maxed out on hardware, software and databases start getting the attention. And in the IBM midrange, the term database modernization creeps into conversations. David Andruchuk likes to be part of those conversations. His company, CSDA, provides a database modernization service for a performance bump and also for business flexibility.
Database modernization tends to focus on converting DDS-based files to DDL-based files. “It’s easy to modify DDS code to DDL code,” Andruchuk says. “But there’s a lot more to it than that.”
Yes, there is. Imagine a database analyst sitting down to manually modify 3,000 tables. That’s a lot of monkey work. Andruchuk provides an example to provide visualization: “Take a simple but common example–the need to modify names and addresses. Those names and addresses appear in 25 tables. Those 25 tables exist in 4,000 modules. That means 4,000 modules need to be modified. Then figure in all the hours of labor to do that.”
Not every database modernization project requires the labor-intensive equivalent of building the Panama Canal. But there are modernizations that could get bogged down with costly, time-consuming tasks that are inefficient use of resources–what we like to refer to as monkey work.
There will be multiple tables that will easily convert–logical files that become indexes, for example. But there will be things that are unique to what a database does that don’t easily convert, Andruchuk says, and that’s where projects lose momentum. Plus, there’s a learning curve for people making their first ventures to the SQL side of database design.
Eliminating the monkey work is the service CSDA sells. The company has a tool that automates the conversion process, which saves one heck of a lot of time and–if you were paying for the analyst to do the monkey work–money. It also has the database skills that are rarely possessed by IBM i staffing.
But the decision to modernize a DB2 for i (or DB2/400) database is bigger than the automation of manual processes. The conversion of DDS physical files and logical files to DDL tables, views, and indexes is just part of what IBM refers to as data-centric development, which makes SQL the development framework of choice and gives traditional RPG developers something to chew on while pondering the future.
As Andruchuk sees it, adapting database access and software development practices to new technologies is a necessity that’s just as important as changing business strategies to changing market requirements. He believes database modernization, and data-centric programming, should precede application modernization.
The compelling reason to take on modernization escapes the grasp of many IBM i shop managers.
The road to success begins with determining a data model for your existing database. That becomes a roadmap and it provides a basic, identifiable database understanding. You have to figure out where you are before you can figure out where you want to go. The first steps usually aren’t difficult, Andruchuk believes, when done on a small-scale, table-by-table basis, then running existing RPG programs against them in a test environment. Doing this without recompiling the RPG apps is what most people find intriguing.
And if all goes well, Andruchuk says, the database can be plugged in, it’s going to run, and your RPG programs won’t know the difference.
It’s possible you have no data model that helps make sense of a database. I’ve heard this from Mike Cain, one of IBM’s top DB2 on i brains, and a guy who has seen some database modeling nightmares. A bad data model adds to the degree of modernization difficulty and the worry of revealing a horrific database prevents some IBM i users from even exploring database modernization.
Andruchuk has seen this play out as well. “It is possible that the data model is so screwed up that the core of the database can’t be modernized and needs to be changed,” he says. “Then you try to do that without it disrupting the company.”
Good luck with your disruption avoidance plan. Skills and experience might come in handy.
For those with no fear of finding a poorly defined database and perhaps an upper-level management dictate to address scalability and multi-platform issues, the transition is mostly about learning SQL skills and taking advantage of the SQE-based engine and getting the benefit of the performance increase without adding more server horsepower. But the reality is that most companies that depend on DB2 for i (DB2/400) do not have well-documented, well-defined databases. Poorly defined data is responsible for slow-moving applications and a stubborn tendency that inhibits interoperability with other platforms.
Consequently, it can stop database modernization projects dead in their tracks as upper management tries to make a decision whether to modernize or migrate or possibly buy an enterprise software package to wiggle free from the bad database curse. Those packages frequently come with either the Oracle or SAP name on the box and they are not inexpensive.
“Companies expecting a lot of growth or that are in a merger and an integration of multiple disparate systems are most likely to be wrestling with these technology and cost issues,” Andruchuk says. And companies running non-supported versions of the operating system, with apps created in RPG II and RPG III, that are looking at migrating or upgrading to the latest Power, are looking at about the same money (as the Oracle or SAP software options) to make either move. But any migration brings with it a retooling of the IT staff, which needs to be accounted for in the decision-making process.”
When hardware isn’t capable of overcoming software deficiencies, or is too expensive, Andruchuk says it is time to fix the database efficiencies with the attainable goal of improved performance, scalability, and flexibility.
For a company that may be dealing with ever-increasing batch processing cycles or modernizing application front ends using Java, PHP, or Ruby developers, the decision process will include discussions about data access. One of the key discussion points will be SQL databases, which are familiar to modern application development teams. The recognition that modern app development is tailored to SQL databases is typically part of the discussion.
CSDA claims its service is built for low cost and a quick turnaround–an assertion that universally attached to every IT product and service I can think of. Comparisons of automated conversions of DDS to DDL and manual conversions of the same will validate this claim, but it’s actually a trickier comparison than what you might think.
CSDA charges are determined on a per-table conversion basis, with break points on the price per table after a project gets into the conversion of thousands of tables.
Keep in mind the physical table is the base of the modernization and that the related logical files are–for the most part–not counted objects. However, logical files can vary considerably depending on how a developer has done the coding relative to joins, multiple record formats, and other techniques. Each case is unique.
Although Andruchuk readily admits that everyone he knows is still maintaining their databases with DDS and that more far more companies are in the “thinking about database modernization stage than in the doing database modernization stage,” he’s a true believer that database modernizations will soon be taking place and CSDA will be in the right place with the right tool to help these jobs get accomplished.