BI On IBM i: A Fish Out Of Water
August 17, 2015 Dan Burger
A fish discovers it needs water when it is no longer in it. A business discovers it needs data when it can’t get it. Or, in many cases, it can’t get the data it wants, when it’s wanted, and how it’s wanted. In the IBM midrange, the term “operational reporting” has been around forever. Because we have a habit of changing the names of old things to make them sound new, the term “business intelligence” is more favored. What matters is getting information in a timely manner and in a friendly format.
The underlying infrastructure doesn’t matter. Operational reporting is fine if it delivers. That’s a big if. A great many shops are not making the grade. The tools and processes are old and the expectations are new. It’s a problem especially for small businesses, where budgets are tight and human resources with the time to take on new tasks are scarce.
Last week in The Four Hundred, an article titled The IBM i Market Is Not Economics 101 noted that Power Systems sales are on the rise. That a good indicator of a healthy Power Systems community, but for small IBM i shops an upgrade plan will dominate IT activity, putting other projects–like operational reporting/business intelligence, for instance–on the back burner.
Cost is going to be a factor. Enhanced reporting capabilities will likely involve purchasing a new tool and another server, or at least configuring an LPAR. Operational reporting could be supported by a data store (data warehouse or data mart), but it’s not a requirement. There could be increased software fees that go along with the hardware/OS upgrades. Testing interfaces for applications running on new systems consumes time. Whether or not it’s a dedicated project or a “fit it in when you can” project will also be a factor.
When IBM is successful selling new servers, it is usually a slower year for us,” says Bill Langston, director of marketing at New Generation Software. “And then we come back when that is over. Next year or the year after will be better for us. Investing in IBM i and Power is good. The new environments that companies are moving to have database engine improvements and OS improvements that help performance. We ride on the coattails on that.”
New Generation Software develops query, reporting, OLAP, and business analytics tools.
Doug Mack, a DB2 for i and business intelligence consultant in IBM’s Power Systems Lab Services organization, spends a great deal of his time working with companies to upgrade from Query/400 to Web Query. He’s also a proponent of data-centric programming, a common trait with the IBM Lab Services teams connected with DB2 for i.
Data-centric features include having a database that does more work keeping track of the data, making inferences, taking care of repeatable processes such as coming up with logic so the operator doesn’t have to, and, by doing all these things, increasing efficiency. As companies go forward with browser-based application development and cross-platform data access, the increasingly critical role of the database has folks like Mack and other IBM DB2 for i brainiacs like Mike Cain, Scott Forstie, and Kent Milligan extolling the benefits of data-centric thinking.
“You can buy any BI tool you want to, but that doesn’t help you solve a lot of the data issues you have to deal with to make the data meaningful,” Mack says. “In the RPG world, many customers have business rules built into the RPG code and not into the database, meaning that you have to get someone to translate those rules to understand the data in the BI tool. The ability of building meta data over RPG code, user defined functions, and SQL stored procedures and views helps with this.”
Data-centric programming is gaining some traction primarily in enterprise-level organizations where larger IT staffs possess what Mack refers to as “project discipline,” which is the where with all to create specific projects, select project leaders and project team members. In other words, they have the human resources bandwidth to do it.
Langston views data-centric development as a niche interest at best.
“It’s a community that is focused on that,” he says. “They are the ones with a lot of custom applications (home grown or no longer supported vendor solutions) and a lot of in-house developers. They typically have the staff to continue working on the applications.”
For this article, the focus is on small shops where operational reporting efficiencies, not big data analytics, are the realistic business objectives. For many of these shops, operational reporting is mired in the past, but can be quickly up and running–and more importantly functioning more efficiently–to deliver the most wanted data when it’s wanted and how it’s wanted.
And it can do this without a change of architecture, although Mack would politely suggest a long range plan that transitions to modern database architecture.
Because IBM markets two business intelligence tools Web Query and Cognos, Mack finds IBM i shops asking about both options.
“It comes back to functional requirements,” Mack says. “Does the tool do what the customer is looking for? Which tools fit into the current infrastructure environment, leverage skills, and fit the budget. All these are factors.”
Another factor that is readily apparent is how badly does a company want an IBM i solution. This is very important for many IBM i shops that want to leverage the existing infrastructure, the i OS, and the DB2 for i processes. The preference for built-for-business IBM i system remains strong in many organizations. There is little concern about hard to find IBM i skills.
Cognos does not run on IBM i. Therefore, it will require a separate server or at least an LPAR. A company might have system admins with cross-platform management and LPAR provisioning experience, but it’s not likely in the smaller shops, where getting the most value for as little money as possible is a guiding light. Choosing Cognos, Mack says, is probably because a company already has experience with Cognos, has experience with Linux, or has specific requirements such as predictive analysis.
Some IBM midrange shops work with ISVs that have incorporated Cognos into their software. That’s another reason they may choose Cognos for other purposes. Or similarly, there are shops that partner with consultants with Cognos expertise, which could influence the decision.
Also on the table are discussions about creating separate repositories for business intelligence purposes whether it’s operational reporting or more complex analytics.
“Companies may start out with simple reports that are relatively easy to do. And users are happy because they are getting data they couldn’t get before. But as executive management asks for answers to more difficult questions, the BI tool use gets expanded,” says Alan Jordan, director of data warehouse technologies at HelpSystems. “Companies get stuck because they don’t have the operational data store (data warehouse) that makes it easier to build more complex reports.”
That could lead to “I can’t get there from here” predicaments. Some shops create ad hoc versions of operational data stores. (Don’t get hung up on the terminology whether it is data store, data mart, or data warehouse.) The most common method is to create summary tables, but as Jordan points out, summary tables are not architected databases. He warns, “This could be a maintenance mess if it’s continued over many years because they are manually built and often undocumented.
“In the early going, it’s often determined that ‘we don’t need all that.’ But the realization comes later that it should have built it correctly from the beginning.”
In other words, the BI tool, by itself, is often thought of as being the answer. And that could be true when the needs are simple. But as it gets more complex, whether it is operational reporting or complex analytics, it needs a best practices approach.
Certainly BI tools such as IBM’s Web Query, New Generation Software’s NGS-IQ, and HelpSystems’ SEQUEL can and are being used in conjunction with data stores. The reasons for doing so include security–preventing BI users from messing with the production database–and to lessen the demands on the production server.
You will often find data being consolidated on an isolated database used for reporting. Sometimes that’s a separate LPAR, other times it is on backup servers used for disaster recovery or high availability.
There’s also the house cleaning aspect to getting the most out of business intelligence tools.
Langston points out the importance of figuring out what database tables are being touched by what jobs, what things are joined and how they are calculated. The point is to find which queries can be discarded, which ones can be kept, and which ones will be a challenge to migrate for use with a BI tool with the capabilities that are needed in a modern business environment.
“It’s important to study reports and reporting processes that you’ve built. You have to understand what you have. RPG code is documented thoroughly, but reporting doesn’t get the same attention.”
Most BI tools have the capability to go through the discovery process.
“Moving into the modern world is more than teaching to do the same things only with a new tool,” Jordan says. It’s no longer having everyone write their own stuff in their own informational silos, with many people doing the same or similar reporting and duplicating efforts.”