Analytical Expectations And Misconceptions Of IBM i
June 20, 2016 Dan Burger
Often we recognize we had expectations about the same time as we recognize those expectations have not been met. The IBM midrange computer, a combination of the IBM i operating system running on Power Systems hardware, exceeds the expectations of a high percentage of organizations that rely on it. Reliability, scalability, and securability are its hallmarks. However, some would describe its analytical capabilities as a weakness. Transactional-based business computers, like this one, are limited in what they can do.
Are they analytically challenged? That depends on your expectations.
The type of analytics most IBM i shops are interested in are functions such as grouping, aggregating, rollups, and summing data from different perspectives. It’s often referred to as slicing and dicing data. Another common term is operational analytics, which is analytics derived from structured business data. Not all organizations running on i care about it and many that do are using antiquated tools and processes. (IBM’s DB2 Web Query and New Generation Software‘s NGS-IQ are two examples of modern tools that come to mind.)
Among business executives, the idea that more analytics is always better is a marketing tsunami. Clearly there is value in analytics. But the value of some new technologies is not widely known and the question of whether any given technology is applicable–now or in the future–still needs to be answered.
In an interview with IT Jungle two months ago, IBM i chief architect Steve Will said feedback from IBM i shops indicates a desire to do analytics and processing on the same partition. Will says that response was, and is, a factor in developing OLAP support.
“Our customers would prefer to have one integrated place to do this,” Will said. “We are not trying to compete with the analytics that run on another platform. We try to make sure that analytics engines that want to get at our data now get it live. The OLAP functions are not about taking the data somewhere else.”
The OLAP functionality that was added with the release of IBM i 7.3 is indicative of what Will says customers want. IBM’s DB2 Web Query and Cognos business intelligence tools are both putting it to use. However, customers need to be running 7.3 to get the advantage. That’s a small, but active, number of shops at this stage, certainly in the low single digits in terms of percentage of the IBM i community. It will be at least two years before 7.3 is running in 10 percent of the shops and probably twice that long before it will be found in half the IBM i installed base.
“The reality is DB2 Web Query users, if they need this kind of grouping or rollup, are doing it (OLAP) today,” according to Doug Mack, a DB2 for i analytics consultant on the IBM Systems Lab Services team. “This is an example of the integration points with DB2 capabilities that can be leveraged by DB2 Web Query.”
The difference, Mack explains, is that the data manipulation work–aggregating and reshaping data–is being done by the DB2 Web Query reporting server. The OLAP function in DB2 delivered in IBM i 7.3 allows the work to be pushed to DB2 where it can be run more efficiently. Pushing the work to DB2 also fits with the data-centric computing strategy that IBM promotes.
As customers start to adopt 7.3, Mack says the Lab Services team will promote the use of DB2 OLAP functions with DB2 because of the increased efficiency (performance), however, it won’t be a necessity. In addition to using the OLAP functions in Web Query, Mack notes IBM i shops are building their own apps that use the SQL Views and contain the OLAP functions.
Web Query, by the way, is an SQL-based tool, but it can be used with non-SQL data structures.
There is no requirement to change the constructs that define the database, but, Mack points out, database modernization is a strategic choice being made by enterprise IBM i shops. The shops where a lot more code is being written and data-centric development makes more sense from a cost-effectiveness perspective.
Some OLAP functions have been supported by DB2 for many years. Since its inception, one of the report formats supported (along with Excel, HTML, PDF, and others) was called OLAP. That format is now descriptively referred to as auto drill down. It provides the slicing and dicing and grouping and aggregations at the report level that many IBM i users create, Mack says.
“There are types of reports where you want to look at the data across different groups,” Mack explains using Web Query, the tool he knows best. “An example is a set of employees spread across a set of departments. The report ranks employees by highest salary to lowest, both within the department and within the whole company. Someone that might be ranked number one in the department may be ranked number 20 overall in the company. A single report shows this. Customers are doing this just through techniques in how they build the report in DB2 Web Query. In 7.3, we could add another technique that calls the DB2 function that just returns the data with this ranking already done. In other words, using the capability in 7.3 to build a report like this makes use of the DB2 functions. And there will probably be the benefit of better performance because we’re pushing the work to DB2.”
In a similar fashion, the database temporal capabilities that are built into i 7.3 pushes the work of maintaining historical perspective into DB2 rather than building it into the applications. It’s the data-centric programming methodology again–the concept that rides shotgun (the navigator position) on the IBM i modernization road trip. The simple explanation for data-centric development is that it allows the database to do more work keeping track of and securing the data rather than having each and every application coded with this responsibility.
And, as was mentioned about OLAP, temporal capabilities are available to those who are not on 7.3. Mack explains that in the world of data warehousing, developers maintain history using ETL tools. The concept is called slowly changing dimensions. IBM created a tool for this called DataMigrator.
“It works slightly differently than temporal, but the idea of storing and accessing historical data is the same,” Mack says. “What temporal is doing isn’t necessarily a new concept for DB2 Web Query, but it may provide another mechanism to push the work into DB2. I don’t think people are going to switch to temporal in their data warehouse because DataMigrator handles history today. It will depend on adoption rate of 7.3. When [shops move to 7.3], I think they’ll probably be more likely to use temporal tables in DB2 in operational/production systems, because they are currently band-aiding the storing of historical data through RPG code.”