Guru: The Transition From Modern RPG IV To Modern SQL
October 30, 2023 Bob Cozzi
Seven or eight years ago I was working with a client who needed more contemporary reporting than provided by the legacy Query/400 product. Initially I looked at Db2 Web Query and realized the documentation was sparse and the tutorials for most things, even “getting started” were virtually non-existent. It was also about four times more expensive than I felt it was worth to a P05 or P10 client. I mentioned the documentation issue to the folks at IBM and they agreed. Some of it was eventually resolved, but not enough in my opinion to get the masses to replace Query/400 with that product.
As a long-time IBM i Developer, I made the decision to build something that was good enough for my client’s needs, and maybe get a new product out of it as well. To do that, I created a few program templates that produced the reports they needed by simply specifying the SQL SELECT statement. The generated SQLRPGLE code produced the report quickly and only took 15 to 30 minutes to create from those templates. This gave me the initial idea to create what became the core SQL iQuery engine.
SQL iQuery Report Writer Emerges
After several months, I had a pretty great SQL engine that leveraged internal SQL APIs to syntax check and dynamically run SQL on IBM i. That initial version ran dynamic SQL statements, including the SELECT statement and output the results to Display, Print, PDF, OUTFILE, and CSV formats. At the time, these were more than enough options to justify the effort. It also gave my clients more output options than the legacy Query/400 product. Their next several report requests only required the composition of the SQL SELECT statement. The output was automagically created by the SQL iQuery engine. Therefore, once the SELECT statement was composed, you had a report or PDF or CSV file of that request in only a few minutes. In fact, you could develop using the OUTPUT(*) option, fine tuning the results to your taste, and then route it to CSV by simply changing to OUTPUT(*CSV) – it is that easy.
After creating a commercial package/licensed program product (LPP) for SQL iQuery, I began selling it to IBM i clients in the Americas. The early adopters asked about “web output” and a better way to format Excel data (apparently CSV = Excel Data in IBM i world).
I started to work on a better Excel-compatible output option and work on a web feature in my spare time.
For Excel, which is so complex, I initially opted to do the fast and simple “Sylk” output option. Sylk is the original Microsoft Multiplan file format. Multiplan is the predecessor to MS Excel and Sylk format is still supported in Office 365.
Sylk is okay but rather limited. So, I did a lot of research on Excel file formats and found there was an evolution in supported formats. The transitional era for Excel was in the mid-2000s. It went from XLS, which is proprietary binary file structure, to XLSX, which is a ZIP file with embedded directories that contain data as in JSON and XML formats. Don’t believe me? Go onto a PC and rename any Excel .XLSX file to a .ZIP file and unzip it. You’ll see what I mean.
Going straight to XLSX format would be too challenging and the binary XLS is just a nightmare. During the transitional phase, MS moved to something called SpreadSheetML. This format is basically pure XML. It is the native format for Open Office, Libre Office and is supported by Office 365. Office 365 was just coming into mainstream around that time, but I thought why not support all three major spreadsheet programs. So Excel 365 along with the Open Office products and Libre Offset are all covered by SpreadSheetML. Note Open Office and Libre Office support .ODS file format. This is actually the native suffix for SpreadSheetML files. If you use OUTPUT(*ODS) it creates the same file as OUTPUT(*XLS) or OUTPUT(*XLSX) but changes the suffix from XLS to ODS so that the supported packages open it directly. FYI: Since 2016, when opening SpreadSheetML in Excel 365, an end-user response is required. Sort of a contemporary “are you sure” dialog. The warning is benign, but some users are so paranoid that you should probably warn them about it ahead of time so they don’t click “No” instinctively.
To summarize, today, when you select OUTPUT(*EXCEL) it generates a SpreadSheetML output file with a .XLS suffix.
Since I had extensive experience writing CGILIB for IBM i a decade early, CGILIB was the result of IBM announcing it was cancelling the original CGIDEV2 code (the second time they did that). I offered to take it over and continue to provide it for free, but they wanted too much money from me to do that. So I wrote a better version called CGILIB which didn’t have any of the limitation found in CGIDEV2 and I bundled with my COZTOOLS software package.
I leveraged that experience and knowledge and built what was effectively CGILIB 2.0; this time it was written entirely in C and C++ and had no practical limit on file uploads or HTML page size. Using this new CGILIB engine, users were able to compose an SQL statement and specify OUTPUT(*HTML) and the results were written to the IFS as an HTML web page containing a scrollable table. Later we added direct output to the web browser from within SQL iQuery Script, which is how our SQLiQuery.com and SQL Tools websites are created today.
Obviously, the web and Excel output took a while to design and build. As a distraction while I was creating those two features, I decided that JSON output was also a major up and coming player. Remember this was over 7 years ago and while JSON was big in the Web world, it wasn’t even on the horizon in the IBM i world. But I felt it could potentially displace a lot of future XML use and it turned out I was right. I ended up creating the first JSON render engine for Db2 to JSON. It is still in use today when you specify OUTPUT(*JSON).
SQL iQuery Script is Born
Most of my clients wanted to transform some of their legacy Query/400 queries into SQL statements and process them using SQL iQuery. So I created a module that would read the generated output from the RTVQMQRY CL command directly and run the generated source member code. But then I also needed to support the HTML/Web interface better, so I created a scripting language that is now known as SQL iQuery Script. Originally it was simply a source file or IFS text file that contained one or more SQL statements, after all it originally was intended to run materialized Query/400 code. Basically you passed in the name of the “script file” to the SQL iQuery command (RUNiQRY) and instead of it running an ad hoc SQL statement, it opened and read the script file and ran each SQL statement contained in it. The need to identify HTML or Excel, or JSON or whatever for output led me to create Directives. These Directives controlled the Script process similar to CL command parameters.
I had already implemented something like this for the materialized Query/400 scripts to read their headings, but I needed a more robust solution. For example, I needed a way to allow users to direct the output of the query, so I created an #OUTPUT statement. Now it was as easy as the following to control the output option:
After implementing this, I had an epiphany. Why not create a set of Directives that control all kinds of things, such as statement logging, debugging, excel column formatting, email options and so on. Today we dozens of #Directives in SQL iQuery Script ranging from the simple #OUTPUT to Web/HTML controls, to email and Excel options.
After the first customer installed this new version, their first question was: How do I pass parameters to the SQL iQuery Script from CL?
I thought about it for a while and then remember back in the 1990s I wrote a tool call “SQL Lite” for AS/400. It sold a ton of licenses and basically allowed substitutions values by using a parameter similar to the SETVAR parameter of the STRQMQRY command. Once I added it to the RUNiQRY command voila! we had “Session Variables”.
Session Variables can be passed to the script from the SETVAR parameter from either CL programs or the Command Entry screen. They can even be specified behind Menu options, Job Scheduler options, or sbmjob commands. You can also define Session Variables using the #DEFINE directive in a script itself.
Session Variables are specified similar to CL variables, and that’s on purpose.
From there, this opened up a whole can of worms; we added Conditional Logic, that is, IF/THEN/ELSE along with FOR and WHILE looping, assignment statements, built-in functions and total SQL integration. I now write entire applications using SQL iQuery Script – it’s that good.
This script checks if any customers in the demo file IBM ships with IBM i have a high balance due and writes a message to the joblog indicating so. Since this is SQL iQuery Script and not RPG IV, we have the ability to embed the SELECT statement on an IF condition in the form of an “IF EXISTS” statement and easily determine if the file contains rows that match our needs.
I have also moved to 100 percent SQL iQuery Script for Web for all my web development. I’ve created customer websites along with an entire web-based retail POS system with register receipts using it. The benefit is they no longer need to use the 5250 emulator for their cash registers and can now use a PC, Mac, iPad or other tablet, so long as it has a browser.
Another client syncs data between distant servers. They set up an SQL iQuery Script with its integrated FTP feature (yes we built FTP into it) and they use SQL to select a set of IFS file and then transfer them to a remote system overnight.
While using SQL iQuery for the last 7+ years, I built a number of SQL functions. Most of them did specific things that I used to do in CL but wanted to do in SQL. For example, I wrapped up the QUSROBJD (Retrieve Object Description) API into the RTVOBJD SQL Table function. I started doing this back on V5R4 but in V7R1 I moved from RPG IV for these Table functions to C++. Then in V7R2 IBM enhanced the UDTF preprocessor and I went nuts creating what is now over 300 SQL functions for IBM i. The first dozen or so were always an SQL version of an existing CL RTVxxxxx command. But later I created a few dozen List commands (e.g, List Objects, List Members, List File Descriptions, List Jobs, List Subsystem Descriptions, etc.) Around that time, IBM came out with what is now called “IBM i Services” which is also a collection of SQL table function. Theirs seemed to focus on providing SQL interfaces to SQL objects on the system, while mine where focused on exposing operating system APIs, access to various object descriptions, and MI instructions via SQL functions.
At some point IBM shifted their focus and it felt almost like they were chasing my tail. Sometimes they provide stuff I had already written and shipped, but they also provide some important and unique stuff that is relatively low-use or too complex for 3rd parties to invest in developing.
Probably the most dominant differences in what I’m doing with SQL Tools and IBM is delivering with IBM i Services is that (A) my stuff works on V7R2 and later, (B) Upper/lower case is ignored in 99% of my stuff and in 0% of IBM’s, (C) I have an edge in terms of consistent parameter naming. Also, SQL Tools has all the RTVxxxx APIs implemented, including rarely needed function such as RTVCMDD (Retrieve Command Definition as XML). We also have a few dozen that they don’t offer besides the aforementioned RTV functions, we also have READSPL (Read SPOOLED file), READSRC (Read Source Member) and IFS_HASH, Loan Amortization schedule, WordWrap, and a ton of others. All written in high-speed C and C++ for IBM i.
Don’t get me wrong, I love and use the IBM i Services, there’s no reason not to. But I can’t live without SQL Tools on every client’s system I work on. SQL Tools works the same regardless of IBM i release levels or partitions. So you can be on IBM i V7R5 and query something over on a V7R2 partition using SQL Tools and it works as expected. That alone is worth the price of admission.
At this point, SQL iQuery had tons of important features such as *JSON, *EXCEL, *CSV, *PDF, email, variables, conditional logic, embedded SQL, web output and much more. We even provide a way to route the output to the IBM ACS CLdownload jar file (similar to what the new V7R5 GENERATE_SPREADSHEET will do) to provide native binary XLSX output where needed. With a stable SQL iQuery codebase, I was focused on turning out more and more SQL Functions via the SQL Tools product.
But I was struggling to get access to IBM i customer lists for sales/marketing purposes, I decided to approach IBM and ask them to partner with me in offering SQL iQuery to IBM i customers as a modern replacement for the now withdrawn Query/400 product. They seem to have a serious discussion on the offer, but ultimately decided it might compete with their Db2 Web Query efforts and declined.
At the time, there were only two third-party products that provided similar capability. Db2 Web Query which as of today is no longer an IBM offering, and another 3rd party product which IBM also does not OEM, but it can cost several tens of thousands of USD up front and thousands more each year in SWMA. I was selling SQL iQuery for about $3,000 at the time and $500 per year in SWMA. But although people told me “That’s less than our annual SWMA costs for XYZ” I had very little luck getting through to this market.
I tried going to regional and national User Group events/conferences and while we did get a lot of leads, we only closed a few dozen orders. Another Vendor at these events suggested that people perceive value in price, and I should raise my price to $10,000 or more so customers would take it seriously. Instead of doing that, I decided to cut the price entirely and go with the rare but once popular SWMA-only option. I mean really, why charge an upfront fee and then charge for support after 90 days? If a product is good enough, customers would flock to the SWMA-only option, or so I thought.
After about two years of paying thousands to attend each Conference as a Vendor, nothing was working. I decided to ask the other vendors about their situation to see if was “just me” or if it were a market-wide phenomenon. Over the next two years I expressed my challenges to the other vendors and solicited their advice. With only one exception, all long-time vendors had the same response:
“We don’t get (m)any new customers from these events. We attend the show to remind the community we are still here; we want our existing customers to continue to pay their support fees.”
This isn’t the best news if you’re trying to promote a relatively new product and have another product in the pipeline (SQL Tools was about to premiere). So I went back into Consulting for a handful of clients to produce revenue; all while improving, refining and enhancing SQL iQuery and building a load of SQL functions for IBM i in the (then) upcoming SQL Tools product.
Earlier this year we went through a transition where we tried various marketing approaches. We dropped the upfront license fee (i.e., went to a SWMA-only model), we then combined the products into one package (SQL World) but eventually we went back to two distinct products with an upfront fee and annual SWMA.
Today we have decided to return to the SWMA-only model with no upfront, one-time-charge (OTC). The SWMA-only model is a little much different from a subscription model.
SWMA-Only Model: No upfront fee. Pay the first year’s SWMA within 90 days and you get a license key for that version/release/modification level of the product. The license key never expires but does not work on future releases. When you pay the next year’s SWMA you get a new license key that can be installed anytime (even between you install updated releases) and used for the then current release level.
Subscription-based pricing: No upfront fee. Pay your subscription fee and you get a key for the term of the subscription. The key expires shortly after the expiration grace period. That is, the software no longer functions after about 1 year unless you pay your renewal fee. If you pay your subscription renewal fee, you get a new license key for the next term of the subscription. This will hopefully be automated for those connected to the internet.
If you have a need to create reports or CSV, PDF or Excel files and automatically email them to your users, consider SQL iQuery for IBM i. It is a great product — probably the best I’ve ever created. You can download a no-charge trial today at our website http://www.SQLiQuery.com. Note that we do not pay the fee for a certificate so even though it may say it is unsecure, that by no means indicates that the website is unsafe. So check it out today!
Now, about that withdrawal of DB2 Web Query for i, which was announced a few weeks ago by IBM. My SQL iQuery was considered a competitive product by IBM of the Db2 Web Query offering. I found it to be just a BI tool that also read Query/400 definitions and transformed those into Db2 Web Query queries (viewed on the web).
SQL iQuery isn’t the same product. It is a replacement for Query/400, but rather than use legacy prompts and whatnot, it allows developers and users to build SQL scripts that produce what they need. It can output to HTML, JSON and generate a Google Chart from the results, as well as send it to CSV, PDF, Excel, Print, etc. So you can do a lot of what you could with Db2 Web Query, but they are different products with a different mindset. iQuery is mostly for IBM i developers and users who want to run from the command line, job scheduler, CL, etc. It also has a very good web component that is being used in a number of customers as their web presence. But that’s not widespread.
Bob Cozzi helps IBM i clients solve complex issues using SQL, RPG IV, or C/C++ as well as the SQL iQuery licensed program (the best way to modernize your Query/400 with SQL). Visit his website at: http://www.SQLiQuery.com. Cozzi is also the author of The Modern RPG Language, developer of SQL iQuery and SQL Tools, and a speaker on SQL and RPG IV topics you can actually use.