• The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
Menu
  • The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
  • More on RPG Encapsulation

    January 12, 2005 Joel Cochran

    The code for this article is available for download.

    Reader response to my July article, “Encapsulating File Access in a Service Program,” filled my mailbox for some time. Most of the feedback has been extremely positive, with readers wanting information on some of the more advanced techniques I referred to but didn’t have column space to explain.

    Many of the questions had to do with embedded SQL and how to incorporate that into the service program to offer more flexibility. In this article, I address a number of these questions and offer some more techniques to employ in this powerful model. Unlike the basic model I outlined previously, which works easily with either SQL or native I/O, most of these techniques are specific to embedded SQL. Where needed I will explain the SQL aspects of these techniques.

    Using a Template

    I’m a big fan of program templates, such as the subfile templates I’ve shared in this newsletter before. Well thought out templates allow me to quickly create programs of similar types, and since using the encapsulation model for service programs takes a lot of typing, and if properly implemented would require one service program per file, it is a prime target for a template. For a template to be successful in this regard, we need to add as many standard elements as possible. The idea is to provide an easily modified starting point.

    In this case, I’ve amended our global variables section to include some fields for our SQL needs:

    // A second DataStructure for SQL statements
    d SQL_DS        e ds                  extname(MYNAMES) prefix(sql_)
    
    // The file name.  This was in the previous version but unused.  
    // Can be qualified with the library name or made larger to handle 
    // SQL table names.
    d fileName        s             20a   varying inz( 'MYNAMES' )
    
    // The primary key field name
    d keyField        s             10a   varying inz( 'ID' )
    
    // Indicates whether or not the SQL cursors and prepared statements 
    // procedure has been executed.
    d isSQLprepared   s               n   inz
    
    // Strings for Prepared Statements
    d sqlSelectStr    s                   like( sql )
    d sqlUpdateStr    s                   like( sql )
    d sqlInsertStr    s                   like( sql )
    d sqlDeleteStr    s                   like( sql )
    

    Implementing these fields would have little or no affect on a native implementation, but if you implement this model with SQL, you can use these fields in our dynamic SQL statements. This prevents certain efforts from being duplicated and increases flexibility.

    Optimizing the SQL

    One of the potential problems of the basic version of encapsulation is that it uses inline embedded SQL statements. Basically, the SQL engine has to generate, analyze, assign access paths, execute, and destroy each statement every time they execute. Nothing is reused, which adds a lot of unnecessary overhead. To help alleviate that performance burden, we are going to use prepared statements. A prepared statement does all those tasks one time, and then the program holds on to it for reuse.

    To achieve this task, I’ve added a mynames_prepare() subprocedure. Since this procedure only needs to run once in the service program’s lifecycle, I added the isSQLprepared variable above. Now, in the mynames_getRecord(), mynames_update(), mynames_insert(), and mynames_delete() procedures, check this variable first and run the mynames_prepare() procedure if it is false:

    /free
      if not isSQLprepared ;
        mynames_prepare();
      endif ;
    /end-free
    

    Examine the mynames_prepare() subprocedure in the provided source code, and you’ll see a couple of interesting things. First, if you’ve never used prepared statements before, this will give you an easy template to copy. Second, speaking of templates, you’ll notice that all the statement strings use the filename and keyField variables where applicable, instead of hard-coding them into the strings themselves. This goes back to templating: changing the initial value of a variable in a D-spec is much easier to maintain and makes this approach easy to copy for other applications. You’ll also notice that the procedure returns an indicator for whether any of the statements encountered a problem, but permits the rest of the statements to continue. Since this service program and the file it serves should constantly remain in synch, this is primarily used for debug purposes.

    Implementing the Prepared Statements

    With the prepared statements in the bag, we need to implement them in the database access procedures. The update, insert, and delete statements are executed with their relative prepared statements. The question marks (?) in the strings for the prepared statements act as variable place holders, so when we execute the statement we add a list of variables using the “Using” option. Below is the altered mynames_update() procedure:

    p mynames_update  b                   export
    
    d mynames_update  pi              n
    
    d isUpdated       s               n   inz(*off)
    
     /free
       if not isSQLprepared ;
         mynames_prepare();
       endif ;
       n_UPDUSER = userID ;
       n_UPDSTAMP = %timestamp();
     /end-free
    c/exec sql
    c+ execute updateFile
    c+   using :n_USERNAME,
    c+         :n_USEREMAIL,
    c+         :n_PHONENUM,
    c+         :n_UPDUSER,
    c+         :n_UPDSTAMP,
    c+         :n_ID
    c/end-exec
     /free
       if SQLSTT = '00000' ;
         isUpdated = *on ;
       endif ;
    
       return isUpdated ;
     /end-free
    
    p mynames_update  e
    

    Follow this same model for the insert and delete procedures. You may also want to read up on SQL States in the “SQL Programming Concepts” manual, coverage of which is beyond the scope of this article.

    Dealing with Multiple Record Sets

    The encapsulation model from the last article is intended to emulate an object-oriented approach, where the object holds and serves data from a single database record. Frequently, though, what we really want is to work with a subset of records: more than one, but not the whole file. In other languages, this is referred to as a record set. The flexibility of SQL allows us to process these subsets dynamically! This can also be achieved with OPNQRYF, but otherwise cannot be done with native I/O without a ton of logical files and a huge branch statement.

    To process a record set, we are going to create three new procedures. First, using the techniques from the mynames_prepare() procedure, we will create the mynames_getRecordSet() procedure. This procedure receives two parms, one for the where clause and an optional one for order by:

    p mynames_getRecordSet...
    p                 b                   export
    
    d mynames_getRecordSet...
    d                 pi              n
    d  whereClause                1024a   const varying
    d  orderByClause              1024a   const varying options(*nopass)
    
    d where           s           1024a   varying inz( ' where ' )
    d orderBy         s           1024a   varying inz( ' order by ' )
    d isOpen          s               n   inz(*off)
    
     /free
       where = where + whereClause ;
       if %parms() = 1 ;
         orderBy = orderBy + keyField ;
       else ;
         orderBy = orderBy + orderByClause ;
       endif ;
    
       sqlRecSetStr = 'select * from ' + fileName + where + orderBy ;
       mynames_closeRecordSet();
     /end-free
    c/exec sql
    c+ declare setCursor cursor
    c+     for setStatement
    c/end-exec
    c/exec sql
    c+ prepare setStatement
    c+    from :sqlRecSetStr
    c/end-exec
    c/exec sql
    c+ open setCursor
    c/end-exec
    /free
       if sqlstt = '00000' ;
         isOpen = *on ;
       endif ;
    
       return isOpen ;
     /end-free
    
    p mynames_getRecordSet...
    p                 e
    

    Before executing any of the SQL statements, we call another new procedure, mynames_closeRecordSet(). This ensures that the cursor is closed before we try to open it based on our new prepared statement:

    p mynames_closeRecordSet...
    p                 b                   export
    
    d mynames_closeRecordSet...
    d                 pi
    
    c/exec sql
    c+ close setCursor
    c/end-exec
     /free
       return ;
     /end-free
    
    p mynames_closeRecordSet...
    p                 e
    

    Now that we have a method for opening a record set, we need a procedure for reading the next record in the set:

    p mynames_fetchRecordSet...
    p                 b                   export
    
    d mynames_fetchRecordSet...
    d                 pi              n
    
    d isFound         s               n   inz( *off )
    
    c/exec sql
    c+ fetch next
    c+  from setCursor
    c+  into :mynames
    c/end-exec
     /free
       if sqlstt = '00000' ;
         isFound = *on ;
       endif ;
       return isFound ;
     /end-free
    
    p mynames_fetchRecordSet...
    p                 e
    

    The procedure returns *off when there are no more records to process. Here is a brief example of using and processing a record set:

        // Retrieves the record set of all records with
        // PHONENUM equal to 8005551212 and orders the
        // results by USERNAME
        // returns true if SQL statement is OK and set is opened
        if mynames_getRecordSet( 'PHONENUM = 8005551212' : 'USERNAME' );
          // Do While loop for records remaining
          dow mynames_fetchRecordSet();
            message = 'User: ' + %trimr( mynames_getUserName() ) +
                     ' EMail: ' + %trimr( mynames_getUserEmail() );
            dsply message ;
          enddo ;
        endif ;
    

    This record set approach could also be enhanced by making the cursor “scrollable,” allowing you to jump around within the record set. Of course, you would need to implement additional procedures such as mynames_fetchFirstRecord(), mynames_fetchPreviousRecord(), and mynames_fetchLastRecord().

    Additional Procedures

    Hopefully by now you’ve come to see that with this approach you simply add procedures for whatever you need. To handle alternative or multiple keys, add procedures that access the database in the necessary manner. Just make sure that the procedures populate the main data structure, and the rest of the procedures will continue to function as desired. You can also easily add procedures that validate data entry, like validating if a particular name or ID is on file, or if an e-mail address is properly formatted. Need to find the average value of a particular field? Add a procedure. How about one that deletes records based on date? Add a procedure. Want to know the standard deviation of the sum of three fields? You guessed it: add a procedure! The more procedures you add, the more functional the service program.

    A Final SQL Catch-All Procedure

    As a final backstop for unanticipated file maintenance, you can add a simple _executeSQL() procedure. The parameters for this procedure include the statement type, the values string, and the where statement:

    *---------------------------------------------------------------------
     * Execute dynamic SQL statement
     *---------------------------------------------------------------------
    p mynames_executeDynamicSQL...
    p                 b                   export
    
    
    d mynames_executeDynamicSQL...
    d                 pi             5a
    d  type                          6a   const
    d  values                     1024a   const varying options(*omit)
    d  where                      1024a   const varying options(*nopass)
    
    d isOK            s               n   inz( *off )
    
     /free
       if type = 'UPDATE' and %parms() = 3 ;
         executeStr = 'update ' + fileName +
                      ' set ' + values +
                      ' where ' + where ;
         isOK = *on ;
    
       elseif type = 'UPDATE' and %parms() = 2 ;
         executeStr = 'update ' + fileName +
                      ' set ' + values ;
         isOK = *on ;
       elseif type = 'DELETE' and %parms() = 1 ;
         executeStr = 'delete from ' + fileName ;
         isOK = *on ;
    
       elseif type = 'DELETE' and %parms() = 3 ;
         executeStr = 'delete from ' + fileName +
                      ' where ' + where ;
         isOK = *on ;
    
       elseif type = 'INSERT' and %parms()>= 2 and
              %addr( values ) <> *null ;
         executeStr = 'insert into ' + fileName +
                      ' ' + values ;
         isOK = *on ;
    
       else ;
         sqlStt = '99999' ;
    
       endif ;
    
       if isOK ;
     /end-free
    c/exec sql
    c+   execute immediate :executeStr
    c/end-exec
     /free
       endif ;
    
       return sqlstt ;
     /end-free
    p mynames_executeDynamicSQL...
    p                 e
    

    Right away, though, there is a problem with this approach: it opens a serious control gap. This procedure could easily be exploited to run a statement that bypasses controls specifically included in the other procedures. In other words, it seriously contradicts the entire purpose of encapsulation, so think hard before adding this procedure to your service program.

    You might wonder why I even bother to mention it: first of all, if I know the service program is just going to be used internally, or more specifically just by me, then I don’t worry about another programmer maliciously using it. Second, it allows me to solve a one-off problem quickly without needing to add a new procedure that I’ll never use again. It basically allows me to be lazy. As a vendor, I would never use this in a situation where someone else could abuse the procedure: abusing my code is a pleasure I reserve for myself.


    Shameless Self-Promotion

    I’ve labeled my encapsulation approach “RPGBeans.” Check out my new site, www.RPGBeans.com. The highlight of the site is a tool I’ve been working on that automatically reads a database file and generates the encapsulating RPG source code. The tool will have some other handy features as well, such as a prototype generator, so drop in and check it out. Until next time, happy coding!

    Joel Cochran is the director of research and development for a small software firm in Staunton, Virginia, and is the author and publisher of www.RPGNext.com. Click here to contact Joel by e-mail.

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Tags:

    Sponsored by
    Maxava

    Migrate IBM i with Confidence

    Tired of costly and risky migrations? Maxava Migrate Live minimizes disruption with seamless transitions. Upgrading to Power10 or cloud hosted system, Maxava has you covered!

    Learn More

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Help/Systems’ Scheduler Now Automates EnterpriseOne Jobs Testing At iSeries Shops Not Up to Snuff, Original Finds

    Leave a Reply Cancel reply

Volume 5, Number 2 -- January 12, 2005
THIS ISSUE
SPONSORED BY:

ProData Computer Svcs
Patrick Townsend & Associates
Guild Companies

Table of Contents

  • More on RPG Encapsulation
  • Advanced Pattern Matching in RPG
  • The Dangers of Temporarily Changing User Profiles

Content archive

  • The Four Hundred
  • Four Hundred Stuff
  • Four Hundred Guru

Recent Posts

  • Public Preview For Watson Code Assistant for i Available Soon
  • COMMON Youth Movement Continues at POWERUp 2025
  • IBM Preserves Memory Investments Across Power10 And Power11
  • Eradani Uses AI For New EDI And API Service
  • Picking Apart IBM’s $150 Billion In US Manufacturing And R&D
  • FAX/400 And CICS For i Are Dead. What Will IBM Kill Next?
  • Fresche Overhauls X-Analysis With Web UI, AI Smarts
  • Is It Time To Add The Rust Programming Language To IBM i?
  • Is IBM Going To Raise Prices On Power10 Expert Care?
  • IBM i PTF Guide, Volume 27, Number 20

Subscribe

To get news from IT Jungle sent to your inbox every week, subscribe to our newsletter.

Pages

  • About Us
  • Contact
  • Contributors
  • Four Hundred Monitor
  • IBM i PTF Guide
  • Media Kit
  • Subscribe

Search

Copyright © 2025 IT Jungle