Auditing Object Changes In A Production Environment
September 29, 2015 Ron Adams
Like many business environments today, my shop consists of just one person wearing many hats to support the IBM i system. This makes for a tight, controlled environment with that one person managing the system very efficiently. However, most auditors aren’t too pleased with this setup as it opens the door for potential violations with established compliance guidelines. Even though we are a small shop with minimal budget, we were able to easily develop a solution to satisfy the auditors. Here’s what we did.
Most environments like ours are run by individuals with high integrity and dedication to their jobs and would never dream of malicious activity. Nonetheless, auditors still want some reassurance and proof. Our business falls under the Sarbanes-Oxley (SOX) compliance guidelines, which dictates that we should have at least two distinct individuals; one to write the code and another to approve it and move it in to the production environment. A structure similar to this is considered the ideal setup as far as most auditors would like to see, but it’s difficult to justify for smaller IT departments.
The problem becomes even more difficult if you try to involve another business person with little or no understanding of IBM i commands and all the complexities of promoting code. Some auditors allow exceptions for the developer to move the code, but they want to see a well-defined sign-off and review process. They also want to see a log of when the code moved so they can check it against the approval process. What to do? How can an environment with limited technical resources provide the needed evidence to prove that unauthorized changes to the core business applications are not being made in production without going through an approval process?
We wrestled with this problem for a couple of years. We looked at solutions from several providers, all of which offered great features and robust reporting. In the end, we settled on creating our own set of utilities. Our solution doesn’t come near the level of sophistication or functionality of some of the commercial products, but it does offer a basic capture of the data necessary to satisfy our auditors.
Digging into the functionality IBM provides in the audit journal (QAUDJRN), we discovered everything we needed. However, the information contained was overwhelming, especially for the untrained auditor or controller who would be reviewing the data and most likely knows very little about the technical details of IBM i. We started by looking at the captured data to determine what we needed. For our production environment, this was primarily program (*PGM) and file (*FILE) object types for specific libraries.
We already had audit journaling running and capturing some data, but we needed to tweak the system values to make sure we were catching everything. We set our system values as follows:
Once we had set these values and were confident that we were capturing the right data, we proceeded to automate the data extraction process. The data is captured in the audit journal and can be easily extracted using the Copy Audit Journal Entry (CPYAUDJRNE) command for the entry types and specific date range needed at month end.
We created a monthly job, scheduled in our job scheduler (Robot SCHEDULE), to extract the data for the specific entry types. The entry types we extract and the default output tables are:
The command looks like this:
CPYAUDJRNE ENTTYP(CO DO OM OR OW ZC) OUTFILE(TEMPLIB/QAUDIT) JRNRCV(*CURCHAIN) FROMTIME(&&MONTHSTR 0000) [Begin of Month] TOTIME(&&MONTHEND 0000) [End of Month]
Robot SCHEDULE, from (HelpSystems, allows us to set up dynamic date variables (&&MOTHSTR, &&MONTHED) for Month Start and Month End Date. If you’re using another scheduler or process, you may need to build a CL wrapper to specify your date range.
After the CPYAUDJRNE runs, we copy the output files from TEMPLIB to a permanent library named AUDDTA, appending to existing files of the same names. We do this to keep the full, unfiltered files for a period of time (up to a year) in case we need to show evidence of the original data.
CPYF FROMFILE(TEMPLIB/QAUDITCO) TOFILE(AUDDTA/QAUDITCO) MBROPT(*ADD)
Next, we created extract tables that are pared-down versions of the IBM-created extract tables. The tables that CPYAUDJRNE creates contain quite a few data elements that we found useless for our reporting, so we pared the list down to just those that we needed to capture for audit reporting. Also, these pared-down tables will remain for as long as needed, unlike the originals that were copied from TEMPLIB and may be purged from time to time.
The next step was to populate the pared-down extraction tables with the specific audit information we needed to see for our production environment. This is done using an SQL script for each data set. The scripts are saved in source members and the Run SQL Statement (RUNSQLSTM) command is used to execute them.
RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITCOM)
One of the challenges we faced when trying to build something on our own was in defining the list of production libraries and building the queries. We started off with just a handful of libraries, but then the list expanded to 18. At first we handled this in the scripts using the IN clause in our SQL statement and the list of library names, but then we created a table (AUDLIBP) to hold all of the library names. We also created a table of object types (AUDOBJP), even though for now we’re only querying ‘*PGM’ and ‘*FILE’ object types. The tables are simple, with AUDLIBP having just one field for library name, and AUDOBJP containing just one field for object type.
Listed below is the source for entry type “CO”. The source code for the other entry types is very similar and captures just the fields we need:
MERGE INTO AUDDTA.QAUDITCOX B USING (SELECT COSEQN, COCODE, COENTT, COTSTP, COJOB, COUSER, CONBR, COPGM, COPGMLIB, COUSPF, COSYNM, COSYSSEQ, CORCV, CORCVLIB, CORPORT, CORADR, COETYP, COONAM, COOLIB, COOTYP, COATTR FROM AUDDTA.QAUDITCO WHERE COOTYP IN (SELECT OBOTYP FROM AUDDTA.AUDOBJP) AND COOLIB IN (SELECT LBLIB FROM AUDDTA.AUDLIBP)) AS N ON B.COSYSSEQ = N.COSYSSEQ AND B.COTSTP = N.COTSTP WHEN NOT MATCHED THEN INSERT VALUES(N.COSEQN, N.COCODE, N.COENTT, N.COTSTP, N.COJOB, N.COUSER, N.CONBR, N.COPGM, N.COPGMLIB, N.COUSPF, N.COSYNM, N.COSYSSEQ, N.CORCV, N.CORCVLIB, N.CORPORT, N.CORADR, N.COETYP, N.COONAM, N.COOLIB, N.COOTYP, N.COATTR);
When all of the SQL statements have completed, we use our reporting tools to build reports based on the extract files created by the merge that can be saved and sent to our controller for review. We take it a step further when reporting by joining the object names in the extracts to our in-house developed source management utilities and identify the associated change number assigned to the promoted code changes.
Listed below is the script from the scheduled monthly job.
CPYAUDJRNE ENTTYP(CO DO OM OR OW ZC) OUTFILE(NGSQRYTMP/QAUDIT) + JRNRCV(*CURCHAIN) FROMTIME(@@MONTHSTR 000000) TOTIME(@@MONTHEND 235959) CPYF FROMFILE(NGSQRYTMP/QAUDITCO) TOFILE(AUDDTA/QAUDITCO) MBROPT(*ADD) CPYF FROMFILE(NGSQRYTMP/QAUDITDO) TOFILE(AUDDTA/QAUDITDO) MBROPT(*ADD) CPYF FROMFILE(NGSQRYTMP/QAUDITOM) TOFILE(AUDDTA/QAUDITOM) MBROPT(*ADD) CPYF FROMFILE(NGSQRYTMP/QAUDITOR) TOFILE(AUDDTA/QAUDITOR) MBROPT(*ADD) CPYF FROMFILE(NGSQRYTMP/QAUDITOW) TOFILE(AUDDTA/QAUDITOW) MBROPT(*ADD) CPYF FROMFILE(NGSQRYTMP/QAUDITZC) TOFILE(AUDDTA/QAUDITZC) MBROPT(*ADD) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITCOM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITDOM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITOMM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITORM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITOWM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT) RUNSQLSTM SRCFILE(AUDDTA/QSQLSRC) SRCMBR(QAUDITZCM) COMMIT(*NONE) SECLVLTXT(*YES) OUTPUT(*PRINT)
Working in a small shop can be challenging at times, especially when dealing with compliance requirements. But, with just a little bit of work, a solution can be built to satisfy many of those requirements using the audit journals.