Hunting Down Storage Hogs
February 24, 2010 Hey, Joe:
My System i 550 disk space keeps filling up, and now it’s approaching 87 percent utilization. I’m worried it will be a major problem once storage tops 90 percent used. I’ve checked everything I can think of. Do you have any ideas for what could be eating up my disk space?
This is a common issue that most shops face at one time or another. Here are some common places to search for memory hogs on your system.
1. Excessive spooled files–Some shops are spooled file pack rats, refusing to delete large numbers of spooled files, sometimes for years. To see if you have a spooled files issue, use this Work with Output Queue (WRKOUTQ) command to check for output queues that contain too many entries.
This command displays all your output queues and the number of spooled files each output queue contains, making it easy to determine if your users are keeping too many reports on the system.
2. Junk libraries–Do you have any copies of old libraries that you saved before performing a software upgrade or before performing a major database change? Many shops forget to delete these saved libraries and the files wind up junking up their system. Run this Display Library (DSPLIB) command to determine if you have any junk libraries on your system.
Look for any libraries that have the literals ‘OLD’, ‘SAV’, or ‘XX’ as the last characters in their names (these are classic tells for library copies). Sometimes a junk library will also contain the date the library was created in their library name (i.e., RBT021810). Make sure you backup junk libraries to media before deleting them, in case you later discover that they weren’t junk libraries after all.
3. Junk files and large files or file members that need reorganization–Several years ago, I developed a procedure for creating a file that contains meta-data about all the other files on my system, including the number of deleted records in each file and member. You can review this procedure by clicking here. Once created, sort your file metafile data in descending order by number of deleted records. This will provide a road map for determining which files can return the most disk drive space through reorganization.
Also be on the lookout for junk files and old versions of large files. Junk files include old save files that haven’t been removed, as well as test versions of large production files that an IT staffer tucked away in a private library before a database change. You can also use the metafile data to search for these files.
4. Journal Receivers–We recently solved a storage problem by reviewing and finding 27 journal receivers that weren’t being deleted off the system at the proper time. These receivers consumed almost 9 percent of system storage and their removal reduced storage utilization from 86 to 77 percent full, enough to take away the imminent threat of our machine crashing when storage filled up. To view all the journal receivers on your system, run this Work with Journal Receivers (WRKJRNRCV) command.
Look for a high number of receivers that have the same core name but different receiver numbers (such as AUDRCV2602 through AUDRCV2623). If you see this pattern, check out the library the receivers belong to with this Display Library (DSPLIB) command.
If your suspect journal receivers are consuming a lot of space, you can delete old receivers and change your journal management technique to detach and delete receivers at regular intervals.
5. AS/400 Integrated File System (AS/400 IFS) –In some instances, third-party applications can fill up the AS/400 IFS with ASCII files and other objects. I’ve seen this happen with fax or email packages where old records aren’t being purged in a timely manner and several months of documents pile up in an IFS directory. A few years ago, I evaluated four different free tools to calculate how big your AS/400 IFS is so that you can determine if any folders are taking up too much space. Perhaps you can use one of these tools to check for excessive storage usage in the AS/400 IFS.