Testing At iSeries Shops Not Up to Snuff, Original Finds
January 18, 2005 Alex Woodie
The importance of software testing was made painfully obvious to Bill Gates a couple of weeks ago. During his keynote at the consumer electronics show, the Microsoft chairman’s Media Center PC crashed during a demo, and other Microsoft executives had similar problems. While most midrange programmers would bristle at any comparison of Microsoft’s wares to their own, the iSeries world still has a thing or two to learn about good testing procedures, according to a report commissioned by testing tool vendor Original Software Group.
In October 2004, Original Software commissioned an independent study to determine how iSeries shops test their software and how much they spend on testing. About 130 companies in the United States, Canada, Europe, and Australia participated in the study, which was conducted online. Most of the respondents were team-leader and management-level personnel involved in development and quality control.
One of the major goals of the study was to determine how much time and money companies are spending on software testing, which the study calculated by multiplying the number of workers devoted to testing by those workers’ compensation. The study found that, based on the amount of time that development, quality assurance, and user-acceptance-testing personnel spend on testing (about 18 hours per week on average), and taking into account the average cost of these personnel ($64 per hour on average), the total annual expenditure on testing personnel is almost $1.2 million.
The survey found that testing is conducted manually at most shops, with less than 20 percent using any kind of automated testing tool. (Original Software’s business is making automated testing tools, so the fact that four out of five iSeries shops have no tools bodes well for Original’s business strategy.) There were several reasons given for not using testing tools, including not enough time (35 percent), cost too high (33 percent), hassles of script and data maintenance (26 percent), and complexity (18 percent), according to Original.
Not surprisingly, most respondents (87 percent) to Original’s study feel that there is at least “some room for improvement” to their testing methodologies, according to the study results. Only 14 percent felt their testing methodologies are “highly efficient,” which shows that, even with testing tools in place, there is no guarantee of having a highly efficient testing process.
Original says the survey shows that software testing is not being taken seriously enough. “The results of our survey reveal that poor software testing seems to be the default in IT departments,” says Gus Kenyon, Original’s marketing director. “This is in spite of numerous disasters and corporate exposures having been reported in the press, problems caused by poor software quality.” The survey also indicates that IT departments have an inadequate understanding of automated software testing, he says.
The problem is not a lack of spending on testing, as even Microsoft–by far the wealthiest company in the history of high technology–continues to struggle with software quality. “It should be encouraging that companies are devoting so much effort and money to software testing,” says George Wilson, Original’s vice president of operations. “But the continuing levels of exposure, the lack of test automation, and the available levels of improvement all seem to indicate that the time and money is being spent inefficiently.”
In 2002, the federal government estimated that software bugs cost the country $59.5 billion per year. That number is likely increasing as companies continue to push developers and testers to accomplish more with fewer resources, according to Original. Regulatory compliance is also playing a role in directing where companies expend their testing resources; 46 percent of the respondents to Original’s survey said they are working to meet a regulatory mandate.
Original’s answer to the problem is to eliminate manual testing and replace its with tools that automate testing, such as its TestBench suite for the iSeries and other platforms. “Historically we have shown that, with our TestBench suite, we can save a company a minimum of 30 percent of the time and resources they are spending on testing,” Wilson says. “That amounts to over $300,000 per year for the average company.”