Quality assurance is more than just manual testing. Today’s systems are more complex than traditional approaches can handle. New automated test techniques have evolved allowing machines to exhaustively and automatically test themselves for errors. Compared to traditional manual testing techniques, automated testing can save over 80% of the cost and time of manual testing while producing much higher levels of uptime in the initial release of a new or replacement system.
Test verification techniques have evolved to the point where it is now possible to prove that all test scenarios have been thought of and tested ensuring no functional bugs exist in the final version. For replacement, or upgrades to existing systems, automated testing efficiently executes regression tests which demonstrate the new version of the system is no worse than the original (does not generate different output for same input). In fact we have proprietary techniques that score predictive risk of a project at any point in time so that you would know the number of defects and downtime a user is likely to experience before a system is released, with up to 97% certainty.
This risk management process supports identification, measurement and mitigation of project risks. The benefits of a more sophisticated approach to Quality Assurance, besides lowered cost, quicker time to market and higher customer satisfaction are adherence to security and compliance requirements.
Four types of risk must be mitigated in order to successfully release an enterprise system today:
• Functional Risk: Functional risk relates to whether the system meets the users requirements, functionality and delivers expected results.
• Non-Functional Risk: Non-functional are risks associated with “behind the scenes” issues such as system performance and response times under load from many simultaneous users. It measures system stability and system durability over long periods of time (endurance testing) and detects memory leaks that cause system deterioration and eventual crash. Included in this type of testing is the graceful handling/recovery of component failure and handling of sudden spikes in data, connections or number of simultaneous users.
• Deployment Risk: Deployment risk measures the likelihood of problems being introduced when delivering and installing software. SDLC activities impacting Deployment Risk include source control, branching,
version control, automated build, package and installation along with rollback planning. These risks are mitigated by a strong discipline of Configuration Management (CM) or Change Management.
• Compliance Risk: Compliance risk measures impacts to the business introduced by violating laws, rules, regulations and industry best practices. Software is subject to continually increasing levels of operational and security compliance resulting from industry regulations such as Sarbanes-Oxley (SOX), SAS70, Segregation of Duties (SoD), Migration Integrity (MI) and Access Administration (AA).
Example: Financial Services Company
A large Financial Services company had a mission critical system that had an opportunity cost of $60 million per one hour of downtime. Their downtime was one hour per day (92% uptime) before engaging with us. Over an 18 month period our SDLC and Quality Assurance techniques were applied and resulted in an increased uptime of 99.98% which was sustained for the following five years. This meant unexpected down time was reduced to one hour every three years, rather than one hour every day, mitigating $8.4 billion dollars a year in risk or $42 billion over the lifetime of the project. The added benefit of this new process is that it generated, captured and stored all the required compliance evidence with digital signatures in the correct chronological order to facilitate passing bi-annual audits with zero exceptions, issues or comments.