Email this Article Email   

CHIPS Articles: Factors to Consider When Implementing Automated Software Testing

Factors to Consider When Implementing Automated Software Testing
By Larry Yang - January-March 2017
Testing is a major component of the software development lifecycle (SDLC), constituting a prominent cost driver for both government and industry organizations. Therefore, many businesses are automating their software testing to save money and improve quality. When considering whether automation is a viable option, organizations must take several factors into account. The purpose of this article is to illuminate these factors.

Software development and integration is a continuous process throughout the acquisition lifecycle. Automated software testing can improve testing capabilities, replacing some of the resource-intensive manual efforts, and it can be executed alone or in conjunction with manual testing. Regardless of the complexity, technical maturity, or requirements of the software program, automated software testing may be a viable option.

Advantages and Disadvantages of Automated Software Testing

Automated software testing is characterized by the following advantages and disadvantages.

Advantages

-- Saves time and money by reducing setup time by automating data staging and streamlining regression testing after every software modification, patch or release, ultimately reducing manual labor. Automated software testing reduces overall test execution times at minimal cost.

-- Improves software quality by minimizing human errors, enhancing repeatability and consistency of test results, and reducing the risk of errors through more thorough testing. Expands and enhances test coverage. Automated software testing can execute thousands of complex test cases throughout each test run. Lengthy tests can be run unattended, for example, performed overnight.

-- Allows additional testing not suitable to manual software testing, such as load and performance testing where it can simulate thousands of users, complex scenarios and concurrent processes, and allows endurance testing with large and greatly varied datasets that the automated tool is able to generate, allowing more exhaustive testing. Testers can examine a software system as it runs for a long duration and measure the system's reaction parameters. Longevity testing evaluates a system's ability to handle a constant, moderate workload for a long period. Allows for race-condition testing — the behavior of electronics, software or other system where the output is dependent on the sequence or timing of other uncontrollable events.

Automated software testing allows identification of potentially catastrophic issues early — before they cause system outages.

Disadvantages

-- Challenging process which requires appropriate resources such as automation architect and automation engineers (often costly and difficult to find); developers with knowledge of scripting and programming, e.g., Java or Visual Basic; and subject matter experts (SME) with firm grasp of application being automated.

-- Additional costs for setup (e.g., purchase of the automation tool, training of staff, and maintenance of test scripts), equipment, lab, and annual software maintenance fees.

-- Errors in automated scripts, though rare, can negate data collected and force retesting.

Scope of Automated Software Testing

Once the advantages and disadvantages have been weighed, the scope of the initiative must be defined. The software application(s) should be analyzed to determine which portions can be automated.

Good candidates are test cases that:
• Process a large amount of data.
• Are time-consuming or repetitious.
• Are complex or difficult to execute.
• Are prone to human error during manual testing.
• Perform common, frequently executed processes.
• Assess functionality of core and critical business functions.
• Ensure continued software functionality through regression testing.

Checklist for Automated Software Testing

The analysis above will define whether your application has a large enough scope to justify automated software testing. The next steps involve obtaining leadership approval during the project planning phase to ensure support and performing a detailed analysis to predict the value of implementation using a cost-benefit analysis (CBA) or business case analysis (BCA) framework.

Be realistic about the level of effort (LOE) required by investigating the existing LOE for the project to prevent any issues, such as concurrent test environment usage, resulting in resource collisions. Determine the return on investment by running a calculated analysis of automating your program to determine the total cost, savings per year, benefits per year, and overall gain (benefit divided by investment). Calculate the break-even point when your program will recoup associated costs.

Decide which automated tool(s) to use for your program. Research all possible tool options, list the pros and cons of each tool and decide which one will most benefit your program.

Considerations include:
• Compatibility with system being automated.
• Browsers and operating systems supported.
• Applications supported, e.g., mainframe, back-end, and web.
• Programming languages supported.
• Level of coding experience required among automation engineers.
• Amount of coding required to generate automated test scripts.
• Reusability of test components.
• Load testing capability.
• Reporting and analysis capabilities.
• Ability to record and playback.
• User-friendliness of interface.
• Costs, including the cost of procuring the software/tool; maintenance for tool support, such as critical updates, patches, bug fixes; support costs, such as licensing fees for team members — plus a few extras for backup; and training costs for the team, if applicable.

Define the requirements for the development and the testing environment. An ideal development setting will simulate a production-like environment. Decide which datasets are needed for automated testing and stage data prior to test execution. Ensure that all development and testing accounts are set up, with appropriate permissions. Finally, establish a process for software and data baselining.

Consider whether the project requires a lab, conference room, or both, and whether it should be located in-house or at an external facility. Ensure that the space is adequate to support the team, any observers, meetings, and equipment, and most importantly, consider the software classification and network connectivity requirements. It is best to compare the costs of various options to ascertain the best fit.

Automated Software Testing Team

Develop the knowledge, skills, and abilities (KSA) required, including programming skills, such as Java, C, C++ and Visual Basic. Compose your team — testers, software engineers, database administrators (DBA), SMEs and others that may be needed.

Weigh the costs, availability and long-term benefits of various options for project manning. Determine whether existing program resources are available to assist in testing. You may have to create position descriptions and complete necessary hiring actions.

Overall Costs to Implement

Roll up and quantify funding needed for implementation: automated software tool, maintenance, support, licensing, and training; labor; location and facility: development and testing site; contract fees associated with tool procurement, contractor support, facility; and sustainment and maintenance for automated test-script updates, repository and documentation; and prototype to ensure proof of concept.

Documentation written by the team to ensure a successful implementation should take into account all project management documents, including:
• Project Structure, Monitoring & Control – business alignment, project initiation and acceptance.
• Project Management Plan (PMP) – scope, schedule, cost and performance.
• Project Charter – mission, objectives, deliverables and member expectations.
• Integrated Master Schedule (IMS) – resource-loaded timeline with milestones.
• Communications Plan – details of stakeholder engagement.
• Data/Reporting Plan – consistent reporting procedure.
• Work Breakdown Structure (WBS) and WBS Dictionary – hierarchy of work elements.
• Organizational Breakdown Structure (OBS) – tiered depiction work teams/functions.
• Financial Structure of Project – billing elements and network activities.
• Procurement Strategy – specific procurement actions for services or supplies.
• Project Spend Plan – program-level budget development and execution.
• Project Resource Plan – rough order of magnitude (ROM) estimates of manpower, facilities, IT and other operational requirements.
• Project Closeout Plan – facilities, materials, contract, financial and workforce.

Technical execution documents:
• Systems Engineering Plan (SEP) – guide for all technical aspects of the program.
• High-Level System Requirements – top-level requirements, e.g., stakeholder.
• Decomposed System Requirements – breakdown of high-level requirements.
• Requirements Traceability Matrix (RTM) – linking of requirements throughout validation process.
• System Design – engineering design overview for Automated Software Testing.
• Technical Review Action Plan (TRAP) – guide for facilitating Systems Engineering Technical Review (SETR) event, e.g., entry/exit criteria and approval process.
• Automation and Development Guide – coding standards, script-development process, configuration instructions.
• Automated Test Script Suite – developed by test team from manual procedures.
• Detailed Test Plan (DTP) – overall Test and Evaluation structure and objectives.
• Test Report – results of testing to include results and analysis of testing.
• Best Practices and Lessons Learned.

Other considerations must be taken into account when deciding whether to pursue automated software testing, such as conducting industry research. Investigate best practices, tips, tool suggestions, and support. Develop a strategy for how automated software testing will maximize efficiency.

This article outlines the factors to evaluate and the process to follow in implementing automated software testing for maximum success.

Larry Yang has over 12 years of information technology expertise and experience in automated software testing. He has both formal education (MBA, BS in Computer Science and AS in Computer Programming) and technical credentials (SSCP, Security+, Oracle DBA Certified Associate, ITIL v3 Intermediate, and ASTQB Certified Tester Foundation Level). The checklist in this article was developed by Mr. Yang to guide the Test Automation Project.

Testing Pilot Success

Space and Naval Warfare Systems Center (SPAWARSYSCEN) Atlantic was selected by the Director, Innovation, Test & Evaluation, and Technology Requirements (OPNAV N84) to receive funding to pilot and expand the understanding of how automated software testing tools can better support Department of Defense (DoD) software development efforts.

The test automation project was formally chartered and executed via SPAWARSYSCEN Atlantic Standard Business and Tailored Systems Engineering Technical Review (SETR) Processes.

Larry Yang served as the project and technical lead of the pilot which was completed in October 2016. Larry and his outstanding team members, including Phuong Luu, Guillermo Mujica, Nicolas Naugle and John Garrett, created an automated testing solution for software-intensive IT systems that will benefit other software-intensive programs in the Department of the Navy.

A Major Automated Information System (MAIS) Acquisition Category I (ACAT-I) program was selected for the pilot due to its complexity, requirements, technical maturity, and interface count. The project successfully followed the Naval System Engineering Framework and best practices, performed a formal test event, and collected metrics to measure overall efficiency, which proved that testing automation saves time and money while improving software quality.

The team also generated technical documents, engineering processes, and a framework for automated software testing that will serve as a guide for other programs seeking to implement an automated testing solution. This winning team provided all framework and OPNAV deliverables ahead of schedule and was recognized by OPNAV as one of the best projects receiving special funding.

Due to the team’s superb efforts the pilot was reported to Congress by Deputy Assistant Secretary of the Navy – Research, Development, Test & Evaluation via the House Armed Services Committee as an example of demonstrating cost-effective efforts in automation.

Example of automated software testing process flow by Larry Yang.
Example of automated software testing process flow by Larry Yang.

Phuong Luu, automation test engineer, Guillermo Mujica, automation test engineer, Larry Yang, project lead & technical lead and John Garrett, automation test engineer. Not pictured: Nicolas Naugle, automation test engineer. Photo by Joe Bullinger/SPAWAR Systems Center Atlantic
Phuong Luu, automation test engineer, Guillermo Mujica, automation test engineer, Larry Yang, project lead & technical lead and John Garrett, automation test engineer. Not pictured: Nicolas Naugle, automation test engineer. Photo by Joe Bullinger/SPAWAR Systems Center Atlantic

Photo by Joe Bullinger/SPAWAR Systems Center Atlantic
Photo by Joe Bullinger/SPAWAR Systems Center Atlantic
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988