NEWS & ARTICLES

VetsEZ’s Release Readiness Tool Offers Consistent and Accurate Results to Self-Assessing Development and Sustainment Teams Working In VA’s Evolving DevOps Environment

Using Systems Analysis internally to build a toolbox that organizes and streamlines our work.

An Understanding of VA Software Release Audits

The Release Readiness Office (RRO) in the Department of Veterans Affairs (VA) was previously responsible for auditing all software Releases going to Production environments. The RRO was composed of Release Agents who provide quality assurance guidance, review assessments, and report findings to teams working on VA software. The RRO would audit each project against a specific set of Acceptance Criteria to produce a Release Readiness Report (RRR) that was given to the Portfolio Director, the Product Owner, and the Receiving Organization (known informally as the Triad). The RRR provided evaluation results based on criteria that was approved by VA, including compliance topics such as Design, Engineering and Architecture (DEA), Section 508, Security, Privacy, and Release Management. Once complete, the Release Agent presented the project team with a score along with a recommendation on whether or not newly developed code was deemed ready to be implemented. Only code that received a score of 100% received the “Ready to Release” recommendation in the RRR. With these findings, the Triad then made the final release decision.

The Release Agent auditing process was a complex ship to steer, with well over one hundred (100) Acceptance Criteria, not all of which applied to every release type. Additionally, changes to the Audit process were published quarterly.

 

The Evolution of Unclear and Inconsistent Release Readiness

Historically, Release Readiness auditing processes were focused only on software development (i.e. system enhancements) projects due to staffing limitations. In 2018, the RRO increased department Release Agent staffing by 50%. These additional resources enabled VA to assign Release Agents to audit sustainment (i.e. software maintenance) projects, which allowed for a more comprehensive audit of code being promoted to production environments. However, this introduced another challenge for Release Agents because the process for auditing sustainment projects was being defined while projects were being audited. The determination of which of the 100+ Acceptance Criteria applied to which project was based largely on which Portfolio (Health, Benefits and Memorials, Enterprise Services, and Corporate) was overseeing the project. In April 2019, the DEA and 508 Compliance groups changed the Acceptance Criteria selection logic, based upon whether the project was funded with software development or sustainment budgets. Due to this change, software development projects followed one set of release readiness validations and sustainment project releases followed a different set of release readiness validations. The frequent updates and variations in auditing requirements made it difficult to define the correct Release Readiness requirements (i.e. Acceptance Criteria) from one project to the next for all parties involved.

Being unaware of the full list of Acceptance Criteria required for their project at the onset of preliminary planning caused confusion and required time and effort to clarify. Additional complexities in understanding and consensus of release needs were created due to the large amount of inaccurate and out-of-date testing tool process documentation that was stored in various repositories throughout VA. Issues with Acceptance Criteria and testing of requirements was compounded by limitations of project teams testing tools. Solving these issues and identifying the correct Acceptance Criteria for a particular project became highly complex and time consuming for both the Release agents and project teams. A further issue that haunted project teams was related to the Test Scripts (i.e. Master Test Plan) for the Acceptance Criteria. At one time, there were Test Scripts that outlined how to validate the Acceptance Criteria for development teams. The Test Scripts, however, did not cover sustainment projects, they were not available consistently to the project teams, and eventually they became largely obsolete due to a lack of proper maintenance and upkeep.

The Release Agents’ methodologies utilized in order to assign the RRR evaluation score and recommendation varied across Release Agents. Further more, Release Agents occasionally changed during a project’s lifespan, which meant that a single project team could be subjected to one Release Agent’s preferred scoring methods and when the Release Agent changed, they were subjected to another.

 

VetsEZ Resolves Release Readiness Challenges

Equipped with subject matter expertise of issues faced by both the RRO and project teams, VetsEZ created a dynamically driven tool to define Acceptance Criteria needs that considered the release type (development versus sustainment).  This solution involved the standardization of source documentation that was fed into an Excel-based VBA program (called the “Wizard ”) to produce a tailored, standardized set of Acceptance Criteria requirements and job aids based upon the specific nature of the release. In addition, VetsEZ’s subject matter expert revamped the Acceptance Criteria’s legacy Test Scripts. This allowed software development and sustainment teams to have clear and specific instructions for their project release readiness requirements. Once they were updated, each of the Test Scripts could clearly articulate the Pass/Fail criteria, in an unambiguous manner, so that the RRR evaluation score and recommendation would be consistent from Release Agent to Release Agent. Because VA was also moving away from a requirement to use the Rational Tool Suite (RTS), two versions of the Master Test Plan were produced, one for projects that leverage RTS for testing purposes (complete with screen shots and job aids) and another that was tool agnostic. To accommodate future variations in testing tools used by project teams, the Master Test Plan was also designed in such a way that the RTS- specific version could be a prototype for additional versions for tools such as GitHub or Jira. Regardless of what tool or technology a team used to execute the Acceptance Criteria testing, Release Readiness testing was clearly and uniformly understood across all teams.

The Master Test Plan was designed for use by both the Release Agents and the Project Teams as the final arbiter for all Release Readiness compliance decisions. Designed with modularity in mind, the Master Test Plan could be easily updated as Acceptance Criteria evolved. It was also designed to clearly feed into and work with the Wizard . This allowed the Wizard to produce a project-specific (i.e. tailored) listing of Acceptance Criteria allowing teams to test only what was required to receive the “Ready to Release” recommendation.

 

Outcomes and Benefits

The VetsEZ subject matter experts created the Wizard to resolve a large amount of challenges that were faced by project teams, Release Agents and other interested parties. The tool provided a method for clear understanding of the auditing requirements that would be applicable to a specific release.

The production of a tailorable system of documentation with tools that are responsive to the dynamic nature of life at VA is much needed as VA continues to define how DevOps will work in a tool agnostic environment. The Wizard can produce a comprehensive, release-appropriate set of Acceptance Criteria, coupled with a set of guidelines and tools for the project teams to self-assess compliance.

The following provides a list of overall benefits of the Wizard:

  • The Wizard provides a clear definition of validations to perform to ensure code is production- ready
  • The automation of Acceptance Criteria selection by the Wizard (instead of by individuals)
    • Provides a dramatic reduction in the amount of time required to determine compliance requirements and results for project team self-assessments
    • Provides a consistent approach in an environment where project teams are self-assess- ing instead of being overseen by the Release Readiness Office
  • The Wizard is already equipped to be utilized with a variety of testing tools and scalable to keep up with the VA’s evolving DevOps requirements
“I love working at VetsEZ because they encourage their employees to think outside the box, empowering them to brainstorm and solve problems well beyond their job descriptions. I developed the Wizard because I watched a team manually process enormous amounts of data and I saw a way to put several decades of IT experience to good use. Why do something manually when you can use the power of computing? It was also an enormous amount of fun!” Nancy Burak

SME & Creator of the Wizard, VetsEZ