Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Improving the quality of the products produced by the SRDP Program requires a integrative approach, and while NRAO has a long history of producing high quality products we don't often talk about it.  This page is the beginning of a process to monitor and improve the quality of software products.  A more detailed description of the Quality management is available in the TTA Software QA planManagement Plan.

As a starting point we adopt the following objective function for Software Quality: 

Where Quality (Q) is defined as the fraction of completed features (FC) to the scheduled features (FS) minus the Defects found in production (DP) and any Technical Debt incurred (TD).  I know that this equation has unit problems, but it is intended as conceptual.  

The Feature completion rate can be deduced from the subsystem planning and Jira records, and a process for recording technical debt has been developed by Mark Whitehead here.  This page will describe the process for determining the defect rate in production (and related quantities) and how the information is intended to be used.

Defects

Defect Environment

Features and defects are tracked through the Jira system and provides a natural mechanism for harvesting data for our quality metrics.  An additional field will be available on all tickets in the SSA Project: Defect Environment.   If a new ticket is opened it should have its Detection Environment set to the appropriate value (see below), or if the feature ticket is set back to to "implementing" from one of the testing states the feature ticket should be updated to the value (in this way we will record the last phase at which a defect associated with a feature is found.


Info
titleDefect Cause

A quick reminder that one of the important pieces of information associated with each defect (maintained outside of Jira) is the cause of the defect, or defect origin. Defects can be attributed to requirements errors, architectural errors, implementation errors, configuration errors, or many other causes. The objective is to identify which causes are most affecting our total system quality and work together to address those causes.

The Detection Environment .  This field has the following valid entries:

  • Implementation: Defect was identified during the implementation phase.

    Note

    Defects happen all the time during development and are fixed as part of the routine implementation process, the intention is not to try to record those.  Rather this is intended to identify when a ticket is opened for whatever reason during this phase (perhaps for reporting within the team, or as a reminder to return to the issue prior to release.  This is simply to prevent an empty field to mean both "it never got filled out" and "it was part of the implementation process".

  • Verification: A defect was identified during the teams testing and verification process.  
  • Validation: Defect was identified during the feature validation, prior to the Test Readiness Review. 
  • System Validation: Defect was found between the Test Readiness Review and the Operations Readiness Review
  • Operations: Defect was found in the deployed system