Everything that concerns Quality Assurance should go here, e.g.:
The Quality Criteria (QC) Definition Task is responsible of generating the Quality Criteria Documents that drive the Quality Criteria Verification.
Quality Criteria definition is a continuous process driven by the requirements of users and operations communities, however the verification of new software releases is done against fixed date releases of the QC documents, due every 6 months. Between major these releases, two draft versions are made available for lightweight review.
The QC releases are done in coordination with the EGI Roadmap and Technology releases.
There are 3 different possible states for the Quality Criteria documents:
Please check the dissemination information in order to ensure that you are using the correct version of the QC documents.
The QC documents collect criteria for capabilities of a specific UMD Major Release. Only one FINAL version of QC documents is active for any given UMD Release. In the case of having more than one major UMD release active, i.e. new products are verified and made available in different UMD repositories, the verification will use the FINAL version of QC documents for each UMD release. Check the dissemination information for determining which QC version should be used for the verification.
Changes in the criteria documents are triggered by one of the following sources of changes:
Any change to criteria is tracked in the Related Information field that includes any links to direct source of change for the criterion (e.g. RT ticket with user requirement) and the Revision Log field, that records the historic information about the changes in the criterion.
Moreover, the Quality Criteria Release Notes include an overview of the most relevant changes produced in the whole set of criteria documents.
Number of UMD Roadmap Capabilities defined through validation criteria: This metric tracks the coverage of UMD Capabilities with Quality Criteria. The goal is that for each UMD Capability a number of Quality Criteria are recorded that together define the quality of software contributed against that UMD Capability. It is recorded as a percentage of covered capabilities Number of software incidents found in production that result in changes to quality criteria: This metric tracks the quality of the defined criteria. Any incidents (bugs) found in EGI Infrastructure should result result in a change of the Quality Criteria. The metric considers also security incidents found by the SVG.
Number of "change triggers" that result into an update of the Quality Criteria (Internal): Similar to the previous one (and includes it), this internal metric tracks the quality of the defined criteria by counting the number of change triggers (as defined previously in change management section) produce changes in the Quality Criteria.
The main objective of the TSA2.3 is to verify the quality of the software provided by the TP before entering the SR phase and going to production. By doing so we prevent that software that might work enters into the SR and even goes into production but that doesn´t follow the quality criteria defined in TSA2.2. Some of the reasons for doing the verification before the software enters the stage rollout are:
- Check that the bugs reported in the previous release of the software have been corrected (work in collaboration with DMSU) by the TP.
- Software can work well in the SR but might not have all the functionalities required
- Software might not be safe, well documented, or have the necessary installation rules or licenses
When a new product is available, the TP has to follow the NSRW. Once the software is correctly uploaded to the repository, the release enters into the verification phase. The requirements are that the TP has to provide all the necessary information to the verifier (QCV) so that the QCV can assess that the TP has tested in advance the quality of the software. Depending on the type of release, different actions will be taken by the QCV, Verification process is described in detail in the EGI Verifier Guideline.
RT workflow and Verification Templates links are available in this page. First of all verifiers must check QC service mapping to know which test must be verified for each product. This service mapping is available here QC Verification service mapping and Verification/Executive Summary templates are available here: QC Verification Templates.
Verifiers must fill all required fields for each product and write a Verification process summary into Executive Summary, this summary should include:
Before each UMD release the verification team checks the new RC. To perform this task SA2.3 VMs have included the RC_testing script. This script is available after each SA2 VM instantiation (into /root/configuration_templates/tools directory). This script is able to detect any package inconsistency after UMD RC repo configuration. The RC testing process is as follows:
The current FINAL version of Quality Criteria is available at http://egi-qc.github.io/. Testing procedures for generic criteria are available at EGI_QC_Testing, Specific criteria are available at EGI_QC_Specific
Verification of software products is managed through EGI's RT, every product will consist of one or more tickets that include the information about the product and links to the repositories where the packages are available.
If you are performing an external verification, you do not need to manage the RT tickets. Instead you should perform the verification and provide the final report.
In order to start a verification, the verifier must:
In order to be able to better organize the verification activity a list of core components that MUST to be verified. The list is available here
Each product to be verified must be installed in a controllable environment where tests are easily reproducible. The EGI Verification Testbed may be used for this purpose (check the link for requesting new VMs for the products).
Products must be installed (or upgraded) using the packages from the repository linked in the RT ticket. The URL for this repository can be found at the Custom Fields of the RT ticket within attribute RepositoryURL. The easiest way of adding the repositoy is using the files available at the repofiles directory.
Verification is performed using UMD repositories + OS provided repositories + EPEL (for RH based distributions). No other external repositories may be used.
Once the product is installed (or upgraded), the verifier must perform a basic configuration that allows testing of the product features. This configuration should use the services already existing in the testbed if they are needed. Check EGI Verification Testbed for a list of available services. Support for dteam and ops.vo.ibergrid.eu VOs is expected for all services.
Sample configurations for some of the products are available at configuration-templates github repo.
Each verification must be documented with a final report that includes an executive summary of the result and a summary of the tests performed. The tests to be performed are described in the the current QC document, see also below and at the http://github.com/egi-qc/qc-tests GitHub repository for more information. Verification report template is available at EGI DocDB #1993 in different formats.
The effort dedicated to testing the criteria should be adjusted depending on the type of release:
The effort dedicated to those criteria should be adjusted depending on the type of release:
Tests for the generic QC are described in http://egi-qc.github.io/. Each criterion includes in the What to check and What to report fields describe what needs to be tested and what to include in the verification report respectively. More detailed information about the tests is also available at EGI_QC6_Testing.
Specific QC depends on the product to be verified. There are two criteria in this section:
If the Verifier finds any problems or issues with the product (any of the criteria is not met or problems to install/configure/operate the product), either they are clarified within the ticket by the verification team by using the verifiers list (sw-rel-qc(at)mailman.egi.eu) or, if the problems needs the interaction of the Technology Provider, a GGUS ticket should be opened.
If a GGUS ticket is needed:
The product is considered as ACCEPTED if all the criteria marked as critical pass. Otherwise it will be REJECTED
Once filled, the verification report must be uploaded as a new doc in DocDB with the following information:
Finally, the result of the verification must be set by changing the RolloutProgress field of the RT ticket(s).
When the process is finished (the product is accepted or rejected) the verifier must also fill the "Time Worked" RT field to account the real hours/minutes spent to finish the verification process.