Date: Thu, 28 Mar 2024 13:20:49 +0100 (CET) Message-ID: <1703083493.66.1711628449824@czmuims01.ops.egi.eu> Subject: Exported From Confluence MIME-Version: 1.0 Content-Type: multipart/related; boundary="----=_Part_65_1793007267.1711628449824" ------=_Part_65_1793007267.1711628449824 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Content-Location: file:///C:/exported.html
Everything that concerns Quality Assurance should go here, e.g.:=
The Quality Criteria (QC) Definition Task is responsible of generating t= he Quality Criteria Documents that drive the Quality Criteria Verification.=
Quality Criteria definition is a continuous process driven by the requir= ements of users and operations communities, however the verification of new= software releases is done against fixed date releases of the QC documents,= due every 6 months. Between major these releases, two draft versions are m= ade available for lightweight review.
The QC releases are done in coordination with the EGI Roadmap and Technology= releases.
There are 3 different possible states for the Quality Criteria documents= :
Please check the dissemination information in order to ensur= e that you are using the correct version of the QC documents.
The QC documents collect criteria for capabilities of a specific UMD Major Release. Only o= ne FINAL version of QC documents is active for any given U= MD Release. In the case of having more than one major UMD release active, i= .e. new products are verified and made available in different UMD repositor= ies, the verification will use the FINAL version of QC doc= uments for each UMD release. Check the dissemination information for determining which QC version should be used for the verification.
Changes in the criteria documents are triggered by one of the following = sources of changes:
Any change to criteria is tracked in the Related Information fi= eld that includes any links to direct source of change for the criterion (e= .g. RT ticket with user requirement) and the Revision Log field, t= hat records the historic information about the changes in the criterion.
Moreover, the Quality Criteria Release Notes include an overview of the = most relevant changes produced in the whole set of criteria documents.
Number of UMD Roadmap Capabilities defined through validation cr= iteria: This metric tracks the coverage of UMD Capabilities with Q= uality Criteria. The goal is that for each UMD Capability a number of Quali= ty Criteria are recorded that together define the quality of software contr= ibuted against that UMD Capability. It is recorded as a percentage of cover= ed capabilities Number of software incidents found in production that resul= t in changes to quality criteria: This metric tracks the quality of= the defined criteria. Any incidents (bugs) found in EGI Infrastructure sho= uld result result in a change of the Quality Criteria. The metric considers= also security incidents found by the SVG.
Number of "change triggers" that result into an update of the Qu= ality Criteria (Internal): Similar to the previous one (and includ= es it), this internal metric tracks the quality of the defined criteria by = counting the number of change triggers (as defined previously in change management sect= ion) produce changes in the Quality Criteria.
The main objective of the TSA2.3 is to verify the quality of the softwar= e provided by the TP before entering the SR phase and going to production. = By doing so we prevent that software that might work enters into the SR and= even goes into production but that doesn=C2=B4t follow the quality criteri= a defined in TSA2.2. Some of the reasons for doing the verification before = the software enters the stage rollout are:
- Check that the bugs reported in the previous release of the software h= ave been corrected (work in collaboration with DMSU) by the TP.
- Software can work well in the SR but might not have all the functional= ities required
- Software might not be safe, well documented, or have the necessary ins= tallation rules or licenses
When a new product is available, the TP has to follow the NSRW. Once the software = is correctly uploaded to the repository, the release enters into the verifi= cation phase. The requirements are that the TP has to provide all the neces= sary information to the verifier (QCV) so that the QCV can assess that the = TP has tested in advance the quality of the software. Depending on the type= of release, different actions will be taken by the QCV, Verification proce= ss is described in detail in the EGI Verifier Guideline.
RT workflow and Verification Templates links are available in this page.= First of all verifiers must check QC service mapping to know which test mu= st be verified for each product. This service mapping is available here QC Verification service mapping and Verification/Executiv= e Summary templates are available here: QC Verification Tem= plates.
Verifiers must fill all required fields for each product and write a Ver= ification process summary into Executive Summary, this summary should inclu= de:
Before each UMD release the verification team checks the new RC. To perf= orm this task SA2.3 VMs have included the RC_testing s= cript. This script is available after each SA2 VM instantiation (into /= root/configuration_templates/tools directory). This script is able to detec= t any package inconsistency after UMD RC repo configuration. The RC testing= process is as follows:
The current FINAL version of Quality Criteria is availa= ble at http://egi-qc.github.io/. Testing procedures for generic crit= eria are available at EGI_QC_Tes= ting, Specific criteria are available at EGI_QC_Specific
Verification of software products is managed through EGI's RT, every prod= uct will consist of one or more tickets that include the information about = the product and links to the repositories where the packages are available.=
External Verification:
If you are performing an external verification, you do not need to manage = the RT tickets. Instead you should perform the verification and provide the= final report.
In order to start a verification, the verifier must:
In order to be able to better organize the verification activity a list = of core components that MUST to be verified. The list is available here
Each product to be verified must be installed in a controllable environm= ent where tests are easily reproducible. The EGI Verification Testbed may be used f= or this purpose (check the link for requesting new VMs for the products).= p>
Products must be installed (or upgraded) using the pack= ages from the repository linked in the RT ticket. The URL for this reposito= ry can be found at the Custom Fields of the RT ticket within attribute = RepositoryURL. The easiest way of adding the repositoy is using the fi= les available at the repofiles directory.
Verification is performed using UMD repositories + OS provided repositor= ies + EPEL (for RH based distributions). No other external repositories may= be used.
Once the product is installed (or upgraded), the verifier must perform a= basic configuration that allows testing of the product features. This conf= iguration should use the services already existing in the testbed if they a= re needed. Check EGI Verification Testbed for a list of available services. Support= for dteam and ops.vo.ibergrid.eu VOs is expected for all services.
Sample configurations for some of the products are available at configuration-templates github repo.
Each verification must be documented with a final report that includes a= n executive summary of the result and a summary of the tests performed. The= tests to be performed are described in the the current QC document,= see also below and at the http://github.com/egi-qc/qc-tests GitHub repository for more information. Verification report template is a= vailable at EGI DocDB #1993 in different formats.
The effort dedicated to testing the criteria should be adjusted dependin= g on the type of release:
The effort dedicated to those criteria should be adjusted depending on the=
type of release:
Tests for the generic QC are described in http://egi-qc.github.io/. Each criterion includes in the What to check and What to re= port fields describe what needs to be tested and what to include in th= e verification report respectively. More detailed information about the tes= ts is also available at EGI_QC= 6_Testing.
Specific QC depends on the product to be verified. There are two criteri= a in this section:
If the Verifier finds any problems or issues with the product (any of th= e criteria is not met or problems to install/configure/operate the product)= , either they are clarified within the ticket by the verification team by u= sing the verifiers list (sw-rel-qc(at)mailman.egi.eu) or, if the problems n= eeds the interaction of the Technology Provider, a GGUS ticket should be op= ened.
If a GGUS ticket is needed:
The product is considered as ACCEPTED if all the criter= ia marked as critical pass. Otherwise it will be R= EJECTED
Once filled, the verification report must be uploaded as a new doc in DocDB with the following information:<=
/p>
Finally, the result of the verification must be set by changing the = RolloutProgress field of the RT ticket(s).
When the process is finished (the product is accepted or rejected) the v= erifier must also fill the "Time Worked" RT field to account the real hours= /minutes spent to finish the verification process.