Software Verification Plan
|
status: draft
safety: ASIL_B
|
||||
Software verification#
This document implements Verification Plan (wp__verification__plan).
Purpose#
The present document describes the plan for software integration and verification of the project. It intends to give an overview by linking to other relevant sources and provide further information about the verification and testing activities.
Main purpose of this document is to show the objectives, approaches, and strategies that are defined to contribute to the overall quality of the software project. The target groups of this document are mainly Contributor (rl__contributor) and Committer (rl__committer).
Objectives and scope#
Objectives#
The overall objectives of the projects software integration and verification activities are the following:
Correctness - Verify that the software platform and its modules perform their intended functions correctly and produce the expected results.
Completeness - Ensure that all specified requirements have been implemented.
Reliability - Confirm that the software operates reliably under specified conditions and can handle errors properly.
Performance - Assess the software’s performance to ensure it meets the required performance criteria, such as response time and throughput.
Maintainability - Check that the software is maintainable, with clear documentation and code.
Compliance - Ensure that the software complies with defined processes and guidelines of the project.
Traceability - Verify that all requirements are traceable through the development process and that changes are properly managed and documented according to the processes defined in this project.
Verification scope and constraints#
The software verification aspects described in this document focus on the verification of the software platform and modules developed within this project. The integration of the platform on a target device and the respective verification and validation should be considered by the distributor of the platform. On target integration tests that are running on a reference hardware in context of this project can be taken as a starting point.
Risks and mitigation#
Potential risks that derived from the verification activities and their respective mitigation measures are handled by the Risk management.
Schedules#
The integration of software elements is driven by contribution requests and their respective Lifecylce concept model. The contribution of a feature itself implies that it gets fully verified.
Approach#
General approach#
The above defined objectives are supposed to be achieved by the following approaches.
The projects infrastructure is highly driven by a Continuous Integration (CI) methodology. Thus, not just the integration but also the verification approaches of the project are following that. Another aspect is the fact that software build and verification processes are fully automated. Software elements, infrastructure, tests, and documentation are defined as code. The following sections provide more details about the software integration as well as the software verification.
Software integration#
The already mentioned approach of Continuous Integration means that software integration is performed with each fully automated software build at any time.
The following types of integrations are applicable:
New software elements get integrated according to the Contribution Request Guideline (gd_guidl__contr_request_guideline)
Fixes of defects get integrated based on their prioritization described by the Problem Resolution.
Changes get integrated based on the Create/Discuss Change request (wf__cr_dc_changerequest) and will follow the Pull Request Guideline (gd_guidl__pull_request_guideline) as any other artifact.
Levels of integration and verification#
There are the following different levels of integration (2, 3) and verification (1, 2, 3) defined:
Software unit verification to verify software detailed design
Software component verification to verify the integration of units to a component and also the integration of smaller component(s) to a component based on
component architecture and
component requirements
Software feature verification to verify the integration of components to a feature based on
feature architecture and
feature requirements
Verification Methods#
A verification is based on different methods. The derivation of test cases can also be based on certain methods. An overview of the different methods that are applicable in the project are given in this section. Usually the defined methods are not applied on each verification level. Due to that the following tables contain a column that defines the applicable level. Another column defines if a respective method is supposed to be applied if the linked references are QM or ASIL B relevant.
Automated test cases should contain further information about which methods have been applied. The corresponding guidance is given here: Verification Guideline (gd_guidl__verification_guide). The identifier of the respective method is to be used as meta data (TestType and DerivationTechnique).
Methods |
Identifier |
Applicable on level |
Applicable for QM / ASIL B |
---|---|---|---|
Control Flow Analysis |
control-flow-analysis |
1, 2, - |
QM & ASIL B |
Data Flow Analysis |
data-flow-analysis |
1, 2, - |
QM & ASIL B |
Fault Injection |
fault-injection |
1, -, - |
ASIL B |
Inspection |
inspection |
1, -, - |
ASIL B |
Interface Test |
interface-test |
-, 2, 3 |
QM & ASIL B |
Requirements-based Test |
requirements-based |
-, 2, 3 |
QM & ASIL B |
Resource Usage Evaluation (only on reference environment) |
resource-usage |
-, -, 3 |
QM & ASIL B |
Static Code Analysis |
static-code-analysis |
1, 2, 3 |
QM & ASIL B |
Structural Statement Coverage (Code coverage) |
structural-statement-coverage |
1, -, - |
ASIL B |
Structural Branch Coverage (Code coverage) |
structural-branch-coverage |
1, -, - |
ASIL B |
Walkthrough |
walkthrough |
1, 2, 3 |
QM |
For QM software some of the methods may be executed with less rigor compared to safety-critical elements. These are data-flow-analysis as well as control-flow-analysis
Static code analysis is part of the Implementation (wp__sw_implementation).
Test Derivation Methods#
Methods |
Identifier |
Applicable on level |
Applicable for QM / ASIL B |
---|---|---|---|
Analysis of Boundary Values |
boundary-values |
1, 2, 3 |
QM, ASIL B |
Analysis of Equivalence Classes |
equivalence-classes |
2, 3 |
QM, ASIL B |
Analysis of Requirements |
requirements-analysis |
2, 3 |
QM, ASIL B |
Error Guessing based on Knowledge or Experience |
error-guessing |
2, 3 |
QM, ASIL B |
Random Testing |
monkey-testing |
3 |
QM, ASIL B |
Exlporative Testing |
explorative-testing |
2, 3 |
QM, ASIL B |
For non-safety-critical(QM) software parts, you can generally reduce the rigor of the testing approaches, but cannot omit them completely. It may be possible to reduce the number of boundary-values tested based on a risk assessment and focus on impactful boundaries. Similar for the equivalence-classes the focus can be put on more likely classes such as invalid classes, empty/null/zero values, system limits. Equivalence Classes should be supplemented by Boundary Value Analysis.
Quality criteria#
The quality criteria of the software verification activities are defined at the following table. The defined goals are to be reached with every contribution.
# |
Criterion |
Goal for QM |
Goal for Safety |
---|---|---|---|
1 |
Structural Statement Coverage |
85% |
100% |
2 |
Structural Branch Coverage |
85% |
100% |
3 |
Verification coverage of software detailed design (test coverage) |
100% |
100% |
4 |
Verification coverage of software architecture design (test coverage) |
100% |
100% |
5 |
Verification coverage of software requirements specifications (test coverage) |
100% |
100% |
6 |
Relative amount of executed tests |
100% |
100% |
7 |
Relative amount of failed tests |
0% |
0% |
Further quality goals are defined in section Quality management.
Coverage of detailed design#
Beside Component Integration test (wp__verification__comp_int_test) and Unit test (wp__verification__sw_unit_test) the following aspects define the coverage of detailed design.
Statement/Branch/Path coverage as defined by their specific thresholds
Static analysis and Linting
Code Inspection (wp__sw_code_inspect) for safety-critical implementation
Coverage of architectural design#
Beside Component Integration test (wp__verification__comp_int_test) and Feature Integration test (wp__verification__feat_int_test) the following aspects define the coverage of the architectural design.
Component Safety Analyses (wp__sw_component_safety_analyses) for safety-critical parts
Feature Safety Analyses (wp__feature_safety_analyses) for safety-critical parts
Each architectural element has at least one test case linked with attribute “fully verified” or multiple test cases with attribute “partially verified”.
Coverage of software requirements specifications#
For a release all valid
requirements need to have a complete test coverage of linked test cases.
Test development#
The verification steps as well as the development of test cases is done along with the implementation of code. A full automation of tests should be achieved and the derived test cases should contain meta data that gives further information as defined in Test Linking to Requirements. The list of relevant work products is shown above (as part of the development of the product).
The different environments that can be used for the test development are defined below.
Pre-existing test cases#
The recommendations according to the Verification Guideline (gd_guidl__verification_guide) for pre-existing test cases is followed. Any pre-existing test case (e.g. from OSS components) is reviewed and adopted to follow the Test Specification Guideline (gd_guidl__verification_specification) and Linking Requirements to Tests (gd_req__link_tests).
Test execution and result analysis#
The execution of the tests is based on a full automation defined by build pipelines. The analysis of the test results needs to be performed by the contributor.
Test selection and regression testing#
All existing test cases should be executed within continuous integration pipelines to verify initially developed components or software changes. A specific selection of sub sets is not planned. The fact that all existing and automated tests get executed continuously covers the approach to identify regressions.
Work products and traceability#
The traceability between verification relevant work products is one of the defined objectives. An overall overview of the different work products and their relationship is given in project context - see Workproducts.
The work products are related to verification can be found in Verification Work Products.
The link between a test specification and the respective requirement or design specification is given by the identifier of the reference annotated to the verification specification.
Environments and resources#
Roles#
In general, the different roles of this project are defined within the Process documentation: Roles. The following roles are crucial to comply with the aspects defined in this document:
The Contributor (rl__contributor) needs to make sure that the objectives of the software integration and verification are fulfilled when contributing to the project.
The Committer (rl__committer) needs to verify that the contributor has fulfilled the expected objectives.
In this way roles are followed as defined in Roles.
Tools#
The list of the tools mentioned here does not reflect the full list of tools that are used for the whole project. Only tools that have an important impact on the test execution and reporting are given here. A full list of tools (and their versions) is maintained by Tool Management. The aim of the given list here is to provide a better picture of the software test strategy and corresponding processes.
Bazel
The main build environment of the project is based on Bazel. It it used to build software components, documentation, and automated tests.
GoogleTest
The software components of the project written in C++ are tested with the help of GoogleTest.
Integration Testing Framework (ITF)
The integration of software components can be verified with the help of the ITF. It allows the definition and execution of test based on pytest.
Rust
The platform developed in this project supports Rust as a programming language. Its built-in test framework is used to test respective software components.
Verification setups and variants#
Different test frameworks get used to verify software components and their integration into the platform (see Tools section above). Driven by that the following test setups can be derived:
GoogleTest
Rust
ITF
All defined setups are used to run automated tests within continuous integration pipelines.
Test execution environment and reference hardware#
The platform is consisting solely on features that are considered as “middleware” as the layer above the hardware abstraction layer. The platform itself doe not require to be running on a specific hardware. It integrates with an Posix Operating System which is the first level of abstraction to the physical hardware.
The simulation environment will be based on x86 and arm64 architecture, to be close to later target hardware.
The integration of the platform on a target device and the respective verification and validation should be considered by the distributor of the platform. On target integration tests that are running on a reference hardware in context of this project can be taken as a starting point.
The reference hardware is not yet decided.
Reference hardware interaction with infrastructure#
Once the reference hardware is decided, this section will inform about the location of the reference hardware, how it interacts with the CI system and how access rights are handled. This includes physical maintenance as well as virtual access.