Component Requirements Statistics#

Overview#

_images/need_pie_2b909.svg

In Detail#

_images/need_pie_cce4e.svg
_images/need_pie_8587e.svg
_images/need_pie_01abf.svg

Failed Tests

Hint: This table should be empty. Before a PR can be merged all tests have to be successful.

FAILED TESTS#

testcase

Result

Fully Verifies

Partially Verifies

Test Type

Derivation Technique

link

test_smoke

failed

interface-test

explorative-testing

Skipped / Disabled Tests

No needs passed the filters

All passed Tests#

SUCCESSFUL TESTS#

testcase

Result

Fully Verifies

Partially Verifies

Test Type

Derivation Technique

link

HealthMonitorTest__TestName

passed

interface-test

explorative-testing

IdentifierHashTest__IdentifierHash_default_created

passed

interface-test

explorative-testing

IdentifierHashTest__IdentifierHash_invalid_hash_no_string_representation

passed

interface-test

explorative-testing

IdentifierHashTest__IdentifierHash_no_dangling_pointer_after_source_string_dies

passed

interface-test

explorative-testing

IdentifierHashTest__IdentifierHash_with_string_created

passed

interface-test

explorative-testing

IdentifierHashTest__IdentifierHash_with_string_view_created

passed

interface-test

explorative-testing

ProcessStateClient_UT__ProcessStateClient_ConstructReceiver_Succeeds

passed

interface-test

explorative-testing

ProcessStateClient_UT__ProcessStateClient_QueueMaxNumberOfProcesses_Succeeds

passed

interface-test

explorative-testing

ProcessStateClient_UT__ProcessStateClient_QueueOneProcess_Succeeds

passed

interface-test

explorative-testing

ProcessStateClient_UT__ProcessStateClient_QueueOneProcessTooMany_Fails

passed

interface-test

explorative-testing

Details About Testcases#

_images/need_pie_d5f3b.svg _images/need_pie_34de7.svg

Test Log Files#

tests-report/tests/integration/smoke/smoke/test.log

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //tests/integration/smoke:smoke
-----------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.12.12, pytest-9.0.1, pluggy-1.6.0
rootdir: /home/runner/.bazel/sandbox/processwrapper-sandbox/513/execroot/_main/bazel-out/k8-fastbuild/bin/tests/integration/smoke/smoke.runfiles/score_tooling+/python_basics/score_pytest
configfile: pytest.ini
collected 1 item

../score_tooling+/python_basics/score_pytest::test_smoke FAILED          [100%]

=================================== FAILURES ===================================
__________________________________ test_smoke __________________________________

    @add_test_properties(
        partially_verifies=[],
        test_type="interface-test",
        derivation_technique="explorative-testing",
    )
    def test_smoke():
        """Smoke test for the launch manager daemon."""
        code, stdout, stderr = get_common_interface().run_until_file_deployed(
            "src/launch_manager_daemon/launch_manager"
        )
    
        print(format_logs(code, stdout, stderr))
    
>       check_for_failures(Path("tests/integration/smoke"), 2)

tests/integration/smoke/smoke.py:35: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

path = PosixPath('tests/integration/smoke'), expected_count = 2

    def check_for_failures(path: Path, expected_count: int):
        """Check expected_count xml files for failures, raising an exception if
        a failure is found or a different number of xml files are found.
        """
        failing_files = []
        checked_files = []
        for file in path.iterdir():
            if file.suffix == ".xml":
                gtest_xml = open(file).read()
                query = 'failures="'
                failure_number = gtest_xml[gtest_xml.find(query) + len(query)]
                if failure_number != "0":
                    failing_files.append(file.name)
                checked_files.append(file.name)
                shutil.copy(file, get_bazel_out_dir())
        if len(failing_files) > 0:
            raise RuntimeError(
                f"Failures found in the following files:\n {'\n'.join(failing_files)}"
            )
        if len(checked_files) != expected_count:
>           raise RuntimeError(
                f"Expected to find {expected_count} xml files, instead found {len(checked_files)}:\n{'\n'.join(checked_files)}"
            )
E           RuntimeError: Expected to find 2 xml files, instead found 0:

tests/integration/testing_utils.py:181: RuntimeError
----------------------------- Captured stdout call -----------------------------
stdout:


stderr:
/usr/bin/fakeroot: 175: /usr/bin/fakechroot: not found


Exit status = 127
- generated xml file: /home/runner/.bazel/sandbox/processwrapper-sandbox/513/execroot/_main/bazel-out/k8-fastbuild/testlogs/tests/integration/smoke/smoke/test.xml -
=========================== short test summary info ============================
FAILED ../score_tooling+/python_basics/score_pytest::test_smoke - RuntimeErro...
============================== 1 failed in 0.13s ===============================

tests-report/tests/ut/identifier_hash_UT/identifier_hash_UT/test.log

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //tests/ut/identifier_hash_UT:identifier_hash_UT
-----------------------------------------------------------------------------
Running main() from gmock_main.cc
[==========] Running 5 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 5 tests from IdentifierHashTest
[ RUN      ] IdentifierHashTest.IdentifierHash_with_string_view_created
[       OK ] IdentifierHashTest.IdentifierHash_with_string_view_created (0 ms)
[ RUN      ] IdentifierHashTest.IdentifierHash_with_string_created
[       OK ] IdentifierHashTest.IdentifierHash_with_string_created (0 ms)
[ RUN      ] IdentifierHashTest.IdentifierHash_default_created
[       OK ] IdentifierHashTest.IdentifierHash_default_created (0 ms)
[ RUN      ] IdentifierHashTest.IdentifierHash_invalid_hash_no_string_representation
[       OK ] IdentifierHashTest.IdentifierHash_invalid_hash_no_string_representation (0 ms)
[ RUN      ] IdentifierHashTest.IdentifierHash_no_dangling_pointer_after_source_string_dies
[       OK ] IdentifierHashTest.IdentifierHash_no_dangling_pointer_after_source_string_dies (0 ms)
[----------] 5 tests from IdentifierHashTest (0 ms total)

[----------] Global test environment tear-down
[==========] 5 tests from 1 test suite ran. (0 ms total)
[  PASSED  ] 5 tests.

tests-report/src/health_monitoring_lib/tests/test.log

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //src/health_monitoring_lib:tests
-----------------------------------------------------------------------------

running 82 tests
test deadline::common::tests::new_and_fields ... ok
test deadline::common::tests::acquire_and_release_deadline ... ok
test deadline::deadline_monitor::tests::get_deadline_unknown_tag ... ok
test deadline::deadline_monitor::tests::deadline_outside_time_range_is_error_when_dropped_after_evaluate ... ok
test deadline::common::tests::concurrent_acquire ... ok
test deadline::deadline_monitor::tests::start_stop_deadline_outside_ranges_is_error_when_dropped_before_evaluate ... ok
test deadline::deadline_monitor::tests::start_stop_deadline_outside_ranges_is_evaluated_as_error ... ok
test deadline::deadline_state::tests::as_u64_and_new ... ok
test deadline::deadline_state::tests::deadline_state_default_and_snapshot ... ok
test deadline::deadline_state::tests::deadline_state_update_none_returns_err ... ok
test deadline::deadline_state::tests::deadline_state_update_success ... ok
test deadline::deadline_state::tests::default_state ... ok
test deadline::deadline_state::tests::set_and_get_timestamp_ms ... ok
test deadline::deadline_monitor::tests::deadline_failed_on_first_run_and_then_restarted_is_evaluated_as_error ... ok
test deadline::deadline_state::tests::set_running ... ok
test deadline::deadline_state::tests::set_underrun ... ok
test deadline::ffi::tests::deadline_destroy_null_deadline ... ok
test deadline::ffi::tests::deadline_monitor_builder_add_deadline_invalid_range ... ok
test deadline::ffi::tests::deadline_monitor_builder_add_deadline_null_deadline_tag ... ok
test deadline::ffi::tests::deadline_monitor_builder_add_deadline_null_builder ... ok
test deadline::ffi::tests::deadline_monitor_builder_add_deadline_succeeds ... ok
test deadline::ffi::tests::deadline_monitor_builder_create_null_builder ... ok
test deadline::ffi::tests::deadline_monitor_builder_create_succeeds ... ok
test deadline::ffi::tests::deadline_monitor_destroy_null_monitor ... ok
test deadline::ffi::tests::deadline_monitor_get_deadline_null_deadline_handle ... ok
test deadline::ffi::tests::deadline_monitor_builder_destroy_null_builder ... ok
test deadline::ffi::tests::deadline_monitor_get_deadline_null_monitor ... ok
test deadline::ffi::tests::deadline_monitor_get_deadline_succeeds ... ok
test deadline::ffi::tests::deadline_monitor_get_deadline_unknown_deadline ... ok
test deadline::ffi::tests::deadline_start_already_started ... ok
test deadline::ffi::tests::deadline_start_null_deadline ... ok
test deadline::ffi::tests::deadline_start_succeeds ... ok
test deadline::ffi::tests::deadline_monitor_get_deadline_null_deadline_tag ... ok
test deadline::ffi::tests::deadline_stop_succeeds ... ok
test ffi::tests::health_monitor_builder_add_deadline_monitor_null_deadline_monitor_builder ... ok
test ffi::tests::health_monitor_builder_add_deadline_monitor_null_hmon_builder ... ok
test ffi::tests::health_monitor_builder_add_deadline_monitor_null_monitor_tag ... ok
test ffi::tests::health_monitor_builder_add_deadline_monitor_succeeds ... ok
test ffi::tests::health_monitor_builder_build_invalid_cycle_intervals ... ok
test ffi::tests::health_monitor_builder_build_null_builder_handle ... ok
test ffi::tests::health_monitor_builder_build_null_monitor_handle ... ok
test ffi::tests::health_monitor_builder_build_succeeds ... ok
test ffi::tests::health_monitor_builder_create_null_handle ... ok
test ffi::tests::health_monitor_builder_create_succeeds ... ok
test ffi::tests::health_monitor_builder_destroy_null_handle ... ok
test ffi::tests::health_monitor_destroy_null_hmon ... ok
test ffi::tests::health_monitor_get_deadline_monitor_already_taken ... ok
test ffi::tests::health_monitor_get_deadline_monitor_null_deadline_monitor ... ok
test ffi::tests::health_monitor_get_deadline_monitor_null_hmon ... ok
test ffi::tests::health_monitor_get_deadline_monitor_null_monitor_tag ... ok
test ffi::tests::health_monitor_get_deadline_monitor_succeeds ... ok
test deadline::ffi::tests::deadline_stop_null_deadline ... ok
test ffi::tests::health_monitor_start_monitor_not_taken ... ok
test ffi::tests::health_monitor_start_null_hmon ... ok
test ffi::tests::health_monitor_start_no_monitors ... ok
test tag::tests::deadline_tag_debug ... ok
test tag::tests::deadline_tag_from_str ... ok
test tag::tests::deadline_tag_from_string ... ok
test tag::tests::deadline_tag_score_debug ... ok
test tag::tests::monitor_tag_debug ... ok
test tag::tests::monitor_tag_from_str ... ok
test tag::tests::monitor_tag_from_string ... ok
test tag::tests::monitor_tag_score_debug ... ok
test tag::tests::tag_debug ... ok
test tag::tests::tag_hash ... ok
test tag::tests::tag_partial_eq_is_eq ... ok
test tag::tests::tag_partial_eq_is_ne ... ok
test tag::tests::tag_score_debug ... ok
test tag::tests::test_from_str ... ok
test tag::tests::test_from_string ... ok
test tests::hm_get_deadline_monitor_works ... ok
test tests::hm_with_monitors_shall_not_start_with_not_taken_monitors - should panic ... ok
test ffi::tests::health_monitor_start_succeeds ... ok
test tests::hm_with_no_monitors_shall_panic_on_start - should panic ... ok
test tests::hm_with_taken_monitors_starts ... ok
test tests::hm_with_wrong_cycle_fails_to_build - should panic ... ok
test worker::tests::monitoring_logic_report_alive_on_each_call_when_no_error ... ok
test worker::tests::monitoring_logic_report_error_when_deadline_failed ... ok
test deadline::deadline_monitor::tests::monitor_with_multiple_running_deadlines ... ok
test worker::tests::unique_thread_runner_monitoring_works ... ok
test worker::tests::monitoring_logic_report_alive_respect_cycle ... ok
test deadline::deadline_monitor::tests::start_stop_deadline_within_range_works ... ok

test result: ok. 82 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.00s

tests-report/src/health_monitoring_lib/cpp_tests/test.log

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //src/health_monitoring_lib:cpp_tests
-----------------------------------------------------------------------------
Running main() from gmock_main.cc
[==========] Running 1 test from 1 test suite.
[----------] Global test environment set-up.
[----------] 1 test from HealthMonitorTest
[ RUN      ] HealthMonitorTest.TestName
[2026/02/24 10:11:52.4455061][src/health_monitoring_lib/rust/deadline/deadline_monitor.rs:243][7318][HMON][ERROR] Deadline DeadlineTag(deadline_1) stopped too early by 100 ms
[2026/02/24 10:11:52.4456174][src/health_monitoring_lib/rust/worker.rs:98][7318][HMON][INFO] Monitoring thread started.
[2026/02/24 10:11:52.4456295][src/health_monitoring_lib/rust/worker.rs:115][7318][HMON][INFO] Monitoring thread exiting.
[       OK ] HealthMonitorTest.TestName (0 ms)
[----------] 1 test from HealthMonitorTest (0 ms total)

[----------] Global test environment tear-down
[==========] 1 test from 1 test suite ran. (0 ms total)
[  PASSED  ] 1 test.

tests-report/src/launch_manager_daemon/process_state_client_lib/processstateclient_UT/test.log

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //src/launch_manager_daemon/process_state_client_lib:processstateclient_UT
-----------------------------------------------------------------------------
Running main() from gmock_main.cc
[==========] Running 4 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 4 tests from ProcessStateClient_UT
[ RUN      ] ProcessStateClient_UT.ProcessStateClient_ConstructReceiver_Succeeds
[       OK ] ProcessStateClient_UT.ProcessStateClient_ConstructReceiver_Succeeds (0 ms)
[ RUN      ] ProcessStateClient_UT.ProcessStateClient_QueueOneProcess_Succeeds
[       OK ] ProcessStateClient_UT.ProcessStateClient_QueueOneProcess_Succeeds (0 ms)
[ RUN      ] ProcessStateClient_UT.ProcessStateClient_QueueMaxNumberOfProcesses_Succeeds
[       OK ] ProcessStateClient_UT.ProcessStateClient_QueueMaxNumberOfProcesses_Succeeds (22 ms)
[ RUN      ] ProcessStateClient_UT.ProcessStateClient_QueueOneProcessTooMany_Fails
  !!! ->   2026/2/24 10:10:30 LCLM LCLM ERROR:   [ Failed to queue posix process ]
  !!! ->   2026/2/24 10:10:30 LCLM LCLM ERROR:   [ ProcessStateReceiver::getNextChangedPosixProcess: Overflow occurred, will be reported as kCommunicationError ]
[       OK ] ProcessStateClient_UT.ProcessStateClient_QueueOneProcessTooMany_Fails (3 ms)
[----------] 4 tests from ProcessStateClient_UT (26 ms total)

[----------] Global test environment tear-down
[==========] 4 tests from 1 test suite ran. (26 ms total)
[  PASSED  ] 4 tests.