Test #2054
This run and associated checks have been archived and are scheduled for deletion.
Learn more about checks retention
test.yml
on: schedule
unit-test
1h 36m
docker-build
32s
Annotations
2 errors
unit-test:
tests/unit_tests/test_matchmaker_algorithm_stable_marriage.py#L130
test_matching_graph_symmetric[build_fast]
hypothesis.errors.Flaky: Hypothesis test_matching_graph_symmetric(request=<FixtureRequest for <Function test_matching_graph_symmetric[build_fast]>>, caplog_context=make_caplog_context, build_func=build_fast, searches=[Search(['p0'], -1.5),
Search(['p0'], -0.003),
Search(['p0'], -0.003),
Search(['p0'], -1384.2469706299032),
Search(['p0'], -0.003, has_newbie),
Search(['p0'], -1384.2469706299032)]) produces unreliable results: Falsified on the first call but did not on a subsequent one
Falsifying example: test_matching_graph_symmetric(
request=<FixtureRequest for <Function test_matching_graph_symmetric[build_fast]>>,
caplog_context=make_caplog_context,
build_func=build_fast,
searches=[Search(['p0'], -1.5),
Search(['p0'], -0.003),
Search(['p0'], -0.003),
Search(['p0'], -1384.2469706299032),
Search(['p0'], -0.003, has_newbie),
Search(['p0'], -1384.2469706299032)],
)
Unreliable test timings! On an initial run, this test took 441.04ms, which exceeded the deadline of 300.00ms, but on a subsequent run it took 2.33 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None.
You can reproduce this example by temporarily adding @reproduce_failure('6.92.2', b'AXicY2BjZGKAAVZ2ZoYDULYBYwUjsyEzMxOrBiPbAQZGFVbG4xcgUhysDAcm/4z4slslQZTZmZGlmJF5KjMDC3YVfkxXGdkOM7LdYmKxZmAVYGdkOOkAUcTIwOD3TT7kxtadAgwNPTcZz+wTC3NkmM/IdJ+R4TIj0xcGFhtVZsbjd+ekohlZBdTLFsbEeodE8xgAzHc3aA==') as a decorator on your test case
|
unit-test
Process completed with exit code 1.
|