Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mark more tests as flaky #3961

Merged
merged 1 commit into from
Sep 14, 2023

Conversation

FlorianHockmann
Copy link
Member

@FlorianHockmann FlorianHockmann commented Sep 7, 2023

This adds the necessary annotation to repeat flaky tests if necessary for tests described in these issues:

I went through the recent builds on master and checked all failed builds. I think this should handle all flaky tests that occurred in these builds, apart from the TinkerPop test where I just don't know how to deal with it: #3841, but we can of course just address that in another PR.
We however also sometimes get failed jobs because of some intermediate issues where a backend cannot be properly set up or teared down for some reason or downloading some artifact just fails. I don't know what we can do to improve this, but I would also consider that out of scope of this PR where I just wanted to deal with failing jobs due to specific tests being flaky.

Hopefully, this will already lead to significantly fewer cases where we have to manually restart failed jobs (which I had to do a lot lately for all these Dependabot PRs).

Copy link
Member

@porunov porunov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes look good to me. However, I'm not sure why we need minSuccess = 2.
It means that the test will be executed at least 2 times. I'm OK with it, however, it seems to me that minSuccess = 1 is good as well.
Apart from that, it would be great if you could add a comment with a link to GitHub Issue which is being closed to each affected test. In such case, later it will be easier to track back to the reason why a specific test is marked as Flaky.

@FlorianHockmann
Copy link
Member Author

However, I'm not sure why we need minSuccess = 2.
It means that the test will be executed at least 2 times. I'm OK with it, however, it seems to me that minSuccess = 1 is good as well.

I simply took that from our docs on flaky tests.
I guess the reasoning here was that a flaky test should at least pass in 50% of the cases, but I see your point. If the test is really flaky, then this might not be the case and it doesn't really help us if we then let the build fail again. It would only slow down our development in general.
I'll change this config and also update the docs accordingly.

Apart from that, it would be great if you could add a comment with a link to GitHub Issue which is being closed to each affected test. In such case, later it will be easier to track back to the reason why a specific test is marked as Flaky.

Good idea! I'll add that. Just note that our current policy is to leave flaky test issues open until the test is improved in a way so that it isn't flaky any more.

This adds the necessary annotation to repeat flaky tests if necessary
for tests described in these issues:
* JanusGraph#1457
* JanusGraph#1459
* JanusGraph#1462
* JanusGraph#1464
* JanusGraph#1465
* JanusGraph#1497
* JanusGraph#1498
* JanusGraph#2272
* JanusGraph#3142
* JanusGraph#3356
* JanusGraph#3392
* JanusGraph#3393
* JanusGraph#3651
* JanusGraph#3931
* JanusGraph#3959
* JanusGraph#3960

I went through the recent builds on `master` and checked all failed
builds. I think this should handle all flaky tests that occurred in
these builds, apart from the TinkerPop test where I just don't know how
to deal with it: JanusGraph#3841, but we can of course just address that in
another PR.
We however also sometimes get failed jobs because of some intermediate
issues where a backend cannot be properly set up or teared down for some
reason or downloading some artifact just fails. I don't know what we can
do to improve this, but I would also consider that out of scope of this
PR where I just wanted to deal with failing jobs due to specific tests
being flaky.

Hopefully, this will already lead to significantly fewer cases where we
have to manually restart failed jobs (which I had to do a lot lately for
all these Dependabot PRs).

I also added a link to the relevant issue to all flaky tests.

Signed-off-by: Florian Hockmann <[email protected]>
@FlorianHockmann
Copy link
Member Author

Done, I also added links to the issues for all flaky tests that already had the annotation.

Copy link
Member

@porunov porunov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect, thank you @FlorianHockmann !

@FlorianHockmann FlorianHockmann merged commit 2ba933c into JanusGraph:master Sep 14, 2023
170 checks passed
@FlorianHockmann FlorianHockmann deleted the mark-flaky-tests branch September 14, 2023 06:50
@janusgraph-automations
Copy link

💔 All backports failed

Status Branch Result
v0.6 Backport failed because of merge conflicts

Manual backport

To create the backport manually run:

backport --pr 3961

Questions ?

Please refer to the Backport tool documentation and see the Github Action logs for details

FlorianHockmann added a commit that referenced this pull request Sep 14, 2023
Follow-up to #3961 where I forgot to remove this argument from these 2
tests because they are using `ParameterizedRepeatedIfExceptionsTest`
instead of `RepeatedIfExceptionsTest`.

Signed-off-by: Florian Hockmann <[email protected]>
FlorianHockmann added a commit that referenced this pull request Sep 14, 2023
Follow-up to #3961 where I forgot to remove this argument from these 2
tests because they are using `ParameterizedRepeatedIfExceptionsTest`
instead of `RepeatedIfExceptionsTest`.

Signed-off-by: Florian Hockmann <[email protected]>
@FlorianHockmann
Copy link
Member Author

💚 All backports created successfully

Status Branch Result
v0.6

Note: Successful backport PRs will be merged automatically after passing CI.

Questions ?

Please refer to the Backport tool documentation

FlorianHockmann added a commit that referenced this pull request Sep 14, 2023
Follow-up to #3961 where I forgot to remove this argument from these 2
tests because they are using `ParameterizedRepeatedIfExceptionsTest`
instead of `RepeatedIfExceptionsTest`.

Signed-off-by: Florian Hockmann <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants