Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Propagate failures in pandas integration tests and Skip failing tests #17521

Merged
merged 28 commits into from
Dec 13, 2024

Conversation

Matt711
Copy link
Contributor

@Matt711 Matt711 commented Dec 4, 2024

Description

This PR ensures that the integration tests fail in any one of the test modules fails. It also skips of xfails any tests that are not currently passing. Finally, it fixes one incorrect use of rng.random.

Some of the change were originally made in #17489

Checklist

  • I am familiar with the Contributing Guidelines.
  • New or existing tests cover these changes.
  • The documentation is up to date with these changes.

@Matt711 Matt711 requested review from a team as code owners December 4, 2024 23:22
@github-actions github-actions bot added Python Affects Python cuDF API. cudf.pandas Issues specific to cudf.pandas labels Dec 4, 2024
@Matt711 Matt711 added non-breaking Non-breaking change bug Something isn't working labels Dec 4, 2024
@Matt711
Copy link
Contributor Author

Matt711 commented Dec 5, 2024

For the reviewer, I'll follow-up this PR to fix failing tests. xref #17490

Copy link
Contributor

@vyasr vyasr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the plan to merge this after we add xfails? Sorry I think I've lost track of the various discussions we've had, I know I've asked this a couple of times already.

@Matt711
Copy link
Contributor Author

Matt711 commented Dec 10, 2024

Is the plan to merge this after we add xfails? Sorry I think I've lost track of the various discussions we've had, I know I've asked this a couple of times already.

Yes that is still the plan. A few of the tests that are failing with "AttributeError: 'ndarray' object has no attribute '_fsproxy_wrapped'" are giving me some weird problems. They fail when run normally and XPASS when I add @pytest.mark.xfail. I'm going to look at these failures more closely tomorrow. And my apologies, I meant to update you earlier today.

Copy link
Contributor Author

@Matt711 Matt711 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests are finally passing, but there's a lot more work to do here in #17490. CC. @vyasr

@@ -1,129 +0,0 @@
# Copyright (c) 2024, NVIDIA CORPORATION.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm getting strange errors in the catboost test suite. Pytest fails to collect the tests with this error.

ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

From a cursory investigation, it may be due to the version of NumPy installed that's causing problems. I tried skipping the tests but it still fails. I'm deleting the tests for now and will make it priority to add them back in #17490.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably due to ABI incompatibility between the version of NumPy that catboost is compiled against and the version of of NumPy installed in test CI job.

@@ -40,6 +40,7 @@ jobs:
- pandas-tests
- pandas-tests-diff
- telemetry-setup
- third-party-integration-tests-cudf-pandas
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll delete these changes to pr.yaml once this PR is approved. I only added them to run the CI job in this PR.

@Matt711 Matt711 requested a review from vyasr December 12, 2024 15:18
@Matt711 Matt711 changed the title Propagate failures in pandas integration tests Propagate failures in pandas integration tests and Skip failing tests Dec 12, 2024
@vyasr
Copy link
Contributor

vyasr commented Dec 12, 2024

Why delete catboost? It looks like the failure is a runtime issue and not like a seg fault, so can't you just mark the test as a skip?

@vyasr
Copy link
Contributor

vyasr commented Dec 12, 2024

Discussed offline, this test still seems to make the suite fail even if it is skipped so we'll drop it for now and add back in a follow-up.

.github/workflows/pr.yaml Outdated Show resolved Hide resolved
.github/workflows/pr.yaml Outdated Show resolved Hide resolved
@Matt711
Copy link
Contributor Author

Matt711 commented Dec 12, 2024

/merge

1 similar comment
@Matt711
Copy link
Contributor Author

Matt711 commented Dec 13, 2024

/merge

@rapids-bot rapids-bot bot merged commit 5baaf6d into rapidsai:branch-25.02 Dec 13, 2024
104 of 105 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working cudf.pandas Issues specific to cudf.pandas non-breaking Non-breaking change Python Affects Python cuDF API.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants