Skip to content

Commit

Permalink
Merge branch 'master' into installplan_approval
Browse files Browse the repository at this point in the history
  • Loading branch information
asanzgom authored Oct 28, 2024
2 parents 7640988 + 129ca46 commit 955a3a3
Show file tree
Hide file tree
Showing 8 changed files with 457 additions and 316 deletions.
16 changes: 16 additions & 0 deletions ods_ci/libs/DataSciencePipelinesKfp.py
Original file line number Diff line number Diff line change
Expand Up @@ -304,6 +304,22 @@ def check_run_status(self, run_id, timeout=160):
count += 1
return run_status # pyright: ignore [reportPossiblyUnboundVariable]

@keyword
def get_last_run_by_pipeline_name(self, pipeline_name: str | None = None, namespace: str | None = None):
"""
Gets run_id of the last run created for pipeline_name
:param pipeline_name:
:param namespace:
:return:
run_id
"""
pipeline_id = self.client.get_pipeline_id(pipeline_name)
pipeline_version_id = self.get_last_pipeline_version(pipeline_id)
all_runs = self.get_all_runs(namespace=namespace, pipeline_version_id=pipeline_version_id)
if len(all_runs) > 0:
return all_runs[-1].run_id
return None

@keyword
def delete_pipeline(self, pipeline_id):
"""Deletes a pipeline"""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,6 @@ Wait Until Pipeline Server Is Deployed
... Verify DSPv1 Pipeline Server Deployments namespace=${namespace}
END


Wait Until Pipeline Server Is Deleted
[Documentation] Waits until all pipeline server pods are deleted
[Arguments] ${namespace}
Expand All @@ -183,7 +182,6 @@ Create Secret With Pipelines Object Storage Information
Run And Verify Command oc create secret generic dashboard-dspa-secret -n ${namespace} --from-literal=AWS_ACCESS_KEY_ID=${object_storage_access_key} --from-literal=AWS_SECRET_ACCESS_KEY=${object_storage_secret_key} # robocop: off=line-too-long
Run And Verify Command oc label secret dashboard-dspa-secret -n ${namespace} opendatahub.io/dashboard=true


Import Pipeline And Create Run
[Documentation]
[Arguments] ${namespace} ${username} ${password}
Expand Down Expand Up @@ -211,6 +209,15 @@ Import Pipeline And Create Run

RETURN ${pipeline_id} ${pipeline_version_id} ${pipeline_run_id} ${experiment_id}

Get Last Run By Pipeline Name
[Documentation] Returns ${pipeline_run_id} of the last run for the last version of ${pipeline_name}
[Arguments] ${namespace} ${username} ${password} ${pipeline_name}

DataSciencePipelinesKfp.Setup Client user=${username} pwd=${password} project=${namespace}
${pipeline_run_id}= DataSciencePipelinesKfp.Get Last Run By Pipeline Name pipeline_name=${pipeline_name}

RETURN ${pipeline_run_id}

Verify Run Status
[Documentation] Verifies pipeline run status matches ${pipeline_run_expected_status}
[Arguments] ${namespace} ${username} ${password}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Resource ../../../Resources/Page/ODH/ODHDashboard/ODHDataScienceProject/


*** Variables ***
${PROJECT}= dsp-upgrade-testing
${PROJECT}= upg-dsp
${PIPELINE_LONGRUNNING_FILEPATH}= tests/Resources/Files/pipeline-samples/v2/cache-disabled/pip_index_url/take_nap_compiled.yaml # robocop: disable:line-too-long


Expand All @@ -24,9 +24,13 @@ Verify Resources After Upgrade
DataSciencePipelinesBackend.Wait Until Pipeline Server Is Deployed namespace=${PROJECT}

${take_nap_run_id}= DataSciencePipelinesBackend.Get Last Run By Pipeline Name
... namespace=${PROJECT} username=${TEST_USER.USERNAME} password=${TEST_USER.PASSWORD}
... pipeline_name=take-nap

Verify Run Status
... namespace=${PROJECT} username=${TEST_USER.USERNAME} password=${TEST_USER.PASSWORD}
... pipeline_run_id=${DSP_LONGRUNNING_PIPELINE_RUN_ID} pipeline_run_expected_status=RUNNING
... pipeline_run_id=${take_nap_run_id} pipeline_run_expected_status=RUNNING

Projects.Delete Project Via CLI By Display Name ${PROJECT}

Expand Down Expand Up @@ -60,5 +64,3 @@ Start Long Running Pipeline
... pipeline_package_path=${PIPELINE_LONGRUNNING_FILEPATH}
... pipeline_run_name=take-nap-run
... pipeline_run_params=${pipeline_run_params}

Set Global Variable ${DSP_LONGRUNNING_PIPELINE_RUN_ID} ${pipeline_run_id}
406 changes: 406 additions & 0 deletions ods_ci/tests/Resources/Page/ModelRegistry/ModelRegistry.resource

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ Resource ../../Resources/CLI/ModelServing/modelmesh.resource
Resource ../../Resources/CLI/DataSciencePipelines/DataSciencePipelinesUpgradeTesting.resource
Resource ../../Resources/Page/DistributedWorkloads/DistributedWorkloads.resource
Resource ../../Resources/Page/DistributedWorkloads/WorkloadMetricsUI.resource
Resource ../../Resources/Page/ModelRegistry/ModelRegistry.resource
Suite Setup Dashboard Suite Setup
Suite Teardown RHOSi Teardown
Test Tags PreUpgrade
Expand Down Expand Up @@ -179,6 +180,11 @@ Data Science Pipelines Pre Upgrade Configuration
[Tags] Upgrade DataSciencePipelines-Backend
DataSciencePipelinesUpgradeTesting.Setup Environment For Upgrade Testing

Model Registry Pre Upgrade Set Up
[Documentation] Creates a Model Registry instance and registers a model/version
[Tags] Upgrade ModelRegistryUpgrade
Model Registry Pre Upgrade Scenario


*** Keywords ***
Dashboard Suite Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Resource ../../Resources/Page/DistributedWorkloads/DistributedWorkload
Resource ../../Resources/Page/DistributedWorkloads/WorkloadMetricsUI.resource
Resource ../../Resources/CLI/MustGather/MustGather.resource
Resource ../../Resources/CLI/DataSciencePipelines/DataSciencePipelinesUpgradeTesting.resource
Resource ../../Resources/Page/ModelRegistry/ModelRegistry.resource
Suite Setup Upgrade Suite Setup
Test Tags PostUpgrade

Expand Down Expand Up @@ -215,7 +216,6 @@ Verify that the must-gather image provides RHODS logs and info
END
[Teardown] Cleanup must-gather Logs


Verify That DSC And DSCI Release.Name Attribute matches ${expected_release_name}
[Documentation] Tests the release.name attribute from the DSC and DSCI matches the desired value.
... ODH: Open Data Hub
Expand Down Expand Up @@ -244,6 +244,12 @@ Data Science Pipelines Post Upgrade Verifications
[Tags] Upgrade DataSciencePipelines-Backend
DataSciencePipelinesUpgradeTesting.Verify Resources After Upgrade

Model Registry Post Upgrade Verification
[Documentation] Verifies that registered model/version in pre-upgrade is present after the upgrade
[Tags] Upgrade ModelRegistryUpgrade
Model Registry Post Upgrade Scenario
[Teardown] Post Upgrade Scenario Teardown


*** Keywords ***
Dashboard Suite Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Suite Teardown RHOSi Teardown


*** Variables ***
${URL_TEST_PIPELINE_RUN_YAML}= https://raw.githubusercontent.com/opendatahub-io/data-science-pipelines-operator/main/tests/resources/test-pipeline-run.yaml # robocop: disable:line-too-long
${URL_TEST_PIPELINE_RUN_YAML}= https://raw.githubusercontent.com/red-hat-data-services/ods-ci/refs/heads/master/ods_ci/tests/Resources/Files/pipeline-samples/v2/cache-disabled/pip_index_url/hello_world_pip_index_url_compiled.yaml # robocop: disable:line-too-long


*** Test Cases ***
Expand All @@ -25,12 +25,15 @@ Verify Admin Users Can Create And Run a Data Science Pipeline Using The Api
... the pipeline resources.
[Tags] Sanity ODS-2083
End To End Pipeline Workflow Via Api ${OCP_ADMIN_USER.USERNAME} ${OCP_ADMIN_USER.PASSWORD} pipelinesapi1
[Teardown] Projects.Delete Project Via CLI By Display Name pipelinesapi1


Verify Regular Users Can Create And Run a Data Science Pipeline Using The Api
[Documentation] Creates, runs pipelines with regular user. Double check the pipeline result and clean
... the pipeline resources.
[Tags] Tier1 ODS-2677
End To End Pipeline Workflow Via Api ${TEST_USER.USERNAME} ${TEST_USER.PASSWORD} pipelinesapi2
[Teardown] Projects.Delete Project Via CLI By Display Name pipelinesapi2

Verify Ods Users Can Do Http Request That Must Be Redirected to Https
[Documentation] Verify Ods Users Can Do Http Request That Must Be Redirected to Https
Expand Down Expand Up @@ -82,12 +85,11 @@ End To End Pipeline Workflow Via Api
${status} = Login And Wait Dsp Route ${username} ${password} ${project}
Should Be True ${status} == 200 Could not login to the Data Science Pipelines Rest API OR DSP routing is not working # robocop: disable:line-too-long
Setup Client ${username} ${password} ${project}
${pipeline_param} = Create Dictionary recipient=integration_test
${pipeline_param} = Create Dictionary
${run_id} = Import Run Pipeline From Url pipeline_url=${URL_TEST_PIPELINE_RUN_YAML} pipeline_params=${pipeline_param} # robocop: disable:line-too-long
${run_status} = Check Run Status ${run_id}
Should Be Equal As Strings ${run_status} SUCCEEDED Pipeline run doesn't have a status that means success. Check the logs # robocop: disable:line-too-long
DataSciencePipelinesKfp.Delete Run ${run_id}
[Teardown] Projects.Delete Project Via CLI By Display Name ${project}

Double Check If DSPA Was Created
[Documentation] Double check if DSPA was created
Expand Down
Loading

0 comments on commit 955a3a3

Please sign in to comment.