Skip to content

Commit

Permalink
Merge branch 'master' into feature/Validate_ServiceMeshControlPlaneAl…
Browse files Browse the repository at this point in the history
…readyCreated
  • Loading branch information
asanzgom authored Sep 4, 2024
2 parents 6c04140 + 59278d9 commit 7b4abdf
Show file tree
Hide file tree
Showing 36 changed files with 346 additions and 225 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
*** Settings ***
Documentation Collection of keywords to interact with Data Science Pipelines via CLI
Library OperatingSystem
Resource ../../../Resources/OCP.resource


*** Variables ***
${DSPA_PATH}= tests/Resources/Files/pipeline-samples/v2/dspa


*** Keywords ***
Create Pipeline Server
[Documentation] Creates a pipeline server providing object storage and database information
... Note: currently, only some of the parameters are used. In the future this keyword will be
... enhanced to use them all
[Arguments] ${namespace}
... ${object_storage_access_key} ${object_storage_secret_key}
# ... ${object_storage_endpoint} ${object_storage_region}
# ... ${object_storage_bucket_name}
# ... ${database_host}=${EMPTY} ${database_port}=3306
# ... ${database_username}=${EMPTY} ${database_password}=${EMPTY}
# ... ${database_db_name}=${EMPTY}
... ${dsp_version}=v2

Create Secret With Pipelines Object Storage Information namespace=${namespace}
... object_storage_access_key=${object_storage_access_key}
... object_storage_secret_key=${object_storage_secret_key}

# Process DSPA Template to create pipeline server
${TEMPLATE_PARAMETERS}= Set Variable -p DSP_VERSION=${dsp_version}
Run oc process -f ${DSPA_PATH}/dspa-template.yaml ${TEMPLATE_PARAMETERS} | oc apply -n ${namespace} -f -

# robocop: disable:line-too-long
Create PipelineServer Using Custom DSPA
[Documentation] Install and verifies that DataSciencePipelinesApplication CRD is installed and working
[Arguments] ${namespace} ${dspa_file}=data-science-pipelines-sample.yaml ${assert_install}=True

Run oc apply -f "${DSPA_PATH}/${dspa_file}" -n ${namespace}
IF ${assert_install}==True
${generation_value} Run oc get datasciencepipelinesapplications -n ${namespace} -o json | jq '.items[0].metadata.generation'
Should Be True ${generation_value} == 2 DataSciencePipelinesApplication created
END

Verify Pipeline Server Deployments # robocop: disable
[Documentation] Verifies the correct deployment of DS Pipelines in the rhods namespace
[Arguments] ${namespace}

@{all_pods}= Oc Get kind=Pod namespace=${namespace}
... label_selector=component=data-science-pipelines
Run Keyword And Continue On Failure Length Should Be ${all_pods} 7

@{pipeline_api_server}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-dspa
${containerNames}= Create List oauth-proxy ds-pipeline-api-server
Verify Deployment ${pipeline_api_server} 1 2 ${containerNames}

@{pipeline_metadata_envoy}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-metadata-envoy-dspa
${containerNames}= Create List container oauth-proxy
Verify Deployment ${pipeline_metadata_envoy} 1 2 ${containerNames}

@{pipeline_metadata_grpc}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-metadata-grpc-dspa
${containerNames}= Create List container
Verify Deployment ${pipeline_metadata_grpc} 1 1 ${containerNames}

@{pipeline_persistenceagent}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-persistenceagent-dspa
${containerNames}= Create List ds-pipeline-persistenceagent
Verify Deployment ${pipeline_persistenceagent} 1 1 ${containerNames}

@{pipeline_scheduledworkflow}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-scheduledworkflow-dspa
${containerNames}= Create List ds-pipeline-scheduledworkflow
Verify Deployment ${pipeline_scheduledworkflow} 1 1 ${containerNames}

@{pipeline_workflow_controller}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-workflow-controller-dspa
${containerNames}= Create List ds-pipeline-workflow-controller
Verify Deployment ${pipeline_workflow_controller} 1 1 ${containerNames}

@{mariadb}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=mariadb-dspa
${containerNames}= Create List mariadb
Verify Deployment ${mariadb} 1 1 ${containerNames}

Wait Until Pipeline Server Is Deployed
[Documentation] Waits until all the expected pods of the pipeline server
... are running
[Arguments] ${namespace}
Wait Until Keyword Succeeds 10 times 10s
... Verify Pipeline Server Deployments namespace=${namespace}

Wait Until Pipeline Server Is Deleted
[Documentation] Waits until all pipeline server pods are deleted
[Arguments] ${namespace}
# robocop: off=expression-can-be-simplified
FOR ${_} IN RANGE 0 30
${pod_count}= Run oc get pods -n ${namespace} -l component=data-science-pipelines | wc -l
IF ${pod_count}==0 BREAK
Sleep 1s
END

# robocop: disable:line-too-long
Create Pipelines ConfigMap With Custom Pip Index Url And Trusted Host
[Documentation] Creates a Configmap (ds-pipeline-custom-env-vars) in the project,
... storing the values for pip_index_url and pip_trusted_host
[Arguments] ${namespace}
Run oc create configmap ds-pipeline-custom-env-vars -n ${namespace} --from-literal=pip_index_url=${PIP_INDEX_URL} --from-literal=pip_trusted_host=${PIP_TRUSTED_HOST}

Create Secret With Pipelines Object Storage Information
[Documentation] Creates a secret needed to create a pipeline server containing the object storage credentials
[Arguments] ${namespace} ${object_storage_access_key} ${object_storage_secret_key}
Run oc create secret generic dashboard-dspa-secret -n ${namespace} --from-literal=AWS_ACCESS_KEY_ID=${object_storage_access_key} --from-literal=AWS_SECRET_ACCESS_KEY=${object_storage_secret_key}
Run oc label secret dashboard-dspa-secret -n ${namespace} opendatahub.io/dashboard=true
6 changes: 5 additions & 1 deletion ods_ci/tests/Resources/CLI/ModelServing/llm.resource
Original file line number Diff line number Diff line change
Expand Up @@ -361,7 +361,11 @@ Query Model Multiple Times
... inference_type=${inference_type} model_name=${model_name} body_params=${body_params}
... query_text=${EXP_RESPONSES}[queries][${query_idx}][query_text]
IF "${token}" != "${None}"
Set To Dictionary ${header} Authorization Bearer ${token}
IF "${protocol}" == "grpc"
${header}= Set Variable "Authorization: Bearer ${token}" -H ${header}
ELSE
Set To Dictionary ${header} Authorization Bearer ${token}
END
END
${runtime_details}= Set Variable ${RUNTIME_FORMATS}[${runtime}][endpoints][${inference_type}][${protocol}]
${endpoint}= Set Variable ${runtime_details}[endpoint]
Expand Down
24 changes: 12 additions & 12 deletions ods_ci/tests/Resources/CLI/MustGather/MustGather.resource
Original file line number Diff line number Diff line change
Expand Up @@ -12,21 +12,21 @@ Get must-gather Logs
${output}= Run process tests/Resources/CLI/MustGather/get-must-gather-logs.sh shell=yes
Should Be Equal As Integers ${output.rc} 0
Should Not Contain ${output.stdout} FAIL
${must-gather-dir}= Run ls -d must-gather.local.*
${namespaces-log-dir}= Run ls -d ${must-gather-dir}/quay-io-modh-must-gather-sha256-*/namespaces
Set Suite Variable ${must-gather-dir}
Set Suite Variable ${namespaces-log-dir}
Directory Should Exist ${must-gather-dir}
Directory Should Not Be Empty ${must-gather-dir}
${must_gather_dir}= Run ls -d must-gather.local.*
${namespaces_log_dir}= Run ls -d ${must_gather_dir}/quay-io-modh-must-gather-sha256-*/namespaces
Set Suite Variable ${must_gather_dir}
Set Suite Variable ${namespaces_log_dir}
Directory Should Exist ${must_gather_dir}
Directory Should Not Be Empty ${must_gather_dir}

Verify Logs For ${namespace}
[Documentation] Verifies the must-gather logs related to a namespace
Directory Should Exist ${namespaces-log-dir}/${namespace}
Directory Should Not Be Empty ${namespaces-log-dir}/${namespace}
Directory Should Not Be Empty ${namespaces-log-dir}/${namespace}/pods
${log-files}= Run find ${namespaces-log-dir}/${namespace}/pods -type f -name "*.log"
Should Not Be Equal ${log-files} ${EMPTY}
Directory Should Exist ${namespaces_log_dir}/${namespace}
Directory Should Not Be Empty ${namespaces_log_dir}/${namespace}
Directory Should Not Be Empty ${namespaces_log_dir}/${namespace}/pods
${log_files}= Run find ${namespaces_log_dir}/${namespace}/pods -type f -name "*.log"
Should Not Be Equal ${log_files} ${EMPTY}

Cleanup must-gather Logs
[Documentation] Deletes the folder with the must-gather logs
Remove Directory ${must-gather-dir} recursive=True
Run Keyword If "${must_gather_dir}" != "${EMPTY}" Remove Directory ${must_gather_dir} recursive=True
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/bin/sh
# Redirecting stdout/stderr of must-gather to a file, as it fills up the
# process buffer and prevents the script from running further.
oc adm must-gather --image=quay.io/modh/must-gather@sha256:1bd8735d715b624c1eaf484454b0d6d400a334d8cbba47f99883626f36e96657 &> must-gather-results.txt
oc adm must-gather --image=quay.io/modh/must-gather@sha256:9d5988f45c3b00ec7fbbe7a8a86cc149a2768c9c47e207694fdb6e87ef44adf3 &> must-gather-results.txt

if [ $? -eq 0 ]
then
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
kind: Template
apiVersion: template.openshift.io/v1
metadata:
name: dspa-template
objects:
- apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: dspa
spec:
dspVersion: ${DSP_VERSION}
objectStorage:
disableHealthCheck: false
enableExternalRoute: false
externalStorage:
basePath: ''
bucket: ${OBJECT_STORAGE_BUCKET}
host: ${OBJECT_STORAGE_HOST}
port: ''
region: ${OBJECT_STORAGE_REGION}
s3CredentialsSecret:
accessKey: AWS_ACCESS_KEY_ID
secretKey: AWS_SECRET_ACCESS_KEY
secretName: dashboard-dspa-secret
scheme: https
podToPodTLS: true
parameters:
- description: Kubeflow Pipelines Version
value: "v2"
name: DSP_VERSION
- description: Object Storage Bucket Name
value: "ods-ci-ds-pipelines"
name: OBJECT_STORAGE_BUCKET
- description: Object Storage Host
value: "s3.amazonaws.com"
name: OBJECT_STORAGE_HOST
- description: Object Storage Region
value: "us-east-1"
name: OBJECT_STORAGE_REGION
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ ${CODEFLARE-SDK-API_URL} %{CODEFLARE-SDK-API_URL=https://api.git
${CODEFLARE-SDK_DIR} codeflare-sdk
${CODEFLARE-SDK_REPO_URL} %{CODEFLARE-SDK_REPO_URL=https://github.com/project-codeflare/codeflare-sdk.git}
${DISTRIBUTED_WORKLOADS_RELEASE_ASSETS} https://github.com/opendatahub-io/distributed-workloads/releases/latest/download
${FMS_HF_TUNING_IMAGE} quay.io/modh/fms-hf-tuning@sha256:2985c259c66e227417ed69365bb23ab92ed5022650672771e56070326b21d5f4
${FMS_HF_TUNING_IMAGE} quay.io/modh/fms-hf-tuning@sha256:8edea6f0f9c4c631cdca1e1c10abf0d4b994738fde78c40d48eda216fdd382f5
${KFTO_CORE_BINARY_NAME} kfto
${KFTO_UPGRADE_BINARY_NAME} kfto-upgrade

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ Select Refresh Interval
[Arguments] ${refresh_interval}
Wait Until Element Is Visible ${REFRESH_INTERVAL_XP} timeout=20
Click Element ${REFRESH_INTERNAL_MENU_XP}
Click Element xpath=//button[text()="${refresh_interval}"]
Click Element xpath=//button[@role="option" and contains(., "${refresh_interval}")]

Get Current CPU Usage
[Documentation] Returns value of current cpu usage
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@ Resource ../../../ODS.robot
Resource ../../../Common.robot
Resource ../../../Page/ODH/ODHDashboard/ODHDashboard.robot
Library DateTime
Resource ../../../CLI/DataSciencePipelines/DataSciencePipelinesBackend.resource
Library ../../../../../libs/DataSciencePipelinesAPI.py
Resource ODHDataScienceProject/Pipelines.resource


*** Variables ***
${DATA_SCIENCE_PIPELINES_APPLICATION_PATH}= tests/Resources/Files
${PIPELINES_IMPORT_BTN_FORM_XP}= xpath://*[@data-testid="import-button"]
${PIPELINE_NAME_INPUT_XP}= xpath://*[@data-testid="pipeline-name"]
${PIPELINE_DESC_INPUT_XP}= xpath://*[@data-testid="pipeline-description"]
Expand All @@ -23,23 +23,6 @@ ${PIPELINE_EXPERIMENT_TABLE_XP}= xpath://*[@data-testid="experim


*** Keywords ***
# robocop: disable:line-too-long
Install DataSciencePipelinesApplication CR
[Documentation] Install and verifies that DataSciencePipelinesApplication CRD is installed and working
[Arguments] ${project} ${dsp_file}=data-science-pipelines-sample.yaml ${assert_install}=True
Log ${project}
Oc Apply kind=DataSciencePipelinesApplication src=${DATA_SCIENCE_PIPELINES_APPLICATION_PATH}/${dsp_file} namespace=${project}
IF ${assert_install}==True
${generation_value} Run oc get datasciencepipelinesapplications -n ${project} -o json | jq '.items[0].metadata.generation'
Should Be True ${generation_value} == 2 DataSciencePipelinesApplication created
END

Create Pipelines ConfigMap With Custom Pip Index Url And Trusted Host
[Documentation] Creates a Configmap (ds-pipeline-custom-env-vars) in the project,
... storing the values for pip_index_url and pip_trusted_host
[Arguments] ${project_title}
Run oc create configmap ds-pipeline-custom-env-vars --from-literal=pip_index_url=${PIP_INDEX_URL} --from-literal=pip_trusted_host=${PIP_TRUSTED_HOST} -n ${project_title}

Fill In Pipeline Import Form
[Documentation] Compiles the form to create a pipeline.
... It works when you start server creation from either
Expand Down Expand Up @@ -151,7 +134,7 @@ Delete Pipeline Server
Click Element xpath://button/span/span[text()='Delete pipeline server']
Handle Deletion Confirmation Modal ${data_science_project_name} pipeline server pipeline server
Wait Until Page Contains text=Configure pipeline server timeout=120s
Pipelines.Wait Until Pipeline Server Is Deleted ${data_science_project_name}
DataSciencePipelinesBackend.Wait Until Pipeline Server Is Deleted ${data_science_project_name}

Verify There Is No "Error Displaying Pipelines" After Creating Pipeline Server
[Documentation] Verify me message "Error displaying Pipelines" after creating pipeline server
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,8 @@ Import Pipeline
Click Button ${PIPELINES_IMPORT_BTN_FORM_XP}
END
Wait Until Generic Modal Disappears
Wait Until Project Is Open project_title=${project_title}
Maybe Wait For Dashboard Loading Spinner Page timeout=45s
Wait For Dashboard Page Title ${name} timeout=30s

Create Pipeline Run
[Documentation] Create a pipeline run from DS Project details page.
Expand Down Expand Up @@ -277,69 +278,6 @@ Wait Until Page Contains Run Topology Page
Run Keyword And Continue On Failure Wait Until Page Contains Element
... ${RUN_TOPOLOGY_XP}

Verify Pipeline Server Deployments # robocop: disable
[Documentation] Verifies the correct deployment of DS Pipelines in the rhods namespace
[Arguments] ${project_title}

${namespace}= Get Openshift Namespace From Data Science Project
... project_title=${project_title}

@{all_pods}= Oc Get kind=Pod namespace=${namespace}
... label_selector=component=data-science-pipelines
Run Keyword And Continue On Failure Length Should Be ${all_pods} 7

@{pipeline_api_server}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-dspa
${containerNames}= Create List oauth-proxy ds-pipeline-api-server
Verify Deployment ${pipeline_api_server} 1 2 ${containerNames}

@{pipeline_metadata_envoy}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-metadata-envoy-dspa
${containerNames}= Create List container oauth-proxy
Verify Deployment ${pipeline_metadata_envoy} 1 2 ${containerNames}

@{pipeline_metadata_grpc}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-metadata-grpc-dspa
${containerNames}= Create List container
Verify Deployment ${pipeline_metadata_grpc} 1 1 ${containerNames}

@{pipeline_persistenceagent}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-persistenceagent-dspa
${containerNames}= Create List ds-pipeline-persistenceagent
Verify Deployment ${pipeline_persistenceagent} 1 1 ${containerNames}

@{pipeline_scheduledworkflow}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-scheduledworkflow-dspa
${containerNames}= Create List ds-pipeline-scheduledworkflow
Verify Deployment ${pipeline_scheduledworkflow} 1 1 ${containerNames}

@{pipeline_workflow_controller}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=ds-pipeline-workflow-controller-dspa
${containerNames}= Create List ds-pipeline-workflow-controller
Verify Deployment ${pipeline_workflow_controller} 1 1 ${containerNames}

@{mariadb}= Oc Get kind=Pod namespace=${namespace}
... label_selector=app=mariadb-dspa
${containerNames}= Create List mariadb
Verify Deployment ${mariadb} 1 1 ${containerNames}

Wait Until Pipeline Server Is Deployed
[Documentation] Waits until all the expected pods of the pipeline server
... are running
[Arguments] ${project_title}
Wait Until Keyword Succeeds 10 times 10s
... Verify Pipeline Server Deployments project_title=${project_title}

Wait Until Pipeline Server Is Deleted
[Documentation] Waits until all pipeline server pods are deleted
[Arguments] ${project_title}
# robocop: off=expression-can-be-simplified
FOR ${_} IN RANGE 0 30
${pod_count}= Run oc get pods -n ${project_title} -l component=data-science-pipelines | wc -l
IF ${pod_count}==0 BREAK
Sleep 1s
END

# TODO: we need to replace this keyword for a similar one checking in Data Science Pipelines > Runs
# Verify Successful Pipeline Run Via Project UI
# [Documentation] Validates that a given pipeline run in a given pipeline is in successful end state
Expand Down
Loading

0 comments on commit 7b4abdf

Please sign in to comment.