Skip to content

Commit

Permalink
Merge branch 'main' of github.com:rapidsai/deployment into migrate-az…
Browse files Browse the repository at this point in the history
…ure-mnmg
  • Loading branch information
skirui-source committed Sep 28, 2023
2 parents 8af59fd + 7722303 commit e0f69c9
Show file tree
Hide file tree
Showing 26 changed files with 336 additions and 225 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/build-and-deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
name: Build (and deploy)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3.1.0
- uses: actions/checkout@v4
with:
fetch-depth: 0

Expand All @@ -38,7 +38,7 @@ jobs:
DEPLOYMENT_DOCS_BUILD_STABLE: ${{ startsWith(github.event.ref, 'refs/tags/') && 'true' || 'false' }}
run: make dirhtml SPHINXOPTS="-W --keep-going -n"

- uses: aws-actions/configure-aws-credentials@v1-node16
- uses: aws-actions/configure-aws-credentials@v4
if: ${{ github.repository == 'rapidsai/deployment' && github.event_name == 'push' }}
with:
role-to-assume: ${{ vars.AWS_ROLE_ARN }}
Expand All @@ -49,4 +49,4 @@ jobs:
if: ${{ github.repository == 'rapidsai/deployment' && github.event_name == 'push' }}
env:
DESTINATION_DIR: ${{ startsWith(github.event.ref, 'refs/tags/') && 'stable' || 'nightly' }}
run: aws s3 sync --no-progress --delete build/dirhtml "s3://rapidsai-docs/deployment/${DESTINATION_DIR}/html"
run: aws s3 sync --no-progress --delete build/dirhtml "s3://rapidsai-docs/deployment/html/${DESTINATION_DIR}"
2 changes: 1 addition & 1 deletion source/cloud/aws/ec2-multi.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ cluster = EC2Cluster(
worker_class="dask_cuda.CUDAWorker",
worker_options={"rmm-managed-memory": True},
security_groups=["<SECURITY GROUP ID>"],
docker_args="--shm-size=256m -e DISABLE_JUPYTER=true",
docker_args="--shm-size=256m",
n_workers=3,
security=False,
availability_zone="us-east-1a",
Expand Down
2 changes: 1 addition & 1 deletion source/cloud/azure/azure-vm-multi.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ cluster = AzureVMCluster(
n_workers=2,
worker_class="dask_cuda.CUDAWorker",
docker_image="{{rapids_container}}",
docker_args="-e DISABLE_JUPYTER=true -p 8787:8787 -p 8786:8786",
docker_args="-p 8787:8787 -p 8786:8786",
)
```

Expand Down
19 changes: 13 additions & 6 deletions source/cloud/gcp/vertex-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,14 +25,21 @@ $ docker push gcr.io/<project>/<folder>/{{ rapids_container.replace('rapidsai/',
## Create a New Notebook

1. From the Google Cloud UI, navigate to [**Vertex AI**](https://console.cloud.google.com/vertex-ai) -> **Dashboard** and select **+ CREATE NOTEBOOK INSTANCE**.
2. Under the **Environment** section, specify **Custom container**, and in the section below, select the `gcr.io` path to your pushed RAPIDS Docker image.
3. Under **Machine Configuration** select an NVIDIA GPU.
4. Check the **Install NVIDIA GPU Driver** option.
5. After customizing any other aspects of the machine you wish, click **CREATE**.
2. In the **Details** section, under the **Workbench type** heading select **Managed Notebook** from the drop down menu.
3. Under the **Environment** section, select **Provide custom docker images**, and in the input field below, select the `gcr.io` path to your pushed RAPIDS Docker image.
4. Under the **Machine type** section select an NVIDIA GPU.
5. Check the **Install NVIDIA GPU Driver** option.
6. After customizing any other aspects of the machine you wish, click **CREATE**.

## TEST RAPIDS
## Test RAPIDS

Once the managed notebook is fully configured, you can click **OPEN JUPYTERLAB** to navigate to another tab running JupyterLab to use the latest version of RAPIDS with Vertex AI.
Once the managed notebook is fully configured, you can click **OPEN JUPYTERLAB** to navigate to another tab running JupyterLab.

```{warning}
You should see a popup letting you know it is loading the RAPIDS kernel, this can take a long time so please be patient.
```

Once the kernel is loaded you can launch a notebook with the `rapids` kernel to use the latest version of RAPIDS with Vertex AI.

For example we could import and use RAPIDS libraries like `cudf`.

Expand Down
16 changes: 10 additions & 6 deletions source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,18 +20,22 @@
copyright = f"{datetime.date.today().year}, NVIDIA"
author = "NVIDIA"

# Single modifiable version for all of the docs - easier for future updates
stable_version = "23.08"
nightly_version = "23.10"

versions = {
"stable": {
"rapids_version": "23.06",
"rapids_container": "nvcr.io/nvidia/rapidsai/rapidsai-core:23.06-cuda11.8-runtime-ubuntu22.04-py3.10",
"rapids_version": stable_version,
"rapids_container": f"nvcr.io/nvidia/rapidsai/base:{stable_version}-cuda11.8-py3.10",
"rapids_conda_channels": "-c rapidsai -c conda-forge -c nvidia",
"rapids_conda_packages": "rapids=23.06 python=3.10 cudatoolkit=11.8",
"rapids_conda_packages": f"rapids={stable_version} python=3.10 cudatoolkit=11.8",
},
"nightly": {
"rapids_version": "23.08-nightly",
"rapids_container": "rapidsai/rapidsai-core-nightly:23.08-cuda11.8-runtime-ubuntu22.04-py3.10",
"rapids_version": f"{nightly_version}-nightly",
"rapids_container": f"rapidsai/base:{nightly_version + 'a'}-cuda11.8-py3.10",
"rapids_conda_channels": "-c rapidsai-nightly -c conda-forge -c nvidia",
"rapids_conda_packages": "rapids=23.08 python=3.10 cudatoolkit=11.8",
"rapids_conda_packages": f"rapids={nightly_version} python=3.10 cudatoolkit=11.8",
},
}
rapids_version = (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ spec:
spec:
initContainers:
- name: prepull-rapids
image: us-central1-docker.pkg.dev/nv-ai-infra/rapidsai/rapidsai-core:23.02-cuda11.8-runtime-ubuntu22.04-py3.10
image: us-central1-docker.pkg.dev/nv-ai-infra/rapidsai/rapidsai/base:23.08-cuda12.0-py3.10
command: ["sh", "-c", "'true'"]
containers:
- name: pause
Expand Down
Loading

0 comments on commit e0f69c9

Please sign in to comment.