diff --git a/.DS_Store b/.DS_Store index 7abfdec..54494c7 100644 Binary files a/.DS_Store and b/.DS_Store differ diff --git a/modules/.DS_Store b/modules/.DS_Store index c305f39..a16fed6 100644 Binary files a/modules/.DS_Store and b/modules/.DS_Store differ diff --git a/modules/ROOT/.DS_Store b/modules/ROOT/.DS_Store new file mode 100644 index 0000000..9210ee7 Binary files /dev/null and b/modules/ROOT/.DS_Store differ diff --git a/modules/ROOT/images/intro_v4.mp4 b/modules/ROOT/images/intro_v4.mp4 new file mode 100644 index 0000000..d8ff088 Binary files /dev/null and b/modules/ROOT/images/intro_v4.mp4 differ diff --git a/modules/ROOT/pages/index.adoc b/modules/ROOT/pages/index.adoc index 43657da..b114985 100644 --- a/modules/ROOT/pages/index.adoc +++ b/modules/ROOT/pages/index.adoc @@ -1,7 +1,10 @@ = Serving LLM Models on OpenShift AI :navtitle: Home -Welcome to this Quick course on _Deploying an LLM using OpenShift AI_. + +video::intro_v4.mp4[width=800,start=60,opts=autoplay] + +Welcome to this quick course on _Serving an LLM using OpenShift AI_. This program was designed to guide you through the process of installing an OpenShift AI Platform using an OpenShift Container Platform Web Console UI. We get hands-on experience in each component needed to enable a RHOAI Platform using an Openshift Container Platform Cluster. @@ -19,11 +22,15 @@ IMPORTANT: The hands-on labs in this course were created and tested with RHOAI v The PTL team acknowledges the valuable contributions of the following Red Hat associates: -*Christopher Nuland +* Christopher Nuland + + * Vijay Chebolu + + * Noel O'Conner -*Vijay Chebolu & Team + * Hunter Gerlach -*Karlos Knox + * Karlos Knox == Classroom Environment @@ -56,16 +63,16 @@ For Red Hat partners who do not have access to RHDP, provision an environment us The overall objectives of this course include: - * Familiarize utilizing Red Hat OpenShift AI to Serve & Interact with an LLM. + * Utilize Red Hat OpenShift AI to serve & interact with an LLM - * Install Red Hat OpenShift AI Operators & Dependencies + * Install Red Hat OpenShift AI operators & dependencies - * Add a custom Model Serving Runtime + * Add a custom model serving runtime * Create a data science project, workbench & data connections - * Load an LLM model into the Ollama Runtime Framework + * Load an LLM model into the Ollama runtime framework - * Import (from Git repositories), interact with LLM model via a Jupyter Notebook + * Import (from git repositories), interact with LLM model via a Jupyter Notebooks * Experiment with the Mistral LLM \ No newline at end of file diff --git a/modules/chapter2/images/createDSC.png b/modules/chapter2/images/createDSC.png new file mode 100644 index 0000000..4d1dfee Binary files /dev/null and b/modules/chapter2/images/createDSC.png differ diff --git a/modules/chapter2/images/createsecret.png b/modules/chapter2/images/createsecret.png new file mode 100644 index 0000000..8501da0 Binary files /dev/null and b/modules/chapter2/images/createsecret.png differ diff --git a/modules/chapter2/images/dcsyamlfile.png b/modules/chapter2/images/dcsyamlfile.png new file mode 100644 index 0000000..743337d Binary files /dev/null and b/modules/chapter2/images/dcsyamlfile.png differ diff --git a/modules/chapter2/images/openshiftai_operator.png b/modules/chapter2/images/openshiftai_operator.png new file mode 100644 index 0000000..cbc85de Binary files /dev/null and b/modules/chapter2/images/openshiftai_operator.png differ diff --git a/modules/chapter2/images/openshiftingress_project.png b/modules/chapter2/images/openshiftingress_project.png new file mode 100644 index 0000000..0206934 Binary files /dev/null and b/modules/chapter2/images/openshiftingress_project.png differ diff --git a/modules/chapter2/images/serverless_operator.png b/modules/chapter2/images/serverless_operator.png new file mode 100644 index 0000000..ea34e50 Binary files /dev/null and b/modules/chapter2/images/serverless_operator.png differ diff --git a/modules/chapter2/pages/section1.adoc b/modules/chapter2/pages/section1.adoc index 3b7a5ab..aaeac20 100644 --- a/modules/chapter2/pages/section1.adoc +++ b/modules/chapter2/pages/section1.adoc @@ -32,7 +32,9 @@ This exercise uses the Red Hat Demo Platform; specifically the OpenShift Contain Installing these Operators prior to the installation of the OpenShift AI Operator in my experience has made a difference in OpenShift AI acknowledging the availability of these components and adjusting the initial configuration to shift management of these components to OpenShift AI. -. Navigate to **Operators** -> **OperatorHub** and search for *OpenShift AI*. +* Navigate to **Operators** -> **OperatorHub** and search for *OpenShift AI*. + +image::openshiftai_operator.png[] . Click on the `Red{nbsp}Hat OpenShift AI` operator. In the pop up window that opens, ensure you select the latest version in the *stable* channel and click on **Install** to open the operator's installation view. + diff --git a/modules/chapter2/pages/section2.adoc b/modules/chapter2/pages/section2.adoc index 3868f3f..2adb647 100644 --- a/modules/chapter2/pages/section2.adoc +++ b/modules/chapter2/pages/section2.adoc @@ -24,6 +24,9 @@ The content of the Secret (data) should contain two items, *tls.cert* and *tls.k . In the Navigation pane on the left, click on the *Workloads* section, then *Secrets* under Workloads. . From the Project dropdown, toggle the *show default projects* radial button to on. . Select the *openshift-ingress* project from the list. + +image::openshiftingress_project.png[] + . Locate the file named *ingress-certs-(XX-XX-2024)*, type should be *Opaque* . Click on the filename to open the secret, Select the *YAML Tab* . Copy all the text from the window, and ensure that you scroll down. (CTL-A should work). @@ -48,8 +51,10 @@ tls.key: >- LS0tLS1CRUd... type: kubernetes.io/tls ``` +image::createsecret.png[] + -* Copy the Name in red portion of the text (optional, but helpful) +* Copy the name of the secret from line 4, just the name (optional, but helpful) * Click *create* to apply this YAML into the istio-system proejct (namespace). *We have copied the Secret used by OCP & made it available be used by OAI.* @@ -81,12 +86,16 @@ serving: managementState: Managed name: knative-serving ``` +image::dcsyamlfile.png[] Once you have made those changes to the YAML file, *Click Create* to Deploy the Data Science Cluster. + +image::createDSC.png[] + Single Model Serve Platform will now be deployed to expose ingress connections with the same certificate as OpenShift Routes. Endpoints will be accessible using TLS without having to ignore error messages or create special configurations. -== Epilogue +== OpenShift AI install summary Congratulations, you have successfully completed the installation of OpenShift AI on an OpenShift Container Cluster. OpenShift AI is now running on a new Dashboard!