Skip to content

Commit

Permalink
deploy: 3beee07
Browse files Browse the repository at this point in the history
  • Loading branch information
kknoxrht committed Jun 26, 2024
1 parent 0c55c23 commit 007361c
Show file tree
Hide file tree
Showing 6 changed files with 34 additions and 71 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
41 changes: 10 additions & 31 deletions model-serving/1.1/chapter2/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,7 @@ <h2 id="_supported_configurations"><a class="anchor" href="#_supported_configura
</li>
<li>
<p>Self-managed software that you can install on-premise or on the public cloud in a self-managed environment, such as <strong>OpenShift Container Platform</strong>.
For information about OpenShift AI as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see <a href="https://access.redhat.com/documentation/en-us/red_hat_openshift_ai_self-managed/2.8">Product Documentation for Red Hat OpenShift AI Self-Managed 2.8</a>.</p>
For information about OpenShift AI as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see <a href="https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.10">Product Documentation for Red Hat OpenShift AI Self-Managed 2.10</a>.</p>
</li>
</ul>
</div>
Expand All @@ -191,49 +191,28 @@ <h2 id="_applicable_operators"><a class="anchor" href="#_applicable_operators"><
<div class="paragraph">
<p>In addition to the <strong>Red&#160;Hat OpenShift AI</strong> Operator there are additional operators that you may need to install depending on which features and components of <strong>Red&#160;Hat OpenShift AI</strong> you want to utilize.</p>
</div>
<div class="admonitionblock note">
<table>
<tr>
<td class="icon">
<i class="fa icon-note" title="Note"></i>
</td>
<td class="content">
<div class="paragraph">
<p>To support the KServe component, which is used by the single-model serving platform to serve large models, install the Operators for Red Hat OpenShift Serverless and Red Hat OpenShift Service Mesh.</p>
</div>
</td>
</tr>
</table>
</div>
<div class="dlist">
<dl>
<dt class="hdlist1"><a href="https://docs.openshift.com/container-platform/latest/hardware_enablement/psap-node-feature-discovery-operator.html">OpenShift Serveless Operator</a></dt>
<dt class="hdlist1"><a href="https://www.redhat.com/en/technologies/cloud-computing/openshift/serverless">Red&#160;Hat OpenShift Serverless Operator</a></dt>
<dd>
<p>The <strong>OpenShift Serveless Operator</strong> is a prerequisite for the <strong>Single Model Serving Platform</strong>.</p>
<p>The <strong>Red Hat OpenShift Serverless operator</strong> provides a collection of APIs that enables containers, microservices and functions to run "serverless". The <strong>Red&#160;Hat OpenShift Serverless Operator</strong> is required if you want to install the Single-model serving platform component.</p>
</dd>
<dt class="hdlist1"><a href="https://docs.openshift.com/container-platform/latest/hardware_enablement/psap-node-feature-discovery-operator.html">OpenShift Service Mesh Operator</a></dt>
<dt class="hdlist1"><a href="https://catalog.redhat.com/software/container-stacks/detail/5ec53e8c110f56bd24f2ddc4">Red&#160;Hat OpenShift Service Mesh Operator</a></dt>
<dd>
<p>The <strong>OpenShift Service Mesh Operator</strong> is a prerequisite for the <strong>Single Model Serving Platform</strong>.</p>
<p><strong>Red Hat OpenShift Service Mesh operator</strong> provides an easy way to create a network of deployed services that provides discovery, load balancing, service-to-service authentication, failure recovery, metrics, and monitoring. The <strong>Red&#160;Hat OpenShift Serverless Operator</strong> is required if you want to install the Single-model serving platform component.</p>
</dd>
<dt class="hdlist1"><a href="https://www.redhat.com/en/technologies/cloud-computing/openshift/pipelines">Red&#160;Hat OpenShift Pipelines Operator</a></dt>
<dt class="hdlist1"><a href="https://developers.redhat.com/articles/2021/06/18/authorino-making-open-source-cloud-native-api-security-simple-and-flexible">Red&#160;Hat Authorino (technical preview) Operator</a></dt>
<dd>
<p>The <strong>Red&#160;Hat OpenShift Pipelines Operator</strong> is a prerequisite for the <strong>Single Model Serving Platform</strong>.</p>
<p><strong>Red Hat Authorino</strong> is an open source, Kubernetes-native external authorization service to protect APIs. The <strong>Red&#160;Hat Authorino Operator</strong> is required to support enforcing authentication policies in Red Hat OpenShift AI.</p>
</dd>
</dl>
</div>
<div class="admonitionblock note">
<table>
<tr>
<td class="icon">
<i class="fa icon-note" title="Note"></i>
</td>
<td class="content">
<div class="exampleblock">
<div class="content">
<div class="paragraph">
<p>The following Operators are required to support the use of Nvidia GPUs (accelerators) with OpenShift AI:</p>
</div>
</td>
</tr>
</table>
</div>
</div>
<div class="dlist">
<dl>
Expand Down
14 changes: 2 additions & 12 deletions model-serving/1.1/chapter2/section1.html
Original file line number Diff line number Diff line change
Expand Up @@ -161,13 +161,6 @@ <h1 class="page">Installing Red&#160;Hat OpenShift AI Using the Web Console</h1>
<div class="paragraph">
<p><strong>Red&#160;Hat OpenShift AI</strong> is available as an operator via the OpenShift Operator Hub. You will install the <strong>Red&#160;Hat OpenShift AI operator</strong> and dependencies using the OpenShift web console in this section.</p>
</div>
<div class="videoblock">
<div class="content">
<video src="_images/openshiftai_operator.mp4" width="640" controls>
Your browser does not support the video tag.
</video>
</div>
</div>
</div>
</div>
<div class="sect1">
Expand Down Expand Up @@ -207,16 +200,13 @@ <h2 id="_lab_installation_of_redhat_openshift_ai"><a class="anchor" href="#_lab_
<div class="ulist">
<ul>
<li>
<p>Web Terminal</p>
</li>
<li>
<p>Red Hat OpenShift Serverless</p>
</li>
<li>
<p>Red Hat OpenShift Service Mesh</p>
</li>
<li>
<p>Red Hat OpenShift Pipelines</p>
<p>Red Hat Authorino technical preview</p>
</li>
<li>
<p>GPU Support</p>
Expand Down Expand Up @@ -260,7 +250,7 @@ <h2 id="_lab_installation_of_redhat_openshift_ai"><a class="anchor" href="#_lab_
<div class="olist arabic">
<ol class="arabic">
<li>
<p>Click on the <code>Red&#160;Hat OpenShift AI</code> operator. In the pop up window that opens, ensure you select the latest version in the <strong>fast</strong> channel. Any version greater than 2.91 and click on <strong>Install</strong> to open the operator&#8217;s installation view.</p>
<p>Click on the <code>Red&#160;Hat OpenShift AI</code> operator. In the pop up window that opens, ensure you select the latest version in the <strong>stable</strong> channel. Any version greater than 2.10 and click on <strong>Install</strong> to open the operator&#8217;s installation view.</p>
</li>
<li>
<p>In the <code>Install Operator</code> page, leave all of the options as default and click on the <strong>Install</strong> button to start the installation.</p>
Expand Down
9 changes: 6 additions & 3 deletions model-serving/1.1/chapter2/section3.html
Original file line number Diff line number Diff line change
Expand Up @@ -237,13 +237,16 @@ <h2 id="_openshift_ai_install_summary"><a class="anchor" href="#_openshift_ai_in
<div class="ulist">
<ul>
<li>
<p>Serverless, ServiceMesh, &amp; Pipelines Operators</p>
<p>Red Hat OpenShift Serverless</p>
</li>
<li>
<p>OpenShift AI Operator</p>
<p>Red Hat OpenShift ServiceMesh</p>
</li>
<li>
<p>Red Hat Authorino (technical preview)</p>
</li>
<li>
<p>Web Terminal Operator</p>
<p>OpenShift AI Operator</p>
</li>
</ul>
</div>
Expand Down
28 changes: 13 additions & 15 deletions model-serving/1.1/chapter3/section1.html
Original file line number Diff line number Diff line change
Expand Up @@ -171,23 +171,21 @@ <h1 class="page">Creating OpenShift AI Resources - 1</h1>
<h2 id="_model_serving_runtimes"><a class="anchor" href="#_model_serving_runtimes"></a>Model Serving Runtimes</h2>
<div class="sectionbody">
<div class="paragraph">
<p>A model-serving runtime provides integration with a specified model server and the model frameworks that it supports. By default, Red Hat OpenShift AI includes the following Model RunTimes:</p>
<p>A model-serving runtime provides integration with a specified model server and the model frameworks that it supports. By default, Red Hat OpenShift AI includes the following model serving runTimes:</p>
</div>
<div class="literalblock">
<div class="content">
<pre>Multi-model
* OpenVINO Model Server - Multi-model
Single-model
* OpenVINO Model Server
* Caikit TGIS for KServe
* TGIS Standalone for KServe
* vLLM For KServe</pre>
</div>
<div class="ulist">
<ul>
<li>
<p>OpenVINO Model Server runtime.</p>
</li>
<li>
<p>Caikit TGIS for KServe</p>
</li>
<li>
<p>TGIS Standalone for KServe</p>
</li>
</ul>
</div>
<div class="paragraph">
<p>However, if these runtime do not meet your needs (if they don&#8217;t support a particular model framework, for example), you might want to add your own custom runtimes.</p>
<p>However, if these runtimes do not meet your needs (if they don&#8217;t support a particular model framework, for example), you might want to add your own custom runtimes.</p>
</div>
<div class="paragraph">
<p>As an administrator, you can use the OpenShift AI interface to add and enable custom model-serving runtimes. You can then choose from your enabled runtimes when you create a new model server.</p>
Expand Down Expand Up @@ -247,7 +245,7 @@ <h2 id="_add_the_ollama_custom_runtime"><a class="anchor" href="#_add_the_ollama
builtInAdapter:
modelLoadingTimeoutMillis: 90000
containers:
- image: quay.io/rh-aiservices-bu/ollama-ubi9:0.1.30
- image: quay.io/rh-aiservices-bu/ollama-ubi9:0.1.45
env:
- name: OLLAMA_MODELS
value: /.ollama/models
Expand Down
13 changes: 3 additions & 10 deletions model-serving/1.1/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ <h1 class="page">Serving an LLM using OpenShift AI</h1>
<p>This program was designed to guide you through the process of installing an OpenShift AI Platform using the OpenShift Container Platform Web Console UI. We get hands-on experience in each component needed to enable a RHOAI Platform using an Openshift Container Platform Cluster.</p>
</div>
<div class="paragraph">
<p>Once we have an operational OpenShift AI Platform, we will login and begin the configuration of: Model Runtimes, Data Science Projects, Data connections, &amp; finally use a jupyter notebook to infer the answers to easy questions.</p>
<p>Once we have an operational OpenShift AI Platform, we will login and begin the configuration of: model runtimes, data science projects, data connections, &amp; finally use a jupyter notebook to infer the answers to easy questions.</p>
</div>
<div class="paragraph">
<p>There will be some challenges along the way, all designed to teach us about a component, or give us the knowledge needed to utilize OpenShift AI and host a Large Language Model.</p>
Expand Down Expand Up @@ -237,13 +237,6 @@ <h2 id="_classroom_environment"><a class="anchor" href="#_classroom_environment"
</tr>
</table>
</div>
<div class="videoblock">
<div class="content">
<video src="_images/openshiftai_demo.mp4" width="640" controls>
Your browser does not support the video tag.
</video>
</div>
</div>
<div class="paragraph">
<p>When ordering this catalog item in RHDP:</p>
</div>
Expand All @@ -267,7 +260,7 @@ <h2 id="_classroom_environment"><a class="anchor" href="#_classroom_environment"
</ol>
</div>
<div class="paragraph">
<p>For Red Hat partners who do not have access to RHDP, provision an environment using the Red Hat Hybrid Cloud Console. Unfortunately, the labs will NOT work on the trial sandbox environment. You need to provision an OpenShift AI cluster on-premises, or in the supported cloud environments by following the product documentation at <a href="https://access.redhat.com/documentation/en-us/red_hat_openshift_ai_self-managed/2.9/html/installing_and_uninstalling_openshift_ai_self-managed/index">Product Documentation for Red Hat OpenShift AI 2024</a>.</p>
<p>For Red Hat partners who do not have access to RHDP, provision an environment using the Red Hat Hybrid Cloud Console. Unfortunately, the labs will NOT work on the trial sandbox environment. You need to provision an OpenShift AI cluster on-premises, or in the supported cloud environments by following the product documentation at <a href="https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.10/html/installing_and_uninstalling_openshift_ai_self-managed/index">Product Documentation for installing Red Hat OpenShift AI 2.10</a>.</p>
</div>
</div>
</div>
Expand Down Expand Up @@ -313,7 +306,7 @@ <h2 id="_objectives"><a class="anchor" href="#_objectives"></a>Objectives</h2>
<p>Import (from git repositories), interact with LLM model via Jupyter Notebooks</p>
</li>
<li>
<p>Experiment with the Mistral LLM</p>
<p>Experiment with the Mistral LLM and Llama3 large language models</p>
</li>
</ul>
</div>
Expand Down

0 comments on commit 007361c

Please sign in to comment.