Chapter 1. Installing OpenShift Lightspeed
The installation process for Red Hat OpenShift Lightspeed consists of two main tasks: installing the Lightspeed Operator and configuring the large language model (LLM) provider.
1.1. Large Language Model (LLM) configuration overview
You can configure Red Hat Enterprise Linux AI or Red Hat OpenShift AI as large language model (LLM) provider for the OpenShift Lightspeed Service. Either of those LLM providers can use a server or inference service that processes inference queries. Configure the LLM provider before you install the OpenShift Lightspeed Operator.
Alternatively, you can connect the OpenShift Lightspeed Service to one of the publicly available LLM providers, such as IBM watsonx, OpenAI, or Microsoft Azure OpenAI.
1.1.1. Red Hat Enterprise Linux AI with OpenShift Lightspeed
You can use Red Hat Enterprise Linux AI to host an LLM.
For more information, see Generating a custom LLM using RHEL AI.
1.1.2. Red Hat OpenShift AI with OpenShift Lightspeed
You can use Red Hat OpenShift AI to host an LLM.
For more information, see Single-model serving platform.
1.1.3. IBM watsonx with OpenShift Lightspeed
To configure IBM watsonx as the LLM provider, you need an IBM Cloud project with access to IBM watsonx. You also need your IBM watsonx API key.
For more information, see the official IBM watsonx product documentation.
1.1.4. OpenAI with OpenShift Lightspeed
To configure OpenAI as the LLM provider with OpenShift Lightspeed, you need either the OpenAI API key or the OpenAI project name during the configuration process.
The OpenAI Service has a feature for projects and service accounts. You can use a service account in a dedicated project so that you can precisely track OpenShift Lightspeed usage.
For more information, see the official OpenAI product documentation.
1.1.5. Microsoft Azure OpenAI with OpenShift Lightspeed
To configure Microsoft Azure OpenAI as the LLM provider, you need a Microsoft Azure OpenAI Service instance. You must have at least one model deployment in Microsoft Azure OpenAI Studio for that instance.
For more information, see the official Microsoft Azure OpenAI product documentation.
1.2. Installing the OpenShift Lightspeed Operator
Prerequisites
- You have deployed OpenShift Container Platform 4.15 or later. The cluster must be connected to the Internet and have telemetry enabled.
-
You are logged in to the OpenShift Container Platform web console as a user with the
cluster-admin
role. - You have access to the OpenShift CLI (oc).
- You have successfully configured your Large Language Model (LLM) provider so that OpenShift Lightspeed can communicate with it.
Procedure
-
In the OpenShift Container Platform web console, navigate to the Operators
OperatorHub page. - Search for Lightspeed.
- Locate the Lightspeed Operator, and click to select it.
- When the prompt that discusses the community operator appears, click Continue.
- Click Install.
- Use the default installation settings presented, and click Install to continue.
-
Click Operators
Installed Operators to verify that the Lightspeed Operator is installed. Succeeded
should appear in the Status column.