OpenAI-compatible#
This page contains an instruction for the OpenAI LLM connection.
To connect Lens Prism to OpenAI:
- Log into the profile of you AI provider.
-
Obtain the following parameters:
Parameter Recommended value Comments API Key api-key
Find in the AI provider profile, or contact organization profile administrator Model IDs model-id
Find in the AI provider profile, or contact organization profile administrator Context Window Size - Accordingly to the LLM settings Max Output Tokens - Accordingly to the LLM settings Custom Instructions - Optional. Set up personal preferences, such as language, specific details, and so on. These instructions are appended to every prompt as a guidance to personalize and improve Lens Prism responses
Configuration example#
Connecting to Lite LLM#
Lens supports Lite LLM which you can use as a gateway to various LLM APIs.
Tip
If you run your Lite LLM instance from a Kubernetes cluster, make sure that you have created a port-forward session to the corresponding pod or service.
For details see the Lite LLM deployment documentation
- In Preferences > Lens Prism AI > AI Provider, select OpenAI-compatible.
- Set Base URL to
http://localhost:4000/
. This value is default, you can set a custom one in the Lite LLM configuration file. -
Set the Model IDs to <model-name>
Tip
Paste the value specified in the
model_name
field of the Lite LLMconfig.yaml
. See the example below:model_list: - model_name: bedrock-claude-3-7-sonnet ...
-
Specify the API key in the API Key input field.
You can type any value, if you have not enabled the
master_key
orvirtual_keys
parameters in your Lite LLM configurations. In this case, it does not affect the outcome. However, the API Key field cannot stay empty. -
Set all other parameters as specified in the LLM documentation.