Skip to content

OpenAI-compatible#

This page contains an instruction for the OpenAI LLM connection.

To connect Lens Prism to OpenAI:

  1. Log into the profile of you AI provider.
  2. Obtain the following parameters:

    Parameter Recommended value Comments
    API Key api-key Find in the AI provider profile, or contact organization profile administrator
    Model IDs model-id Find in the AI provider profile, or contact organization profile administrator
    Context Window Size - Accordingly to the LLM settings
    Max Output Tokens - Accordingly to the LLM settings
    Custom Instructions - Optional. Set up personal preferences, such as language, specific details, and so on. These instructions are appended to every prompt as a guidance to personalize and improve Lens Prism responses

Configuration example#

an example of configs

Connecting to Lite LLM#

Lens supports Lite LLM which you can use as a gateway to various LLM APIs.

Tip

If you run your Lite LLM instance from a Kubernetes cluster, make sure that you have created a port-forward session to the corresponding pod or service.

For details see the Lite LLM deployment documentation

  1. In Preferences > Lens Prism AI > AI Provider, select OpenAI-compatible.
  2. Set Base URL to http://localhost:4000/. This value is default, you can set a custom one in the Lite LLM configuration file.
  3. Set the Model IDs to <model-name>

    Tip

    Paste the value specified in the model_name field of the Lite LLM config.yaml. See the example below:

    model_list:
      - model_name: bedrock-claude-3-7-sonnet
    ...
    
  4. Specify the API key in the API Key input field.

    You can type any value, if you have not enabled the master_key or virtual_keys parameters in your Lite LLM configurations. In this case, it does not affect the outcome. However, the API Key field cannot stay empty.

  5. Set all other parameters as specified in the LLM documentation.

Info

See also