๐Ÿšจ Note We have fully transitioned to use the OpenAI SDK as our main interaction point. Please see here for more information.

Using lytix to manage your OptiModel server is as simple as creating a lytix API key.

Prerequisite First create a lytix account here

Create a Lytix API Key

Start by creating and noting down a lytix api key. See instructions here

(Reccomended) Set you API key via the environment variable

export LX_API_KEY=<your-api-key-here>

Set your API key in the code

from optimodel import LytixCreds

LytixCreds.setAPIKey('<your-api-key-here>')

(Optional) Add a new Provider

Note: You optinally skip this step and can send credentials with every SDK call. See instructions here

Before you can start making LLMs calls, youโ€™ll first need to setup a new provider here

Note Access to models is limited to what providers you have setup. For example, if you only setup OpenAI, you will not be able to call llama3 models.

Install the SDK

pip3 install optimodel-py

Call The SDK

Now you are ready to make your first call by passing in the LX_API_KEY environment variable.

from optimodel import queryModel, listModels, ModelMessage, ModelTypes

prompt = "Hello How are you?"

response = await queryModel(
    model=ModelTypes.llama_3_8b_instruct,
    messages=[
        ModelMessage(
            role="system",
            content="You are a helpful assistant. Always respond in JSON syntax",
        ),
        ModelMessage(role="user", content=prompt),
    ],
)
print("Got response:", response)

Just remember to pass your LX_API_KEY when starting your program as an environment variable.

> LX_API_KEY=<your-api-key-here> python3 your_script.py

Using Local Credentials

If you want to use local credentials, you can do so by passing in an array of credentials to the queryModel function.

response = await queryModel(
    ...the same parameters as above
    credentials: [creds],
)

Where each items in credentials can be any of the following

AWS Credentials

from optimodel import AWSBedrockCredentials

creds = AWSBedrockCredentials(
  awsAccessKeyId="$AWS_ACCESS_KEY_ID",
  awsSecretKey="$AWS_SECRET_KEY",
  awsRegion="$AWS_REGION",
)

Open AI

from optimodel import OpenAICredentials

creds = OpenAICredentials(
  openAiKey="$OPENAI_API_KEY",
)

Anthropic

from optimodel import AnthropicCredentials

creds = AnthropicCredentials(
  anthropicApiKey="$ANTHROPIC_API_KEY",
)

TogetherAI

from optimodel import TogetherAICredentials

creds = TogetherAICredentials(
  togetherApiKey="$TOGETHER_API_KEY",
)

Groq

from optimodel import GroqCredentials

creds = GroqCredentials(
  groqApiKey="$ANTHROPIC_API_KEY",
)

MistralAI

from optimodel import MistralAICredentials

creds = MistralAICredentials(
  mistralApiKey="$MISTRAL_API_KEY",
)

Extra Parameters

The following extra parameters are available to pass to the queryModel function:

speedPriority: This can be used to control how OptiModel should prioritize the request. If set to high it will not focus on cost

speedPriority = "low" | "high"

validator: This is a function that will be used to validate the response. If the validator returns False the request will be retried. Note: You must pass fallbackModels if you use a validator.

validator = Callable[[str], bool]

fallbackModels: This a list of other models to fallback to if the first model fails the validator.

fallbackModels = List[ModelTypes]

maxGenLen: This is the maximum length of the response, if the model response is longer than this, it will be truncated. This will check against the contig, so for example if you pass a value of 1 million, and no provider will be able to generate a response of that length, the request will fail.

maxGenLen = int

jsonMode: This will enable json mode for the request. This is useful if you want to pass in a json object as a prompt.

jsonMode = bool

provider: You can optionally force a specific provder to be used. This is useful if you have multiple providers setup and want to force a specific one to be used.

provider = "openai" | "groq" | "bedrock" | "anthropic" | "together"

userId: You can optionally pass in a userId to track user requests across the Lytix platform. This is often a unique user identifier.

userId = str

sessionId: You can optionally pass in a sessionId to track sessions or workflows. Events with the same sessionId will be grouped together.

sessionId = str

workflowName: You can optionally pass in a workflowName to track sessions or workflows. Events with the same workflowName will be grouped together.

workflowName = str