Skip to main content

Run the playground against an OpenAI-compliant model provider/proxy

The LangSmith playground allows you to use any model that is compliant with the OpenAI API. You can utilize your model by setting the Proxy Provider for OpenAI in the playground.

Deploy an OpenAI compliant model

Many providers offer OpenAI compliant models or proxy services. Some examples of this include:

You can use these providers to deploy your model and get an API endpoint that is compliant with the OpenAI API.

Take a look at the full specification for more information.

Use the model in the LangSmith Playground

Once you have deployed a model server, you can use it in the LangSmith Playground. Enter the playground and select the Proxy Provider inside the OpenAI modal.

OpenAI Proxy Provider

If everything is set up correctly, you should see the model's response in the playground. You can also use this functionality to invoke downstream pipelines as well.

See how to store your model configuration for later use here.


Was this page helpful?


You can leave detailed feedback on GitHub.