LLM Customization
Stagehand supports a wide variety of LLMs. You can use any LLM that supports structured outputs with our existing clients, or by writing a custom LLM provider.
Google Gemini and small models on Ollama are really difficult to get consistent structured outputs from. Though these models are “supported” via an OpenAI-compatible API, we do not recommend them yet for Stagehand.
Supported LLMs
Out of the box, Stagehand supports LLMs from Anthropic, OpenAI, Groq, and Cerebras. The full list of supported models is below:
You can pass in one of these LLMs to the llm
property in the Stagehand
constructor.
Custom LLMs
If you’d like to use a custom LLM, you can do so by providing a custom llmProvider
function to the Stagehand
constructor.
The only requirement for an LLM to be used with Stagehand is that it supports structured outputs.
OpenAI-compatible APIs
Most LLMs are OpenAI-compatible, and thus can be used with Stagehand as long as they support structured outputs. This includes models like Google Gemini, Ollama, and most Llama models including Groq, Cerebras, and more.
To get started, you can use the OpenAI external client as a template to create a client for your model.
Vercel AI SDK
The Vercel AI SDK is a popular library for interacting with LLMs. You can use any of the providers supported by the Vercel AI SDK to create a client for your model, as long as they support structured outputs.
Vercel AI SDK supports providers for OpenAI, Anthropic, and Google, along with support for Amazon Bedrock and Azure OpenAI.
To get started, you’ll need to install the ai
package and the provider you want to use. For example, to use Amazon Bedrock, you’ll need to install the @ai-sdk/amazon-bedrock
package.
You’ll also need to use the Vercel AI SDK external client as a template to create a client for your model.
To get started, you can use the Vercel AI SDK external client as a template to create a client for your model.
LangChain
LangChain is a popular library for building LLM applications. You can use any of the providers supported by LangChain to create a client for your model, as long as they support structured outputs.
To get started, you’ll need to install the langchain
package and the provider you want to use. For example, to use OpenAI, you’ll need to install the @langchain/openai
package. You’ll also need to install the zod-to-json-schema
package to convert your Zod schema to a JSON schema.
You’ll also want to add the LangChain external client to your Stagehand project.
Next, you can use the LangChain external client as a template to create a Stagehand LLM client for your model.