Google Gemini and small models on Ollama are really difficult to get consistent structured outputs from. Though these models are “supported” via an OpenAI-compatible API, we do not recommend them yet for Stagehand.

Supported LLMs

Out of the box, Stagehand supports LLMs from Anthropic, OpenAI, Groq, and Cerebras. The full list of supported models is below:

You can pass in one of these LLMs to the llm property in the Stagehand constructor.

const stagehand = new Stagehand({
  modelName: "gpt-4o",
  modelClientOptions: {
    apiKey: process.env.OPENAI_API_KEY,
  },
});

Custom LLMs

If you’d like to use a custom LLM, you can do so by providing a custom llmProvider function to the Stagehand constructor.

The only requirement for an LLM to be used with Stagehand is that it supports structured outputs.

OpenAI-compatible APIs

Most LLMs are OpenAI-compatible, and thus can be used with Stagehand as long as they support structured outputs. This includes models like Google Gemini, Ollama, and most Llama models including Groq, Cerebras, and more.

To get started, you can use the OpenAI external client as a template to create a client for your model.

import { Stagehand } from "@browserbasehq/stagehand";
import { CustomOpenAIClient } from "./external_clients/customOpenAI";
import OpenAI from "openai";

const stagehand = new Stagehand({
	...StagehandConfig,
	llmClient: new CustomOpenAIClient({
		modelName: "llama3.3",
		client: new OpenAI({
			apiKey: "ollama",
			baseURL: "http://localhost:11434/v1",
		}),
	}),
});

await stagehand.init();

Vercel AI SDK

The Vercel AI SDK is a popular library for interacting with LLMs. You can use any of the providers supported by the Vercel AI SDK to create a client for your model, as long as they support structured outputs.

Vercel AI SDK supports providers for OpenAI, Anthropic, and Google, along with support for Amazon Bedrock and Azure OpenAI.

To get started, you’ll need to install the ai package and the provider you want to use. For example, to use Amazon Bedrock, you’ll need to install the @ai-sdk/amazon-bedrock package.

You’ll also need to use the Vercel AI SDK external client as a template to create a client for your model.

npm install ai @ai-sdk/amazon-bedrock

To get started, you can use the Vercel AI SDK external client as a template to create a client for your model.

// Install/import the provider you want to use.
// For example, to use OpenAI, import `openai` from @ai-sdk/openai
import { bedrock } from "@ai-sdk/amazon-bedrock";
import { AISdkClient } from "./aisdkClient.ts";

const stagehand = new Stagehand({
  llmClient: new AISdkClient({
	model: bedrock("anthropic.claude-3-7-sonnet-20250219-v1:0"),
  }),
});

LangChain

LangChain is a popular library for building LLM applications. You can use any of the providers supported by LangChain to create a client for your model, as long as they support structured outputs.

To get started, you’ll need to install the langchain package and the provider you want to use. For example, to use OpenAI, you’ll need to install the @langchain/openai package. You’ll also need to install the zod-to-json-schema package to convert your Zod schema to a JSON schema.

You’ll also want to add the LangChain external client to your Stagehand project.

npm install langchain @langchain/openai zod-to-json-schema

Next, you can use the LangChain external client as a template to create a Stagehand LLM client for your model.

import { ChatOpenAI } from "@langchain/openai";
import { LangchainClient } from "./external_clients/langchain";

const stagehand = new Stagehand({
  llmClient: new LangchainClient(
	new ChatOpenAI({ model: "gpt-4o" }),
  ),
});