Stagehand uses Large Language Models (LLMs) to understand web pages, plan actions, and interact with complex interfaces. The choice of LLM significantly impacts your automation’s accuracy, speed, and cost.

Model Evaluation

Find more details about how to choose the right model on our Model Evaluation page.

Why LLM Choice Matters

  • Accuracy: Better models provide more reliable element detection and action planning
  • Speed: Faster models reduce automation latency
  • Cost: Different providers offer varying pricing structures
  • Reliability: Structured output support ensures consistent automation behavior
Find more details about how to choose the right model on our Model Evaluation page.
Small models on Ollama struggle with consistent structured outputs. While technically supported, we don’t recommend them for production Stagehand workflows.

Environment Variables Setup

Set up your API keys before configuring Stagehand:
# Choose one or more providers
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
GOOGLE_API_KEY=your_google_key_here
GROQ_API_KEY=your_groq_key_here

Supported Providers

Stagehand supports major LLM providers with structured output capabilities:

Production-Ready Providers

ProviderBest ModelsStrengthsUse Case
OpenAIgpt-4.1, gpt-4.1-miniHigh accuracy, reliableProduction, complex sites
Anthropicclaude-3-7-sonnet-latestExcellent reasoningComplex automation tasks
Googlegemini-2.5-flash, gemini-2.5-proFast, cost-effectiveHigh-volume automation

Additional Providers

Basic Configuration

Model Name Format

Stagehand uses the format provider/model-name for model specification. Examples:
  • OpenAI: openai/gpt-4.1
  • Anthropic: anthropic/claude-3-7-sonnet-latest
  • Google: google/gemini-2.5-flash (Recommended)

Quick Start Examples

Custom LLM Integration

Custom LLMs are currently only supported in TypeScript.
Integrate any LLM with Stagehand using custom clients. The only requirement is structured output support for consistent automation behavior.

Vercel AI SDK

The Vercel AI SDK is a popular library for interacting with LLMs. You can use any of the providers supported by the Vercel AI SDK to create a client for your model, as long as they support structured outputs. Vercel AI SDK supports providers for OpenAI, Anthropic, and Google, along with support for Amazon Bedrock and Azure OpenAI. To get started, you’ll need to install the ai package and the provider you want to use. For example, to use Amazon Bedrock, you’ll need to install the @ai-sdk/amazon-bedrock package. You’ll also need to use the Vercel AI SDK external client as a template to create a client for your model.
npm install ai @ai-sdk/amazon-bedrock
To get started, you can use the Vercel AI SDK external client as a template to create a client for your model.
// Install/import the provider you want to use.
// For example, to use OpenAI, import `openai` from @ai-sdk/openai
import { bedrock } from "@ai-sdk/amazon-bedrock";
import { AISdkClient } from "./external_clients/aisdk";

const stagehand = new Stagehand({
  llmClient: new AISdkClient({
	model: bedrock("anthropic.claude-3-7-sonnet-20250219-v1:0"),
  }),
});

Troubleshooting

Common Issues

Next Steps