LLMTest

Quickstart

Get up and running in under a minute. LLMTest works two ways: as an MCP server in your IDE, and as an API proxy in your production app.

1. Get your API key

Sign up at llmtest.io/signup, add $5 in credits, and copy your API key from the dashboard. It starts with llmt_.

2. Choose your integration

Just want fallbacks? Point your app at the proxy — automatic failover and JSON recovery work out of the box with zero config. See Fallbacks for details.

3. Or use both

Most teams use the proxy in production for reliability and cost tracking, and the MCP server during development for benchmarking and model selection. They share the same API key and data.

Already calling an LLM API? Just swap your base URL to https://llmtest.io/v1 — the proxy is compatible with any OpenAI-format SDK. Starting fresh? Even easier — set the base URL and pick a model.