Connect LLMTest to your IDE as an MCP server. Works with any tool that supports the Model Context Protocol.
/mcp and press EnterllmtestLLMTEST_API_KEY=your-llmt_-key-here npx -y llmtest-mcp
Add this to your MCP config file:
{
"mcpServers": {
"llmtest": {
"command": "npx",
"args": ["-y", "llmtest-mcp"],
"env": {
"LLMTEST_API_KEY": "your-llmt_-key-here"
}
}
}
}Config file locations:
~/.cursor/mcp.json~/.windsurf/mcp.jsonUse the same config above. The MCP server is published as llmtest-mcp on npm.
Once connected, ask your AI assistant:
You should see your flows, call count, and credit balance.
See MCP Tools Reference for the full list of available tools.