Models / arcee-ai/coder-large

Arcee AI: Coder Large

by arcee-ai

Input price

$0.50 /1M tok

Output price

$0.80 /1M tok

Context length

33K tokens

JSON mode

No

Model details

Model ID
arcee-ai/coder-large
Provider
arcee-ai
Modality
text->text
Supports response_format
No
Added to LLMTest
Jan 21, 1970

API usage

Use this model through the LLMTest proxy. Replace your base URL and set the model ID:

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "your-llmt_-key-here",
  baseURL: "https://llmtest.io/v1",
});

const response = await client.chat.completions.create({
  model: "arcee-ai/coder-large",
  messages: [
    { role: "system", content: "You are a helpful assistant." },
    { role: "user", content: "Hello, how are you?" }
  ],
  temperature: 0.7,          // 0-2, higher = more creative
  max_tokens: 1024,          // max tokens to generate
  // stream: true,           // enable streaming (SSE)
  // top_p: 0.9,             // nucleus sampling
});

Supported parameters

Parameter Type Description
modelstringMust be arcee-ai/coder-large
messagesarrayArray of message objects with role and content
temperaturenumberSampling temperature (0-2). Default varies by model.
max_tokensintegerMax tokens to generate
top_pnumberNucleus sampling (0-1)
streambooleanStream response via SSE
stopstring | arrayStop sequences
response_formatobjectNot supported by this model. LLMTest will automatically strip this parameter if sent.

LLMTest features for this model