| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161 |
- ---
- title: Models
- description: Configuring an LLM provider and model.
- ---
- OpenCode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
- ---
- ## Providers
- Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start OpenCode.
- Learn more about [providers](/docs/providers).
- ---
- ## Select a model
- Once you've configured your provider you can select the model you want by typing in:
- ```bash frame="none"
- /models
- ```
- ---
- ## Recommended models
- There are a lot of models out there, with new models coming out every week.
- :::tip
- Consider using one of the models we recommend.
- :::
- However, there are only a few of them that are good at both generating code and tool calling.
- Here are several models that work well with OpenCode, in no particular order. (This is not an exhaustive list):
- - GPT 5.1
- - GPT 5.1 Codex
- - Claude Sonnet 4.5
- - Claude Haiku 4.5
- - Kimi K2
- - GLM 4.6
- - Qwen3 Coder
- - Gemini 3 Pro
- ---
- ## Set a default
- To set one of these as the default model, you can set the `model` key in your
- OpenCode config.
- ```json title="opencode.json" {3}
- {
- "$schema": "https://opencode.ai/config.json",
- "model": "lmstudio/google/gemma-3n-e4b"
- }
- ```
- Here the full ID is `provider_id/model_id`.
- If you've configured a [custom provider](/docs/providers#custom), the `provider_id` is key from the `provider` part of your config, and the `model_id` is the key from `provider.models`.
- ---
- ## Configure models
- You can globally configure a model's options through the config.
- ```jsonc title="opencode.jsonc" {7-12,19-24}
- {
- "$schema": "https://opencode.ai/config.json",
- "provider": {
- "openai": {
- "models": {
- "gpt-5": {
- "options": {
- "reasoningEffort": "high",
- "textVerbosity": "low",
- "reasoningSummary": "auto",
- "include": ["reasoning.encrypted_content"],
- },
- },
- },
- },
- "anthropic": {
- "models": {
- "claude-sonnet-4-5-20250929": {
- "options": {
- "thinking": {
- "type": "enabled",
- "budgetTokens": 16000,
- },
- },
- },
- },
- },
- },
- }
- ```
- Here we're configuring global settings for two built-in models: `gpt-5` when accessed via the `openai` provider, and `claude-sonnet-4-20250514` when accessed via the `anthropic` provider.
- The built-in provider and model names can be found on [Models.dev](https://models.dev).
- You can also configure these options for any agents that you are using. The agent config overrides any global options here. [Learn more](/docs/agents/#additional).
- You can also define custom models that extend built-in ones and can optionally use specific options by referring to their id:
- ```jsonc title="opencode.jsonc" {6-20}
- {
- "$schema": "https://opencode.ai/config.json",
- "provider": {
- "opencode": {
- "models": {
- "gpt-5-high": {
- "id": "gpt-5",
- "options": {
- "reasoningEffort": "high",
- "textVerbosity": "low",
- "reasoningSummary": "auto",
- },
- },
- "gpt-5-low": {
- "id": "gpt-5",
- "options": {
- "reasoningEffort": "low",
- "textVerbosity": "low",
- "reasoningSummary": "auto",
- },
- },
- },
- },
- },
- }
- ```
- ---
- ## Loading models
- When OpenCode starts up, it checks for models in the following priority order:
- 1. The `--model` or `-m` command line flag. The format is the same as in the config file: `provider_id/model_id`.
- 2. The model list in the OpenCode config.
- ```json title="opencode.json"
- {
- "$schema": "https://opencode.ai/config.json",
- "model": "anthropic/claude-sonnet-4-20250514"
- }
- ```
- The format here is `provider/model`.
- 3. The last used model.
- 4. The first model using an internal priority.
|