models.mdx 3.9 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161
  1. ---
  2. title: Models
  3. description: Configuring an LLM provider and model.
  4. ---
  5. OpenCode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
  6. ---
  7. ## Providers
  8. Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start OpenCode.
  9. Learn more about [providers](/docs/providers).
  10. ---
  11. ## Select a model
  12. Once you've configured your provider you can select the model you want by typing in:
  13. ```bash frame="none"
  14. /models
  15. ```
  16. ---
  17. ## Recommended models
  18. There are a lot of models out there, with new models coming out every week.
  19. :::tip
  20. Consider using one of the models we recommend.
  21. :::
  22. However, there are only a few of them that are good at both generating code and tool calling.
  23. Here are several models that work well with OpenCode, in no particular order. (This is not an exhaustive list):
  24. - GPT 5.1
  25. - GPT 5.1 Codex
  26. - Claude Sonnet 4.5
  27. - Claude Haiku 4.5
  28. - Kimi K2
  29. - GLM 4.6
  30. - Qwen3 Coder
  31. - Gemini 3 Pro
  32. ---
  33. ## Set a default
  34. To set one of these as the default model, you can set the `model` key in your
  35. OpenCode config.
  36. ```json title="opencode.json" {3}
  37. {
  38. "$schema": "https://opencode.ai/config.json",
  39. "model": "lmstudio/google/gemma-3n-e4b"
  40. }
  41. ```
  42. Here the full ID is `provider_id/model_id`.
  43. If you've configured a [custom provider](/docs/providers#custom), the `provider_id` is key from the `provider` part of your config, and the `model_id` is the key from `provider.models`.
  44. ---
  45. ## Configure models
  46. You can globally configure a model's options through the config.
  47. ```jsonc title="opencode.jsonc" {7-12,19-24}
  48. {
  49. "$schema": "https://opencode.ai/config.json",
  50. "provider": {
  51. "openai": {
  52. "models": {
  53. "gpt-5": {
  54. "options": {
  55. "reasoningEffort": "high",
  56. "textVerbosity": "low",
  57. "reasoningSummary": "auto",
  58. "include": ["reasoning.encrypted_content"],
  59. },
  60. },
  61. },
  62. },
  63. "anthropic": {
  64. "models": {
  65. "claude-sonnet-4-5-20250929": {
  66. "options": {
  67. "thinking": {
  68. "type": "enabled",
  69. "budgetTokens": 16000,
  70. },
  71. },
  72. },
  73. },
  74. },
  75. },
  76. }
  77. ```
  78. Here we're configuring global settings for two built-in models: `gpt-5` when accessed via the `openai` provider, and `claude-sonnet-4-20250514` when accessed via the `anthropic` provider.
  79. The built-in provider and model names can be found on [Models.dev](https://models.dev).
  80. You can also configure these options for any agents that you are using. The agent config overrides any global options here. [Learn more](/docs/agents/#additional).
  81. You can also define custom models that extend built-in ones and can optionally use specific options by referring to their id:
  82. ```jsonc title="opencode.jsonc" {6-20}
  83. {
  84. "$schema": "https://opencode.ai/config.json",
  85. "provider": {
  86. "opencode": {
  87. "models": {
  88. "gpt-5-high": {
  89. "id": "gpt-5",
  90. "options": {
  91. "reasoningEffort": "high",
  92. "textVerbosity": "low",
  93. "reasoningSummary": "auto",
  94. },
  95. },
  96. "gpt-5-low": {
  97. "id": "gpt-5",
  98. "options": {
  99. "reasoningEffort": "low",
  100. "textVerbosity": "low",
  101. "reasoningSummary": "auto",
  102. },
  103. },
  104. },
  105. },
  106. },
  107. }
  108. ```
  109. ---
  110. ## Loading models
  111. When OpenCode starts up, it checks for models in the following priority order:
  112. 1. The `--model` or `-m` command line flag. The format is the same as in the config file: `provider_id/model_id`.
  113. 2. The model list in the OpenCode config.
  114. ```json title="opencode.json"
  115. {
  116. "$schema": "https://opencode.ai/config.json",
  117. "model": "anthropic/claude-sonnet-4-20250514"
  118. }
  119. ```
  120. The format here is `provider/model`.
  121. 3. The last used model.
  122. 4. The first model using an internal priority.