providers.mdx 19 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887
  1. ---
  2. title: Providers
  3. description: Using any LLM provider in opencode.
  4. ---
  5. import config from "../../../../config.mjs"
  6. export const console = config.console
  7. opencode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
  8. To add a provider you need to:
  9. 1. Add the API keys for the provider using `opencode auth login`.
  10. 2. Configure the provider in your opencode config.
  11. ---
  12. ### Credentials
  13. When you add a provider's API keys with `opencode auth login`, they are stored
  14. in `~/.local/share/opencode/auth.json`.
  15. ---
  16. ### Config
  17. You can customize the providers through the `provider` section in your opencode
  18. config.
  19. ---
  20. #### Base URL
  21. You can customize the base URL for any provider by setting the `baseURL` option. This is useful when using proxy services or custom endpoints.
  22. ```json title="opencode.json" {6}
  23. {
  24. "$schema": "https://opencode.ai/config.json",
  25. "provider": {
  26. "anthropic": {
  27. "options": {
  28. "baseURL": "https://api.anthropic.com/v1"
  29. }
  30. }
  31. }
  32. }
  33. ```
  34. ---
  35. ## opencode zen
  36. opencode zen is a list of models provided by the opencode team that have been
  37. tested and verified to work well with opencode. [Learn more](/docs/zen).
  38. :::tip
  39. If you are new, we recommend starting with opencode zen.
  40. :::
  41. 1. You sign in to **<a href={console}>opencode zen</a>** and get your API key.
  42. 2. You run `opencode auth login` and select opencode zen and add your API key.
  43. 3. Run `/models` in the TUI to see the list of models we recommend.
  44. It works like any other provider in opencode. And is completely optional to use
  45. it.
  46. ---
  47. ## Directory
  48. Let's look at some of the providers in detail. If you'd like to add a provider to the
  49. list, feel free to open a PR.
  50. :::note
  51. Don't see a provider here? Submit a PR.
  52. :::
  53. ---
  54. ### Amazon Bedrock
  55. To use Amazon Bedrock with opencode:
  56. 1. Head over to the **Model catalog** in the Amazon Bedrock console and request
  57. access to the models you want.
  58. :::tip
  59. You need to have access to the model you want in Amazon Bedrock.
  60. :::
  61. 1. You'll need either to set one of the following environment variables:
  62. - `AWS_ACCESS_KEY_ID`: You can get this by creating an IAM user and generating
  63. an access key for it.
  64. - `AWS_PROFILE`: First login through AWS IAM Identity Center (or AWS SSO) using
  65. `aws sso login`. Then get the name of the profile you want to use.
  66. - `AWS_BEARER_TOKEN_BEDROCK`: You can generate a long-term API key from the
  67. Amazon Bedrock console.
  68. Once you have one of the above, set it while running opencode.
  69. ```bash
  70. AWS_ACCESS_KEY_ID=XXX opencode
  71. ```
  72. Or add it to a `.env` file in the project root.
  73. ```bash title=".env"
  74. AWS_ACCESS_KEY_ID=XXX
  75. ```
  76. Or add it to your bash profile.
  77. ```bash title="~/.bash_profile"
  78. export AWS_ACCESS_KEY_ID=XXX
  79. ```
  80. 1. Run the `/models` command to select the model you want.
  81. ---
  82. ### Anthropic
  83. We recommend signing up for [Claude Pro](https://www.anthropic.com/news/claude-pro) or [Max](https://www.anthropic.com/max), it's the most cost-effective way to use opencode.
  84. Once you've singed up, run `opencode auth login` and select Anthropic.
  85. ```bash
  86. $ opencode auth login
  87. ┌ Add credential
  88. ◆ Select provider
  89. │ ● Anthropic (recommended)
  90. │ ○ OpenAI
  91. │ ○ Google
  92. │ ...
  93. ```
  94. Here you can select the **Claude Pro/Max** option and it'll open your browser
  95. and ask you to authenticate.
  96. ```bash
  97. $ opencode auth login
  98. ┌ Add credential
  99. ◇ Select provider
  100. │ Anthropic
  101. ◆ Login method
  102. │ ● Claude Pro/Max
  103. │ ○ Create API Key
  104. │ ○ Manually enter API Key
  105. ```
  106. Now all the the Anthropic models should be available when you use the `/models` command.
  107. ##### Using API keys
  108. You can also select **Create API Key** if you don't have a Pro/Max subscription. It'll also open your browser and ask you to login to Anthropic and give you a code you can paste in your terminal.
  109. Or if you already have an API key, you can select **Manually enter API Key** and paste it in your terminal.
  110. ---
  111. ### Azure OpenAI
  112. 1. Head over to the [Azure portal](https://portal.azure.com/) and create an **Azure OpenAI** resource. You'll need:
  113. - **Resource name**: This becomes part of your API endpoint (`https://RESOURCE_NAME.openai.azure.com/`)
  114. - **API key**: Either `KEY 1` or `KEY 2` from your resource
  115. 2. Go to [Azure AI Foundry](https://ai.azure.com/) and deploy a model.
  116. :::note
  117. The deployment name must match the model name for opencode to work properly.
  118. :::
  119. 3. Run `opencode auth login` and select **Azure**.
  120. ```bash
  121. $ opencode auth login
  122. ┌ Add credential
  123. ◆ Select provider
  124. │ ● Azure
  125. │ ...
  126. ```
  127. 4. Enter your API key.
  128. ```bash
  129. $ opencode auth login
  130. ┌ Add credential
  131. ◇ Select provider
  132. │ Azure
  133. ◇ Enter your API key
  134. │ _
  135. ```
  136. 5. Set your resource name as an environment variable:
  137. ```bash
  138. AZURE_RESOURCE_NAME=XXX opencode
  139. ```
  140. Or add it to a `.env` file in the project root:
  141. ```bash title=".env"
  142. AZURE_RESOURCE_NAME=XXX
  143. ```
  144. Or add it to your bash profile:
  145. ```bash title="~/.bash_profile"
  146. export AZURE_RESOURCE_NAME=XXX
  147. ```
  148. 6. Run the `/models` command to select your deployed model.
  149. ---
  150. ### Cerebras
  151. 1. Head over to the [Cerebras console](https://inference.cerebras.ai/), create an account, and generate an API key.
  152. 2. Run `opencode auth login` and select **Cerebras**.
  153. ```bash
  154. $ opencode auth login
  155. ┌ Add credential
  156. ◆ Select provider
  157. │ ● Cerebras
  158. │ ...
  159. ```
  160. 3. Enter your Cerebras API key.
  161. ```bash
  162. $ opencode auth login
  163. ┌ Add credential
  164. ◇ Select provider
  165. │ Cerebras
  166. ◇ Enter your API key
  167. │ _
  168. ```
  169. 4. Run the `/models` command to select a model like _Qwen 3 Coder 480B_.
  170. ---
  171. ### DeepSeek
  172. 1. Head over to the [DeepSeek console](https://platform.deepseek.com/), create an account, and click **Create new API key**.
  173. 2. Run `opencode auth login` and select **DeepSeek**.
  174. ```bash
  175. $ opencode auth login
  176. ┌ Add credential
  177. ◆ Select provider
  178. │ ● DeepSeek
  179. │ ...
  180. ```
  181. 3. Enter your DeepSeek API key.
  182. ```bash
  183. $ opencode auth login
  184. ┌ Add credential
  185. ◇ Select provider
  186. │ DeepSeek
  187. ◇ Enter your API key
  188. │ _
  189. ```
  190. 4. Run the `/models` command to select a DeepSeek model like _DeepSeek Reasoner_.
  191. ---
  192. ### Fireworks AI
  193. 1. Head over to the [Fireworks AI console](https://app.fireworks.ai/), create an account, and click **Create API Key**.
  194. 2. Run `opencode auth login` and select **Fireworks AI**.
  195. ```bash
  196. $ opencode auth login
  197. ┌ Add credential
  198. ◆ Select provider
  199. │ ● Fireworks AI
  200. │ ...
  201. ```
  202. 3. Enter your Fireworks AI API key.
  203. ```bash
  204. $ opencode auth login
  205. ┌ Add credential
  206. ◇ Select provider
  207. │ Fireworks AI
  208. ◇ Enter your API key
  209. │ _
  210. ```
  211. 4. Run the `/models` command to select a model like _Kimi K2 Instruct_.
  212. ---
  213. ### GitHub Copilot
  214. To use your GitHub Copilot subscription with opencode:
  215. :::note
  216. Some models might need a [Pro+
  217. subscription](https://github.com/features/copilot/plans) to use.
  218. :::
  219. 1. Run `opencode auth login` and select GitHub Copilot.
  220. ```bash
  221. $ opencode auth login
  222. ┌ Add credential
  223. ◇ Select provider
  224. │ GitHub Copilot
  225. ◇ ──────────────────────────────────────────────╮
  226. │ │
  227. │ Please visit: https://github.com/login/device │
  228. │ Enter code: 8F43-6FCF │
  229. │ │
  230. ├─────────────────────────────────────────────────╯
  231. ◓ Waiting for authorization...
  232. ```
  233. 2. Navigate to [github.com/login/device](https://github.com/login/device) and enter the code.
  234. 3. Now run the `/models` command to select the model you want.
  235. ---
  236. ### Groq
  237. 1. Head over to the [Groq console](https://console.groq.com/), click **Create API Key**, and copy the key.
  238. 2. Run `opencode auth login` and select Groq.
  239. ```bash
  240. $ opencode auth login
  241. ┌ Add credential
  242. ◆ Select provider
  243. │ ● Groq
  244. │ ...
  245. ```
  246. 3. Enter the API key for the provider.
  247. ```bash
  248. $ opencode auth login
  249. ┌ Add credential
  250. ◇ Select provider
  251. │ Groq
  252. ◇ Enter your API key
  253. │ _
  254. ```
  255. 4. Run the `/models` command to select the one you want.
  256. ---
  257. ### LM Studio
  258. You can configure opencode to use local models through LM Studio.
  259. ```json title="opencode.json" "lmstudio" {5, 6, 8, 10-14}
  260. {
  261. "$schema": "https://opencode.ai/config.json",
  262. "provider": {
  263. "lmstudio": {
  264. "npm": "@ai-sdk/openai-compatible",
  265. "name": "LM Studio (local)",
  266. "options": {
  267. "baseURL": "http://127.0.0.1:1234/v1"
  268. },
  269. "models": {
  270. "google/gemma-3n-e4b": {
  271. "name": "Gemma 3n-e4b (local)"
  272. }
  273. }
  274. }
  275. }
  276. }
  277. ```
  278. In this example:
  279. - `lmstudio` is the custom provider ID. This can be any string you want.
  280. - `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
  281. - `name` is the display name for the provider in the UI.
  282. - `options.baseURL` is the endpoint for the local server.
  283. - `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
  284. ---
  285. ### Moonshot AI
  286. To use Kimi K2 from Moonshot AI:
  287. 1. Head over to the [Moonshot AI console](https://platform.moonshot.ai/console), create an account, and click **Create API key**.
  288. 2. Run `opencode auth login` and select **Moonshot AI**.
  289. ```bash
  290. $ opencode auth login
  291. ┌ Add credential
  292. ◆ Select provider
  293. │ ...
  294. │ ● Moonshot AI
  295. ```
  296. 3. Enter your Moonshot API key.
  297. ```bash
  298. $ opencode auth login
  299. ┌ Add credential
  300. ◇ Select provider
  301. │ Moonshot AI
  302. ◇ Enter your API key
  303. │ _
  304. ```
  305. 4. Run the `/models` command to select _Kimi K2_.
  306. ---
  307. ### Ollama
  308. You can configure opencode to use local models through Ollama.
  309. ```json title="opencode.json" "ollama" {5, 6, 8, 10-14}
  310. {
  311. "$schema": "https://opencode.ai/config.json",
  312. "provider": {
  313. "ollama": {
  314. "npm": "@ai-sdk/openai-compatible",
  315. "name": "Ollama (local)",
  316. "options": {
  317. "baseURL": "http://localhost:11434/v1"
  318. },
  319. "models": {
  320. "llama2": {
  321. "name": "Llama 2"
  322. }
  323. }
  324. }
  325. }
  326. }
  327. ```
  328. In this example:
  329. - `ollama` is the custom provider ID. This can be any string you want.
  330. - `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
  331. - `name` is the display name for the provider in the UI.
  332. - `options.baseURL` is the endpoint for the local server.
  333. - `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
  334. ---
  335. ### OpenAI
  336. 1. Head over to the [OpenAI Platform console](https://platform.openai.com/api-keys), click **Create new secret key**, and copy the key.
  337. 2. Run `opencode auth login` and select OpenAI.
  338. ```bash
  339. $ opencode auth login
  340. ┌ Add credential
  341. ◆ Select provider
  342. │ ● OpenAI
  343. │ ...
  344. ```
  345. 3. Enter the API key for the provider.
  346. ```bash
  347. $ opencode auth login
  348. ┌ Add credential
  349. ◇ Select provider
  350. │ OpenAI
  351. ◇ Enter your API key
  352. │ _
  353. ```
  354. 4. Run the `/models` command to select the one you want.
  355. ---
  356. ### opencode zen
  357. opencode zen is a list of tested and verified models provided by the opencode team. [Learn more](/docs/zen).
  358. 1. Sign in to **<a href={console}>opencode zen</a>** and click **Create API Key**.
  359. 2. Run `opencode auth login` and select **opencode zen**.
  360. ```bash
  361. $ opencode auth login
  362. ┌ Add credential
  363. ◆ Select provider
  364. │ ● opencode zen
  365. │ ...
  366. ```
  367. 3. Enter your opencode API key.
  368. ```bash
  369. $ opencode auth login
  370. ┌ Add credential
  371. ◇ Select provider
  372. │ opencode zen
  373. ◇ Enter your API key
  374. │ _
  375. ```
  376. 4. Run the `/models` command to select a model like _Qwen 3 Coder 480B_.
  377. ---
  378. ### OpenRouter
  379. 1. Head over to the [OpenRouter dashboard](https://openrouter.ai/settings/keys), click **Create API Key**, and copy the key.
  380. 2. Run `opencode auth login` and select OpenRouter.
  381. ```bash
  382. $ opencode auth login
  383. ┌ Add credential
  384. ◆ Select provider
  385. │ ● OpenRouter
  386. │ ○ Anthropic
  387. │ ○ Google
  388. │ ...
  389. ```
  390. 3. Enter the API key for the provider.
  391. ```bash
  392. $ opencode auth login
  393. ┌ Add credential
  394. ◇ Select provider
  395. │ OpenRouter
  396. ◇ Enter your API key
  397. │ _
  398. ```
  399. 4. Many OpenRouter models are preloaded by default, run the `/models` command to select the one you want.
  400. You can also add additional models through your opencode config.
  401. ```json title="opencode.json" {6}
  402. {
  403. "$schema": "https://opencode.ai/config.json",
  404. "provider": {
  405. "openrouter": {
  406. "models": {
  407. "somecoolnewmodel": {}
  408. }
  409. }
  410. }
  411. }
  412. ```
  413. 5. You can also customize them through your opencode config. Here's an example of specifying a provider
  414. ```json title="opencode.json"
  415. {
  416. "$schema": "https://opencode.ai/config.json",
  417. "provider": {
  418. "openrouter": {
  419. "models": {
  420. "moonshotai/kimi-k2": {
  421. "options": {
  422. "provider": {
  423. "order": ["baseten"],
  424. "allow_fallbacks": false
  425. }
  426. }
  427. }
  428. }
  429. }
  430. }
  431. }
  432. ```
  433. ---
  434. ### Together AI
  435. 1. Head over to the [Together AI console](https://api.together.ai), create an account, and click **Add Key**.
  436. 2. Run `opencode auth login` and select **Together AI**.
  437. ```bash
  438. $ opencode auth login
  439. ┌ Add credential
  440. ◆ Select provider
  441. │ ● Together AI
  442. │ ...
  443. ```
  444. 3. Enter your Together AI API key.
  445. ```bash
  446. $ opencode auth login
  447. ┌ Add credential
  448. ◇ Select provider
  449. │ Together AI
  450. ◇ Enter your API key
  451. │ _
  452. ```
  453. 4. Run the `/models` command to select a model like _Kimi K2 Instruct_.
  454. ---
  455. ### xAI
  456. For a limited time, you can use xAI's Grok Code for free with opencode.
  457. :::tip
  458. Grok Code is available for free for a limited time on opencode.
  459. :::
  460. 1. Make sure you are on the latest version of opencode.
  461. 2. Run the `/models` command and select **Grok Code Free**.
  462. As a part of the trial period, the xAI team will be using the request logs to
  463. monitor and improve Grok Code.
  464. ---
  465. ### Z.AI
  466. 1. Head over to the [Z.AI API console](https://z.ai/manage-apikey/apikey-list), create an account, and click **Create a new API key**.
  467. 2. Run `opencode auth login` and select **Z.AI**.
  468. ```bash
  469. $ opencode auth login
  470. ┌ Add credential
  471. ◆ Select provider
  472. │ ● Z.AI
  473. │ ...
  474. ```
  475. 3. Enter your Z.AI API key.
  476. ```bash
  477. $ opencode auth login
  478. ┌ Add credential
  479. ◇ Select provider
  480. │ Z.AI
  481. ◇ Enter your API key
  482. │ _
  483. ```
  484. 4. Run the `/models` command to select a model like _GLM-4.5_.
  485. ---
  486. ## Custom provider
  487. To add any **OpenAI-compatible** provider that's not listed in `opencode auth login`:
  488. :::tip
  489. You can use any OpenAI-compatible provider with opencode. Most modern AI providers offer OpenAI-compatible APIs.
  490. :::
  491. 1. Run `opencode auth login` and scroll down to **Other**.
  492. ```bash
  493. $ opencode auth login
  494. ┌ Add credential
  495. ◆ Select provider
  496. │ ...
  497. │ ● Other
  498. ```
  499. 2. Enter a unique ID for the provider.
  500. ```bash
  501. $ opencode auth login
  502. ┌ Add credential
  503. ◇ Enter provider id
  504. │ myprovider
  505. ```
  506. :::note
  507. Choose a memorable ID, you'll use this in your config file.
  508. :::
  509. 3. Enter your API key for the provider.
  510. ```bash
  511. $ opencode auth login
  512. ┌ Add credential
  513. ▲ This only stores a credential for myprovider - you will need configure it in opencode.json, check the docs for examples.
  514. ◇ Enter your API key
  515. │ sk-...
  516. ```
  517. 4. Create or update your `opencode.json` file in your project directory:
  518. ```json title="opencode.json" ""myprovider"" {5-15}
  519. {
  520. "$schema": "https://opencode.ai/config.json",
  521. "provider": {
  522. "myprovider": {
  523. "npm": "@ai-sdk/openai-compatible",
  524. "name": "My AI ProviderDisplay Name",
  525. "options": {
  526. "baseURL": "https://api.myprovider.com/v1"
  527. },
  528. "models": {
  529. "my-model-name": {
  530. "name": "My Model Display Name"
  531. }
  532. }
  533. }
  534. }
  535. }
  536. ```
  537. Here are the configuration options:
  538. - **npm**: AI SDK package to use, `@ai-sdk/openai-compatible` for OpenAI-compatible providers
  539. - **name**: Display name in UI.
  540. - **models**: Available models.
  541. - **options.baseURL**: API endpoint URL.
  542. - **options.apiKey**: Optionally set the API key, if not using auth.
  543. - **options.headers**: Optionally set custom headers.
  544. More on the advanced options in the example below.
  545. 5. Run the `/models` command and your custom provider and models will appear in the selection list.
  546. ---
  547. ##### Example
  548. Here's an example setting the `apiKey` and `headers` options.
  549. ```json title="opencode.json" {9,11}
  550. {
  551. "$schema": "https://opencode.ai/config.json",
  552. "provider": {
  553. "myprovider": {
  554. "npm": "@ai-sdk/openai-compatible",
  555. "name": "My AI ProviderDisplay Name",
  556. "options": {
  557. "baseURL": "https://api.myprovider.com/v1",
  558. "apiKey": "{env:ANTHROPIC_API_KEY}",
  559. "headers": {
  560. "Authorization": "Bearer custom-token"
  561. }
  562. },
  563. "models": {
  564. "my-model-name": {
  565. "name": "My Model Display Name"
  566. }
  567. }
  568. }
  569. }
  570. }
  571. ```
  572. We are setting the `apiKey` using the `env` variable syntax, [learn more](/docs/config#env-vars).
  573. ---
  574. ## Troubleshooting
  575. If you are having trouble with configuring a provider, check the following:
  576. 1. **Check the auth setup**: Run `opencode auth list` to see if the credentials
  577. for the provider are added to your config.
  578. This doesn't apply to providers like Amazon Bedrock, that rely on environment variables for their auth.
  579. 2. For custom providers, check the opencode config and:
  580. - Make sure the provider ID used in `opencode auth login` matches the ID in your opencode config.
  581. - The right npm package is used for the provider. For example, use `@ai-sdk/cerebras` for Cerebras. And for all other OpenAI-compatible providers, use `@ai-sdk/openai-compatible`.
  582. - Check correct API endpoint is used in the `options.baseURL` field.