providers.mdx 21 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941
  1. ---
  2. title: Providers
  3. description: Using any LLM provider in OpenCode.
  4. ---
  5. import config from "../../../config.mjs"
  6. export const console = config.console
  7. OpenCode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.dev) to support for **75+ LLM providers** and it supports running local models.
  8. To add a provider you need to:
  9. 1. Add the API keys for the provider using `opencode auth login`.
  10. 2. Configure the provider in your OpenCode config.
  11. ---
  12. ### Credentials
  13. When you add a provider's API keys with `opencode auth login`, they are stored
  14. in `~/.local/share/opencode/auth.json`.
  15. ---
  16. ### Config
  17. You can customize the providers through the `provider` section in your OpenCode
  18. config.
  19. ---
  20. #### Base URL
  21. You can customize the base URL for any provider by setting the `baseURL` option. This is useful when using proxy services or custom endpoints.
  22. ```json title="opencode.json" {6}
  23. {
  24. "$schema": "https://opencode.ai/config.json",
  25. "provider": {
  26. "anthropic": {
  27. "options": {
  28. "baseURL": "https://api.anthropic.com/v1"
  29. }
  30. }
  31. }
  32. }
  33. ```
  34. ---
  35. ## OpenCode Zen
  36. OpenCode Zen is a list of models provided by the OpenCode team that have been
  37. tested and verified to work well with OpenCode. [Learn more](/docs/zen).
  38. :::tip
  39. If you are new, we recommend starting with OpenCode Zen.
  40. :::
  41. 1. Run `opencode auth login`, select opencode, and head to [opencode.ai/auth](https://opencode.ai/auth).
  42. 2. Sign in, add your billing details, and copy your API key.
  43. 3. Paste your API key.
  44. ```bash
  45. $ opencode auth login
  46. ┌ Add credential
  47. ◇ Select provider
  48. │ opencode
  49. ● Create an api key at https://opencode.ai/auth
  50. ◆ Enter your API key
  51. │ _
  52. ```
  53. 4. Run `/models` in the TUI to see the list of models we recommend.
  54. It works like any other provider in OpenCode. And is completely optional to use
  55. it.
  56. ---
  57. ## Directory
  58. Let's look at some of the providers in detail. If you'd like to add a provider to the
  59. list, feel free to open a PR.
  60. :::note
  61. Don't see a provider here? Submit a PR.
  62. :::
  63. ---
  64. ### Amazon Bedrock
  65. To use Amazon Bedrock with OpenCode:
  66. 1. Head over to the **Model catalog** in the Amazon Bedrock console and request
  67. access to the models you want.
  68. :::tip
  69. You need to have access to the model you want in Amazon Bedrock.
  70. :::
  71. 1. You'll need either to set one of the following environment variables:
  72. - `AWS_ACCESS_KEY_ID`: You can get this by creating an IAM user and generating
  73. an access key for it.
  74. - `AWS_PROFILE`: First login through AWS IAM Identity Center (or AWS SSO) using
  75. `aws sso login`. Then get the name of the profile you want to use.
  76. - `AWS_BEARER_TOKEN_BEDROCK`: You can generate a long-term API key from the
  77. Amazon Bedrock console.
  78. Once you have one of the above, set it while running opencode.
  79. ```bash
  80. AWS_ACCESS_KEY_ID=XXX opencode
  81. ```
  82. Or add it to your bash profile.
  83. ```bash title="~/.bash_profile"
  84. export AWS_ACCESS_KEY_ID=XXX
  85. ```
  86. 1. Run the `/models` command to select the model you want.
  87. ---
  88. ### Anthropic
  89. We recommend signing up for [Claude Pro](https://www.anthropic.com/news/claude-pro) or [Max](https://www.anthropic.com/max), it's the most cost-effective way to use opencode.
  90. Once you've signed up, run `opencode auth login` and select Anthropic.
  91. ```bash
  92. $ opencode auth login
  93. ┌ Add credential
  94. ◆ Select provider
  95. │ ● Anthropic
  96. │ ...
  97. ```
  98. Here you can select the **Claude Pro/Max** option and it'll open your browser
  99. and ask you to authenticate.
  100. ```bash
  101. $ opencode auth login
  102. ┌ Add credential
  103. ◇ Select provider
  104. │ Anthropic
  105. ◆ Login method
  106. │ ● Claude Pro/Max
  107. │ ○ Create API Key
  108. │ ○ Manually enter API Key
  109. ```
  110. Now all the the Anthropic models should be available when you use the `/models` command.
  111. ##### Using API keys
  112. You can also select **Create API Key** if you don't have a Pro/Max subscription. It'll also open your browser and ask you to login to Anthropic and give you a code you can paste in your terminal.
  113. Or if you already have an API key, you can select **Manually enter API Key** and paste it in your terminal.
  114. ---
  115. ### Azure OpenAI
  116. 1. Head over to the [Azure portal](https://portal.azure.com/) and create an **Azure OpenAI** resource. You'll need:
  117. - **Resource name**: This becomes part of your API endpoint (`https://RESOURCE_NAME.openai.azure.com/`)
  118. - **API key**: Either `KEY 1` or `KEY 2` from your resource
  119. 2. Go to [Azure AI Foundry](https://ai.azure.com/) and deploy a model.
  120. :::note
  121. The deployment name must match the model name for opencode to work properly.
  122. :::
  123. 3. Run `opencode auth login` and select **Azure**.
  124. ```bash
  125. $ opencode auth login
  126. ┌ Add credential
  127. ◆ Select provider
  128. │ ● Azure
  129. │ ...
  130. ```
  131. 4. Enter your API key.
  132. ```bash
  133. $ opencode auth login
  134. ┌ Add credential
  135. ◇ Select provider
  136. │ Azure
  137. ◇ Enter your API key
  138. │ _
  139. ```
  140. 5. Set your resource name as an environment variable:
  141. ```bash
  142. AZURE_RESOURCE_NAME=XXX opencode
  143. ```
  144. Or add it to your bash profile:
  145. ```bash title="~/.bash_profile"
  146. export AZURE_RESOURCE_NAME=XXX
  147. ```
  148. 6. Run the `/models` command to select your deployed model.
  149. ---
  150. ### Cerebras
  151. 1. Head over to the [Cerebras console](https://inference.cerebras.ai/), create an account, and generate an API key.
  152. 2. Run `opencode auth login` and select **Cerebras**.
  153. ```bash
  154. $ opencode auth login
  155. ┌ Add credential
  156. ◆ Select provider
  157. │ ● Cerebras
  158. │ ...
  159. ```
  160. 3. Enter your Cerebras API key.
  161. ```bash
  162. $ opencode auth login
  163. ┌ Add credential
  164. ◇ Select provider
  165. │ Cerebras
  166. ◇ Enter your API key
  167. │ _
  168. ```
  169. 4. Run the `/models` command to select a model like _Qwen 3 Coder 480B_.
  170. ---
  171. ### DeepSeek
  172. 1. Head over to the [DeepSeek console](https://platform.deepseek.com/), create an account, and click **Create new API key**.
  173. 2. Run `opencode auth login` and select **DeepSeek**.
  174. ```bash
  175. $ opencode auth login
  176. ┌ Add credential
  177. ◆ Select provider
  178. │ ● DeepSeek
  179. │ ...
  180. ```
  181. 3. Enter your DeepSeek API key.
  182. ```bash
  183. $ opencode auth login
  184. ┌ Add credential
  185. ◇ Select provider
  186. │ DeepSeek
  187. ◇ Enter your API key
  188. │ _
  189. ```
  190. 4. Run the `/models` command to select a DeepSeek model like _DeepSeek Reasoner_.
  191. ---
  192. ### Fireworks AI
  193. 1. Head over to the [Fireworks AI console](https://app.fireworks.ai/), create an account, and click **Create API Key**.
  194. 2. Run `opencode auth login` and select **Fireworks AI**.
  195. ```bash
  196. $ opencode auth login
  197. ┌ Add credential
  198. ◆ Select provider
  199. │ ● Fireworks AI
  200. │ ...
  201. ```
  202. 3. Enter your Fireworks AI API key.
  203. ```bash
  204. $ opencode auth login
  205. ┌ Add credential
  206. ◇ Select provider
  207. │ Fireworks AI
  208. ◇ Enter your API key
  209. │ _
  210. ```
  211. 4. Run the `/models` command to select a model like _Kimi K2 Instruct_.
  212. ---
  213. ### GitHub Copilot
  214. To use your GitHub Copilot subscription with opencode:
  215. :::note
  216. Some models might need a [Pro+
  217. subscription](https://github.com/features/copilot/plans) to use.
  218. Some models need to be manually enabled in your [GitHub Copilot settings](https://docs.github.com/en/copilot/how-tos/use-ai-models/configure-access-to-ai-models#setup-for-individual-use).
  219. :::
  220. 1. Run `opencode auth login` and select GitHub Copilot.
  221. ```bash
  222. $ opencode auth login
  223. ┌ Add credential
  224. ◇ Select provider
  225. │ GitHub Copilot
  226. ◇ ──────────────────────────────────────────────╮
  227. │ │
  228. │ Please visit: https://github.com/login/device │
  229. │ Enter code: 8F43-6FCF │
  230. │ │
  231. ├─────────────────────────────────────────────────╯
  232. ◓ Waiting for authorization...
  233. ```
  234. 2. Navigate to [github.com/login/device](https://github.com/login/device) and enter the code.
  235. 3. Now run the `/models` command to select the model you want.
  236. ---
  237. ### Groq
  238. 1. Head over to the [Groq console](https://console.groq.com/), click **Create API Key**, and copy the key.
  239. 2. Run `opencode auth login` and select Groq.
  240. ```bash
  241. $ opencode auth login
  242. ┌ Add credential
  243. ◆ Select provider
  244. │ ● Groq
  245. │ ...
  246. ```
  247. 3. Enter the API key for the provider.
  248. ```bash
  249. $ opencode auth login
  250. ┌ Add credential
  251. ◇ Select provider
  252. │ Groq
  253. ◇ Enter your API key
  254. │ _
  255. ```
  256. 4. Run the `/models` command to select the one you want.
  257. ---
  258. ### Google Vertex AI
  259. To use Google Vertex AI with OpenCode:
  260. 1. Head over to the **Model Garden** in the Google Cloud Console and check the
  261. models available in your region.
  262. :::tip
  263. You need to have a Google Cloud project with Vertex AI API enabled.
  264. :::
  265. 2. Set the required environment variables:
  266. - `GOOGLE_VERTEX_PROJECT`: Your Google Cloud project ID
  267. - `GOOGLE_VERTEX_REGION` (optional): The region for Vertex AI (defaults to `us-east5`)
  268. - Authentication (choose one):
  269. - `GOOGLE_APPLICATION_CREDENTIALS`: Path to your service account JSON key file
  270. - Authenticate using gcloud CLI: `gcloud auth application-default login`
  271. Set them while running opencode.
  272. ```bash
  273. GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json GOOGLE_VERTEX_PROJECT=your-project-id opencode
  274. ```
  275. Or add them to your bash profile.
  276. ```bash title="~/.bash_profile"
  277. export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
  278. export GOOGLE_VERTEX_PROJECT=your-project-id
  279. export GOOGLE_VERTEX_REGION=us-central1
  280. ```
  281. 3. Run the `/models` command to select the model you want.
  282. ---
  283. ### LM Studio
  284. You can configure opencode to use local models through LM Studio.
  285. ```json title="opencode.json" "lmstudio" {5, 6, 8, 10-14}
  286. {
  287. "$schema": "https://opencode.ai/config.json",
  288. "provider": {
  289. "lmstudio": {
  290. "npm": "@ai-sdk/openai-compatible",
  291. "name": "LM Studio (local)",
  292. "options": {
  293. "baseURL": "http://127.0.0.1:1234/v1"
  294. },
  295. "models": {
  296. "google/gemma-3n-e4b": {
  297. "name": "Gemma 3n-e4b (local)"
  298. }
  299. }
  300. }
  301. }
  302. }
  303. ```
  304. In this example:
  305. - `lmstudio` is the custom provider ID. This can be any string you want.
  306. - `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
  307. - `name` is the display name for the provider in the UI.
  308. - `options.baseURL` is the endpoint for the local server.
  309. - `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
  310. ---
  311. ### Moonshot AI
  312. To use Kimi K2 from Moonshot AI:
  313. 1. Head over to the [Moonshot AI console](https://platform.moonshot.ai/console), create an account, and click **Create API key**.
  314. 2. Run `opencode auth login` and select **Moonshot AI**.
  315. ```bash
  316. $ opencode auth login
  317. ┌ Add credential
  318. ◆ Select provider
  319. │ ...
  320. │ ● Moonshot AI
  321. ```
  322. 3. Enter your Moonshot API key.
  323. ```bash
  324. $ opencode auth login
  325. ┌ Add credential
  326. ◇ Select provider
  327. │ Moonshot AI
  328. ◇ Enter your API key
  329. │ _
  330. ```
  331. 4. Run the `/models` command to select _Kimi K2_.
  332. ---
  333. ### Ollama
  334. You can configure opencode to use local models through Ollama.
  335. ```json title="opencode.json" "ollama" {5, 6, 8, 10-14}
  336. {
  337. "$schema": "https://opencode.ai/config.json",
  338. "provider": {
  339. "ollama": {
  340. "npm": "@ai-sdk/openai-compatible",
  341. "name": "Ollama (local)",
  342. "options": {
  343. "baseURL": "http://localhost:11434/v1"
  344. },
  345. "models": {
  346. "llama2": {
  347. "name": "Llama 2"
  348. }
  349. }
  350. }
  351. }
  352. }
  353. ```
  354. In this example:
  355. - `ollama` is the custom provider ID. This can be any string you want.
  356. - `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
  357. - `name` is the display name for the provider in the UI.
  358. - `options.baseURL` is the endpoint for the local server.
  359. - `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
  360. :::tip
  361. If tool calls aren't working, try increasing `num_ctx` in Ollama. Start around 16k - 32k.
  362. :::
  363. ---
  364. ### OpenAI
  365. 1. Head over to the [OpenAI Platform console](https://platform.openai.com/api-keys), click **Create new secret key**, and copy the key.
  366. 2. Run `opencode auth login` and select OpenAI.
  367. ```bash
  368. $ opencode auth login
  369. ┌ Add credential
  370. ◆ Select provider
  371. │ ● OpenAI
  372. │ ...
  373. ```
  374. 3. Enter the API key for the provider.
  375. ```bash
  376. $ opencode auth login
  377. ┌ Add credential
  378. ◇ Select provider
  379. │ OpenAI
  380. ◇ Enter your API key
  381. │ _
  382. ```
  383. 4. Run the `/models` command to select the one you want.
  384. ---
  385. ### OpenCode Zen
  386. OpenCode Zen is a list of tested and verified models provided by the OpenCode team. [Learn more](/docs/zen).
  387. 1. Sign in to **<a href={console}>OpenCode Zen</a>** and click **Create API Key**.
  388. 2. Run `opencode auth login` and select **OpenCode Zen**.
  389. ```bash
  390. $ opencode auth login
  391. ┌ Add credential
  392. ◆ Select provider
  393. │ ● OpenCode Zen
  394. │ ...
  395. ```
  396. 3. Enter your OpenCode API key.
  397. ```bash
  398. $ opencode auth login
  399. ┌ Add credential
  400. ◇ Select provider
  401. │ OpenCode Zen
  402. ◇ Enter your API key
  403. │ _
  404. ```
  405. 4. Run the `/models` command to select a model like _Qwen 3 Coder 480B_.
  406. ---
  407. ### OpenRouter
  408. 1. Head over to the [OpenRouter dashboard](https://openrouter.ai/settings/keys), click **Create API Key**, and copy the key.
  409. 2. Run `opencode auth login` and select OpenRouter.
  410. ```bash
  411. $ opencode auth login
  412. ┌ Add credential
  413. ◆ Select provider
  414. │ ● OpenRouter
  415. │ ○ Anthropic
  416. │ ○ Google
  417. │ ...
  418. ```
  419. 3. Enter the API key for the provider.
  420. ```bash
  421. $ opencode auth login
  422. ┌ Add credential
  423. ◇ Select provider
  424. │ OpenRouter
  425. ◇ Enter your API key
  426. │ _
  427. ```
  428. 4. Many OpenRouter models are preloaded by default, run the `/models` command to select the one you want.
  429. You can also add additional models through your opencode config.
  430. ```json title="opencode.json" {6}
  431. {
  432. "$schema": "https://opencode.ai/config.json",
  433. "provider": {
  434. "openrouter": {
  435. "models": {
  436. "somecoolnewmodel": {}
  437. }
  438. }
  439. }
  440. }
  441. ```
  442. 5. You can also customize them through your opencode config. Here's an example of specifying a provider
  443. ```json title="opencode.json"
  444. {
  445. "$schema": "https://opencode.ai/config.json",
  446. "provider": {
  447. "openrouter": {
  448. "models": {
  449. "moonshotai/kimi-k2": {
  450. "options": {
  451. "provider": {
  452. "order": ["baseten"],
  453. "allow_fallbacks": false
  454. }
  455. }
  456. }
  457. }
  458. }
  459. }
  460. }
  461. ```
  462. ---
  463. ### Together AI
  464. 1. Head over to the [Together AI console](https://api.together.ai), create an account, and click **Add Key**.
  465. 2. Run `opencode auth login` and select **Together AI**.
  466. ```bash
  467. $ opencode auth login
  468. ┌ Add credential
  469. ◆ Select provider
  470. │ ● Together AI
  471. │ ...
  472. ```
  473. 3. Enter your Together AI API key.
  474. ```bash
  475. $ opencode auth login
  476. ┌ Add credential
  477. ◇ Select provider
  478. │ Together AI
  479. ◇ Enter your API key
  480. │ _
  481. ```
  482. 4. Run the `/models` command to select a model like _Kimi K2 Instruct_.
  483. ---
  484. ### xAI
  485. For a limited time, you can use xAI's Grok Code for free with opencode.
  486. :::tip
  487. Grok Code is available for free for a limited time on opencode.
  488. :::
  489. 1. Make sure you are on the latest version of opencode.
  490. 2. Run the `/models` command and select **Grok Code Free**.
  491. As a part of the trial period, the xAI team will be using the request logs to
  492. monitor and improve Grok Code.
  493. ---
  494. ### Z.AI
  495. 1. Head over to the [Z.AI API console](https://z.ai/manage-apikey/apikey-list), create an account, and click **Create a new API key**.
  496. 2. Run `opencode auth login` and select **Z.AI**.
  497. ```bash
  498. $ opencode auth login
  499. ┌ Add credential
  500. ◆ Select provider
  501. │ ● Z.AI
  502. │ ...
  503. ```
  504. If you are subscribed to the **GLM Coding Plan**, select **Z.AI Coding Plan**.
  505. ```bash
  506. $ opencode auth login
  507. ┌ Add credential
  508. ◆ Select provider
  509. │ ● Z.AI Coding Plan
  510. │ ...
  511. ```
  512. 3. Enter your Z.AI API key.
  513. ```bash
  514. $ opencode auth login
  515. ┌ Add credential
  516. ◇ Select provider
  517. │ Z.AI
  518. ◇ Enter your API key
  519. │ _
  520. ```
  521. 4. Run the `/models` command to select a model like _GLM-4.5_.
  522. ---
  523. ## Custom provider
  524. To add any **OpenAI-compatible** provider that's not listed in `opencode auth login`:
  525. :::tip
  526. You can use any OpenAI-compatible provider with opencode. Most modern AI providers offer OpenAI-compatible APIs.
  527. :::
  528. 1. Run `opencode auth login` and scroll down to **Other**.
  529. ```bash
  530. $ opencode auth login
  531. ┌ Add credential
  532. ◆ Select provider
  533. │ ...
  534. │ ● Other
  535. ```
  536. 2. Enter a unique ID for the provider.
  537. ```bash
  538. $ opencode auth login
  539. ┌ Add credential
  540. ◇ Enter provider id
  541. │ myprovider
  542. ```
  543. :::note
  544. Choose a memorable ID, you'll use this in your config file.
  545. :::
  546. 3. Enter your API key for the provider.
  547. ```bash
  548. $ opencode auth login
  549. ┌ Add credential
  550. ▲ This only stores a credential for myprovider - you will need configure it in opencode.json, check the docs for examples.
  551. ◇ Enter your API key
  552. │ sk-...
  553. ```
  554. 4. Create or update your `opencode.json` file in your project directory:
  555. ```json title="opencode.json" ""myprovider"" {5-15}
  556. {
  557. "$schema": "https://opencode.ai/config.json",
  558. "provider": {
  559. "myprovider": {
  560. "npm": "@ai-sdk/openai-compatible",
  561. "name": "My AI ProviderDisplay Name",
  562. "options": {
  563. "baseURL": "https://api.myprovider.com/v1"
  564. },
  565. "models": {
  566. "my-model-name": {
  567. "name": "My Model Display Name"
  568. }
  569. }
  570. }
  571. }
  572. }
  573. ```
  574. Here are the configuration options:
  575. - **npm**: AI SDK package to use, `@ai-sdk/openai-compatible` for OpenAI-compatible providers
  576. - **name**: Display name in UI.
  577. - **models**: Available models.
  578. - **options.baseURL**: API endpoint URL.
  579. - **options.apiKey**: Optionally set the API key, if not using auth.
  580. - **options.headers**: Optionally set custom headers.
  581. More on the advanced options in the example below.
  582. 5. Run the `/models` command and your custom provider and models will appear in the selection list.
  583. ---
  584. ##### Example
  585. Here's an example setting the `apiKey` and `headers` options.
  586. ```json title="opencode.json" {9,11}
  587. {
  588. "$schema": "https://opencode.ai/config.json",
  589. "provider": {
  590. "myprovider": {
  591. "npm": "@ai-sdk/openai-compatible",
  592. "name": "My AI ProviderDisplay Name",
  593. "options": {
  594. "baseURL": "https://api.myprovider.com/v1",
  595. "apiKey": "{env:ANTHROPIC_API_KEY}",
  596. "headers": {
  597. "Authorization": "Bearer custom-token"
  598. }
  599. },
  600. "models": {
  601. "my-model-name": {
  602. "name": "My Model Display Name"
  603. }
  604. }
  605. }
  606. }
  607. }
  608. ```
  609. We are setting the `apiKey` using the `env` variable syntax, [learn more](/docs/config#env-vars).
  610. ---
  611. ## Troubleshooting
  612. If you are having trouble with configuring a provider, check the following:
  613. 1. **Check the auth setup**: Run `opencode auth list` to see if the credentials
  614. for the provider are added to your config.
  615. This doesn't apply to providers like Amazon Bedrock, that rely on environment variables for their auth.
  616. 2. For custom providers, check the opencode config and:
  617. - Make sure the provider ID used in `opencode auth login` matches the ID in your opencode config.
  618. - The right npm package is used for the provider. For example, use `@ai-sdk/cerebras` for Cerebras. And for all other OpenAI-compatible providers, use `@ai-sdk/openai-compatible`.
  619. - Check correct API endpoint is used in the `options.baseURL` field.