Jay V 7 месяцев назад
Родитель
Сommit
d2c862e32d
2 измененных файлов с 66 добавлено и 74 удалено
  1. 14 73
      packages/web/src/content/docs/docs/config.mdx
  2. 52 1
      packages/web/src/content/docs/docs/models.mdx

+ 14 - 73
packages/web/src/content/docs/docs/config.mdx

@@ -43,55 +43,38 @@ Your editor should be able to validate and autocomplete based on the schema.
 
 
 ---
 ---
 
 
-### Models
+### Modes
 
 
-You can configure the providers and models you want to use in your opencode config through the `provider` and `model` options.
+opencode comes with two built-in modes: _build_, the default with all tools enabled. And _plan_, restricted mode with file modification tools disabled. You can override these built-in modes or define your own custom modes with the `mode` option.
 
 
 ```json title="opencode.json"
 ```json title="opencode.json"
 {
 {
   "$schema": "https://opencode.ai/config.json",
   "$schema": "https://opencode.ai/config.json",
-  "provider": {},
-  "model": ""
+  "mode": {
+    "build": { },
+    "plan": { },
+    "my-custom-mode": { }
+  }
 }
 }
 ```
 ```
 
 
-[Learn more here](/docs/models).
+[Learn more here](/docs/modes).
 
 
-#### Custom Providers
+---
 
 
-You can also define custom providers in your configuration. This is useful for connecting to services that are not natively supported but are OpenAI API-compatible, such as local models served through LM Studio or Ollama.
+### Models
 
 
-Here's an example of how to configure a local on-device model from LM Studio:
+You can configure the providers and models you want to use in your opencode config through the `provider` and `model` options.
 
 
 ```json title="opencode.json"
 ```json title="opencode.json"
 {
 {
   "$schema": "https://opencode.ai/config.json",
   "$schema": "https://opencode.ai/config.json",
-  "model": "lmstudio/google/gemma-3n-e4b",
-  "provider": {
-    "lmstudio": {
-      "npm": "@ai-sdk/openai-compatible",
-      "name": "LM Studio (local)",
-      "options": {
-        "baseURL": "http://127.0.0.1:1234/v1"
-      },
-      "models": {
-        "google/gemma-3n-e4b": {
-          "name": "Gemma 3n-e4b (local)"
-        }
-      }
-    }
-  }
+  "provider": {},
+  "model": ""
 }
 }
 ```
 ```
 
 
-In this example:
-
-- `lmstudio` is the custom provider ID.
-- `npm` specifies the package to use for this provider. `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
-- `name` is the display name for the provider in the UI.
-- `options.baseURL` is the endpoint for the local server.
-- `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
-- The `model` key at the root is set to the full ID of the model you want to use, which is `provider_id/model_id`.
+You can also configure [local models](/docs/models#local). [Learn more](/docs/models).
 
 
 ---
 ---
 
 
@@ -170,48 +153,6 @@ You can configure MCP servers you want to use through the `mcp` option.
 
 
 ---
 ---
 
 
-### Modes
-
-You can configure different modes for opencode through the `mode` option. Modes allow you to customize the behavior, tools, and prompts for different use cases.
-
-opencode comes with two built-in modes: `build` (default with all tools enabled) and `plan` (restricted mode with file modification tools disabled). You can override these built-in modes or create your own custom modes.
-
-```json title="opencode.json"
-{
-  "$schema": "https://opencode.ai/config.json",
-  "mode": {
-    "build": {
-      "model": "anthropic/claude-sonnet-4-20250514",
-      "prompt": "{file:./prompts/build.txt}",
-      "tools": {
-        "write": true,
-        "edit": true,
-        "bash": true
-      }
-    },
-    "plan": {
-      "tools": {
-        "write": false,
-        "edit": false,
-        "bash": false
-      }
-    },
-    "review": {
-      "prompt": "{file:./prompts/code-review.txt}",
-      "tools": {
-        "write": false,
-        "edit": false,
-        "bash": false
-      }
-    }
-  }
-}
-```
-
-[Learn more here](/docs/modes).
-
----
-
 ### Disabled providers
 ### Disabled providers
 
 
 You can disable providers that are loaded automatically through the `disabled_providers` option. This is useful when you want to prevent certain providers from being loaded even if their credentials are available.
 You can disable providers that are loaded automatically through the `disabled_providers` option. This is useful when you want to prevent certain providers from being loaded even if their credentials are available.

+ 52 - 1
packages/web/src/content/docs/docs/models.mdx

@@ -11,10 +11,14 @@ opencode uses the [AI SDK](https://ai-sdk.dev/) and [Models.dev](https://models.
 
 
 You can configure providers in your opencode config under the `provider` section.
 You can configure providers in your opencode config under the `provider` section.
 
 
+---
+
 ### Defaults
 ### Defaults
 
 
 Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start opencode.
 Most popular providers are preloaded by default. If you've added the credentials for a provider through `opencode auth login`, they'll be available when you start opencode.
 
 
+---
+
 ### Custom
 ### Custom
 
 
 You can add custom providers by specifying the npm package for the provider and the models you want to use.
 You can add custom providers by specifying the npm package for the provider and the models you want to use.
@@ -37,9 +41,44 @@ You can add custom providers by specifying the npm package for the provider and
 }
 }
 ```
 ```
 
 
+---
+
 ### Local
 ### Local
 
 
-To configure a local model, specify the npm package to use and the `baseURL`.
+You can configure local model like ones served through LM Studio or Ollama. To
+do so, you'll need to specify a couple of things. 
+
+Here's an example of configuring a local model from LM Studio:
+
+```json title="opencode.json" {4-15}
+{
+  "$schema": "https://opencode.ai/config.json",
+  "provider": {
+    "lmstudio": {
+      "npm": "@ai-sdk/openai-compatible",
+      "name": "LM Studio (local)",
+      "options": {
+        "baseURL": "http://127.0.0.1:1234/v1"
+      },
+      "models": {
+        "google/gemma-3n-e4b": {
+          "name": "Gemma 3n-e4b (local)"
+        }
+      }
+    }
+  }
+}
+```
+
+In this example:
+
+- `lmstudio` is the custom provider ID. We'll use this later.
+- `npm` specifies the package to use for this provider. Here, `@ai-sdk/openai-compatible` is used for any OpenAI-compatible API.
+- `name` is the display name for the provider in the UI.
+- `options.baseURL` is the endpoint for the local server.
+- `models` is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
+
+Similarly, to configure a local model from Ollama:
 
 
 ```json title="opencode.json" {5,7}
 ```json title="opencode.json" {5,7}
 {
 {
@@ -58,6 +97,18 @@ To configure a local model, specify the npm package to use and the `baseURL`.
 }
 }
 ```
 ```
 
 
+To set one of these as the default model, you can set the `model` key at the
+root.
+
+```json title="opencode.json" {3}
+{
+  "$schema": "https://opencode.ai/config.json",
+  "model": "lmstudio/google/gemma-3n-e4b"
+}
+```
+
+Here the full ID is `provider_id/model_id`, where `provider_id` is the key in the `provider` list we set above and `model_id` is the key from the `provider.models` list.
+
 ---
 ---
 
 
 ## Select a model
 ## Select a model