Pārlūkot izejas kodu

docs: document base URL

Dax Raad 7 mēneši atpakaļ
vecāks
revīzija
73b46c2bf9
1 mainītis faili ar 26 papildinājumiem un 6 dzēšanām
  1. 26 6
      packages/web/src/content/docs/docs/models.mdx

+ 26 - 6
packages/web/src/content/docs/docs/models.mdx

@@ -27,12 +27,13 @@ You can add custom providers by specifying the npm package for the provider and
 {
 {
   "$schema": "https://opencode.ai/config.json",
   "$schema": "https://opencode.ai/config.json",
   "provider": {
   "provider": {
-    "openrouter": {
-      "name": "OpenRouter",
+    "moonshot": {
+      "npm": "@ai-sdk/openai-compatible",
+      "options": {
+        "baseURL": "https://api.moonshot.ai/v1"
+      },
       "models": {
       "models": {
-        "weirdo/some-weird-model": {
-          "name": "Claude 3.5 Sonnet"
-        }
+        "kimi-k2-0711-preview": {}
       }
       }
     }
     }
   }
   }
@@ -41,10 +42,29 @@ You can add custom providers by specifying the npm package for the provider and
 
 
 ---
 ---
 
 
+### Base URL
+
+You can customize the base URL for any provider by setting the `baseURL` option. This is useful when using proxy services or custom endpoints.
+
+```json title="opencode.json" {6-7}
+{
+  "$schema": "https://opencode.ai/config.json",
+  "provider": {
+    "anthropic": {
+      "options": {
+        "baseURL": "https://api.anthropic.com/v1"
+      },
+    }
+  }
+}
+```
+
+---
+
 ### Local
 ### Local
 
 
 You can configure local model like ones served through LM Studio or Ollama. To
 You can configure local model like ones served through LM Studio or Ollama. To
-do so, you'll need to specify a couple of things. 
+do so, you'll need to specify a couple of things.
 
 
 Here's an example of configuring a local model from LM Studio:
 Here's an example of configuring a local model from LM Studio: