Răsfoiți Sursa

Add Provider Routing docs for Kilo Code and OpenRouter

Christiaan Arnoldus 4 luni în urmă
părinte
comite
0f8c58b8f6

+ 19 - 0
apps/kilocode-docs/docs/providers/kilocode.md

@@ -43,6 +43,25 @@ Once you've completed the registration process, Kilo Code is automatically confi
 2. **No API Key Management:** Your authentication is handled seamlessly through the registration process
 3. **Model Selection:** Access to frontier models is provided automatically through your Kilo Code account
 
+### Provider Routing
+
+Kilo Code can route to many different inference providers. For personal accounts, provider routing behavior can be controlled in the API Provider settings under Provider Routing.
+
+#### Provider Sorting
+
+- Default provider sorting: at time of writing equivalent to prefer providers with lower latency
+- Prefer providers with lower price
+- Prefer providers with higher throughput (i.e. more tokens per seconds)
+- Prefer providers with lower latency (i.e. shorter time to first token)
+- A specific provider can also be chosen. This is not recommended, because it will result in errors when the provider is facing downtime or enforcing rate limits.
+
+#### Data Policy
+
+- Allow prompt training (free only): providers that may train on your prompts or completions are only allowed for free models.
+- Allow prompt training: providers that may train on your prompts or completions are allowed.
+- Deny prompt training: providers that may train on your prompts or completions are not allowed.
+- Zero data retention: only providers with a strict zero data retention policy are allowed. This option is not recommended, as it will disable many popular providers, such as Anthropic and OpenAI.
+
 ## Connected Accounts
 
 With the Kilo Code provider, if you sign up with Google you can also connect other sign in accounts - like GitHub - by:

+ 26 - 7
apps/kilocode-docs/docs/providers/openrouter.md

@@ -4,19 +4,19 @@ sidebar_label: OpenRouter
 
 # Using OpenRouter With Kilo Code
 
-OpenRouter is an AI platform that provides access to a wide variety of language models from different providers, all through a single API.  This can simplify setup and allow you to easily experiment with different models.
+OpenRouter is an AI platform that provides access to a wide variety of language models from different providers, all through a single API. This can simplify setup and allow you to easily experiment with different models.
 
 **Website:** [https://openrouter.ai/](https://openrouter.ai/)
 
 ## Getting an API Key
 
-1.  **Sign Up/Sign In:** Go to the [OpenRouter website](https://openrouter.ai/).  Sign in with your Google or GitHub account.
-2.  **Get an API Key:** Go to the [keys page](https://openrouter.ai/keys).  You should see an API key listed.  If not, create a new key.
+1.  **Sign Up/Sign In:** Go to the [OpenRouter website](https://openrouter.ai/). Sign in with your Google or GitHub account.
+2.  **Get an API Key:** Go to the [keys page](https://openrouter.ai/keys). You should see an API key listed. If not, create a new key.
 3.  **Copy the Key:** Copy the API key.
 
 ## Supported Models
 
-OpenRouter supports a large and growing number of models.  Kilo Code automatically fetches the list of available models. Refer to the [OpenRouter Models page](https://openrouter.ai/models) for the complete and up-to-date list.
+OpenRouter supports a large and growing number of models. Kilo Code automatically fetches the list of available models. Refer to the [OpenRouter Models page](https://openrouter.ai/models) for the complete and up-to-date list.
 
 ## Configuration in Kilo Code
 
@@ -30,8 +30,27 @@ OpenRouter supports a large and growing number of models.  Kilo Code automatical
 
 OpenRouter provides an [optional "middle-out" message transform](https://openrouter.ai/docs/features/message-transforms) to help with prompts that exceed the maximum context size of a model. You can enable it by checking the "Compress prompts and message chains to the context size" box.
 
+## Provider Routing
+
+OpenRouter can route to many different inference providers and this can be controlled in the API Provider settings under Provider Routing.
+
+### Provider Sorting
+
+- Default provider sorting: use the setting in your OpenRouter account
+- Prefer providers with lower price
+- Prefer providers with higher throughput (i.e. more tokens per seconds)
+- Prefer providers with lower latency (i.e. shorter time to first token)
+- A specific provider can also be chosen. This is not recommended, because it will result in errors when the provider is facing downtime or enforcing rate limits.
+
+### Data Policy
+
+- No data policy set: use the settings in your OpenRouter account.
+- Allow prompt training: providers that may train on your prompts or completions are allowed. Free models generally require this option to be enabled.
+- Deny prompt training: providers that may train on your prompts or completions are not allowed.
+- Zero data retention: only providers with a strict zero data retention policy are allowed. This option is not recommended, as it will disable many popular providers, such as Anthropic and OpenAI.
+
 ## Tips and Notes
 
-* **Model Selection:** OpenRouter offers a wide range of models. Experiment to find the best one for your needs.
-* **Pricing:**  OpenRouter charges based on the underlying model's pricing.  See the [OpenRouter Models page](https://openrouter.ai/models) for details.
-* **Prompt Caching:** Some providers support prompt caching. See the OpenRouter documentation for supported models.
+- **Model Selection:** OpenRouter offers a wide range of models. Experiment to find the best one for your needs.
+- **Pricing:** OpenRouter charges based on the underlying model's pricing. See the [OpenRouter Models page](https://openrouter.ai/models) for details.
+- **Prompt Caching:** Some providers support prompt caching. See the OpenRouter documentation for supported models.