Quellcode durchsuchen

Merge pull request #5304 from Kilo-Org/update-minimax-docs-m2-1

docs: Update MiniMax documentation for M2.1 model availability
Remon Oldenbeuving vor 3 Wochen
Ursprung
Commit
664cbe9b5f
1 geänderte Dateien mit 7 neuen und 6 gelöschten Zeilen
  1. 7 6
      apps/kilocode-docs/docs/providers/minimax.md

+ 7 - 6
apps/kilocode-docs/docs/providers/minimax.md

@@ -4,7 +4,7 @@ sidebar_label: MiniMax
 
 # Using MiniMax With Kilo Code
 
-MiniMax is a global AI foundation model company focused on fast, cost-efficient multimodal models with strong coding, tool-use, and agentic capabilities. Their flagship MiniMax M2 model delivers high-speed inference, long-context reasoning, and advanced development workflow support.
+MiniMax is a global AI foundation model company focused on fast, cost-efficient multimodal models with strong coding, tool-use, and agentic capabilities. Their flagship MiniMax M2.1 model delivers high-speed inference, long-context reasoning, and advanced development workflow support.
 
 **Website:** [https://www.minimax.io/](https://www.minimax.io/)
 
@@ -12,7 +12,7 @@ MiniMax is a global AI foundation model company focused on fast, cost-efficient
 
 1. **Sign Up/Sign In:** Go to the [MiniMax Console](https://platform.minimax.io/). Create an account or sign in.
 2. **Open the API Keys Page:** Navigate to your **Profile > API Keys**.
-3. **Create a Key:** Click to generate a new API key and give it a descriptive name (e.g., “Kilo Code”).
+3. **Create a Key:** Click to generate a new API key and give it a descriptive name (e.g., "Kilo Code").
 4. **Copy the Key:** Copy the key immediately. You may not be able to view it again. Store it securely.
 
 ## Supported Models
@@ -21,7 +21,8 @@ Kilo Code supports the following MiniMax models:
 
 ### MiniMax Models
 
-- `MiniMax-M2` (currently free) – MiniMax’s latest model with strong coding, tool use, long-context reasoning, and high inference speed
+- `MiniMax-M2.1` – MiniMax's latest flagship model with strong coding, tool use, long-context reasoning, and high inference speed
+- `MiniMax-M2` – Previous generation model with strong coding, tool use, and long-context reasoning capabilities
 - `MiniMax-01` – General-purpose MiniMax model
 - `MiniMax-M1` – Earlier MiniMax model with broad multimodal capabilities
 
@@ -34,7 +35,7 @@ Kilo Code supports the following MiniMax models:
 
 ## Tips and Notes
 
-- **Performance:** MiniMax M2 emphasizes fast inference, strong coding ability, and exceptional tool-calling performance.
+- **Performance:** MiniMax M2.1 emphasizes fast inference, strong coding ability, and exceptional tool-calling performance.
 - **Context Window:** MiniMax models support ultra-long context windows suitable for large codebases and agent workflows.
-- **Pricing:** Pricing varies by model, with input costs ranging from $0.20 to $0.40 per million tokens and output costs from $1.10 to $2.20 per million tokens. Refer to the MiniMax documentation for the most current pricing information.
-    - **MiniMax M2** is designed for a strong balance of speed, price, and performance, and is currently **free to use for a limited time**.
+- **Pricing:** Pricing varies by model, with input costs ranging from $0.20 to $0.30 per million tokens and output costs from $1.10 to $2.20 per million tokens. Refer to the MiniMax documentation for the most current pricing information.
+    - **MiniMax M2.1** is currently available for free exclusively for [Kilo for Slack](/slack).