Procházet zdrojové kódy

Changeset version bump (#8879)

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Matt Rubens <[email protected]>
github-actions[bot] před 4 měsíci
rodič
revize
b5fd805cf0
3 změnil soubory, kde provedl 14 přidání a 15 odebrání
  1. 0 14
      .changeset/v3.29.3.md
  2. 13 0
      CHANGELOG.md
  3. 1 1
      src/package.json

+ 0 - 14
.changeset/v3.29.3.md

@@ -1,14 +0,0 @@
----
-"roo-cline": patch
----
-
-- Update Gemini models with latest 09-2025 versions including Gemini 2.5 Pro and Flash (#8485 by @cleacos, PR by @roomote)
-- Add reasoning support for Z.ai GLM binary thinking mode (#8465 by @BeWater799, PR by @daniel-lxs)
-- Enable reasoning in Roo provider (thanks @mrubens!)
-- Add settings to configure time and cost display in system prompt (#8450 by @jaxnb, PR by @roomote)
-- Fix: Use max_output_tokens when available in LiteLLM fetcher (#8454 by @fabb, PR by @roomote)
-- Fix: Process queued messages after context condensing completes (#8477 by @JosXa, PR by @roomote)
-- Fix: Use monotonic clock for rate limiting to prevent timing issues (#7770 by @intermarkec, PR by @chrarnoldus)
-- Fix: Resolve checkpoint menu popover overflow (thanks @daniel-lxs!)
-- Fix: LiteLLM test failures after merge (thanks @daniel-lxs!)
-- Improve UX: Focus textbox and add newlines after adding to context (thanks @mrubens!)

+ 13 - 0
CHANGELOG.md

@@ -1,5 +1,18 @@
 # Roo Code Changelog
 # Roo Code Changelog
 
 
+## [3.29.3] - 2025-10-28
+
+- Update Gemini models with latest 09-2025 versions including Gemini 2.5 Pro and Flash (#8485 by @cleacos, PR by @roomote)
+- Add reasoning support for Z.ai GLM binary thinking mode (#8465 by @BeWater799, PR by @daniel-lxs)
+- Enable reasoning in Roo provider (thanks @mrubens!)
+- Add settings to configure time and cost display in system prompt (#8450 by @jaxnb, PR by @roomote)
+- Fix: Use max_output_tokens when available in LiteLLM fetcher (#8454 by @fabb, PR by @roomote)
+- Fix: Process queued messages after context condensing completes (#8477 by @JosXa, PR by @roomote)
+- Fix: Use monotonic clock for rate limiting to prevent timing issues (#7770 by @intermarkec, PR by @chrarnoldus)
+- Fix: Resolve checkpoint menu popover overflow (thanks @daniel-lxs!)
+- Fix: LiteLLM test failures after merge (thanks @daniel-lxs!)
+- Improve UX: Focus textbox and add newlines after adding to context (thanks @mrubens!)
+
 ## [3.29.2] - 2025-10-27
 ## [3.29.2] - 2025-10-27
 
 
 - Add support for LongCat-Flash-Thinking-FP8 models in Chutes AI provider (#8425 by @leakless21, PR by @roomote)
 - Add support for LongCat-Flash-Thinking-FP8 models in Chutes AI provider (#8425 by @leakless21, PR by @roomote)

+ 1 - 1
src/package.json

@@ -3,7 +3,7 @@
 	"displayName": "%extension.displayName%",
 	"displayName": "%extension.displayName%",
 	"description": "%extension.description%",
 	"description": "%extension.description%",
 	"publisher": "RooVeterinaryInc",
 	"publisher": "RooVeterinaryInc",
-	"version": "3.29.2",
+	"version": "3.29.3",
 	"icon": "assets/icons/icon.png",
 	"icon": "assets/icons/icon.png",
 	"galleryBanner": {
 	"galleryBanner": {
 		"color": "#617A91",
 		"color": "#617A91",