Compare commits

...

21 Commits

Author SHA1 Message Date
SuYao
9414f13f6d
fix: 修改请求体字段名 (#12430) 2026-01-12 16:38:55 +08:00
flt6
cbeda03acb
use cumsum in anthropic cache (#12419)
* use cumsum in anthropic cache

* fix types and refactor addCache in anthropicCacheMiddleware
2026-01-12 15:49:33 +08:00
George·Dong
cea36d170b
fix(qwen-code): format baseUrl with /v1 for OpenAI-compatible tools (#12418)
The Qwen Code tool was failing with 'Model stream ended without a finish reason'
because the OPENAI_BASE_URL environment variable was not properly formatted.

This fix adds /v1 suffix to the baseUrl when it's missing for OpenAI-compatible
tools (qwenCode, openaiCodex, iFlowCli).

Changes:
- Import formatApiHost from @renderer/utils/api
- Use formatApiHost to format baseUrl before passing to environment variables
- Add unit tests for the URL formatting behavior
2026-01-12 13:41:27 +08:00
Nicolae Fericitu
d84b84eb2f
i18n: Major improvements to Romanian (ro-RO) localization (#12428)
* fix(i18n): update and refine Romanian translation

I have corrected several typos and refined the terminology in the ro-ro.json file for better linguistic accuracy. This update ensures translation consistency throughout the user interface.

* i18n: Update and fix Romanian localization (ro-RO)

The Romanian localization file has been updated. 

Necessary corrections have been applied to address issues identified during an interface review, ensuring consistent terminology and improved message clarity.

* i18n: Capitalize "Users" label for UI consistency

Updated the "users" key in ro-ro.json to use an uppercase initial. This ensures visual consistency with other menu items in the settings section (User Management).
2026-01-12 10:58:38 +08:00
SuYao
c7c380d706
fix: disable strict JSON schema for OpenRouter to support MCP tools (#12415)
* fix: update dependencies and patch files for strict JSON schema compliance

- Updated `@ai-sdk/openai-compatible` to include version 1.0.30 and adjusted related patch files.
- Removed obsolete patch for `@ai-sdk/openai-compatible@1.0.28`.
- Added new patch for `@openrouter/ai-sdk-provider` to support strict JSON schema options.
- Modified `options.ts` to set `strictJsonSchema` to false for OpenAI models.
- Enhanced OpenAI compatible provider options to include `sendReasoning` and `strictJsonSchema`.
- Updated lockfile to reflect changes in patched dependencies and their hashes.

* fix: filter strictJsonSchema from request body in OpenRouter patch

- Destructure and remove strictJsonSchema from openrouterOptions before spreading into request body
- This prevents sending the internal option to the OpenRouter API

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-10 21:55:09 +08:00
Copilot
622e3f0db6
feat: Add year to topic timestamp and improve unpin UX (#12408)
* Initial plan

* feat: Add year to topic time display format

Co-authored-by: GeorgeDong32 <98630204+GeorgeDong32@users.noreply.github.com>

* fix: improve topic unpin UX by moving to top of unpinned list

- Unpinned topics now move to the top of the unpinned section
- List automatically scrolls to the unpinned topic position
- Keeps the requirement to unpin before deleting pinned topics

Closes #12398

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: GeorgeDong32 <98630204+GeorgeDong32@users.noreply.github.com>
Co-authored-by: George·Dong <GeorgeDong32@qq.com>
2026-01-10 21:37:37 +08:00
fullex
e5a2980da8
fix(logger): allow logging with unknown window source (#12406)
Some checks failed
Auto I18N Weekly / Auto I18N (push) Has been cancelled
* fix(logger): allow logging with unknown window source

Previously, LoggerService would block all logging when window source
was not initialized, which could swallow important business errors.
Now it uses 'UNKNOWN' as the window source and continues logging.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

* chore(logger): update documentation links for LoggerService eslint

Updated the ESLint configuration to point to the correct documentation for the unified LoggerService, ensuring users have access to the latest guides in both English and Chinese.

---------

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-10 15:12:00 +08:00
George·Dong
5b5e190132
feat(models): add Qwen text-embedding models to defaults (#12410)
* feat(models): add Qwen text-embedding models to defaults

Add four Qwen text-embedding models (v4, v3, v2, v1) to the
default model list in the renderer config. Group them under
"Qwen-text-embedding" and set the provider to "dashscope".

This ensures embedding models are available by default for
features that require text embeddings.

* fix(models): normalize qwen embedding group name

Change group value for Dashscope Qwen embedding models from
'Qwen-text-embedding' to 'qwen-text-embedding' in the default models
configuration. This makes the group naming consistent with other Qwen
model groups (lowercase) and avoids potential mismatches or lookup
errors caused by case differences.

* feat(dashscope): add qwen3-rerank model

Add qwen3-rerank model for Alibaba Cloud Bailian rerank API.
2026-01-10 15:11:08 +08:00
kangfenmao
e8e8f028f3 chore(release): v1.7.13
Co-Authored-By: Claude <noreply@anthropic.com>
2026-01-09 21:03:45 +08:00
kangfenmao
8ab082ceb5 feat(i18n): add careers section to AboutSettings and update translations
- Updated package.json to remove redundant i18n:check command.
- Added a "careers" section in the AboutSettings component with a button linking to the careers page.
- Introduced translations for the "careers" section in multiple languages including English, Chinese, German, Spanish, French, Japanese, Portuguese, Romanian, and Russian.
- Updated cache-related translations across various languages to provide localized support.
2026-01-09 20:57:59 +08:00
Phantom
864eda68fb
ci(workflows): fix pnpm installation and improve issue tracker (#12388)
* ci(workflows): add pnpm caching to improve build performance

Cache pnpm dependencies to reduce installation time in CI workflows

* ci(github-actions): change claude-code-action version to v1

* ci(workflows): improve issue tracker formatting

Format issue body as markdown code block for better readability in workflow output

* ci(workflows): unify claude-code-action version to v1 for both jobs

Changed process-pending-issues job from @main to @v1 to maintain consistency across all jobs and avoid potential version conflicts.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix(workflows): remove --help from allowed-tools to avoid upstream parsing bug

The `--help` flags in claude_args were being incorrectly parsed as actual
command-line arguments by claude-code-action, causing JSON parsing errors
("Unexpected identifier 'Usage'").

Switched to wildcard pattern `pnpm tsx scripts/feishu-notify.ts*` which:
- Allows all feishu-notify.ts commands including --help
- Avoids triggering the upstream argument parsing bug
- Simplifies the allowed-tools configuration

This addresses the root cause identified by Anthropic engineers.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix(workflows): remove --help references from prompts

Remove `--help` references from prompts to avoid potential parsing issues.
The example commands are already comprehensive enough without needing to
mention the help flag.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-09 20:47:39 +08:00
亢奋猫
c5ea42ca3a
fix(security): prevent path traversal vulnerability in DXT plugin system (#12377)
* fix(security): prevent path traversal vulnerability in DXT plugin system

Add input validation to prevent path traversal attacks in DXT plugin handling:

- Add sanitizeName() to filter dangerous characters from manifest.name
- Add validateCommand() to reject commands with path traversal sequences
- Add validateArgs() to validate command arguments
- Remove unsafe fallback logic in cleanupDxtServer()

The vulnerability allowed attackers to write files to arbitrary locations
on Windows by crafting malicious DXT packages with path traversal sequences
(e.g., "..\\..\\Windows\\System32\\") in manifest.name or command fields.

* refactor: use path validation instead of input sanitization

---------

Co-authored-by: defi-failure <159208748+defi-failure@users.noreply.github.com>
2026-01-09 20:47:14 +08:00
Sun
bdf8f103c8
fix(mcp): 修复 MCP 配置 timeout 字段不支持字符串类型的问题 (#12384)
fix(mcp): allow string input for timeout in mcp config

Co-authored-by: Sun <10309831+x_taiyang@user.noreply.gitee.com>
2026-01-09 17:24:08 +08:00
SuYao
7a7089e315
fix: normalize topics in useAssistant and assistants slice to prevent errors (#12319) 2026-01-09 17:21:20 +08:00
defi-failure
9b8420f9b9
fix: restore patch for claude-agent-sdk (#12391)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 16:26:39 +08:00
SuYao
29d8c4a7ed
fix(aiCore): only apply sendReasoning for openai-compatible SDK providers (#12387)
sendReasoning is a patch specific to @ai-sdk/openai-compatible package.
Previously it was incorrectly applied to all providers in buildGenericProviderOptions,
including those with dedicated SDK packages (e.g., cerebras, deepseek, openrouter).

Now it only applies when the provider will actually use openai-compatible SDK:
- No dedicated SDK registered (!hasProviderConfig(providerId))
- OR explicitly openai-compatible (providerId === 'openai-compatible')

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 15:00:31 +08:00
Phantom
76cc196667
ci(workflows): add Feishu notification for workflow failures (#12375)
* ci(workflows): add feishu notification for failed sync jobs

Add Feishu webhook notification when sync-to-gitcode workflow fails or is cancelled. The notification includes tag name, status and run URL for quick debugging.

* ci(workflow): add feishu notification for failed or cancelled jobs
2026-01-09 11:35:17 +08:00
SuYao
61aae7376a
fix: add dispose method to prevent abort listener leak (#12269)
* fix: add dispose method to prevent abort listener leak

Add dispose() method to StreamAbortController that explicitly removes
the abort event listener when stream ends normally. Previously, the
listener would only be removed when abort was triggered ({ once: true }),
but if the stream completed normally without abort, the listener would
remain attached until garbage collection.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* chore: format code

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-01-08 17:55:15 +08:00
kangfenmao
74e1d0887d chore: release v1.7.12
Updated version number to 1.7.12 in package.json and electron-builder.yml. Added release notes detailing the introduction of the MCP Hub with Auto mode and new cache control options for the Anthropic provider, along with various bug fixes.
2026-01-08 17:47:05 +08:00
Phantom
2a1722bb52
fix(workflows): add pnpm installing and caching (#12374) 2026-01-08 17:42:59 +08:00
fullex
7ff6955870
fix(SelectionService): add macOS key code support for modifier key detection (#12355) 2026-01-08 17:42:17 +08:00
38 changed files with 1227 additions and 258 deletions

View File

@ -90,3 +90,30 @@ jobs:
- name: 📢 Notify if no changes
if: steps.git_status.outputs.has_changes != 'true'
run: echo "Bot script ran, but no changes were detected. No PR created."
- name: Send failure notification to Feishu
if: always() && (failure() || cancelled())
shell: bash
env:
FEISHU_WEBHOOK_URL: ${{ secrets.FEISHU_WEBHOOK_URL }}
FEISHU_WEBHOOK_SECRET: ${{ secrets.FEISHU_WEBHOOK_SECRET }}
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
JOB_STATUS: ${{ job.status }}
run: |
# Determine status and color
if [ "$JOB_STATUS" = "cancelled" ]; then
STATUS_TEXT="已取消"
COLOR="orange"
else
STATUS_TEXT="失败"
COLOR="red"
fi
# Build description using printf
DESCRIPTION=$(printf "**状态:** %s\n\n**工作流:** [查看详情](%s)" "$STATUS_TEXT" "$RUN_URL")
# Send notification
pnpm tsx scripts/feishu-notify.ts send \
-t "自动国际化${STATUS_TEXT}" \
-d "$DESCRIPTION" \
-c "${COLOR}"

View File

@ -58,18 +58,34 @@ jobs:
with:
node-version: 22
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_OUTPUT
- name: Cache pnpm dependencies
uses: actions/cache@v4
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-
- name: Install dependencies
if: steps.check_time.outputs.should_delay == 'false'
run: pnpm install
- name: Process issue with Claude
if: steps.check_time.outputs.should_delay == 'false'
uses: anthropics/claude-code-action@main
uses: anthropics/claude-code-action@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
allowed_non_write_users: "*"
anthropic_api_key: ${{ secrets.CLAUDE_TRANSLATOR_APIKEY }}
claude_args: "--allowed-tools Bash(gh issue:*),Bash(pnpm tsx scripts/feishu-notify.ts issue:*),Bash(pnpm tsx scripts/feishu-notify.ts --help),Bash(pnpm tsx scripts/feishu-notify.ts issue --help)"
claude_args: "--allowed-tools Bash(gh issue:*),Bash(pnpm tsx scripts/feishu-notify.ts*)"
prompt: |
你是一个GitHub Issue自动化处理助手。请完成以下任务
@ -78,9 +94,14 @@ jobs:
- 标题:${{ github.event.issue.title }}
- 作者:${{ github.event.issue.user.login }}
- URL${{ github.event.issue.html_url }}
- 内容:${{ github.event.issue.body }}
- 标签:${{ join(github.event.issue.labels.*.name, ', ') }}
### Issue body
`````md
${{ github.event.issue.body }}
`````
## 任务步骤
1. **分析并总结issue**
@ -90,8 +111,7 @@ jobs:
- 重要的技术细节
2. **发送飞书通知**
使用CLI工具发送飞书通知运行 `pnpm tsx scripts/feishu-notify.ts issue --help` 查看参数说明。
示例:
使用CLI工具发送飞书通知参考以下示例
```bash
pnpm tsx scripts/feishu-notify.ts issue \
-u "${{ github.event.issue.html_url }}" \
@ -130,16 +150,32 @@ jobs:
with:
node-version: 22
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_OUTPUT
- name: Cache pnpm dependencies
uses: actions/cache@v4
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-
- name: Install dependencies
run: pnpm install
- name: Process pending issues with Claude
uses: anthropics/claude-code-action@main
uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.CLAUDE_TRANSLATOR_APIKEY }}
allowed_non_write_users: "*"
github_token: ${{ secrets.GITHUB_TOKEN }}
claude_args: "--allowed-tools Bash(gh issue:*),Bash(gh api:*),Bash(pnpm tsx scripts/feishu-notify.ts issue:*),Bash(pnpm tsx scripts/feishu-notify.ts --help),Bash(pnpm tsx scripts/feishu-notify.ts issue --help)"
claude_args: "--allowed-tools Bash(gh issue:*),Bash(gh api:*),Bash(pnpm tsx scripts/feishu-notify.ts*)"
prompt: |
你是一个GitHub Issue自动化处理助手。请完成以下任务
@ -161,8 +197,7 @@ jobs:
- 重要的技术细节
3. **发送飞书通知**
使用CLI工具发送飞书通知运行 `pnpm tsx scripts/feishu-notify.ts issue --help` 查看参数说明。
示例:
使用CLI工具发送飞书通知参考以下示例
```bash
pnpm tsx scripts/feishu-notify.ts issue \
-u "<issue的html_url>" \

View File

@ -300,3 +300,31 @@ jobs:
run: |
rm -f /tmp/release_payload.json /tmp/upload_headers.txt release_body.txt
rm -rf release-assets/
- name: Send failure notification to Feishu
if: always() && (failure() || cancelled())
shell: bash
env:
FEISHU_WEBHOOK_URL: ${{ secrets.FEISHU_WEBHOOK_URL }}
FEISHU_WEBHOOK_SECRET: ${{ secrets.FEISHU_WEBHOOK_SECRET }}
TAG_NAME: ${{ steps.get-tag.outputs.tag }}
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
JOB_STATUS: ${{ job.status }}
run: |
# Determine status and color
if [ "$JOB_STATUS" = "cancelled" ]; then
STATUS_TEXT="已取消"
COLOR="orange"
else
STATUS_TEXT="失败"
COLOR="red"
fi
# Build description using printf
DESCRIPTION=$(printf "**标签:** %s\n\n**状态:** %s\n\n**工作流:** [查看详情](%s)" "$TAG_NAME" "$STATUS_TEXT" "$RUN_URL")
# Send notification
pnpm tsx scripts/feishu-notify.ts send \
-t "GitCode 同步${STATUS_TEXT}" \
-d "$DESCRIPTION" \
-c "${COLOR}"

View File

@ -143,34 +143,30 @@ artifactBuildCompleted: scripts/artifact-build-completed.js
releaseInfo:
releaseNotes: |
<!--LANG:en-->
Cherry Studio 1.7.11 - New Features & Bug Fixes
Cherry Studio 1.7.13 - Security & Bug Fixes
✨ New Features
- [MCP] Add MCP Hub with Auto mode for intelligent multi-server tool orchestration
🔒 Security
- [Plugin] Fix security vulnerability in DXT plugin system on Windows
🐛 Bug Fixes
- [Chat] Fix reasoning process not displaying correctly for some proxy models
- [Chat] Fix duplicate loading spinners on action buttons
- [Editor] Fix paragraph handle and plus button not clickable
- [Drawing] Fix TokenFlux models not showing in drawing panel
- [Translate] Fix translation stalling after initialization
- [Error] Fix app freeze when viewing error details with large images
- [Notes] Fix folder overlay blocking webview preview
- [Chat] Fix thinking time display when stopping generation
- [Agent] Fix Agent not working when Node.js is not installed on system
- [Chat] Fix app crash when opening certain agents
- [Chat] Fix reasoning process not displaying correctly for some providers
- [Chat] Fix memory leak issue during streaming conversations
- [MCP] Fix timeout field not accepting string format in MCP configuration
- [Settings] Add careers section in About page
<!--LANG:zh-CN-->
Cherry Studio 1.7.11 - 新功能与问题修复
Cherry Studio 1.7.13 - 安全与问题修复
✨ 新功能
- [MCP] 新增 MCP Hub 智能模式,可自动管理和调用多个 MCP 服务器工具
🔒 安全修复
- [插件] 修复 Windows 系统 DXT 插件的安全漏洞
🐛 问题修复
- [对话] 修复部分代理模型的推理过程无法正确显示的问题
- [对话] 修复操作按钮重复显示加载状态的问题
- [编辑器] 修复段落手柄和加号按钮无法点击的问题
- [绘图] 修复 TokenFlux 模型在绘图面板不显示的问题
- [翻译] 修复翻译功能初始化后卡住的问题
- [错误] 修复查看包含大图片的错误详情时应用卡死的问题
- [笔记] 修复文件夹遮挡网页预览的问题
- [对话] 修复停止生成时思考时间显示问题
- [Agent] 修复系统未安装 Node.js 时 Agent 功能无法使用的问题
- [对话] 修复打开某些智能体时应用崩溃的问题
- [对话] 修复部分服务商推理过程无法正确显示的问题
- [对话] 修复流式对话时的内存泄漏问题
- [MCP] 修复 MCP 配置的 timeout 字段不支持字符串格式的问题
- [设置] 关于页面新增招聘入口
<!--LANG:END-->

View File

@ -84,7 +84,7 @@ export default defineConfig([
{
selector: 'CallExpression[callee.object.name="console"]',
message:
'❗CherryStudio uses unified LoggerService: 📖 docs/technical/how-to-use-logger-en.md\n❗CherryStudio 使用统一的日志服务:📖 docs/technical/how-to-use-logger-zh.md\n\n'
'❗CherryStudio uses unified LoggerService: 📖 docs/en/guides/logging.md\n❗CherryStudio 使用统一的日志服务:📖 docs/zh/guides/logging.md\n\n'
}
]
}

View File

@ -1,6 +1,6 @@
{
"name": "CherryStudio",
"version": "1.7.11",
"version": "1.7.13",
"private": true,
"description": "A powerful AI assistant for producer.",
"main": "./out/main/index.js",
@ -42,7 +42,7 @@
"i18n:check": "dotenv -e .env -- tsx scripts/check-i18n.ts",
"i18n:sync": "dotenv -e .env -- tsx scripts/sync-i18n.ts",
"i18n:translate": "dotenv -e .env -- tsx scripts/auto-translate-i18n.ts",
"i18n:all": "pnpm i18n:check && pnpm i18n:sync && pnpm i18n:translate",
"i18n:all": "pnpm i18n:sync && pnpm i18n:translate",
"update:languages": "tsx scripts/update-languages.ts",
"update:upgrade-config": "tsx scripts/update-app-upgrade-config.ts",
"test": "vitest run --silent",
@ -433,7 +433,8 @@
"@img/sharp-linux-x64": "0.34.3",
"@img/sharp-win32-x64": "0.34.3",
"@langchain/core": "1.0.2",
"@ai-sdk/openai-compatible@1.0.27": "1.0.28"
"@ai-sdk/openai-compatible@1.0.27": "1.0.28",
"@ai-sdk/openai-compatible@1.0.30": "1.0.28"
},
"patchedDependencies": {
"@napi-rs/system-ocr@1.0.2": "patches/@napi-rs-system-ocr-npm-1.0.2-59e7a78e8b.patch",
@ -453,7 +454,9 @@
"file-stream-rotator@0.6.1": "patches/file-stream-rotator-npm-0.6.1-eab45fb13d.patch",
"libsql@0.4.7": "patches/libsql-npm-0.4.7-444e260fb1.patch",
"pdf-parse@1.1.1": "patches/pdf-parse-npm-1.1.1-04a6109b2a.patch",
"@ai-sdk/openai-compatible@1.0.28": "patches/@ai-sdk-openai-compatible-npm-1.0.28-5705188855.patch"
"@ai-sdk/openai-compatible@1.0.28": "patches/@ai-sdk__openai-compatible@1.0.28.patch",
"@anthropic-ai/claude-agent-sdk@0.1.76": "patches/@anthropic-ai__claude-agent-sdk@0.1.76.patch",
"@openrouter/ai-sdk-provider": "patches/@openrouter__ai-sdk-provider.patch"
},
"onlyBuiltDependencies": [
"@kangfenmao/keyv-storage",

View File

@ -9,9 +9,9 @@ index 48e2f6263c6ee4c75d7e5c28733e64f6ebe92200..00d0729c4a3cbf9a48e8e1e962c7e2b2
+ sendReasoning: z.ZodOptional<z.ZodBoolean>;
}, z.core.$strip>;
type OpenAICompatibleProviderOptions = z.infer<typeof openaiCompatibleProviderOptions>;
diff --git a/dist/index.js b/dist/index.js
index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e5bfe0f9a 100644
index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..88349c614a69a268a2e4f3b157cb5e328ca1d347 100644
--- a/dist/index.js
+++ b/dist/index.js
@@ -41,7 +41,7 @@ function getOpenAIMetadata(message) {
@ -52,17 +52,38 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
tool_calls: toolCalls.length > 0 ? toolCalls : void 0,
...metadata
});
@@ -200,7 +208,8 @@ var openaiCompatibleProviderOptions = import_v4.z.object({
@@ -200,7 +208,9 @@ var openaiCompatibleProviderOptions = import_v4.z.object({
/**
* Controls the verbosity of the generated text. Defaults to `medium`.
*/
- textVerbosity: import_v4.z.string().optional()
+ textVerbosity: import_v4.z.string().optional(),
+ sendReasoning: import_v4.z.boolean().optional()
+ sendReasoning: import_v4.z.boolean().optional(),
+ strictJsonSchema: z.boolean().optional()
});
// src/openai-compatible-error.ts
@@ -378,7 +387,7 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -225,7 +235,8 @@ var defaultOpenAICompatibleErrorStructure = {
var import_provider2 = require("@ai-sdk/provider");
function prepareTools({
tools,
- toolChoice
+ toolChoice,
+ strictJsonSchema
}) {
tools = (tools == null ? void 0 : tools.length) ? tools : void 0;
const toolWarnings = [];
@@ -242,7 +253,8 @@ function prepareTools({
function: {
name: tool.name,
description: tool.description,
- parameters: tool.inputSchema
+ parameters: tool.inputSchema,
+ strict: strictJsonSchema
}
});
}
@@ -378,7 +390,7 @@ var OpenAICompatibleChatLanguageModel = class {
reasoning_effort: compatibleOptions.reasoningEffort,
verbosity: compatibleOptions.textVerbosity,
// messages:
@ -71,7 +92,7 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
// tools:
tools: openaiTools,
tool_choice: openaiToolChoice
@@ -421,6 +430,17 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -421,6 +433,17 @@ var OpenAICompatibleChatLanguageModel = class {
text: reasoning
});
}
@ -89,7 +110,7 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
if (choice.message.tool_calls != null) {
for (const toolCall of choice.message.tool_calls) {
content.push({
@@ -598,6 +618,17 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -598,6 +621,17 @@ var OpenAICompatibleChatLanguageModel = class {
delta: delta.content
});
}
@ -107,7 +128,7 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
if (delta.tool_calls != null) {
for (const toolCallDelta of delta.tool_calls) {
const index = toolCallDelta.index;
@@ -765,6 +796,14 @@ var OpenAICompatibleChatResponseSchema = import_v43.z.object({
@@ -765,6 +799,14 @@ var OpenAICompatibleChatResponseSchema = import_v43.z.object({
arguments: import_v43.z.string()
})
})
@ -122,7 +143,7 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
).nullish()
}),
finish_reason: import_v43.z.string().nullish()
@@ -795,6 +834,14 @@ var createOpenAICompatibleChatChunkSchema = (errorSchema) => import_v43.z.union(
@@ -795,6 +837,14 @@ var createOpenAICompatibleChatChunkSchema = (errorSchema) => import_v43.z.union(
arguments: import_v43.z.string().nullish()
})
})
@ -138,7 +159,7 @@ index da237bb35b7fa8e24b37cd861ee73dfc51cdfc72..b3060fbaf010e30b64df55302807828e
}).nullish(),
finish_reason: import_v43.z.string().nullish()
diff --git a/dist/index.mjs b/dist/index.mjs
index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5700264de 100644
index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..fca65c04000ce4c01fb90e93326ac179c2378055 100644
--- a/dist/index.mjs
+++ b/dist/index.mjs
@@ -23,7 +23,7 @@ function getOpenAIMetadata(message) {
@ -179,17 +200,38 @@ index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5
tool_calls: toolCalls.length > 0 ? toolCalls : void 0,
...metadata
});
@@ -182,7 +190,8 @@ var openaiCompatibleProviderOptions = z.object({
@@ -182,7 +190,9 @@ var openaiCompatibleProviderOptions = z.object({
/**
* Controls the verbosity of the generated text. Defaults to `medium`.
*/
- textVerbosity: z.string().optional()
+ textVerbosity: z.string().optional(),
+ sendReasoning: z.boolean().optional()
+ sendReasoning: z.boolean().optional(),
+ strictJsonSchema: z.boolean().optional()
});
// src/openai-compatible-error.ts
@@ -362,7 +371,7 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -209,7 +219,8 @@ import {
} from "@ai-sdk/provider";
function prepareTools({
tools,
- toolChoice
+ toolChoice,
+ strictJsonSchema
}) {
tools = (tools == null ? void 0 : tools.length) ? tools : void 0;
const toolWarnings = [];
@@ -226,7 +237,8 @@ function prepareTools({
function: {
name: tool.name,
description: tool.description,
- parameters: tool.inputSchema
+ parameters: tool.inputSchema,
+ strict: strictJsonSchema
}
});
}
@@ -362,7 +374,7 @@ var OpenAICompatibleChatLanguageModel = class {
reasoning_effort: compatibleOptions.reasoningEffort,
verbosity: compatibleOptions.textVerbosity,
// messages:
@ -198,7 +240,7 @@ index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5
// tools:
tools: openaiTools,
tool_choice: openaiToolChoice
@@ -405,6 +414,17 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -405,6 +417,17 @@ var OpenAICompatibleChatLanguageModel = class {
text: reasoning
});
}
@ -216,7 +258,7 @@ index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5
if (choice.message.tool_calls != null) {
for (const toolCall of choice.message.tool_calls) {
content.push({
@@ -582,6 +602,17 @@ var OpenAICompatibleChatLanguageModel = class {
@@ -582,6 +605,17 @@ var OpenAICompatibleChatLanguageModel = class {
delta: delta.content
});
}
@ -234,7 +276,7 @@ index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5
if (delta.tool_calls != null) {
for (const toolCallDelta of delta.tool_calls) {
const index = toolCallDelta.index;
@@ -749,6 +780,14 @@ var OpenAICompatibleChatResponseSchema = z3.object({
@@ -749,6 +783,14 @@ var OpenAICompatibleChatResponseSchema = z3.object({
arguments: z3.string()
})
})
@ -249,7 +291,7 @@ index a809a7aa0e148bfd43e01dd7b018568b151c8ad5..565b605eeacd9830b2b0e817e58ad0c5
).nullish()
}),
finish_reason: z3.string().nullish()
@@ -779,6 +818,14 @@ var createOpenAICompatibleChatChunkSchema = (errorSchema) => z3.union([
@@ -779,6 +821,14 @@ var createOpenAICompatibleChatChunkSchema = (errorSchema) => z3.union([
arguments: z3.string().nullish()
})
})

View File

@ -0,0 +1,33 @@
diff --git a/sdk.mjs b/sdk.mjs
index 1e1c3e4e3f81db622fb2789d17f3d421f212306e..5d193cdb6a43c7799fd5eff2d8af80827bfbdf1e 100755
--- a/sdk.mjs
+++ b/sdk.mjs
@@ -11985,7 +11985,7 @@ function createAbortController(maxListeners = DEFAULT_MAX_LISTENERS) {
}
// ../src/transport/ProcessTransport.ts
-import { spawn } from "child_process";
+import { fork } from "child_process";
import { createInterface } from "readline";
// ../src/utils/fsOperations.ts
@@ -12999,14 +12999,14 @@ class ProcessTransport {
return isRunningWithBun() ? "bun" : "node";
}
spawnLocalProcess(spawnOptions) {
- const { command, args, cwd: cwd2, env, signal } = spawnOptions;
+ const { args, cwd: cwd2, env, signal } = spawnOptions;
const stderrMode = env.DEBUG_CLAUDE_AGENT_SDK || this.options.stderr ? "pipe" : "ignore";
- const childProcess = spawn(command, args, {
+ logForSdkDebugging(`Forking Claude Code Node.js process: ${args[0]} ${args.slice(1).join(" ")}`);
+ const childProcess = fork(args[0], args.slice(1), {
cwd: cwd2,
- stdio: ["pipe", "pipe", stderrMode],
+ stdio: stderrMode === "pipe" ? ["pipe", "pipe", "pipe", "ipc"] : ["pipe", "pipe", "ignore", "ipc"],
signal,
- env,
- windowsHide: true
+ env
});
if (env.DEBUG_CLAUDE_AGENT_SDK || this.options.stderr) {
childProcess.stderr.on("data", (data) => {

View File

@ -0,0 +1,140 @@
diff --git a/dist/index.js b/dist/index.js
index f33510a50d11a2cb92a90ea70cc0ac84c89f29b9..db0af7e2cc05c47baeb29c0a3974a155316fbd05 100644
--- a/dist/index.js
+++ b/dist/index.js
@@ -1050,7 +1050,8 @@ var OpenRouterProviderMetadataSchema = import_v43.z.object({
var OpenRouterProviderOptionsSchema = import_v43.z.object({
openrouter: import_v43.z.object({
reasoning_details: import_v43.z.array(ReasoningDetailUnionSchema).optional(),
- annotations: import_v43.z.array(FileAnnotationSchema).optional()
+ annotations: import_v43.z.array(FileAnnotationSchema).optional(),
+ strictJsonSchema: import_v43.z.boolean().optional()
}).optional()
}).optional();
@@ -1658,7 +1659,8 @@ var OpenRouterChatLanguageModel = class {
responseFormat,
topK,
tools,
- toolChoice
+ toolChoice,
+ providerOptions
}) {
var _a15;
const baseArgs = __spreadValues(__spreadValues({
@@ -1712,7 +1714,8 @@ var OpenRouterChatLanguageModel = class {
function: {
name: tool.name,
description: tool.description,
- parameters: tool.inputSchema
+ parameters: tool.inputSchema,
+ strict: providerOptions?.openrouter?.strictJsonSchema
}
}));
return __spreadProps(__spreadValues({}, baseArgs), {
@@ -1725,7 +1728,7 @@ var OpenRouterChatLanguageModel = class {
async doGenerate(options) {
var _a15, _b, _c, _d, _e, _f, _g, _h, _i, _j, _k, _l, _m, _n, _o, _p, _q, _r, _s, _t, _u, _v, _w;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: responseValue, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -1931,7 +1934,7 @@ var OpenRouterChatLanguageModel = class {
async doStream(options) {
var _a15;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -2564,7 +2567,7 @@ var OpenRouterCompletionLanguageModel = class {
async doGenerate(options) {
var _a15, _b, _c, _d, _e, _f, _g, _h, _i, _j, _k, _l, _m, _n, _o;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -2623,7 +2626,7 @@ var OpenRouterCompletionLanguageModel = class {
}
async doStream(options) {
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({
diff --git a/dist/index.mjs b/dist/index.mjs
index 8a688331b88b4af738ee4ca8062b5f24124d3d81..a2aa299a44352addc26f8891d839ea31a2150ee2 100644
--- a/dist/index.mjs
+++ b/dist/index.mjs
@@ -1015,7 +1015,8 @@ var OpenRouterProviderMetadataSchema = z3.object({
var OpenRouterProviderOptionsSchema = z3.object({
openrouter: z3.object({
reasoning_details: z3.array(ReasoningDetailUnionSchema).optional(),
- annotations: z3.array(FileAnnotationSchema).optional()
+ annotations: z3.array(FileAnnotationSchema).optional(),
+ strictJsonSchema: z3.boolean().optional()
}).optional()
}).optional();
@@ -1623,7 +1624,8 @@ var OpenRouterChatLanguageModel = class {
responseFormat,
topK,
tools,
- toolChoice
+ toolChoice,
+ providerOptions
}) {
var _a15;
const baseArgs = __spreadValues(__spreadValues({
@@ -1677,7 +1679,8 @@ var OpenRouterChatLanguageModel = class {
function: {
name: tool.name,
description: tool.description,
- parameters: tool.inputSchema
+ parameters: tool.inputSchema,
+ strict: providerOptions?.openrouter?.strictJsonSchema
}
}));
return __spreadProps(__spreadValues({}, baseArgs), {
@@ -1690,7 +1693,7 @@ var OpenRouterChatLanguageModel = class {
async doGenerate(options) {
var _a15, _b, _c, _d, _e, _f, _g, _h, _i, _j, _k, _l, _m, _n, _o, _p, _q, _r, _s, _t, _u, _v, _w;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: responseValue, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -1896,7 +1899,7 @@ var OpenRouterChatLanguageModel = class {
async doStream(options) {
var _a15;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -2529,7 +2532,7 @@ var OpenRouterCompletionLanguageModel = class {
async doGenerate(options) {
var _a15, _b, _c, _d, _e, _f, _g, _h, _i, _j, _k, _l, _m, _n, _o;
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({
@@ -2588,7 +2591,7 @@ var OpenRouterCompletionLanguageModel = class {
}
async doStream(options) {
const providerOptions = options.providerOptions || {};
- const openrouterOptions = providerOptions.openrouter || {};
+ const { strictJsonSchema: _strictJsonSchema, ...openrouterOptions } = providerOptions.openrouter || {};
const args = __spreadValues(__spreadValues({}, this.getArgs(options)), openrouterOptions);
const { value: response, responseHeaders } = await postJsonToApi({
url: this.config.url({

49
pnpm-lock.yaml generated
View File

@ -23,17 +23,21 @@ overrides:
'@img/sharp-win32-x64': 0.34.3
'@langchain/core': 1.0.2
'@ai-sdk/openai-compatible@1.0.27': 1.0.28
'@ai-sdk/openai-compatible@1.0.30': 1.0.28
patchedDependencies:
'@ai-sdk/google@2.0.49':
hash: 279e9d43f675e4b979b32b78954dd37acc3026aa36ae2dd7701b5bad2f061522
path: patches/@ai-sdk-google-npm-2.0.49-84720f41bd.patch
'@ai-sdk/openai-compatible@1.0.28':
hash: 66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1
path: patches/@ai-sdk-openai-compatible-npm-1.0.28-5705188855.patch
hash: 5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda
path: patches/@ai-sdk__openai-compatible@1.0.28.patch
'@ai-sdk/openai@2.0.85':
hash: f2077f4759520d1de69b164dfd8adca1a9ace9de667e35cb0e55e812ce2ac13b
path: patches/@ai-sdk-openai-npm-2.0.85-27483d1d6a.patch
'@anthropic-ai/claude-agent-sdk@0.1.76':
hash: e063a8ede82d78f452f7f1290b9d4d9323866159b5624679163caa8edd4928d5
path: patches/@anthropic-ai__claude-agent-sdk@0.1.76.patch
'@anthropic-ai/vertex-sdk@0.11.4':
hash: 12e3275df5632dfe717d4db64df70e9b0128dfac86195da27722effe4749662f
path: patches/@anthropic-ai-vertex-sdk-npm-0.11.4-c19cb41edb.patch
@ -49,6 +53,9 @@ patchedDependencies:
'@napi-rs/system-ocr@1.0.2':
hash: aa1a73e445ee644774745b620589bb99d85bee6c95cc2a91fe9137e580da5bde
path: patches/@napi-rs-system-ocr-npm-1.0.2-59e7a78e8b.patch
'@openrouter/ai-sdk-provider':
hash: 508e8e662b8547de93410cb7c3b1336077f34c6bf79c520ef5273962ea777c52
path: patches/@openrouter__ai-sdk-provider.patch
'@tiptap/extension-drag-handle@3.2.0':
hash: 8432665d4553fb9ba8ff2a126a9181c3ccfee06ae57688aa14f65aa560e52fce
path: patches/@tiptap-extension-drag-handle-npm-3.2.0-5a9ebff7c9.patch
@ -86,7 +93,7 @@ importers:
dependencies:
'@anthropic-ai/claude-agent-sdk':
specifier: 0.1.76
version: 0.1.76(zod@4.3.4)
version: 0.1.76(patch_hash=e063a8ede82d78f452f7f1290b9d4d9323866159b5624679163caa8edd4928d5)(zod@4.3.4)
'@libsql/client':
specifier: 0.14.0
version: 0.14.0
@ -354,7 +361,7 @@ importers:
version: 2.3.0(encoding@0.1.13)
'@openrouter/ai-sdk-provider':
specifier: ^1.2.8
version: 1.5.4(ai@5.0.117(zod@4.3.4))(zod@4.3.4)
version: 1.5.4(patch_hash=508e8e662b8547de93410cb7c3b1336077f34c6bf79c520ef5273962ea777c52)(ai@5.0.117(zod@4.3.4))(zod@4.3.4)
'@opentelemetry/api':
specifier: ^1.9.0
version: 1.9.0
@ -1192,7 +1199,7 @@ importers:
version: 2.0.85(patch_hash=f2077f4759520d1de69b164dfd8adca1a9ace9de667e35cb0e55e812ce2ac13b)(zod@4.3.5)
'@ai-sdk/openai-compatible':
specifier: 1.0.28
version: 1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.5)
version: 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.5)
'@ai-sdk/provider':
specifier: ^2.0.0
version: 2.0.1
@ -1232,7 +1239,7 @@ importers:
version: 2.0.85(patch_hash=f2077f4759520d1de69b164dfd8adca1a9ace9de667e35cb0e55e812ce2ac13b)(zod@4.3.4)
'@ai-sdk/openai-compatible':
specifier: 1.0.28
version: 1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.4)
version: 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider':
specifier: ^2.0.0
version: 2.0.1
@ -1395,12 +1402,6 @@ packages:
peerDependencies:
zod: ^3.25.76 || ^4.1.8
'@ai-sdk/openai-compatible@1.0.30':
resolution: {integrity: sha512-thubwhRtv9uicAxSWwNpinM7hiL/0CkhL/ymPaHuKvI494J7HIzn8KQZQ2ymRz284WTIZnI7VMyyejxW4RMM6w==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.25.76 || ^4.1.8
'@ai-sdk/openai@2.0.85':
resolution: {integrity: sha512-3pzr7qVhsOXwjPAfmvFNZz3sRWCuyMOc3GgLHe7sWY0t8J4hA5mwQ4LISTKYI3iIr8IXzAQn9MUrC8Hiji9RpA==}
engines: {node: '>=18'}
@ -12317,7 +12318,7 @@ snapshots:
'@ai-sdk/cerebras@1.0.34(zod@4.3.4)':
dependencies:
'@ai-sdk/openai-compatible': 1.0.30(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.20(zod@4.3.4)
zod: 4.3.4
@ -12373,7 +12374,7 @@ snapshots:
'@ai-sdk/huggingface@0.0.10(zod@4.3.4)':
dependencies:
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider': 2.0.0
'@ai-sdk/provider-utils': 3.0.17(zod@4.3.4)
zod: 4.3.4
@ -12384,24 +12385,18 @@ snapshots:
'@ai-sdk/provider-utils': 3.0.20(zod@4.3.4)
zod: 4.3.4
'@ai-sdk/openai-compatible@1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.4)':
'@ai-sdk/openai-compatible@1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)':
dependencies:
'@ai-sdk/provider': 2.0.0
'@ai-sdk/provider-utils': 3.0.18(zod@4.3.4)
zod: 4.3.4
'@ai-sdk/openai-compatible@1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.5)':
'@ai-sdk/openai-compatible@1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.5)':
dependencies:
'@ai-sdk/provider': 2.0.0
'@ai-sdk/provider-utils': 3.0.18(zod@4.3.5)
zod: 4.3.5
'@ai-sdk/openai-compatible@1.0.30(zod@4.3.4)':
dependencies:
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.20(zod@4.3.4)
zod: 4.3.4
'@ai-sdk/openai@2.0.85(patch_hash=f2077f4759520d1de69b164dfd8adca1a9ace9de667e35cb0e55e812ce2ac13b)(zod@4.3.4)':
dependencies:
'@ai-sdk/provider': 2.0.0
@ -12496,14 +12491,14 @@ snapshots:
'@ai-sdk/xai@2.0.36(zod@4.3.4)':
dependencies:
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider': 2.0.0
'@ai-sdk/provider-utils': 3.0.17(zod@4.3.4)
zod: 4.3.4
'@ai-sdk/xai@2.0.43(zod@4.3.4)':
dependencies:
'@ai-sdk/openai-compatible': 1.0.30(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.20(zod@4.3.4)
zod: 4.3.4
@ -12573,7 +12568,7 @@ snapshots:
package-manager-detector: 1.6.0
tinyexec: 1.0.2
'@anthropic-ai/claude-agent-sdk@0.1.76(zod@4.3.4)':
'@anthropic-ai/claude-agent-sdk@0.1.76(patch_hash=e063a8ede82d78f452f7f1290b9d4d9323866159b5624679163caa8edd4928d5)(zod@4.3.4)':
dependencies:
zod: 4.3.4
optionalDependencies:
@ -15156,7 +15151,7 @@ snapshots:
'@open-draft/until@2.1.0': {}
'@openrouter/ai-sdk-provider@1.5.4(ai@5.0.117(zod@4.3.4))(zod@4.3.4)':
'@openrouter/ai-sdk-provider@1.5.4(patch_hash=508e8e662b8547de93410cb7c3b1336077f34c6bf79c520ef5273962ea777c52)(ai@5.0.117(zod@4.3.4))(zod@4.3.4)':
dependencies:
'@openrouter/sdk': 0.1.27
ai: 5.0.117(zod@4.3.4)
@ -15273,7 +15268,7 @@ snapshots:
'@opeoginni/github-copilot-openai-compatible@0.1.22(zod@4.3.4)':
dependencies:
'@ai-sdk/openai': 2.0.85(patch_hash=f2077f4759520d1de69b164dfd8adca1a9ace9de667e35cb0e55e812ce2ac13b)(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=66f6605ef3f852d8f2b638a1d64b138eb8b2ad34ca6f331a0496c1d1379379c1)(zod@4.3.4)
'@ai-sdk/openai-compatible': 1.0.28(patch_hash=5ea49b4f07636a8e4630097e67e2787779ba7e933bd0459f81b1803cb125edda)(zod@4.3.4)
'@ai-sdk/provider': 2.1.0-beta.5
'@ai-sdk/provider-utils': 3.0.20(zod@4.3.4)
transitivePeerDependencies:

View File

@ -1,6 +1,10 @@
import { loggerService } from '@logger'
import { MESSAGE_STREAM_TIMEOUT_MS } from '@main/apiServer/config/timeouts'
import { createStreamAbortController, STREAM_TIMEOUT_REASON } from '@main/apiServer/utils/createStreamAbortController'
import {
createStreamAbortController,
STREAM_TIMEOUT_REASON,
type StreamAbortController
} from '@main/apiServer/utils/createStreamAbortController'
import { agentService, sessionMessageService, sessionService } from '@main/services/agents'
import type { Request, Response } from 'express'
@ -26,7 +30,7 @@ const verifyAgentAndSession = async (agentId: string, sessionId: string) => {
}
export const createMessage = async (req: Request, res: Response): Promise<void> => {
let clearAbortTimeout: (() => void) | undefined
let streamController: StreamAbortController | undefined
try {
const { agentId, sessionId } = req.params
@ -45,14 +49,10 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
res.setHeader('Access-Control-Allow-Origin', '*')
res.setHeader('Access-Control-Allow-Headers', 'Cache-Control')
const {
abortController,
registerAbortHandler,
clearAbortTimeout: helperClearAbortTimeout
} = createStreamAbortController({
streamController = createStreamAbortController({
timeoutMs: MESSAGE_STREAM_TIMEOUT_MS
})
clearAbortTimeout = helperClearAbortTimeout
const { abortController, registerAbortHandler, dispose } = streamController
const { stream, completion } = await sessionMessageService.createSessionMessage(
session,
messageData,
@ -64,8 +64,8 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
let responseEnded = false
let streamFinished = false
const cleanupAbortTimeout = () => {
clearAbortTimeout?.()
const cleanup = () => {
dispose()
}
const finalizeResponse = () => {
@ -78,7 +78,7 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
}
responseEnded = true
cleanupAbortTimeout()
cleanup()
try {
// res.write('data: {"type":"finish"}\n\n')
res.write('data: [DONE]\n\n')
@ -108,7 +108,7 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
* - Mark the response as ended to prevent further writes
*/
registerAbortHandler((abortReason) => {
cleanupAbortTimeout()
cleanup()
if (responseEnded) return
@ -189,7 +189,7 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
logger.error('Error writing stream error to SSE', { error: writeError })
}
responseEnded = true
cleanupAbortTimeout()
cleanup()
res.end()
}
}
@ -221,14 +221,14 @@ export const createMessage = async (req: Request, res: Response): Promise<void>
logger.error('Error writing completion error to SSE stream', { error: writeError })
}
responseEnded = true
cleanupAbortTimeout()
cleanup()
res.end()
})
// Clear timeout when response ends
res.on('close', cleanupAbortTimeout)
res.on('finish', cleanupAbortTimeout)
res.on('close', cleanup)
res.on('finish', cleanup)
} catch (error: any) {
clearAbortTimeout?.()
streamController?.dispose()
logger.error('Error in streaming message handler', {
error,
agentId: req.params.agentId,

View File

@ -4,6 +4,7 @@ export interface StreamAbortController {
abortController: AbortController
registerAbortHandler: (handler: StreamAbortHandler) => void
clearAbortTimeout: () => void
dispose: () => void
}
export const STREAM_TIMEOUT_REASON = 'stream timeout'
@ -40,6 +41,15 @@ export const createStreamAbortController = (options: CreateStreamAbortController
signal.addEventListener('abort', handleAbort, { once: true })
let disposed = false
const dispose = () => {
if (disposed) return
disposed = true
clearAbortTimeout()
signal.removeEventListener('abort', handleAbort)
}
const registerAbortHandler = (handler: StreamAbortHandler) => {
abortHandler = handler
@ -59,6 +69,7 @@ export const createStreamAbortController = (options: CreateStreamAbortController
return {
abortController,
registerAbortHandler,
clearAbortTimeout
clearAbortTimeout,
dispose
}
}

View File

@ -8,6 +8,27 @@ import { v4 as uuidv4 } from 'uuid'
const logger = loggerService.withContext('DxtService')
/**
* Ensure a target path is within the base directory to prevent path traversal attacks.
* This is the correct approach: validate the final resolved path rather than sanitizing input.
*
* @param basePath - The base directory that the target must be within
* @param targetPath - The target path to validate
* @returns The resolved target path if valid
* @throws Error if the target path escapes the base directory
*/
export function ensurePathWithin(basePath: string, targetPath: string): string {
const resolvedBase = path.resolve(basePath)
const resolvedTarget = path.resolve(path.normalize(targetPath))
// Must be direct child of base directory, no subdirectories allowed
if (path.dirname(resolvedTarget) !== resolvedBase) {
throw new Error('Path traversal detected: target path must be direct child of base directory')
}
return resolvedTarget
}
// Type definitions
export interface DxtManifest {
dxt_version: string
@ -68,6 +89,76 @@ export interface DxtUploadResult {
error?: string
}
/**
* Validate and sanitize a command to prevent path traversal attacks.
* Commands should be either:
* 1. Simple command names (e.g., "node", "python", "npx") - looked up in PATH
* 2. Absolute paths (e.g., "/usr/bin/node", "C:\\Program Files\\node\\node.exe")
* 3. Relative paths starting with ./ or .\ (relative to extractDir)
*
* Rejects commands containing path traversal sequences (..)
*
* @param command - The command to validate
* @returns The validated command
* @throws Error if command contains path traversal or is invalid
*/
export function validateCommand(command: string): string {
if (!command || typeof command !== 'string') {
throw new Error('Invalid command: command must be a non-empty string')
}
const trimmed = command.trim()
if (!trimmed) {
throw new Error('Invalid command: command cannot be empty')
}
// Check for path traversal sequences
// This catches: .., ../, ..\, /../, \..\, etc.
if (/(?:^|[/\\])\.\.(?:[/\\]|$)/.test(trimmed) || trimmed === '..') {
throw new Error(`Invalid command: path traversal detected in "${command}"`)
}
// Check for null bytes
if (trimmed.includes('\0')) {
throw new Error('Invalid command: null byte detected')
}
return trimmed
}
/**
* Validate command arguments to prevent injection attacks.
* Rejects arguments containing path traversal sequences.
*
* @param args - The arguments array to validate
* @returns The validated arguments array
* @throws Error if any argument contains path traversal
*/
export function validateArgs(args: string[]): string[] {
if (!Array.isArray(args)) {
throw new Error('Invalid args: must be an array')
}
return args.map((arg, index) => {
if (typeof arg !== 'string') {
throw new Error(`Invalid args: argument at index ${index} must be a string`)
}
// Check for null bytes
if (arg.includes('\0')) {
throw new Error(`Invalid args: null byte detected in argument at index ${index}`)
}
// Check for path traversal in arguments that look like paths
// Only validate if the arg contains path separators (indicating it's meant to be a path)
if ((arg.includes('/') || arg.includes('\\')) && /(?:^|[/\\])\.\.(?:[/\\]|$)/.test(arg)) {
throw new Error(`Invalid args: path traversal detected in argument at index ${index}`)
}
return arg
})
}
export function performVariableSubstitution(
value: string,
extractDir: string,
@ -134,12 +225,16 @@ export function applyPlatformOverrides(mcpConfig: any, extractDir: string, userC
// Apply variable substitution to all string values
if (resolvedConfig.command) {
resolvedConfig.command = performVariableSubstitution(resolvedConfig.command, extractDir, userConfig)
// Validate command after substitution to prevent path traversal attacks
resolvedConfig.command = validateCommand(resolvedConfig.command)
}
if (resolvedConfig.args) {
resolvedConfig.args = resolvedConfig.args.map((arg: string) =>
performVariableSubstitution(arg, extractDir, userConfig)
)
// Validate args after substitution to prevent path traversal attacks
resolvedConfig.args = validateArgs(resolvedConfig.args)
}
if (resolvedConfig.env) {
@ -271,10 +366,8 @@ class DxtService {
}
// Use server name as the final extract directory for automatic version management
// Sanitize the name to prevent creating subdirectories
const sanitizedName = manifest.name.replace(/\//g, '-')
const serverDirName = `server-${sanitizedName}`
const finalExtractDir = path.join(this.mcpDir, serverDirName)
const serverDirName = `server-${manifest.name}`
const finalExtractDir = ensurePathWithin(this.mcpDir, path.join(this.mcpDir, serverDirName))
// Clean up any existing version of this server
if (fs.existsSync(finalExtractDir)) {
@ -354,27 +447,15 @@ class DxtService {
public cleanupDxtServer(serverName: string): boolean {
try {
// Handle server names that might contain slashes (e.g., "anthropic/sequential-thinking")
// by replacing slashes with the same separator used during installation
const sanitizedName = serverName.replace(/\//g, '-')
const serverDirName = `server-${sanitizedName}`
const serverDir = path.join(this.mcpDir, serverDirName)
const serverDirName = `server-${serverName}`
const serverDir = ensurePathWithin(this.mcpDir, path.join(this.mcpDir, serverDirName))
// First try the sanitized path
if (fs.existsSync(serverDir)) {
logger.debug(`Removing DXT server directory: ${serverDir}`)
fs.rmSync(serverDir, { recursive: true, force: true })
return true
}
// Fallback: try with original name in case it was stored differently
const originalServerDir = path.join(this.mcpDir, `server-${serverName}`)
if (fs.existsSync(originalServerDir)) {
logger.debug(`Removing DXT server directory: ${originalServerDir}`)
fs.rmSync(originalServerDir, { recursive: true, force: true })
return true
}
logger.warn(`Server directory not found: ${serverDir}`)
return false
} catch (error) {

View File

@ -1083,18 +1083,33 @@ export class SelectionService {
this.lastCtrlkeyDownTime = -1
}
//check if the key is ctrl key
// Check if the key is ctrl key
// Windows: VK_LCONTROL(162), VK_RCONTROL(163)
// macOS: kVK_Control(59), kVK_RightControl(62)
private isCtrlkey(vkCode: number) {
if (isMac) {
return vkCode === 59 || vkCode === 62
}
return vkCode === 162 || vkCode === 163
}
//check if the key is shift key
// Check if the key is shift key
// Windows: VK_LSHIFT(160), VK_RSHIFT(161)
// macOS: kVK_Shift(56), kVK_RightShift(60)
private isShiftkey(vkCode: number) {
if (isMac) {
return vkCode === 56 || vkCode === 60
}
return vkCode === 160 || vkCode === 161
}
//check if the key is alt key
// Check if the key is alt/option key
// Windows: VK_LMENU(164), VK_RMENU(165)
// macOS: kVK_Option(58), kVK_RightOption(61)
private isAltkey(vkCode: number) {
if (isMac) {
return vkCode === 58 || vkCode === 61
}
return vkCode === 164 || vkCode === 165
}

View File

@ -0,0 +1,202 @@
import path from 'path'
import { describe, expect, it } from 'vitest'
import { ensurePathWithin, validateArgs, validateCommand } from '../DxtService'
describe('ensurePathWithin', () => {
const baseDir = '/home/user/mcp'
describe('valid paths', () => {
it('should accept direct child paths', () => {
expect(ensurePathWithin(baseDir, '/home/user/mcp/server-test')).toBe('/home/user/mcp/server-test')
expect(ensurePathWithin(baseDir, '/home/user/mcp/my-server')).toBe('/home/user/mcp/my-server')
})
it('should accept paths with unicode characters', () => {
expect(ensurePathWithin(baseDir, '/home/user/mcp/服务器')).toBe('/home/user/mcp/服务器')
expect(ensurePathWithin(baseDir, '/home/user/mcp/サーバー')).toBe('/home/user/mcp/サーバー')
})
})
describe('path traversal prevention', () => {
it('should reject paths that escape base directory', () => {
expect(() => ensurePathWithin(baseDir, '/home/user/mcp/../../../etc')).toThrow('Path traversal detected')
expect(() => ensurePathWithin(baseDir, '/etc/passwd')).toThrow('Path traversal detected')
expect(() => ensurePathWithin(baseDir, '/home/user')).toThrow('Path traversal detected')
})
it('should reject subdirectories', () => {
expect(() => ensurePathWithin(baseDir, '/home/user/mcp/sub/dir')).toThrow('Path traversal detected')
expect(() => ensurePathWithin(baseDir, '/home/user/mcp/a/b/c')).toThrow('Path traversal detected')
})
it('should reject Windows-style path traversal', () => {
const winBase = 'C:\\Users\\user\\mcp'
expect(() => ensurePathWithin(winBase, 'C:\\Users\\user\\mcp\\..\\..\\Windows\\System32')).toThrow(
'Path traversal detected'
)
})
it('should reject null byte attacks', () => {
const maliciousPath = path.join(baseDir, 'server\x00/../../../etc/passwd')
expect(() => ensurePathWithin(baseDir, maliciousPath)).toThrow('Path traversal detected')
})
it('should handle encoded traversal attempts', () => {
expect(() => ensurePathWithin(baseDir, '/home/user/mcp/../escape')).toThrow('Path traversal detected')
})
})
describe('edge cases', () => {
it('should reject base directory itself', () => {
expect(() => ensurePathWithin(baseDir, '/home/user/mcp')).toThrow('Path traversal detected')
})
it('should handle relative path construction', () => {
const target = path.join(baseDir, 'server-name')
expect(ensurePathWithin(baseDir, target)).toBe('/home/user/mcp/server-name')
})
})
})
describe('validateCommand', () => {
describe('valid commands', () => {
it('should accept simple command names', () => {
expect(validateCommand('node')).toBe('node')
expect(validateCommand('python')).toBe('python')
expect(validateCommand('npx')).toBe('npx')
expect(validateCommand('uvx')).toBe('uvx')
})
it('should accept absolute paths', () => {
expect(validateCommand('/usr/bin/node')).toBe('/usr/bin/node')
expect(validateCommand('/usr/local/bin/python3')).toBe('/usr/local/bin/python3')
expect(validateCommand('C:\\Program Files\\nodejs\\node.exe')).toBe('C:\\Program Files\\nodejs\\node.exe')
})
it('should accept relative paths starting with ./', () => {
expect(validateCommand('./node_modules/.bin/tsc')).toBe('./node_modules/.bin/tsc')
expect(validateCommand('.\\scripts\\run.bat')).toBe('.\\scripts\\run.bat')
})
it('should trim whitespace', () => {
expect(validateCommand(' node ')).toBe('node')
expect(validateCommand('\tpython\n')).toBe('python')
})
})
describe('path traversal prevention', () => {
it('should reject commands with path traversal (Unix style)', () => {
expect(() => validateCommand('../../../bin/sh')).toThrow('path traversal detected')
expect(() => validateCommand('../../etc/passwd')).toThrow('path traversal detected')
expect(() => validateCommand('/usr/../../../bin/sh')).toThrow('path traversal detected')
})
it('should reject commands with path traversal (Windows style)', () => {
expect(() => validateCommand('..\\..\\..\\Windows\\System32\\cmd.exe')).toThrow('path traversal detected')
expect(() => validateCommand('..\\..\\Windows\\System32\\calc.exe')).toThrow('path traversal detected')
expect(() => validateCommand('C:\\..\\..\\Windows\\System32\\cmd.exe')).toThrow('path traversal detected')
})
it('should reject just ".."', () => {
expect(() => validateCommand('..')).toThrow('path traversal detected')
})
it('should reject mixed style path traversal', () => {
expect(() => validateCommand('../..\\mixed/..\\attack')).toThrow('path traversal detected')
})
})
describe('null byte injection', () => {
it('should reject commands with null bytes', () => {
expect(() => validateCommand('node\x00.exe')).toThrow('null byte detected')
expect(() => validateCommand('python\0')).toThrow('null byte detected')
})
})
describe('edge cases', () => {
it('should reject empty strings', () => {
expect(() => validateCommand('')).toThrow('command must be a non-empty string')
expect(() => validateCommand(' ')).toThrow('command cannot be empty')
})
it('should reject non-string input', () => {
// @ts-expect-error - testing runtime behavior
expect(() => validateCommand(null)).toThrow('command must be a non-empty string')
// @ts-expect-error - testing runtime behavior
expect(() => validateCommand(undefined)).toThrow('command must be a non-empty string')
// @ts-expect-error - testing runtime behavior
expect(() => validateCommand(123)).toThrow('command must be a non-empty string')
})
})
describe('real-world attack scenarios', () => {
it('should prevent Windows system32 command injection', () => {
expect(() => validateCommand('../../../../Windows/System32/cmd.exe')).toThrow('path traversal detected')
expect(() => validateCommand('..\\..\\..\\..\\Windows\\System32\\powershell.exe')).toThrow(
'path traversal detected'
)
})
it('should prevent Unix bin injection', () => {
expect(() => validateCommand('../../../../bin/bash')).toThrow('path traversal detected')
expect(() => validateCommand('../../../usr/bin/curl')).toThrow('path traversal detected')
})
})
})
describe('validateArgs', () => {
describe('valid arguments', () => {
it('should accept normal arguments', () => {
expect(validateArgs(['--version'])).toEqual(['--version'])
expect(validateArgs(['-y', '@anthropic/mcp-server'])).toEqual(['-y', '@anthropic/mcp-server'])
expect(validateArgs(['install', 'package-name'])).toEqual(['install', 'package-name'])
})
it('should accept arguments with safe paths', () => {
expect(validateArgs(['./src/index.ts'])).toEqual(['./src/index.ts'])
expect(validateArgs(['/absolute/path/file.js'])).toEqual(['/absolute/path/file.js'])
})
it('should accept empty array', () => {
expect(validateArgs([])).toEqual([])
})
})
describe('path traversal prevention', () => {
it('should reject arguments with path traversal', () => {
expect(() => validateArgs(['../../../etc/passwd'])).toThrow('path traversal detected')
expect(() => validateArgs(['--config', '../../secrets.json'])).toThrow('path traversal detected')
expect(() => validateArgs(['..\\..\\Windows\\System32\\config'])).toThrow('path traversal detected')
})
it('should only check path-like arguments', () => {
// Arguments without path separators should pass even with dots
expect(validateArgs(['..version'])).toEqual(['..version'])
expect(validateArgs(['test..name'])).toEqual(['test..name'])
})
})
describe('null byte injection', () => {
it('should reject arguments with null bytes', () => {
expect(() => validateArgs(['file\x00.txt'])).toThrow('null byte detected')
expect(() => validateArgs(['--config', 'path\0name'])).toThrow('null byte detected')
})
})
describe('edge cases', () => {
it('should reject non-array input', () => {
// @ts-expect-error - testing runtime behavior
expect(() => validateArgs('not an array')).toThrow('must be an array')
// @ts-expect-error - testing runtime behavior
expect(() => validateArgs(null)).toThrow('must be an array')
})
it('should reject non-string elements', () => {
// @ts-expect-error - testing runtime behavior
expect(() => validateArgs([123])).toThrow('must be a string')
// @ts-expect-error - testing runtime behavior
expect(() => validateArgs(['valid', null])).toThrow('must be a string')
})
})
})

View File

@ -2,6 +2,7 @@
* Anthropic Prompt Caching Middleware
* @see https://ai-sdk.dev/providers/ai-sdk-providers/anthropic#cache-control
*/
import type { LanguageModelV2Message } from '@ai-sdk/provider'
import { estimateTextTokens } from '@renderer/services/TokenService'
import type { Provider } from '@renderer/types'
import type { LanguageModelMiddleware } from 'ai'
@ -10,11 +11,11 @@ const cacheProviderOptions = {
anthropic: { cacheControl: { type: 'ephemeral' } }
}
function estimateContentTokens(content: unknown): number {
function estimateContentTokens(content: LanguageModelV2Message['content']): number {
if (typeof content === 'string') return estimateTextTokens(content)
if (Array.isArray(content)) {
return content.reduce((acc, part) => {
if (typeof part === 'object' && part !== null && 'text' in part) {
if (part.type === 'text') {
return acc + estimateTextTokens(part.text as string)
}
return acc
@ -23,21 +24,6 @@ function estimateContentTokens(content: unknown): number {
return 0
}
function addCacheToContentParts(content: unknown): unknown {
if (typeof content === 'string') {
return [{ type: 'text', text: content, providerOptions: cacheProviderOptions }]
}
if (Array.isArray(content) && content.length > 0) {
const result = [...content]
const last = result[result.length - 1]
if (typeof last === 'object' && last !== null) {
result[result.length - 1] = { ...last, providerOptions: cacheProviderOptions }
}
return result
}
return content
}
export function anthropicCacheMiddleware(provider: Provider): LanguageModelMiddleware {
return {
middlewareVersion: 'v2',
@ -54,7 +40,7 @@ export function anthropicCacheMiddleware(provider: Provider): LanguageModelMiddl
// Cache system message (providerOptions on message object)
if (cacheSystemMessage) {
for (let i = 0; i < messages.length; i++) {
const msg = messages[i] as any
const msg = messages[i] as LanguageModelV2Message
if (msg.role === 'system' && estimateContentTokens(msg.content) >= tokenThreshold) {
messages[i] = { ...msg, providerOptions: cacheProviderOptions }
break
@ -64,12 +50,32 @@ export function anthropicCacheMiddleware(provider: Provider): LanguageModelMiddl
// Cache last N non-system messages (providerOptions on content parts)
if (cacheLastNMessages > 0) {
const cumsumTokens = [] as Array<number>
let tokenSum = 0 as number
for (let i = 0; i < messages.length; i++) {
const msg = messages[i] as LanguageModelV2Message
tokenSum += estimateContentTokens(msg.content)
cumsumTokens.push(tokenSum)
}
for (let i = messages.length - 1; i >= 0 && cachedCount < cacheLastNMessages; i--) {
const msg = messages[i] as any
if (msg.role !== 'system' && estimateContentTokens(msg.content) >= tokenThreshold) {
messages[i] = { ...msg, content: addCacheToContentParts(msg.content) }
cachedCount++
const msg = messages[i] as LanguageModelV2Message
if (msg.role === 'system' || cumsumTokens[i] < tokenThreshold || msg.content.length === 0) {
continue
}
const newContent = [...msg.content]
const lastIndex = newContent.length - 1
newContent[lastIndex] = {
...newContent[lastIndex],
providerOptions: cacheProviderOptions
}
messages[i] = {
...msg,
content: newContent
} as LanguageModelV2Message
cachedCount++
}
}

View File

@ -325,9 +325,9 @@ function createDeveloperToSystemFetch(originalFetch?: typeof fetch): typeof fetc
if (options?.body && typeof options.body === 'string') {
try {
const body = JSON.parse(options.body)
if (body.messages && Array.isArray(body.messages)) {
if (body.input && Array.isArray(body.input)) {
let hasChanges = false
body.messages = body.messages.map((msg: { role: string }) => {
body.input = body.input.map((msg: { role: string }) => {
if (msg.role === 'developer') {
hasChanges = true
return { ...msg, role: 'system' }

View File

@ -3,7 +3,7 @@ import { type AnthropicProviderOptions } from '@ai-sdk/anthropic'
import type { GoogleGenerativeAIProviderOptions } from '@ai-sdk/google'
import type { OpenAIResponsesProviderOptions } from '@ai-sdk/openai'
import type { XaiProviderOptions } from '@ai-sdk/xai'
import { baseProviderIdSchema, customProviderIdSchema } from '@cherrystudio/ai-core/provider'
import { baseProviderIdSchema, customProviderIdSchema, hasProviderConfig } from '@cherrystudio/ai-core/provider'
import { loggerService } from '@logger'
import {
getModelSupportedVerbosity,
@ -616,9 +616,14 @@ function buildGenericProviderOptions(
}
if (enableReasoning) {
if (isInterleavedThinkingModel(model)) {
providerOptions = {
...providerOptions,
sendReasoning: true
// sendReasoning is a patch specific to @ai-sdk/openai-compatible
// Only apply when provider will actually use openai-compatible SDK
// (i.e., no dedicated SDK registered OR explicitly openai-compatible)
if (!hasProviderConfig(providerId) || providerId === 'openai-compatible') {
providerOptions = {
...providerOptions,
sendReasoning: true
}
}
}
}
@ -648,6 +653,10 @@ function buildGenericProviderOptions(
}
}
if (isOpenAIModel(model)) {
providerOptions.strictJsonSchema = false
}
return {
[providerId]: providerOptions
}

View File

@ -777,7 +777,12 @@ export const SYSTEM_MODELS: Record<SystemProviderId | 'defaultModel', Model[]> =
{ id: 'qwen-flash', name: 'qwen-flash', provider: 'dashscope', group: 'qwen-flash', owned_by: 'system' },
{ id: 'qwen-plus', name: 'qwen-plus', provider: 'dashscope', group: 'qwen-plus', owned_by: 'system' },
{ id: 'qwen-max', name: 'qwen-max', provider: 'dashscope', group: 'qwen-max', owned_by: 'system' },
{ id: 'qwen3-max', name: 'qwen3-max', provider: 'dashscope', group: 'qwen-max', owned_by: 'system' }
{ id: 'qwen3-max', name: 'qwen3-max', provider: 'dashscope', group: 'qwen-max', owned_by: 'system' },
{ id: 'text-embedding-v4', name: 'text-embedding-v4', provider: 'dashscope', group: 'qwen-text-embedding' },
{ id: 'text-embedding-v3', name: 'text-embedding-v3', provider: 'dashscope', group: 'qwen-text-embedding' },
{ id: 'text-embedding-v2', name: 'text-embedding-v2', provider: 'dashscope', group: 'qwen-text-embedding' },
{ id: 'text-embedding-v1', name: 'text-embedding-v1', provider: 'dashscope', group: 'qwen-text-embedding' },
{ id: 'qwen3-rerank', name: 'qwen3-rerank', provider: 'dashscope', group: 'qwen-rerank' }
],
stepfun: [
{

View File

@ -83,7 +83,14 @@ export function useAssistant(id: string) {
throw new Error(`Assistant model is not set for assistant with name: ${assistant?.name ?? 'unknown'}`)
}
const assistantWithModel = useMemo(() => ({ ...assistant, model }), [assistant, model])
const normalizedTopics = useMemo(
() => (Array.isArray(assistant?.topics) ? assistant.topics : []),
[assistant?.topics]
)
const assistantWithModel = useMemo(
() => ({ ...assistant, model, topics: normalizedTopics }),
[assistant, model, normalizedTopics]
)
const settingsRef = useRef(assistant?.settings)

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "View",
"title": "Careers"
},
"checkUpdate": {
"available": "Update",
"label": "Check Update"

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "查看",
"title": "加入我们"
},
"checkUpdate": {
"available": "立即更新",
"label": "检查更新"

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "查看",
"title": "加入我們"
},
"checkUpdate": {
"available": "立即更新",
"label": "檢查更新"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "快取最近 N 則訊息",
"cache_last_n_help": "快取最後 N 則對話訊息(排除系統訊息)",
"cache_system": "快取系統訊息",
"cache_system_help": "是否快取系統提示",
"token_threshold": "快取權杖閾值",
"token_threshold_help": "超過此標記數量的訊息將被快取。設為 0 以停用快取。"
},
"array_content": {
"help": "該供應商是否支援 message 的 content 欄位為 array 類型",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Ansicht",
"title": "Karriere"
},
"checkUpdate": {
"available": "Jetzt aktualisieren",
"label": "Auf Updates prüfen"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Letzte N Nachrichten zwischenspeichern",
"cache_last_n_help": "Zwischen die letzten N Gesprächsnachrichten (ohne Systemnachrichten) zwischenspeichern",
"cache_system": "Cache-Systemnachricht",
"cache_system_help": "Ob der System-Prompt zwischengespeichert werden soll",
"token_threshold": "Cache-Token-Schwellenwert",
"token_threshold_help": "Nachrichten, die diese Token-Anzahl überschreiten, werden zwischengespeichert. Auf 0 setzen, um das Caching zu deaktivieren."
},
"array_content": {
"help": "Unterstützt Array-Format für message content",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Προβολή",
"title": "Καριέρα"
},
"checkUpdate": {
"available": "Άμεση ενημέρωση",
"label": "Έλεγχος ενημερώσεων"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Κρύψτε τα τελευταία N μηνύματα",
"cache_last_n_help": "Αποθηκεύστε στην κρυφή μνήμη τα τελευταία N μηνύματα της συνομιλίας (εξαιρουμένων των μηνυμάτων συστήματος)",
"cache_system": "Μήνυμα Συστήματος Κρυφής Μνήμης",
"cache_system_help": "Εάν θα αποθηκευτεί προσωρινά το σύστημα εντολών",
"token_threshold": "Κατώφλι Διακριτικού Κρυφής Μνήμης",
"token_threshold_help": "Μηνύματα που υπερβαίνουν αυτό το όριο token θα αποθηκεύονται στην cache. Ορίστε το σε 0 για να απενεργοποιήσετε την προσωρινή αποθήκευση."
},
"array_content": {
"help": "Εάν ο πάροχος υποστηρίζει το πεδίο περιεχομένου του μηνύματος ως τύπο πίνακα",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Vista",
"title": "Carreras"
},
"checkUpdate": {
"available": "Actualizar ahora",
"label": "Comprobar actualizaciones"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Caché de los últimos N mensajes",
"cache_last_n_help": "Almacenar en caché los últimos N mensajes de la conversación (excluyendo los mensajes del sistema)",
"cache_system": "Mensaje del Sistema de Caché",
"cache_system_help": "Si se debe almacenar en caché el mensaje del sistema",
"token_threshold": "Umbral de Token de Caché",
"token_threshold_help": "Los mensajes que superen este recuento de tokens se almacenarán en caché. Establecer en 0 para desactivar el almacenamiento en caché."
},
"array_content": {
"help": "¿Admite el proveedor que el campo content del mensaje sea de tipo array?",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Vue",
"title": "Carrières"
},
"checkUpdate": {
"available": "Mettre à jour maintenant",
"label": "Vérifier les mises à jour"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Mettre en cache les N derniers messages",
"cache_last_n_help": "Mettre en cache les N derniers messages de conversation (à lexclusion des messages système)",
"cache_system": "Message du système de cache",
"cache_system_help": "S'il faut mettre en cache l'invite système",
"token_threshold": "Seuil de jeton de cache",
"token_threshold_help": "Les messages dépassant ce nombre de jetons seront mis en cache. Mettre à 0 pour désactiver la mise en cache."
},
"array_content": {
"help": "Ce fournisseur prend-il en charge le champ content du message sous forme de tableau ?",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "表示",
"title": "キャリア"
},
"checkUpdate": {
"available": "今すぐ更新",
"label": "更新を確認"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "最後のN件のメッセージをキャッシュ",
"cache_last_n_help": "最後のN件の会話メッセージをキャッシュするシステムメッセージは除く",
"cache_system": "キャッシュシステムメッセージ",
"cache_system_help": "システムプロンプトをキャッシュするかどうか",
"token_threshold": "キャッシュトークン閾値",
"token_threshold_help": "このトークン数を超えるメッセージはキャッシュされます。キャッシュを無効にするには0を設定してください。"
},
"array_content": {
"help": "このプロバイダーは、message の content フィールドが配列型であることをサポートしていますか",

View File

@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Visualizar",
"title": "Carreiras"
},
"checkUpdate": {
"available": "Atualizar agora",
"label": "Verificar atualizações"
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Cache Últimas N Mensagens",
"cache_last_n_help": "Armazenar em cache as últimas N mensagens da conversa (excluindo mensagens do sistema)",
"cache_system": "Mensagem do Sistema de Cache",
"cache_system_help": "Se deve armazenar em cache o prompt do sistema",
"token_threshold": "Limite de Token de Cache",
"token_threshold_help": "Mensagens que excederem essa contagem de tokens serão armazenadas em cache. Defina como 0 para desativar o cache."
},
"array_content": {
"help": "O fornecedor suporta que o campo content da mensagem seja do tipo array?",

View File

@ -378,7 +378,7 @@
"about": "Despre",
"close": "Închide fereastra",
"copy": "Copiază",
"cut": "Taie",
"cut": "Decupează",
"delete": "Șterge",
"documentation": "Documentație",
"edit": "Editare",
@ -1191,7 +1191,7 @@
"agent_one": "Agent",
"agent_other": "Agenți",
"and": "și",
"assistant": "Agent",
"assistant": "Asistent",
"assistant_one": "Asistent",
"assistant_other": "Asistenți",
"avatar": "Avatar",
@ -1208,7 +1208,7 @@
"copy": "Copiază",
"copy_failed": "Copiere eșuată",
"current": "Curent",
"cut": "Taie",
"cut": "Decupează",
"default": "Implicit",
"delete": "Șterge",
"delete_confirm": "Ești sigur că vrei să ștergi?",
@ -1222,10 +1222,10 @@
"duplicate": "Duplică",
"edit": "Editează",
"enabled": "Activat",
"error": "eroare",
"error": "Eroare",
"errors": {
"create_message": "Nu s-a putut crea mesajul",
"validation": "Verificarea a eșuat"
"validation": "Validarea a eșuat"
},
"expand": "Extinde",
"file": {
@ -1611,7 +1611,7 @@
"title": "Setări bază de cunoștințe"
},
"sitemap_added": "Adăugat cu succes",
"sitemap_placeholder": "Introdu URL-ul hărții site-ului",
"sitemap_placeholder": "Introdu URL-ul sitemap-ului",
"sitemaps": "Site-uri web",
"source": "Sursă",
"status": "Stare",
@ -1623,7 +1623,7 @@
"status_pending": "În așteptare",
"status_preprocess_completed": "Preprocesare finalizată",
"status_preprocess_failed": "Preprocesare eșuată",
"status_processing": "Se procesează",
"status_processing": "În procesare",
"subtitle_file": "fișier subtitrare",
"threshold": "Prag de potrivire",
"threshold_placeholder": "Nesetat",
@ -1633,9 +1633,9 @@
"topN": "Număr rezultate returnate",
"topN_placeholder": "Nesetat",
"topN_too_large_or_small": "Numărul de rezultate returnate nu poate fi mai mare de 30 sau mai mic de 1.",
"topN_tooltip": "Numărul de rezultate potrivite returnate; cu cât valoarea este mai mare, cu atât mai multe rezultate, dar și mai mulți tokeni consumați.",
"topN_tooltip": "Numărul de rezultate potrivite returnate; cu cât valoarea este mai mare, cu atât mai multe rezultate, dar și un consum mai mare de tokeni.",
"url_added": "URL adăugat",
"url_placeholder": "Introdu URL, separă URL-urile multiple prin Enter",
"url_placeholder": "Introdu URL-ul; separă mai multe URL-uri prin Enter",
"urls": "URL-uri",
"videos": "video",
"videos_file": "fișier video"
@ -1760,7 +1760,7 @@
"switch_user_confirm": "Schimbi contextul de utilizator la {{user}}?",
"time": "Timp",
"title": "Amintiri",
"total_memories": "total amintiri",
"total_memories": "Total amintiri",
"try_different_filters": "Încearcă să ajustezi criteriile de căutare",
"update_failed": "Nu s-a putut actualiza amintirea",
"update_success": "Amintire actualizată cu succes",
@ -1779,7 +1779,7 @@
"user_memories_reset": "Toate amintirile pentru {{user}} au fost resetate",
"user_switch_failed": "Nu s-a putut schimba utilizatorul",
"user_switched": "Contextul de utilizator a fost schimbat la {{user}}",
"users": "utilizatori"
"users": "Utilizatori"
},
"message": {
"agents": {
@ -2196,7 +2196,7 @@
"navbar": {
"expand": "Extinde dialogul",
"hide_sidebar": "Ascunde bara laterală",
"show_sidebar": "Arată bara laterală",
"show_sidebar": "Afișează bara laterală",
"window": {
"close": "Închide",
"maximize": "Maximizează",
@ -2216,27 +2216,27 @@
},
"characters": "Caractere",
"collapse": "Restrânge",
"content_placeholder": "Te rugăm să introduci conținutul notiței...",
"content_placeholder": "Introdu conținutul notiței...",
"copyContent": "Copiază conținutul",
"crossPlatformRestoreWarning": "Configurația multi-platformă a fost restaurată, dar directorul de notițe este gol. Te rugăm să copiezi fișierele notițelor în: {{path}}",
"delete": "șterge",
"delete": "Șterge",
"delete_confirm": "Ești sigur că vrei să ștergi acest {{type}}?",
"delete_folder_confirm": "Ești sigur că vrei să ștergi dosarul \"{{name}}\" și tot conținutul său?",
"delete_note_confirm": "Ești sigur că vrei să ștergi notița \"{{name}}\"?",
"drop_markdown_hint": "Trage fișiere sau dosare .md aici pentru a importa",
"empty": "Încă nu există notițe disponibile",
"expand": "desfășoară",
"expand": "Extinde",
"export_failed": "Exportul în baza de cunoștințe a eșuat",
"export_knowledge": "Exportă notițele în baza de cunoștințe",
"export_success": "Exportat cu succes în baza de cunoștințe",
"folder": "dosar",
"folder": "Dosar",
"new_folder": "Dosar nou",
"new_note": "Creează o notiță nouă",
"no_content_to_copy": "Niciun conținut de copiat",
"no_file_selected": "Te rugăm să selectezi fișierul de încărcat",
"no_valid_files": "Nu a fost încărcat niciun fișier valid",
"open_folder": "Deschide un dosar extern",
"open_outside": "Deschide din exterior",
"open_outside": "Deschide extern",
"rename": "Redenumește",
"rename_changed": "Din cauza politicilor de securitate, numele fișierului a fost schimbat din {{original}} în {{final}}",
"save": "Salvează în Notițe",
@ -2275,7 +2275,7 @@
"font_size_small": "Mic",
"font_title": "Setări font",
"serif_font": "Font cu serife",
"show_table_of_contents": "Arată cuprinsul",
"show_table_of_contents": "Afișează cuprinsul",
"show_table_of_contents_description": "Afișează o bară laterală cu cuprinsul pentru o navigare ușoară în documente",
"title": "Setări afișare"
},
@ -3115,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Vedere",
"title": "Carieră"
},
"checkUpdate": {
"available": "Actualizare",
"label": "Verifică actualizări"
@ -4258,7 +4262,7 @@
"display_title": "Setări afișare mini-aplicații",
"empty": "Trage mini-aplicațiile din stânga pentru a le ascunde",
"open_link_external": {
"title": "Deschide linkurile de fereastră nouă în browser"
"title": "Deschide în browser linkurile care deschid ferestre noi"
},
"reset_tooltip": "Resetează la implicit",
"sidebar_description": "Arată mini-aplicațiile active în bara laterală",
@ -4358,7 +4362,7 @@
"description": "Model folosit pentru sarcini simple, cum ar fi numirea subiectelor și extragerea cuvintelor cheie",
"label": "Model rapid",
"setting_title": "Configurare model rapid",
"tooltip": "Se recomandă alegerea unui model ușor și nu se recomandă alegerea unui model de gândire."
"tooltip": "Se recomandă alegerea unui model ușor, nu a unui model de raționament complex."
},
"topic_naming": {
"auto": "Numire automată subiect",
@ -4476,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Cache Ultimelor N Mesaje",
"cache_last_n_help": "Stochează ultimele N mesaje din conversație (excluzând mesajele de sistem)",
"cache_system": "Mesaj de sistem Cache",
"cache_system_help": "Dacă să se memoreze în cache promptul de sistem",
"token_threshold": "Prag de Token Cache",
"token_threshold_help": "Mesajele care depășesc acest număr de tokeni vor fi memorate în cache. Setați la 0 pentru a dezactiva memorarea în cache."
},
"array_content": {
"help": "Furnizorul acceptă ca câmpul content al mesajului să fie de tip array?",
@ -4698,7 +4702,7 @@
},
"shortcuts": {
"action": "Acțiune",
"actions": "operațiune",
"actions": "Comandă",
"clear_shortcut": "Șterge comanda rapidă",
"clear_topic": "Șterge mesajele",
"copy_last_message": "Copiază ultimul mesaj",
@ -4729,8 +4733,8 @@
},
"theme": {
"color_primary": "Culoare primară",
"dark": "Întunecat",
"light": "Luminos",
"dark": "Întunecată",
"light": "Luminoasă",
"system": "Sistem",
"title": "Temă",
"window": {
@ -4892,21 +4896,21 @@
"translate": {
"custom": {
"delete": {
"description": "Ești sigur că vrei să ștergi?",
"description": "Ești sigur că vrei să ștergi această limbă?",
"title": "Șterge limba personalizată"
},
"error": {
"add": "Adăugarea a eșuat",
"delete": "Ștergerea a eșuat",
"langCode": {
"builtin": "Limba are suport integrat",
"empty": "Codul limbii este gol",
"builtin": "Limbă deja integrată",
"empty": "Codul limbii lipsește",
"exists": "Limba există deja",
"invalid": "Cod limbă invalid"
},
"update": "Actualizarea a eșuat",
"value": {
"empty": "Numele limbii nu poate fi gol",
"empty": "Numele limbii este obligatoriu",
"too_long": "Numele limbii este prea lung"
}
},
@ -4916,13 +4920,13 @@
"placeholder": "en-us"
},
"success": {
"add": "Adăugat cu succes",
"delete": "Șters cu succes",
"update": "Actualizare reușită"
"add": "Adăugată cu succes",
"delete": "Ștearsă cu succes",
"update": "Actualiza cu succes"
},
"table": {
"action": {
"title": "Operațiune"
"title": "Acțiuni"
}
},
"value": {
@ -4954,7 +4958,7 @@
"mcp-servers": "Servere MCP",
"memories": "Amintiri",
"notes": "Notițe",
"paintings": "Picturi",
"paintings": "Imagini",
"settings": "Setări",
"store": "Bibliotecă asistenți",
"translate": "Traducere"
@ -4998,7 +5002,7 @@
"detect": {
"method": {
"algo": {
"label": "algoritm",
"label": "Algoritm",
"tip": "Folosește biblioteca franc pentru detectarea limbii"
},
"auto": {
@ -5105,7 +5109,7 @@
"tray": {
"quit": "Ieșire",
"show_mini_window": "Asistent rapid",
"show_window": "Arată fereastra"
"show_window": "Afișează fereastra"
},
"update": {
"install": "Instalează",
@ -5121,7 +5125,7 @@
"words": {
"knowledgeGraph": "Grafic de cunoștințe",
"quit": "Ieșire",
"show_window": "Arată fereastra",
"show_window": "Afișează fereastra",
"visualization": "Vizualizare"
}
}

View File

@ -1311,7 +1311,6 @@
"backup": {
"file_format": "Ошибка формата файла резервной копии"
},
"base64DataTruncated": "[to be translated]:Base64 image data truncated, size",
"base64DataTruncated": "Данные изображения в формате Base64 усечены, размер",
"boundary": {
"default": {
@ -1393,9 +1392,7 @@
"text": "текст",
"toolInput": "ввод инструмента",
"toolName": "имя инструмента",
"truncated": "[to be translated]:Data truncated, original size",
"truncated": "Данные усечены, исходный размер",
"truncatedBadge": "[to be translated]:Truncated",
"truncatedBadge": "Усечённый",
"unknown": "Неизвестная ошибка",
"usage": "Дозировка",
@ -3118,6 +3115,10 @@
},
"settings": {
"about": {
"careers": {
"button": "Вид",
"title": "Карьера"
},
"checkUpdate": {
"available": "Обновить",
"label": "Проверить обновления"
@ -4479,12 +4480,12 @@
},
"options": {
"anthropic_cache": {
"cache_last_n": "[to be translated]:Cache Last N Messages",
"cache_last_n_help": "[to be translated]:Cache the last N conversation messages (excluding system messages)",
"cache_system": "[to be translated]:Cache System Message",
"cache_system_help": "[to be translated]:Whether to cache the system prompt",
"token_threshold": "[to be translated]:Cache Token Threshold",
"token_threshold_help": "[to be translated]:Messages exceeding this token count will be cached. Set to 0 to disable caching."
"cache_last_n": "Кэшировать последние N сообщений",
"cache_last_n_help": "Кэшировать последние N сообщений разговора (исключая системные сообщения)",
"cache_system": "Сообщение системы кэша",
"cache_system_help": "Кэшировать ли системный промпт",
"token_threshold": "Порог токена кэша",
"token_threshold_help": "Сообщения, превышающие это количество токенов, будут кэшироваться. Установите значение 0, чтобы отключить кэширование."
},
"array_content": {
"help": "Поддерживает ли данный провайдер тип массива для поля content в сообщении",

View File

@ -0,0 +1,231 @@
import type { Model, Provider } from '@renderer/types'
import { codeTools } from '@shared/config/constant'
import { beforeEach, describe, expect, it, vi } from 'vitest'
// Mock CodeToolsPage which is the default export
vi.mock('../CodeToolsPage', () => ({ default: () => null }))
// Mock dependencies needed by CodeToolsPage
vi.mock('@renderer/hooks/useCodeTools', () => ({
useCodeTools: () => ({
selectedCliTool: codeTools.qwenCode,
selectedModel: null,
selectedTerminal: 'systemDefault',
environmentVariables: '',
directories: [],
currentDirectory: '',
canLaunch: true,
setCliTool: vi.fn(),
setModel: vi.fn(),
setTerminal: vi.fn(),
setEnvVars: vi.fn(),
setCurrentDir: vi.fn(),
removeDir: vi.fn(),
selectFolder: vi.fn()
})
}))
vi.mock('@renderer/hooks/useProvider', () => ({
useProviders: () => ({ providers: [] }),
useAllProviders: () => []
}))
vi.mock('@renderer/services/AssistantService', () => ({
getProviderByModel: vi.fn()
}))
vi.mock('@renderer/services/LoggerService', () => ({
loggerService: {
withContext: () => ({
info: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
warn: vi.fn()
})
}
}))
vi.mock('@renderer/store', () => ({
useAppDispatch: () => vi.fn(),
useAppSelector: () => false
}))
vi.mock('@renderer/aiCore', () => ({
default: class {
getBaseURL() {
return ''
}
getApiKey() {
return ''
}
}
}))
vi.mock('@renderer/utils/api', () => ({
formatApiHost: vi.fn((host) => {
if (!host) return ''
const normalized = host.replace(/\/$/, '').trim()
if (normalized.endsWith('#')) {
return normalized.replace(/#$/, '')
}
if (/\/v\d+(?:alpha|beta)?(?=\/|$)/i.test(normalized)) {
return normalized
}
return `${normalized}/v1`
})
}))
vi.mock('react-i18next', () => ({
useTranslation: () => ({ t: (key: string) => key })
}))
describe('generateToolEnvironment', () => {
const createMockModel = (id: string, provider: string): Model => ({
id,
name: id,
provider,
group: provider
})
const createMockProvider = (id: string, apiHost: string): Provider => ({
id,
type: 'openai',
name: id,
apiKey: 'test-key',
apiHost,
models: [],
isSystem: true
})
beforeEach(() => {
vi.clearAllMocks()
})
it('should format baseUrl with /v1 for qwenCode when missing', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-turbo', 'dashscope')
const provider = createMockProvider('dashscope', 'https://dashscope.aliyuncs.com/compatible-mode')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://dashscope.aliyuncs.com/compatible-mode'
})
expect(env.OPENAI_BASE_URL).toBe('https://dashscope.aliyuncs.com/compatible-mode/v1')
})
it('should not duplicate /v1 when already present for qwenCode', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-turbo', 'dashscope')
const provider = createMockProvider('dashscope', 'https://dashscope.aliyuncs.com/compatible-mode/v1')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://dashscope.aliyuncs.com/compatible-mode/v1'
})
expect(env.OPENAI_BASE_URL).toBe('https://dashscope.aliyuncs.com/compatible-mode/v1')
})
it('should handle empty baseUrl gracefully', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-turbo', 'dashscope')
const provider = createMockProvider('dashscope', '')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: ''
})
expect(env.OPENAI_BASE_URL).toBe('')
})
it('should preserve other API versions when present', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-plus', 'dashscope')
const provider = createMockProvider('dashscope', 'https://dashscope.aliyuncs.com/v2')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://dashscope.aliyuncs.com/v2'
})
expect(env.OPENAI_BASE_URL).toBe('https://dashscope.aliyuncs.com/v2')
})
it('should format baseUrl with /v1 for openaiCodex when missing', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('gpt-4', 'openai')
const provider = createMockProvider('openai', 'https://api.openai.com')
const env = generateToolEnvironment({
tool: codeTools.openaiCodex,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://api.openai.com'
})
expect(env.OPENAI_BASE_URL).toBe('https://api.openai.com/v1')
})
it('should format baseUrl with /v1 for iFlowCli when missing', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('gpt-4', 'iflow')
const provider = createMockProvider('iflow', 'https://api.iflow.cn')
const env = generateToolEnvironment({
tool: codeTools.iFlowCli,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://api.iflow.cn'
})
expect(env.IFLOW_BASE_URL).toBe('https://api.iflow.cn/v1')
})
it('should handle trailing slash correctly', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-turbo', 'dashscope')
const provider = createMockProvider('dashscope', 'https://dashscope.aliyuncs.com/compatible-mode/')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://dashscope.aliyuncs.com/compatible-mode/'
})
expect(env.OPENAI_BASE_URL).toBe('https://dashscope.aliyuncs.com/compatible-mode/v1')
})
it('should handle v2beta version correctly', async () => {
const { generateToolEnvironment } = await import('../index')
const model = createMockModel('qwen-plus', 'dashscope')
const provider = createMockProvider('dashscope', 'https://dashscope.aliyuncs.com/v2beta')
const env = generateToolEnvironment({
tool: codeTools.qwenCode,
model,
modelProvider: provider,
apiKey: 'test-key',
baseUrl: 'https://dashscope.aliyuncs.com/v2beta'
})
expect(env.OPENAI_BASE_URL).toBe('https://dashscope.aliyuncs.com/v2beta')
})
})

View File

@ -1,4 +1,5 @@
import { type EndpointType, type Model, type Provider, SystemProviderIds } from '@renderer/types'
import { formatApiHost } from '@renderer/utils/api'
import { codeTools } from '@shared/config/constant'
export interface LaunchValidationResult {
@ -145,6 +146,7 @@ export const generateToolEnvironment = ({
baseUrl: string
}): Record<string, string> => {
const env: Record<string, string> = {}
const formattedBaseUrl = formatApiHost(baseUrl)
switch (tool) {
case codeTools.claudeCode:
@ -169,19 +171,19 @@ export const generateToolEnvironment = ({
case codeTools.qwenCode:
env.OPENAI_API_KEY = apiKey
env.OPENAI_BASE_URL = baseUrl
env.OPENAI_BASE_URL = formattedBaseUrl
env.OPENAI_MODEL = model.id
break
case codeTools.openaiCodex:
env.OPENAI_API_KEY = apiKey
env.OPENAI_BASE_URL = baseUrl
env.OPENAI_BASE_URL = formattedBaseUrl
env.OPENAI_MODEL = model.id
env.OPENAI_MODEL_PROVIDER = modelProvider.id
break
case codeTools.iFlowCli:
env.IFLOW_API_KEY = apiKey
env.IFLOW_BASE_URL = baseUrl
env.IFLOW_BASE_URL = formattedBaseUrl
env.IFLOW_MODEL_NAME = model.id
break

View File

@ -1,4 +1,5 @@
import AssistantAvatar from '@renderer/components/Avatar/AssistantAvatar'
import type { DraggableVirtualListRef } from '@renderer/components/DraggableList'
import { DraggableVirtualList } from '@renderer/components/DraggableList'
import { CopyIcon, DeleteIcon, EditIcon } from '@renderer/components/Icons'
import ObsidianExportPopup from '@renderer/components/Popups/ObsidianExportPopup'
@ -85,6 +86,7 @@ export const Topics: React.FC<Props> = ({ assistant: _assistant, activeTopic, se
const [deletingTopicId, setDeletingTopicId] = useState<string | null>(null)
const deleteTimerRef = useRef<NodeJS.Timeout>(null)
const [editingTopicId, setEditingTopicId] = useState<string | null>(null)
const listRef = useRef<DraggableVirtualListRef>(null)
// 管理模式状态
const manageState = useTopicManageMode()
@ -168,10 +170,46 @@ export const Topics: React.FC<Props> = ({ assistant: _assistant, activeTopic, se
const onPinTopic = useCallback(
(topic: Topic) => {
let newIndex = 0
if (topic.pinned) {
// 取消固定:将话题移到未固定话题的顶部
const pinnedTopics = assistant.topics.filter((t) => t.pinned)
const unpinnedTopics = assistant.topics.filter((t) => !t.pinned)
// 构建新顺序:其他固定话题 + 取消固定的话题(移到顶部) + 其他未固定话题
const reorderedTopics = [
...pinnedTopics.filter((t) => t.id !== topic.id), // 其他固定话题
topic, // 取消固定的话题移到顶部
...unpinnedTopics // 其他未固定话题
]
newIndex = pinnedTopics.length - 1 // 最后一个固定话题的索引 + 1 = 第一个未固定的索引
updateTopics(reorderedTopics)
} else {
// 固定话题:移到固定区域顶部
const pinnedTopics = assistant.topics.filter((t) => t.pinned)
const unpinnedTopics = assistant.topics.filter((t) => !t.pinned)
const reorderedTopics = [
topic, // 新固定的话题移到顶部
...pinnedTopics, // 其他固定话题
...unpinnedTopics.filter((t) => t.id !== topic.id) // 其他未固定话题(排除 topic
]
newIndex = 0
updateTopics(reorderedTopics)
}
const updatedTopic = { ...topic, pinned: !topic.pinned }
updateTopic(updatedTopic)
// 延迟滚动到话题位置(等待渲染完成)
setTimeout(() => {
listRef.current?.scrollToIndex(newIndex, { align: 'auto' })
}, 50)
},
[updateTopic]
[assistant.topics, updateTopic, updateTopics]
)
const onDeleteTopic = useCallback(
@ -529,6 +567,7 @@ export const Topics: React.FC<Props> = ({ assistant: _assistant, activeTopic, se
return (
<>
<DraggableVirtualList
ref={listRef}
className="topics-tab"
list={filteredTopics}
onUpdate={updateTopics}
@ -663,7 +702,7 @@ export const Topics: React.FC<Props> = ({ assistant: _assistant, activeTopic, se
</TopicPromptText>
)}
{showTopicTime && (
<TopicTime className="time">{dayjs(topic.createdAt).format('MM/DD HH:mm')}</TopicTime>
<TopicTime className="time">{dayjs(topic.createdAt).format('YYYY/MM/DD HH:mm')}</TopicTime>
)}
</TopicListItem>
</Dropdown>

View File

@ -15,7 +15,7 @@ import { runAsyncFunction } from '@renderer/utils'
import { UpgradeChannel } from '@shared/config/constant'
import { Avatar, Button, Progress, Radio, Row, Switch, Tag, Tooltip } from 'antd'
import { debounce } from 'lodash'
import { Bug, Building2, Github, Globe, Mail, Rss } from 'lucide-react'
import { Briefcase, Bug, Building2, Github, Globe, Mail, Rss } from 'lucide-react'
import { BadgeQuestionMark } from 'lucide-react'
import type { FC } from 'react'
import { useEffect, useState } from 'react'
@ -327,6 +327,16 @@ const AboutSettings: FC = () => {
<Button onClick={mailto}>{t('settings.about.contact.button')}</Button>
</SettingRow>
<SettingDivider />
<SettingRow>
<SettingRowTitle>
<Briefcase size={18} />
{t('settings.about.careers.title')}
</SettingRowTitle>
<Button onClick={() => onOpenWebsite('https://www.cherry-ai.com/careers')}>
{t('settings.about.careers.button')}
</Button>
</SettingRow>
<SettingDivider />
<SettingRow>
<SettingRowTitle>
<Bug size={18} />

View File

@ -113,9 +113,10 @@ class LoggerService {
* @param data - Additional data to log
*/
private processLog(level: LogLevel, message: string, data: any[]): void {
let windowSource = this.window
if (!this.window) {
console.error('[LoggerService] window source not initialized, please initialize window source first')
return
windowSource = 'UNKNOWN'
}
const currentLevel = LEVEL_MAP[level]
@ -164,7 +165,7 @@ class LoggerService {
if (currentLevel >= LEVEL_MAP[this.logToMainLevel] || forceLogToMain) {
const source: LogSourceWithContext = {
process: 'renderer',
window: this.window,
window: windowSource,
module: this.module
}

View File

@ -43,6 +43,8 @@ const initialState: AssistantsState = {
unifiedListOrder: []
}
const normalizeTopics = (topics: unknown): Topic[] => (Array.isArray(topics) ? topics : [])
const assistantsSlice = createSlice({
name: 'assistants',
initialState,
@ -127,7 +129,7 @@ const assistantsSlice = createSlice({
assistant.id === action.payload.assistantId
? {
...assistant,
topics: uniqBy([topic, ...assistant.topics], 'id')
topics: uniqBy([topic, ...normalizeTopics(assistant.topics)], 'id')
}
: assistant
)
@ -137,7 +139,7 @@ const assistantsSlice = createSlice({
assistant.id === action.payload.assistantId
? {
...assistant,
topics: assistant.topics.filter(({ id }) => id !== action.payload.topic.id)
topics: normalizeTopics(assistant.topics).filter(({ id }) => id !== action.payload.topic.id)
}
: assistant
)
@ -149,7 +151,7 @@ const assistantsSlice = createSlice({
assistant.id === action.payload.assistantId
? {
...assistant,
topics: assistant.topics.map((topic) => {
topics: normalizeTopics(assistant.topics).map((topic) => {
const _topic = topic.id === newTopic.id ? newTopic : topic
_topic.messages = []
return _topic
@ -173,7 +175,7 @@ const assistantsSlice = createSlice({
removeAllTopics: (state, action: PayloadAction<{ assistantId: string }>) => {
state.assistants = state.assistants.map((assistant) => {
if (assistant.id === action.payload.assistantId) {
assistant.topics.forEach((topic) => TopicManager.removeTopic(topic.id))
normalizeTopics(assistant.topics).forEach((topic) => TopicManager.removeTopic(topic.id))
return {
...assistant,
topics: [getDefaultTopic(assistant.id)]
@ -184,7 +186,7 @@ const assistantsSlice = createSlice({
},
updateTopicUpdatedAt: (state, action: PayloadAction<{ topicId: string }>) => {
outer: for (const assistant of state.assistants) {
for (const topic of assistant.topics) {
for (const topic of normalizeTopics(assistant.topics)) {
if (topic.id === action.payload.topicId) {
topic.updatedAt = new Date().toISOString()
break outer
@ -268,7 +270,7 @@ export const {
} = assistantsSlice.actions
export const selectAllTopics = createSelector([(state: RootState) => state.assistants.assistants], (assistants) =>
assistants.flatMap((assistant: Assistant) => assistant.topics)
assistants.flatMap((assistant: Assistant) => normalizeTopics(assistant.topics))
)
export const selectTopicsMap = createSelector([selectAllTopics], (topics) => {

View File

@ -123,7 +123,15 @@ export const McpServerConfigSchema = z
*
* 60
*/
timeout: z.number().optional().describe('Timeout in seconds for requests to this server'),
timeout: z
.preprocess((val) => {
if (typeof val === 'string' && val.trim() !== '') {
const parsed = Number(val)
return isNaN(parsed) ? val : parsed
}
return val
}, z.number().optional())
.describe('Timeout in seconds for requests to this server'),
/**
* DXT包版本号
* DXT包的版本