🐛 fix(chat): strip forkedFromIdentifier before LLM API request (#13142)

fix(chat): strip forkedFromIdentifier before LLM API request

Fork & Chat stores forkedFromIdentifier in agent.params for DB lookup.
Spreading params into the chat payload forwarded it to Responses API,
causing strict providers (e.g. AiHubMix) to reject the request.

Remove the field in getChatCompletion alongside existing non-API keys.

Fixes lobehub/lobehub#13071

Made-with: Cursor
This commit is contained in:
Protocol Zero
2026-03-20 11:07:29 +08:00
committed by GitHub
parent 1b909a74d7
commit bc8debe836

View File

@@ -345,6 +345,8 @@ class ChatService {
const chatConfig = agentChatConfigSelectors.currentChatConfig(getAgentStoreState());
delete (res as any).scope;
// Fork flow stores market metadata in agent.params; must not reach OpenAI-compatible / Responses API
delete (res as any).forkedFromIdentifier;
const payload = merge(
{