♻️ refactor: optimize lobehub models and default configuration (#11621)

* ♻️ refactor: use COPYRIGHT constant from branding config

* ♻️ refactor: optimize default model configuration

- Switch default provider from OpenAI to Anthropic
- Add DEFAULT_MINI_MODEL and DEFAULT_MINI_PROVIDER for lightweight tasks
- Use mini model for system agent tasks: generationTopic, topic, translation, queryRewrite
- Use mini model for memory extraction agents
- Reorder provider list: Anthropic/Google first, local solutions last

* ♻️ refactor: split lobehub models into organized folder structure

- Split large lobehub.ts (1316 lines) into lobehub/ folder
- Organize chat models by provider (openai, anthropic, google, etc.)
- Separate image models and utils into dedicated files

* ♻️ refactor: use SOCIAL_URL constant and fix button alignment in auth-error page

*  feat: add MiniMax M2.1 and M2.1 Lightning models

* ♻️ refactor: remove 'enabled' property from image model configurations in lobehub

* ♻️ refactor: add COPYRIGHT_FULL constant and fix Discord icon visibility

*  test: update snapshots for default provider changes

*  test: fix snapshot provider values for CI environment

* 🐛 fix(e2e): intercept all LLM providers in mock instead of only OpenAI

The default provider was changed from openai to anthropic, but the LLM mock
only intercepted /webapi/chat/openai requests. Now it intercepts all providers.
This commit is contained in:
YuTengjing
2026-01-20 20:08:54 +08:00
committed by GitHub
parent 2fffe8b6ee
commit 5074fbef2c
24 changed files with 1441 additions and 1366 deletions

View File

@@ -146,12 +146,12 @@ export class LLMMockManager {
return;
}
// Intercept OpenAI chat API requests
await page.route('**/webapi/chat/openai**', async (route) => {
// Intercept all LLM chat API requests (openai, anthropic, etc.)
await page.route('**/webapi/chat/**', async (route) => {
await this.handleChatRequest(route);
});
console.log(' ✓ LLM mocks registered (openai)');
console.log(' ✓ LLM mocks registered (all providers)');
}
/**