feat: Add multi-LLM provider support with DeepSeek integration
Major Features: - ✨ Multi-LLM provider support (DeepSeek, Ollama, OpenAI, Custom) - 🤖 Admin panel LLM configuration management UI - 🔄 Dynamic provider switching without restart - 🧪 Built-in API connection testing - 🔒 Secure API key management Backend Changes: - Add routes/llmConfig.js: Complete LLM config CRUD API - Update routes/analyze.js: Use database LLM configuration - Update server.js: Add LLM config routes - Add scripts/add-deepseek-config.js: DeepSeek setup script Frontend Changes: - Update src/pages/AdminPage.jsx: Add LLM Config tab + modal - Update src/services/api.js: Add LLM config API methods - Provider presets for DeepSeek, Ollama, OpenAI - Test connection feature in config modal Configuration: - Update .env.example: Add DeepSeek API configuration - Update package.json: Add llm:add-deepseek script Documentation: - Add docs/LLM_CONFIGURATION_GUIDE.md: Complete guide - Add DEEPSEEK_INTEGRATION.md: Integration summary - Quick setup instructions for DeepSeek API Endpoints: - GET /api/llm-config: List all configurations - GET /api/llm-config/active: Get active configuration - POST /api/llm-config: Create configuration - PUT /api/llm-config/🆔 Update configuration - PUT /api/llm-config/:id/activate: Activate configuration - DELETE /api/llm-config/🆔 Delete configuration - POST /api/llm-config/test: Test API connection Database: - Uses existing llm_configs table - Only one config active at a time - Fallback to Ollama if no database config Security: - Admin-only access to LLM configuration - API keys never returned in GET requests - Audit logging for all config changes - Cannot delete active configuration DeepSeek Model: - Model: deepseek-chat - High-quality 5 Why analysis - Excellent Chinese language support - Cost-effective pricing 🤖 Generated with Claude Code Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
11
.env.example
11
.env.example
@@ -10,13 +10,18 @@ SERVER_HOST=localhost
|
||||
SERVER_PORT=3001
|
||||
CLIENT_PORT=5173
|
||||
|
||||
# Ollama API Configuration
|
||||
# Ollama API Configuration (Fallback if no database config)
|
||||
OLLAMA_API_URL=https://ollama_pjapi.theaken.com
|
||||
OLLAMA_MODEL=qwen2.5:3b
|
||||
|
||||
# LLM API Keys (Optional - for admin configuration)
|
||||
GEMINI_API_KEY=
|
||||
# DeepSeek API Configuration (Recommended)
|
||||
# Get your API key from: https://platform.deepseek.com/
|
||||
DEEPSEEK_API_URL=https://api.deepseek.com
|
||||
DEEPSEEK_API_KEY=
|
||||
DEEPSEEK_MODEL=deepseek-chat
|
||||
|
||||
# Other LLM API Keys (Optional - for admin configuration)
|
||||
GEMINI_API_KEY=
|
||||
OPENAI_API_KEY=
|
||||
|
||||
# Session Secret (Generate a random string)
|
||||
|
||||
Reference in New Issue
Block a user