diff --git a/FLASK_SETUP.md b/FLASK_SETUP.md new file mode 100644 index 0000000..a9f909c --- /dev/null +++ b/FLASK_SETUP.md @@ -0,0 +1,392 @@ +# Flask 伺服器設定指南 + +## 🐍 Python Flask 後端伺服器 + +本專案提供 **Python Flask** 版本的後端伺服器,運行於 `http://127.0.0.1:5002` + +## 📋 系統需求 + +- **Python**: 3.8 或更高版本 +- **pip**: Python 套件管理工具 +- **MySQL**: 資料庫(已設定) + +## 🚀 快速開始 + +### Windows 用戶 + +1. **雙擊執行啟動腳本** + ```bash + run_flask.bat + ``` + + 腳本會自動: + - ✅ 檢查 Python 安裝 + - ✅ 建立虛擬環境(第一次執行) + - ✅ 安裝所有依賴 + - ✅ 啟動 Flask 伺服器 + +2. **或手動執行** + ```bash + # 建立虛擬環境 + python -m venv venv + + # 啟動虛擬環境 + venv\Scripts\activate + + # 安裝依賴 + pip install -r requirements.txt + + # 啟動伺服器 + python app.py + ``` + +### Linux/Mac 用戶 + +1. **執行啟動腳本** + ```bash + chmod +x run_flask.sh + ./run_flask.sh + ``` + +2. **或手動執行** + ```bash + # 建立虛擬環境 + python3 -m venv venv + + # 啟動虛擬環境 + source venv/bin/activate + + # 安裝依賴 + pip install -r requirements.txt + + # 啟動伺服器 + python3 app.py + ``` + +## 📦 已安裝的套件 + +``` +Flask==3.0.0 # Web 框架 +Flask-Cors==4.0.0 # CORS 支援 +PyMySQL==1.1.0 # MySQL 連接器 +requests==2.31.0 # HTTP 請求 +python-dotenv==1.0.0 # 環境變數管理 +``` + +完整清單請參考 [requirements.txt](requirements.txt) + +## 🌐 服務端點 + +### 基礎端點 + +``` +GET http://127.0.0.1:5002/ + - API 資訊 + +GET http://127.0.0.1:5002/health + - 健康檢查 + +GET http://127.0.0.1:5002/api-proxy-example.html + - API 使用範例頁面 +``` + +### 資料庫 API + +``` +POST http://127.0.0.1:5002/api/db/test + - 測試資料庫連線 + +GET http://127.0.0.1:5002/api/db/tables + - 列出所有資料表 +``` + +### LLM API + +``` +POST http://127.0.0.1:5002/api/llm/test/gemini + - 測試 Gemini API + +POST http://127.0.0.1:5002/api/llm/test/deepseek + - 測試 DeepSeek API + +POST http://127.0.0.1:5002/api/llm/test/openai + - 測試 OpenAI API + +POST http://127.0.0.1:5002/api/llm/test/claude + - 測試 Claude API + +POST http://127.0.0.1:5002/api/llm/test/all + - 測試所有 LLM + +POST http://127.0.0.1:5002/api/llm/generate + - 生成內容 +``` + +## 💡 使用範例 + +### Python 範例 + +```python +import requests + +# 測試資料庫連線 +response = requests.post('http://127.0.0.1:5002/api/db/test') +print(response.json()) + +# 測試 Claude API +response = requests.post('http://127.0.0.1:5002/api/llm/test/claude') +print(response.json()) + +# 生成內容 +response = requests.post( + 'http://127.0.0.1:5002/api/llm/generate', + json={ + 'prompt': '介紹 HR 績效評核系統', + 'provider': 'claude', + 'options': { + 'temperature': 0.7, + 'maxTokens': 200 + } + } +) +print(response.json()) +``` + +### JavaScript 範例 + +```javascript +// 測試資料庫連線 +fetch('http://127.0.0.1:5002/api/db/test', { + method: 'POST' +}) + .then(res => res.json()) + .then(data => console.log(data)); + +// 生成內容 +fetch('http://127.0.0.1:5002/api/llm/generate', { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + prompt: '介紹 HR 績效評核系統', + provider: 'claude', + options: { + temperature: 0.7, + maxTokens: 200 + } + }) +}) + .then(res => res.json()) + .then(data => console.log(data)); +``` + +### cURL 範例 + +```bash +# 測試資料庫連線 +curl -X POST http://127.0.0.1:5002/api/db/test + +# 測試 Claude API +curl -X POST http://127.0.0.1:5002/api/llm/test/claude + +# 生成內容 +curl -X POST http://127.0.0.1:5002/api/llm/generate \ + -H "Content-Type: application/json" \ + -d '{ + "prompt": "介紹 HR 績效評核系統", + "provider": "claude", + "options": { + "temperature": 0.7, + "maxTokens": 200 + } + }' +``` + +## ⚙️ 環境變數設定 + +確保 [.env](.env) 檔案包含以下設定: + +```env +# 資料庫設定 +DB_HOST=mysql.theaken.com +DB_PORT=33306 +DB_NAME=db_A102 +DB_USER=A102 +DB_PASSWORD=Bb123456 + +# LLM API 金鑰 +GEMINI_API_KEY=your_gemini_api_key +DEEPSEEK_API_KEY=your_deepseek_api_key +OPENAI_API_KEY=your_openai_api_key +CLAUDE_API_KEY=your_claude_api_key + +# 應用設定 +NODE_ENV=development +FRONTEND_URL=http://127.0.0.1:5002 +``` + +## 🔧 開發模式 + +Flask 在開發模式下會: +- ✅ 自動重新載入程式碼變更 +- ✅ 顯示詳細錯誤訊息 +- ✅ 啟用偵錯模式 + +```python +# app.py 最後一行 +app.run( + host='127.0.0.1', + port=5002, + debug=True # 開發模式 +) +``` + +## 📊 伺服器啟動畫面 + +``` +============================================================ +🚀 HR Performance System API Server (Flask/Python) +============================================================ +📡 Server running on: http://127.0.0.1:5002 +🌍 Environment: development +📅 Started at: 2025-12-03 23:59:59 +============================================================ + +📚 Available endpoints: + GET / - API information + GET /health - Health check + POST /api/db/test - Test database connection + GET /api/db/tables - List all tables + POST /api/llm/test/* - Test LLM connections + POST /api/llm/generate - Generate content with LLM + GET /api-proxy-example.html - API example page + +✨ Server is ready to accept connections! + +✅ Database connection: OK + + * Serving Flask app 'app' + * Debug mode: on +WARNING: This is a development server. Do not use it in production. +Use a production WSGI server instead. + * Running on http://127.0.0.1:5002 +Press CTRL+C to quit + * Restarting with stat +``` + +## 🎯 與 Node.js 版本的比較 + +| 功能 | Flask (Python) | Express (Node.js) | +|------|----------------|-------------------| +| **端口** | 5002 | 3000 | +| **語言** | Python 3.8+ | Node.js 16+ | +| **安裝** | `pip install -r requirements.txt` | `npm install` | +| **啟動** | `python app.py` | `npm start` | +| **優點** | Python 生態系統、機器學習整合 | JavaScript 全棧、效能較好 | + +兩個版本功能完全相同,可以根據團隊技術棧選擇使用! + +## 🐛 常見問題 + +### Q1: 執行 run_flask.bat 出現錯誤 + +**A:** 確認: +1. Python 是否已安裝並在 PATH 中 +2. 執行 `python --version` 檢查版本 +3. 是否有權限建立虛擬環境 + +### Q2: pip install 失敗 + +**A:** 嘗試: +```bash +# 升級 pip +python -m pip install --upgrade pip + +# 使用國內鏡像(中國用戶) +pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple +``` + +### Q3: 資料庫連線失敗 + +**A:** 檢查: +1. MySQL 伺服器是否運行 +2. `.env` 中的資料庫連線資訊是否正確 +3. 防火牆是否阻擋 33306 端口 + +### Q4: CORS 錯誤 + +**A:** Flask-CORS 已設定,但確認: +1. 前端是否從正確的來源發送請求 +2. `.env` 中的 `FRONTEND_URL` 設定 + +### Q5: 虛擬環境啟動失敗 + +**A:** Windows 使用者可能需要: +```bash +# 允許執行腳本(以管理員身份執行 PowerShell) +Set-ExecutionPolicy RemoteSigned +``` + +## 📁 專案結構 + +``` +hr-performance-system/ +├── app.py # Flask 主程式 ⭐ +├── requirements.txt # Python 依賴 ⭐ +├── run_flask.bat # Windows 啟動腳本 ⭐ +├── run_flask.sh # Linux/Mac 啟動腳本 ⭐ +├── server.js # Node.js 版本(可選) +├── package.json # Node.js 依賴(可選) +├── .env # 環境變數 +├── .gitignore +├── venv/ # Python 虛擬環境(自動建立) +├── public/ +│ └── api-proxy-example.html +├── config/ +├── services/ +├── routes/ +├── utils/ +└── database/ +``` + +## 🔒 安全提醒 + +- ⚠️ **開發伺服器**:Flask 內建伺服器僅供開發使用 +- ⚠️ **生產環境**:使用 Gunicorn、uWSGI 等 WSGI 伺服器 +- ⚠️ **API 金鑰**:絕不提交 `.env` 到版本控制 +- ⚠️ **HTTPS**:生產環境必須使用 HTTPS + +## 🚀 生產環境部署 + +### 使用 Gunicorn + +```bash +# 安裝 Gunicorn +pip install gunicorn + +# 啟動 +gunicorn -w 4 -b 127.0.0.1:5002 app:app +``` + +### 使用 Docker + +```dockerfile +FROM python:3.11-slim +WORKDIR /app +COPY requirements.txt . +RUN pip install -r requirements.txt +COPY . . +CMD ["gunicorn", "-w", "4", "-b", "0.0.0.0:5002", "app:app"] +``` + +## 📚 相關文件 + +- [README.md](README.md) - 專案說明 +- [CORS_FIX_GUIDE.md](CORS_FIX_GUIDE.md) - CORS 問題解決 +- [database/README.md](database/README.md) - 資料庫文件 +- [package.json](package.json) - Node.js 版本(可選) + +--- + +**最後更新**: 2025-12-03 +**Python 版本**: 3.8+ +**Flask 版本**: 3.0.0 diff --git a/app.py b/app.py new file mode 100644 index 0000000..34e17ff --- /dev/null +++ b/app.py @@ -0,0 +1,498 @@ +""" +Flask Application +HR 績效評核系統 - Python Flask 後端伺服器 +運行於 127.0.0.1:5002 +""" + +import os +import json +from datetime import datetime +from flask import Flask, request, jsonify, send_from_directory +from flask_cors import CORS +from dotenv import load_dotenv +import pymysql +import requests +from functools import wraps + +# 載入環境變數 +load_dotenv() + +# 建立 Flask 應用 +app = Flask(__name__, static_folder='public', static_url_path='') + +# CORS 設定 +CORS(app, resources={ + r"/api/*": { + "origins": os.getenv('FRONTEND_URL', '*'), + "methods": ["GET", "POST", "PUT", "DELETE", "PATCH"], + "allow_headers": ["Content-Type", "Authorization"] + } +}) + +# 應用配置 +app.config['JSON_AS_ASCII'] = False # 支援中文 +app.config['JSON_SORT_KEYS'] = False +app.config['MAX_CONTENT_LENGTH'] = int(os.getenv('MAX_FILE_SIZE', 5242880)) # 5MB + +# ============================================ +# 資料庫連線 +# ============================================ + +def get_db_connection(): + """建立資料庫連線""" + try: + connection = pymysql.connect( + host=os.getenv('DB_HOST'), + port=int(os.getenv('DB_PORT', 3306)), + user=os.getenv('DB_USER'), + password=os.getenv('DB_PASSWORD'), + database=os.getenv('DB_NAME'), + charset='utf8mb4', + cursorclass=pymysql.cursors.DictCursor + ) + return connection + except Exception as e: + print(f"資料庫連線錯誤: {e}") + return None + +def test_db_connection(): + """測試資料庫連線""" + try: + conn = get_db_connection() + if conn: + with conn.cursor() as cursor: + cursor.execute("SELECT 1") + conn.close() + return True + return False + except: + return False + +# ============================================ +# LLM 服務整合 +# ============================================ + +class LLMService: + """LLM 服務類別""" + + @staticmethod + def get_config(provider): + """取得 LLM 配置""" + configs = { + 'gemini': { + 'api_key': os.getenv('GEMINI_API_KEY'), + 'api_url': 'https://generativelanguage.googleapis.com/v1beta', + 'model': os.getenv('GEMINI_MODEL', 'gemini-pro') + }, + 'deepseek': { + 'api_key': os.getenv('DEEPSEEK_API_KEY'), + 'api_url': os.getenv('DEEPSEEK_API_URL', 'https://api.deepseek.com/v1'), + 'model': os.getenv('DEEPSEEK_MODEL', 'deepseek-chat') + }, + 'openai': { + 'api_key': os.getenv('OPENAI_API_KEY'), + 'api_url': os.getenv('OPENAI_API_URL', 'https://api.openai.com/v1'), + 'model': os.getenv('OPENAI_MODEL', 'gpt-4') + }, + 'claude': { + 'api_key': os.getenv('CLAUDE_API_KEY'), + 'api_url': os.getenv('CLAUDE_API_URL', 'https://api.anthropic.com/v1'), + 'model': os.getenv('CLAUDE_MODEL', 'claude-3-5-sonnet-20241022'), + 'version': '2023-06-01' + } + } + return configs.get(provider) + + @staticmethod + def test_gemini(): + """測試 Gemini API""" + try: + config = LLMService.get_config('gemini') + if not config['api_key']: + return {'success': False, 'message': 'Gemini API key not configured', 'provider': 'gemini'} + + url = f"{config['api_url']}/models/{config['model']}:generateContent" + response = requests.post( + url, + params={'key': config['api_key']}, + json={'contents': [{'parts': [{'text': 'Hello'}]}]}, + timeout=30 + ) + + if response.status_code == 200: + return {'success': True, 'message': 'Gemini API connection successful', 'provider': 'gemini', 'model': config['model']} + return {'success': False, 'message': f'HTTP {response.status_code}', 'provider': 'gemini'} + except Exception as e: + return {'success': False, 'message': str(e), 'provider': 'gemini'} + + @staticmethod + def test_deepseek(): + """測試 DeepSeek API""" + try: + config = LLMService.get_config('deepseek') + if not config['api_key']: + return {'success': False, 'message': 'DeepSeek API key not configured', 'provider': 'deepseek'} + + url = f"{config['api_url']}/chat/completions" + response = requests.post( + url, + headers={'Authorization': f"Bearer {config['api_key']}"}, + json={'model': config['model'], 'messages': [{'role': 'user', 'content': 'Hello'}], 'max_tokens': 50}, + timeout=30 + ) + + if response.status_code == 200: + return {'success': True, 'message': 'DeepSeek API connection successful', 'provider': 'deepseek', 'model': config['model']} + return {'success': False, 'message': f'HTTP {response.status_code}', 'provider': 'deepseek'} + except Exception as e: + return {'success': False, 'message': str(e), 'provider': 'deepseek'} + + @staticmethod + def test_openai(): + """測試 OpenAI API""" + try: + config = LLMService.get_config('openai') + if not config['api_key']: + return {'success': False, 'message': 'OpenAI API key not configured', 'provider': 'openai'} + + url = f"{config['api_url']}/chat/completions" + response = requests.post( + url, + headers={'Authorization': f"Bearer {config['api_key']}"}, + json={'model': config['model'], 'messages': [{'role': 'user', 'content': 'Hello'}], 'max_tokens': 50}, + timeout=30 + ) + + if response.status_code == 200: + return {'success': True, 'message': 'OpenAI API connection successful', 'provider': 'openai', 'model': config['model']} + return {'success': False, 'message': f'HTTP {response.status_code}', 'provider': 'openai'} + except Exception as e: + return {'success': False, 'message': str(e), 'provider': 'openai'} + + @staticmethod + def test_claude(): + """測試 Claude API""" + try: + config = LLMService.get_config('claude') + if not config['api_key']: + return {'success': False, 'message': 'Claude API key not configured', 'provider': 'claude'} + + url = f"{config['api_url']}/messages" + response = requests.post( + url, + headers={ + 'x-api-key': config['api_key'], + 'anthropic-version': config['version'], + 'content-type': 'application/json' + }, + json={ + 'model': config['model'], + 'max_tokens': 50, + 'messages': [{'role': 'user', 'content': 'Hello'}] + }, + timeout=30 + ) + + if response.status_code == 200: + return {'success': True, 'message': 'Claude API connection successful', 'provider': 'claude', 'model': config['model']} + return {'success': False, 'message': f'HTTP {response.status_code}', 'provider': 'claude'} + except Exception as e: + return {'success': False, 'message': str(e), 'provider': 'claude'} + + @staticmethod + def generate_content(prompt, provider='claude', options=None): + """使用指定的 LLM 生成內容""" + if options is None: + options = {} + + config = LLMService.get_config(provider) + if not config or not config['api_key']: + raise Exception(f'{provider} API not configured') + + try: + if provider == 'claude': + url = f"{config['api_url']}/messages" + response = requests.post( + url, + headers={ + 'x-api-key': config['api_key'], + 'anthropic-version': config['version'], + 'content-type': 'application/json' + }, + json={ + 'model': config['model'], + 'max_tokens': options.get('maxTokens', 2000), + 'temperature': options.get('temperature', 0.7), + 'messages': [{'role': 'user', 'content': prompt}] + }, + timeout=30 + ) + + if response.status_code == 200: + data = response.json() + return {'success': True, 'content': data['content'][0]['text'], 'provider': provider} + + elif provider == 'gemini': + url = f"{config['api_url']}/models/{config['model']}:generateContent" + response = requests.post( + url, + params={'key': config['api_key']}, + json={ + 'contents': [{'parts': [{'text': prompt}]}], + 'generationConfig': { + 'temperature': options.get('temperature', 0.7), + 'maxOutputTokens': options.get('maxTokens', 2000) + } + }, + timeout=30 + ) + + if response.status_code == 200: + data = response.json() + return {'success': True, 'content': data['candidates'][0]['content']['parts'][0]['text'], 'provider': provider} + + elif provider in ['deepseek', 'openai']: + url = f"{config['api_url']}/chat/completions" + response = requests.post( + url, + headers={'Authorization': f"Bearer {config['api_key']}"}, + json={ + 'model': config['model'], + 'messages': [{'role': 'user', 'content': prompt}], + 'temperature': options.get('temperature', 0.7), + 'max_tokens': options.get('maxTokens', 2000) + }, + timeout=30 + ) + + if response.status_code == 200: + data = response.json() + return {'success': True, 'content': data['choices'][0]['message']['content'], 'provider': provider} + + raise Exception(f'Failed to generate content: HTTP {response.status_code}') + + except Exception as e: + raise Exception(f'{provider} API error: {str(e)}') + +# ============================================ +# 錯誤處理 +# ============================================ + +def handle_error(error, status_code=500): + """統一錯誤處理""" + return jsonify({ + 'success': False, + 'error': { + 'statusCode': status_code, + 'message': str(error), + 'timestamp': datetime.now().isoformat(), + 'path': request.path + } + }), status_code + +@app.errorhandler(404) +def not_found(error): + """404 錯誤處理""" + return jsonify({ + 'success': False, + 'error': { + 'statusCode': 404, + 'message': f'Cannot {request.method} {request.path}', + 'timestamp': datetime.now().isoformat(), + 'path': request.path + } + }), 404 + +@app.errorhandler(500) +def internal_error(error): + """500 錯誤處理""" + return handle_error(error, 500) + +# ============================================ +# 路由 +# ============================================ + +@app.route('/') +def index(): + """根路由""" + return jsonify({ + 'name': 'HR Performance System API', + 'version': '1.0.0', + 'description': '四卡循環績效管理系統 (Python Flask)', + 'server': 'Flask/Python', + 'endpoints': { + 'health': '/health', + 'database': '/api/db/test', + 'llm': '/api/llm', + 'example': '/api-proxy-example.html' + } + }) + +@app.route('/health') +def health(): + """健康檢查""" + db_status = test_db_connection() + return jsonify({ + 'success': True, + 'message': 'HR Performance System API is running', + 'timestamp': datetime.now().isoformat(), + 'environment': os.getenv('NODE_ENV', 'development'), + 'database': 'connected' if db_status else 'disconnected', + 'server': 'Flask/Python' + }) + +# ============================================ +# 資料庫 API +# ============================================ + +@app.route('/api/db/test', methods=['POST']) +def test_database(): + """測試資料庫連線""" + try: + conn = get_db_connection() + if not conn: + return handle_error('無法連接到資料庫', 500) + + with conn.cursor() as cursor: + cursor.execute("SELECT VERSION() as version") + result = cursor.fetchone() + + cursor.execute("SELECT DATABASE() as database_name") + db_info = cursor.fetchone() + + conn.close() + + return jsonify({ + 'success': True, + 'message': '資料庫連線成功', + 'database': db_info['database_name'], + 'version': result['version'] + }) + except Exception as e: + return handle_error(e, 500) + +@app.route('/api/db/tables', methods=['GET']) +def list_tables(): + """列出所有資料表""" + try: + conn = get_db_connection() + if not conn: + return handle_error('無法連接到資料庫', 500) + + with conn.cursor() as cursor: + cursor.execute("SHOW TABLES") + tables = [list(row.values())[0] for row in cursor.fetchall()] + + conn.close() + + return jsonify({ + 'success': True, + 'count': len(tables), + 'tables': tables + }) + except Exception as e: + return handle_error(e, 500) + +# ============================================ +# LLM API +# ============================================ + +@app.route('/api/llm/test/gemini', methods=['POST']) +def test_llm_gemini(): + """測試 Gemini API""" + result = LLMService.test_gemini() + return jsonify(result) + +@app.route('/api/llm/test/deepseek', methods=['POST']) +def test_llm_deepseek(): + """測試 DeepSeek API""" + result = LLMService.test_deepseek() + return jsonify(result) + +@app.route('/api/llm/test/openai', methods=['POST']) +def test_llm_openai(): + """測試 OpenAI API""" + result = LLMService.test_openai() + return jsonify(result) + +@app.route('/api/llm/test/claude', methods=['POST']) +def test_llm_claude(): + """測試 Claude API""" + result = LLMService.test_claude() + return jsonify(result) + +@app.route('/api/llm/test/all', methods=['POST']) +def test_llm_all(): + """測試所有 LLM API""" + results = { + 'gemini': LLMService.test_gemini(), + 'deepseek': LLMService.test_deepseek(), + 'openai': LLMService.test_openai(), + 'claude': LLMService.test_claude() + } + return jsonify(results) + +@app.route('/api/llm/generate', methods=['POST']) +def generate_content(): + """使用 LLM 生成內容""" + try: + data = request.get_json() + + if not data or 'prompt' not in data: + return handle_error('缺少必要參數: prompt', 400) + + prompt = data['prompt'] + provider = data.get('provider', 'claude') + options = data.get('options', {}) + + result = LLMService.generate_content(prompt, provider, options) + return jsonify(result) + + except Exception as e: + return handle_error(e, 500) + +# ============================================ +# 靜態檔案 +# ============================================ + +@app.route('/api-proxy-example.html') +def api_example(): + """API 範例頁面""" + return send_from_directory('public', 'api-proxy-example.html') + +# ============================================ +# 啟動伺服器 +# ============================================ + +if __name__ == '__main__': + print('=' * 60) + print('🚀 HR Performance System API Server (Flask/Python)') + print('=' * 60) + print(f'📡 Server running on: http://127.0.0.1:5002') + print(f'🌍 Environment: {os.getenv("NODE_ENV", "development")}') + print(f'📅 Started at: {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}') + print('=' * 60) + print('\n📚 Available endpoints:') + print(' GET / - API information') + print(' GET /health - Health check') + print(' POST /api/db/test - Test database connection') + print(' GET /api/db/tables - List all tables') + print(' POST /api/llm/test/* - Test LLM connections') + print(' POST /api/llm/generate - Generate content with LLM') + print(' GET /api-proxy-example.html - API example page') + print('\n✨ Server is ready to accept connections!\n') + + # 測試資料庫連線 + if test_db_connection(): + print('✅ Database connection: OK') + else: + print('⚠️ Database connection: FAILED') + + print('') + + # 啟動伺服器 + app.run( + host='127.0.0.1', + port=5002, + debug=os.getenv('NODE_ENV') == 'development' + ) diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 0000000..0aca917 --- /dev/null +++ b/requirements.txt @@ -0,0 +1,27 @@ +# Flask 核心 +Flask==3.0.0 +Werkzeug==3.0.1 + +# CORS 支援 +Flask-Cors==4.0.0 + +# 資料庫 +PyMySQL==1.1.0 +cryptography==41.0.7 + +# HTTP 請求 +requests==2.31.0 + +# 環境變數 +python-dotenv==1.0.0 + +# JSON 處理 +jsonschema==4.20.0 + +# 日期時間 +python-dateutil==2.8.2 + +# 開發工具(可選) +# pytest==7.4.3 +# black==23.12.1 +# flake8==6.1.0 diff --git a/run_flask.bat b/run_flask.bat new file mode 100644 index 0000000..d870269 --- /dev/null +++ b/run_flask.bat @@ -0,0 +1,59 @@ +@echo off +REM Flask 伺服器啟動腳本 + +echo ======================================== +echo HR Performance System - Flask Server +echo ======================================== +echo. + +REM 檢查 Python 是否安裝 +python --version >nul 2>&1 +if errorlevel 1 ( + echo [ERROR] Python is not installed or not in PATH + echo Please install Python 3.8 or higher + pause + exit /b 1 +) + +REM 檢查虛擬環境 +if not exist "venv\" ( + echo [INFO] Creating virtual environment... + python -m venv venv + if errorlevel 1 ( + echo [ERROR] Failed to create virtual environment + pause + exit /b 1 + ) + echo [SUCCESS] Virtual environment created + echo. +) + +REM 啟動虛擬環境 +echo [INFO] Activating virtual environment... +call venv\Scripts\activate.bat + +REM 安裝依賴 +echo [INFO] Installing dependencies... +pip install -r requirements.txt +if errorlevel 1 ( + echo [ERROR] Failed to install dependencies + pause + exit /b 1 +) +echo. + +REM 檢查 .env 檔案 +if not exist ".env" ( + echo [WARNING] .env file not found + echo Please create .env file with required configuration + echo. +) + +REM 啟動 Flask 伺服器 +echo [INFO] Starting Flask server... +echo Server will run on http://127.0.0.1:5002 +echo Press Ctrl+C to stop the server +echo. +python app.py + +pause diff --git a/run_flask.sh b/run_flask.sh new file mode 100644 index 0000000..24f2e83 --- /dev/null +++ b/run_flask.sh @@ -0,0 +1,54 @@ +#!/bin/bash + +# Flask 伺服器啟動腳本 + +echo "========================================" +echo "HR Performance System - Flask Server" +echo "========================================" +echo "" + +# 檢查 Python 是否安裝 +if ! command -v python3 &> /dev/null; then + echo "[ERROR] Python 3 is not installed" + echo "Please install Python 3.8 or higher" + exit 1 +fi + +# 檢查虛擬環境 +if [ ! -d "venv" ]; then + echo "[INFO] Creating virtual environment..." + python3 -m venv venv + if [ $? -ne 0 ]; then + echo "[ERROR] Failed to create virtual environment" + exit 1 + fi + echo "[SUCCESS] Virtual environment created" + echo "" +fi + +# 啟動虛擬環境 +echo "[INFO] Activating virtual environment..." +source venv/bin/activate + +# 安裝依賴 +echo "[INFO] Installing dependencies..." +pip install -r requirements.txt +if [ $? -ne 0 ]; then + echo "[ERROR] Failed to install dependencies" + exit 1 +fi +echo "" + +# 檢查 .env 檔案 +if [ ! -f ".env" ]; then + echo "[WARNING] .env file not found" + echo "Please create .env file with required configuration" + echo "" +fi + +# 啟動 Flask 伺服器 +echo "[INFO] Starting Flask server..." +echo "Server will run on http://127.0.0.1:5002" +echo "Press Ctrl+C to stop the server" +echo "" +python3 app.py