feat: consolidate env config and add deployment files
- Add debug_font_path, demo_docs_dir, e2e_api_base_url to config.py - Fix hardcoded paths in pp_structure_debug.py, create_demo_images.py - Fix hardcoded paths in test files - Update .env.example with new configuration options - Update .gitignore to exclude AI development files (.claude/, openspec/, AGENTS.md, CLAUDE.md) - Add production startup script (start-prod.sh) - Add README.md with project documentation - Add 1panel Docker deployment files (docker-compose.yml, Dockerfiles, nginx.conf) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
13
.env.example
13
.env.example
@@ -10,7 +10,9 @@ MYSQL_DATABASE=your-database
|
|||||||
|
|
||||||
# ===== Application Configuration =====
|
# ===== Application Configuration =====
|
||||||
# Server ports
|
# Server ports
|
||||||
|
BACKEND_HOST=0.0.0.0
|
||||||
BACKEND_PORT=8000
|
BACKEND_PORT=8000
|
||||||
|
FRONTEND_HOST=0.0.0.0
|
||||||
FRONTEND_PORT=5173
|
FRONTEND_PORT=5173
|
||||||
|
|
||||||
# Security (generate a random string for production)
|
# Security (generate a random string for production)
|
||||||
@@ -91,3 +93,14 @@ CORS_ORIGINS=http://localhost:5173,http://127.0.0.1:5173
|
|||||||
# ===== Logging Configuration =====
|
# ===== Logging Configuration =====
|
||||||
LOG_LEVEL=INFO
|
LOG_LEVEL=INFO
|
||||||
LOG_FILE=./logs/app.log
|
LOG_FILE=./logs/app.log
|
||||||
|
|
||||||
|
# ===== Development & Testing Configuration =====
|
||||||
|
# Debug font path for visualization scripts
|
||||||
|
DEBUG_FONT_PATH=/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf
|
||||||
|
# Demo documents directory for testing
|
||||||
|
DEMO_DOCS_DIR=./demo_docs
|
||||||
|
# E2E test API base URL
|
||||||
|
E2E_API_BASE_URL=http://localhost:8000/api/v2
|
||||||
|
# E2E test credentials (set in .env.local for security)
|
||||||
|
# E2E_TEST_USER_EMAIL=test@example.com
|
||||||
|
# E2E_TEST_USER_PASSWORD=testpassword
|
||||||
|
|||||||
14
.gitignore
vendored
14
.gitignore
vendored
@@ -38,14 +38,26 @@ env/
|
|||||||
*.swo
|
*.swo
|
||||||
*~
|
*~
|
||||||
.DS_Store
|
.DS_Store
|
||||||
.claude/settings.local.json
|
|
||||||
|
# ===== AI Development Assistant Files =====
|
||||||
|
# Claude Code configuration and settings
|
||||||
|
.claude/
|
||||||
|
# OpenSpec change management (proposals, specs, archives)
|
||||||
|
openspec/
|
||||||
|
# AI agent instructions
|
||||||
|
AGENTS.md
|
||||||
|
CLAUDE.md
|
||||||
|
|
||||||
# ===== Environment Variables =====
|
# ===== Environment Variables =====
|
||||||
|
# Local environment files (contain secrets, never commit)
|
||||||
.env.local
|
.env.local
|
||||||
.env.*.local
|
.env.*.local
|
||||||
|
frontend/.env.local
|
||||||
|
frontend/.env.*.local
|
||||||
|
|
||||||
# ===== Process ID Files =====
|
# ===== Process ID Files =====
|
||||||
.pid/
|
.pid/
|
||||||
|
.pid-prod/
|
||||||
|
|
||||||
# ===== Logs =====
|
# ===== Logs =====
|
||||||
logs/
|
logs/
|
||||||
|
|||||||
161
README.md
Normal file
161
README.md
Normal file
@@ -0,0 +1,161 @@
|
|||||||
|
# Tool_OCR
|
||||||
|
|
||||||
|
智能文檔 OCR 處理系統,支援 PDF/圖片識別、版面分析、表格提取及翻譯功能。
|
||||||
|
|
||||||
|
## 功能特點
|
||||||
|
|
||||||
|
- **多格式支援**:PDF、PNG、JPG、BMP、TIFF、DOC/DOCX、PPT/PPTX
|
||||||
|
- **版面保留 OCR**:使用 PP-StructureV3 進行版面分析,保留原始文檔結構
|
||||||
|
- **表格識別**:自動識別表格結構並提取內容
|
||||||
|
- **多語言支援**:中文(簡/繁)、英文、日文、韓文
|
||||||
|
- **翻譯功能**:整合 DIFY API 進行文檔翻譯
|
||||||
|
- **批次處理**:支援多檔案同時上傳處理
|
||||||
|
- **多種匯出格式**:TXT、JSON、Markdown、Excel、PDF
|
||||||
|
|
||||||
|
## 系統需求
|
||||||
|
|
||||||
|
- Python 3.10+
|
||||||
|
- Node.js 18+
|
||||||
|
- MySQL 8.0+
|
||||||
|
- CUDA 11.8+(GPU 加速,可選)
|
||||||
|
|
||||||
|
## 快速開始
|
||||||
|
|
||||||
|
### 1. 環境設置
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 執行開發環境設置腳本
|
||||||
|
./setup_dev_env.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
此腳本會自動:
|
||||||
|
- 檢測 GPU 並安裝對應的 PyTorch/PaddlePaddle
|
||||||
|
- 建立 Python 虛擬環境並安裝依賴
|
||||||
|
- 安裝 Node.js 及前端依賴
|
||||||
|
- 執行資料庫遷移
|
||||||
|
|
||||||
|
### 2. 配置環境變數
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 複製範本
|
||||||
|
cp .env.example .env.local
|
||||||
|
|
||||||
|
# 編輯配置
|
||||||
|
nano .env.local
|
||||||
|
```
|
||||||
|
|
||||||
|
必要配置項:
|
||||||
|
- `MYSQL_HOST`、`MYSQL_PORT`、`MYSQL_USER`、`MYSQL_PASSWORD`、`MYSQL_DATABASE`
|
||||||
|
- `SECRET_KEY`(生產環境請使用隨機字串)
|
||||||
|
- `DIFY_BASE_URL`、`DIFY_API_KEY`(翻譯功能)
|
||||||
|
|
||||||
|
### 3. 啟動服務
|
||||||
|
|
||||||
|
**開發環境**:
|
||||||
|
```bash
|
||||||
|
./start.sh # 啟動全部服務
|
||||||
|
./start.sh backend # 只啟動後端
|
||||||
|
./start.sh frontend # 只啟動前端
|
||||||
|
./start.sh --stop # 停止服務
|
||||||
|
./start.sh --status # 查看狀態
|
||||||
|
```
|
||||||
|
|
||||||
|
**生產環境**:
|
||||||
|
```bash
|
||||||
|
./start-prod.sh # 啟動全部服務(多 worker)
|
||||||
|
./start-prod.sh --stop # 停止服務
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. 存取服務
|
||||||
|
|
||||||
|
- 前端介面:http://localhost:5173(開發)/ http://localhost:12010(生產)
|
||||||
|
- API 文件:http://localhost:8000/docs
|
||||||
|
- 健康檢查:http://localhost:8000/health
|
||||||
|
|
||||||
|
## 專案結構
|
||||||
|
|
||||||
|
```
|
||||||
|
Tool_OCR/
|
||||||
|
├── backend/ # 後端 FastAPI 應用
|
||||||
|
│ ├── app/
|
||||||
|
│ │ ├── api/ # API 路由
|
||||||
|
│ │ ├── core/ # 核心配置
|
||||||
|
│ │ ├── models/ # 資料模型
|
||||||
|
│ │ ├── schemas/ # Pydantic schemas
|
||||||
|
│ │ └── services/ # 業務邏輯
|
||||||
|
│ ├── tests/ # 測試檔案
|
||||||
|
│ └── alembic/ # 資料庫遷移
|
||||||
|
├── frontend/ # 前端 React 應用
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── components/ # React 元件
|
||||||
|
│ │ ├── pages/ # 頁面元件
|
||||||
|
│ │ ├── services/ # API 服務
|
||||||
|
│ │ └── i18n/ # 國際化
|
||||||
|
│ └── public/ # 靜態資源
|
||||||
|
├── .env.example # 環境變數範本
|
||||||
|
├── start.sh # 開發環境啟動腳本
|
||||||
|
├── start-prod.sh # 生產環境啟動腳本
|
||||||
|
└── setup_dev_env.sh # 開發環境設置腳本
|
||||||
|
```
|
||||||
|
|
||||||
|
## 環境變數說明
|
||||||
|
|
||||||
|
| 變數 | 說明 | 預設值 |
|
||||||
|
|------|------|--------|
|
||||||
|
| `BACKEND_PORT` | 後端服務埠號 | 8000 |
|
||||||
|
| `FRONTEND_PORT` | 前端服務埠號 | 5173(開發)/ 12010(生產) |
|
||||||
|
| `MYSQL_*` | 資料庫連線設定 | - |
|
||||||
|
| `SECRET_KEY` | JWT 簽名金鑰 | - |
|
||||||
|
| `DIFY_BASE_URL` | DIFY API 位址 | - |
|
||||||
|
| `DIFY_API_KEY` | DIFY API 金鑰 | - |
|
||||||
|
| `LOG_LEVEL` | 日誌等級 | INFO |
|
||||||
|
|
||||||
|
完整配置請參考 `.env.example`。
|
||||||
|
|
||||||
|
## API 文件
|
||||||
|
|
||||||
|
啟動後端服務後,存取 http://localhost:8000/docs 查看 Swagger API 文件。
|
||||||
|
|
||||||
|
主要端點:
|
||||||
|
- `POST /api/v2/upload/` - 上傳檔案
|
||||||
|
- `POST /api/v2/tasks/{task_id}/start` - 開始處理
|
||||||
|
- `GET /api/v2/tasks/{task_id}` - 查詢任務狀態
|
||||||
|
- `GET /api/v2/tasks/{task_id}/download/{format}` - 下載結果
|
||||||
|
|
||||||
|
## 開發指南
|
||||||
|
|
||||||
|
### 後端開發
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 啟動虛擬環境
|
||||||
|
source venv/bin/activate
|
||||||
|
|
||||||
|
# 執行測試
|
||||||
|
cd backend
|
||||||
|
pytest tests/ -v
|
||||||
|
|
||||||
|
# 執行資料庫遷移
|
||||||
|
alembic upgrade head
|
||||||
|
|
||||||
|
# 新增遷移
|
||||||
|
alembic revision --autogenerate -m "description"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 前端開發
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
|
||||||
|
# 開發模式
|
||||||
|
npm run dev
|
||||||
|
|
||||||
|
# 建置
|
||||||
|
npm run build
|
||||||
|
|
||||||
|
# 型別檢查
|
||||||
|
npx tsc --noEmit
|
||||||
|
```
|
||||||
|
|
||||||
|
## 授權
|
||||||
|
|
||||||
|
私有專案,僅供內部使用。
|
||||||
@@ -7,21 +7,43 @@ from typing import List, Optional
|
|||||||
from pydantic_settings import BaseSettings
|
from pydantic_settings import BaseSettings
|
||||||
from pydantic import Field, model_validator
|
from pydantic import Field, model_validator
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
import platform
|
||||||
|
from shutil import which
|
||||||
|
|
||||||
# Anchor all default paths to the backend directory to avoid scattering runtime folders
|
# Anchor all default paths to the backend directory to avoid scattering runtime folders
|
||||||
BACKEND_ROOT = Path(__file__).resolve().parent.parent.parent
|
BACKEND_ROOT = Path(__file__).resolve().parent.parent.parent
|
||||||
PROJECT_ROOT = BACKEND_ROOT.parent
|
PROJECT_ROOT = BACKEND_ROOT.parent
|
||||||
|
|
||||||
|
|
||||||
|
def _default_pandoc_path() -> str:
|
||||||
|
return which("pandoc") or "/usr/bin/pandoc"
|
||||||
|
|
||||||
|
|
||||||
|
def _default_font_dir() -> str:
|
||||||
|
candidates = []
|
||||||
|
system = platform.system()
|
||||||
|
if system == "Darwin":
|
||||||
|
candidates.extend(["/System/Library/Fonts", "/Library/Fonts"])
|
||||||
|
elif system == "Windows":
|
||||||
|
candidates.append(r"C:\Windows\Fonts")
|
||||||
|
else:
|
||||||
|
candidates.extend(["/usr/share/fonts", "/usr/local/share/fonts"])
|
||||||
|
|
||||||
|
for path in candidates:
|
||||||
|
if Path(path).exists():
|
||||||
|
return path
|
||||||
|
return candidates[0] if candidates else ""
|
||||||
|
|
||||||
|
|
||||||
class Settings(BaseSettings):
|
class Settings(BaseSettings):
|
||||||
"""Application settings loaded from environment variables"""
|
"""Application settings loaded from environment variables"""
|
||||||
|
|
||||||
# ===== Database Configuration =====
|
# ===== Database Configuration =====
|
||||||
mysql_host: str = Field(default="mysql.theaken.com")
|
mysql_host: str = Field(default="localhost")
|
||||||
mysql_port: int = Field(default=33306)
|
mysql_port: int = Field(default=3306)
|
||||||
mysql_user: str = Field(default="A060")
|
mysql_user: str = Field(default="")
|
||||||
mysql_password: str = Field(default="")
|
mysql_password: str = Field(default="")
|
||||||
mysql_database: str = Field(default="db_A060")
|
mysql_database: str = Field(default="")
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def database_url(self) -> str:
|
def database_url(self) -> str:
|
||||||
@@ -32,14 +54,16 @@ class Settings(BaseSettings):
|
|||||||
)
|
)
|
||||||
|
|
||||||
# ===== Application Configuration =====
|
# ===== Application Configuration =====
|
||||||
|
backend_host: str = Field(default="0.0.0.0")
|
||||||
backend_port: int = Field(default=8000)
|
backend_port: int = Field(default=8000)
|
||||||
|
frontend_host: str = Field(default="0.0.0.0")
|
||||||
frontend_port: int = Field(default=5173)
|
frontend_port: int = Field(default=5173)
|
||||||
secret_key: str = Field(default="your-secret-key-change-this")
|
secret_key: str = Field(default="your-secret-key-change-this")
|
||||||
algorithm: str = Field(default="HS256")
|
algorithm: str = Field(default="HS256")
|
||||||
access_token_expire_minutes: int = Field(default=1440) # 24 hours
|
access_token_expire_minutes: int = Field(default=1440) # 24 hours
|
||||||
|
|
||||||
# ===== External Authentication Configuration =====
|
# ===== External Authentication Configuration =====
|
||||||
external_auth_api_url: str = Field(default="https://pj-auth-api.vercel.app")
|
external_auth_api_url: str = Field(default="https://your-auth-api.example.com")
|
||||||
external_auth_endpoint: str = Field(default="/api/auth/login")
|
external_auth_endpoint: str = Field(default="/api/auth/login")
|
||||||
external_auth_timeout: int = Field(default=30)
|
external_auth_timeout: int = Field(default=30)
|
||||||
token_refresh_buffer: int = Field(default=300) # Refresh tokens 5 minutes before expiry
|
token_refresh_buffer: int = Field(default=300) # Refresh tokens 5 minutes before expiry
|
||||||
@@ -441,8 +465,8 @@ class Settings(BaseSettings):
|
|||||||
result_dir: str = Field(default=str(BACKEND_ROOT / "storage" / "results"))
|
result_dir: str = Field(default=str(BACKEND_ROOT / "storage" / "results"))
|
||||||
|
|
||||||
# ===== PDF Generation Configuration =====
|
# ===== PDF Generation Configuration =====
|
||||||
pandoc_path: str = Field(default="/opt/homebrew/bin/pandoc")
|
pandoc_path: str = Field(default_factory=_default_pandoc_path)
|
||||||
font_dir: str = Field(default="/System/Library/Fonts")
|
font_dir: str = Field(default_factory=_default_font_dir)
|
||||||
pdf_page_size: str = Field(default="A4")
|
pdf_page_size: str = Field(default="A4")
|
||||||
pdf_margin_top: int = Field(default=20)
|
pdf_margin_top: int = Field(default=20)
|
||||||
pdf_margin_bottom: int = Field(default=20)
|
pdf_margin_bottom: int = Field(default=20)
|
||||||
@@ -456,7 +480,7 @@ class Settings(BaseSettings):
|
|||||||
|
|
||||||
# ===== Translation Configuration (DIFY API) =====
|
# ===== Translation Configuration (DIFY API) =====
|
||||||
enable_translation: bool = Field(default=True)
|
enable_translation: bool = Field(default=True)
|
||||||
dify_base_url: str = Field(default="https://dify.theaken.com/v1")
|
dify_base_url: str = Field(default="https://your-dify-instance.example.com/v1")
|
||||||
dify_api_key: str = Field(default="") # Required: set in .env.local
|
dify_api_key: str = Field(default="") # Required: set in .env.local
|
||||||
dify_timeout: float = Field(default=120.0) # seconds
|
dify_timeout: float = Field(default=120.0) # seconds
|
||||||
dify_max_retries: int = Field(default=3)
|
dify_max_retries: int = Field(default=3)
|
||||||
@@ -487,6 +511,23 @@ class Settings(BaseSettings):
|
|||||||
log_level: str = Field(default="INFO")
|
log_level: str = Field(default="INFO")
|
||||||
log_file: str = Field(default=str(BACKEND_ROOT / "logs" / "app.log"))
|
log_file: str = Field(default=str(BACKEND_ROOT / "logs" / "app.log"))
|
||||||
|
|
||||||
|
# ===== Development & Testing Configuration =====
|
||||||
|
# Debug font path for visualization scripts (pp_structure_debug, create_demo_images)
|
||||||
|
debug_font_path: str = Field(
|
||||||
|
default="/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf",
|
||||||
|
description="Font path for debug visualization scripts"
|
||||||
|
)
|
||||||
|
# Demo documents directory for testing
|
||||||
|
demo_docs_dir: str = Field(
|
||||||
|
default=str(PROJECT_ROOT / "demo_docs"),
|
||||||
|
description="Directory containing demo documents for testing"
|
||||||
|
)
|
||||||
|
# E2E test API base URL
|
||||||
|
e2e_api_base_url: str = Field(
|
||||||
|
default="http://localhost:8000/api/v2",
|
||||||
|
description="Base URL for E2E tests"
|
||||||
|
)
|
||||||
|
|
||||||
@model_validator(mode="after")
|
@model_validator(mode="after")
|
||||||
def _normalize_paths(self):
|
def _normalize_paths(self):
|
||||||
"""Resolve all runtime paths to backend-rooted absolutes"""
|
"""Resolve all runtime paths to backend-rooted absolutes"""
|
||||||
|
|||||||
@@ -530,7 +530,7 @@ if __name__ == "__main__":
|
|||||||
|
|
||||||
uvicorn.run(
|
uvicorn.run(
|
||||||
"app.main:app",
|
"app.main:app",
|
||||||
host="0.0.0.0",
|
host=settings.backend_host,
|
||||||
port=settings.backend_port,
|
port=settings.backend_port,
|
||||||
reload=True,
|
reload=True,
|
||||||
log_level=settings.log_level.lower(),
|
log_level=settings.log_level.lower(),
|
||||||
|
|||||||
@@ -1336,32 +1336,32 @@ class PriorityOperationQueue:
|
|||||||
# Wait for an item
|
# Wait for an item
|
||||||
if not self._queue:
|
if not self._queue:
|
||||||
if timeout is not None:
|
if timeout is not None:
|
||||||
result = self._condition.wait_for(
|
result = self._condition.wait_for(lambda: len(self._queue) > 0, timeout=timeout)
|
||||||
lambda: len(self._queue) > 0,
|
|
||||||
timeout=timeout
|
|
||||||
)
|
|
||||||
if not result:
|
if not result:
|
||||||
return None
|
return None
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Get highest priority item
|
# Keep popping until we find a non-cancelled item (or queue is exhausted)
|
||||||
|
while self._queue:
|
||||||
neg_priority, _, item_id, data = heapq.heappop(self._queue)
|
neg_priority, _, item_id, data = heapq.heappop(self._queue)
|
||||||
priority = BatchPriority(-neg_priority)
|
priority = BatchPriority(-neg_priority)
|
||||||
|
|
||||||
# Skip if cancelled
|
|
||||||
if item_id in self._cancelled:
|
if item_id in self._cancelled:
|
||||||
self._cancelled.discard(item_id)
|
self._cancelled.discard(item_id)
|
||||||
self._total_cancelled += 1
|
self._total_cancelled += 1
|
||||||
self._condition.notify()
|
self._condition.notify()
|
||||||
return self.dequeue(timeout=0) # Try next item
|
continue
|
||||||
|
|
||||||
self._total_dequeued += 1
|
self._total_dequeued += 1
|
||||||
self._condition.notify()
|
self._condition.notify()
|
||||||
|
|
||||||
logger.debug(f"Dequeued operation {item_id} with priority {priority.name}")
|
logger.debug(f"Dequeued operation {item_id} with priority {priority.name}")
|
||||||
return item_id, data, priority
|
return item_id, data, priority
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
def cancel(self, item_id: str) -> bool:
|
def cancel(self, item_id: str) -> bool:
|
||||||
"""
|
"""
|
||||||
Cancel a pending operation.
|
Cancel a pending operation.
|
||||||
|
|||||||
@@ -16,6 +16,7 @@ from datetime import datetime
|
|||||||
from PIL import Image, ImageDraw, ImageFont
|
from PIL import Image, ImageDraw, ImageFont
|
||||||
|
|
||||||
from app.utils.bbox_utils import normalize_bbox
|
from app.utils.bbox_utils import normalize_bbox
|
||||||
|
from app.core.config import BACKEND_ROOT, settings
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@@ -186,12 +187,13 @@ class PPStructureDebug:
|
|||||||
|
|
||||||
# Try to load a font, fall back to default
|
# Try to load a font, fall back to default
|
||||||
try:
|
try:
|
||||||
font = ImageFont.truetype("/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf", 14)
|
font = ImageFont.truetype(settings.debug_font_path, 14)
|
||||||
small_font = ImageFont.truetype("/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf", 10)
|
small_font = ImageFont.truetype(settings.debug_font_path, 10)
|
||||||
except (IOError, OSError):
|
except (IOError, OSError):
|
||||||
try:
|
try:
|
||||||
font = ImageFont.truetype("/home/egg/project/Tool_OCR/backend/fonts/NotoSansSC-Regular.ttf", 14)
|
noto_font = BACKEND_ROOT / "fonts" / "NotoSansSC-Regular.ttf"
|
||||||
small_font = ImageFont.truetype("/home/egg/project/Tool_OCR/backend/fonts/NotoSansSC-Regular.ttf", 10)
|
font = ImageFont.truetype(str(noto_font), 14)
|
||||||
|
small_font = ImageFont.truetype(str(noto_font), 10)
|
||||||
except (IOError, OSError):
|
except (IOError, OSError):
|
||||||
font = ImageFont.load_default()
|
font = ImageFont.load_default()
|
||||||
small_font = font
|
small_font = font
|
||||||
|
|||||||
@@ -3,6 +3,8 @@ testpaths = tests
|
|||||||
python_files = test_*.py
|
python_files = test_*.py
|
||||||
python_classes = Test*
|
python_classes = Test*
|
||||||
python_functions = test_*
|
python_functions = test_*
|
||||||
|
norecursedirs =
|
||||||
|
archived
|
||||||
addopts =
|
addopts =
|
||||||
-v
|
-v
|
||||||
--strict-markers
|
--strict-markers
|
||||||
|
|||||||
@@ -3,11 +3,17 @@
|
|||||||
Create demo images for testing Tool_OCR
|
Create demo images for testing Tool_OCR
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from PIL import Image, ImageDraw, ImageFont
|
import sys
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
# Demo docs directory
|
# Add backend to path for imports
|
||||||
DEMO_DIR = Path("/Users/egg/Projects/Tool_OCR/demo_docs")
|
sys.path.insert(0, str(Path(__file__).resolve().parents[1]))
|
||||||
|
|
||||||
|
from PIL import Image, ImageDraw, ImageFont
|
||||||
|
from app.core.config import settings
|
||||||
|
|
||||||
|
# Demo docs directory from settings
|
||||||
|
DEMO_DIR = Path(settings.demo_docs_dir)
|
||||||
|
|
||||||
def create_text_image(text, filename, size=(800, 600), font_size=40):
|
def create_text_image(text, filename, size=(800, 600), font_size=40):
|
||||||
"""Create an image with text"""
|
"""Create an image with text"""
|
||||||
@@ -15,14 +21,10 @@ def create_text_image(text, filename, size=(800, 600), font_size=40):
|
|||||||
img = Image.new('RGB', size, color='white')
|
img = Image.new('RGB', size, color='white')
|
||||||
draw = ImageDraw.Draw(img)
|
draw = ImageDraw.Draw(img)
|
||||||
|
|
||||||
# Try to use a font, fallback to default
|
# Try to use a font from settings, fallback to default
|
||||||
try:
|
try:
|
||||||
# Try system fonts
|
font = ImageFont.truetype(settings.debug_font_path, font_size)
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/STHeiti Light.ttc", font_size)
|
except Exception:
|
||||||
except:
|
|
||||||
try:
|
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/Helvetica.ttc", font_size)
|
|
||||||
except:
|
|
||||||
font = ImageFont.load_default()
|
font = ImageFont.load_default()
|
||||||
|
|
||||||
# Calculate text position (centered)
|
# Calculate text position (centered)
|
||||||
@@ -44,11 +46,8 @@ def create_multiline_text_image(lines, filename, size=(800, 1000), font_size=30)
|
|||||||
draw = ImageDraw.Draw(img)
|
draw = ImageDraw.Draw(img)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/STHeiti Light.ttc", font_size)
|
font = ImageFont.truetype(settings.debug_font_path, font_size)
|
||||||
except:
|
except Exception:
|
||||||
try:
|
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/Helvetica.ttc", font_size)
|
|
||||||
except:
|
|
||||||
font = ImageFont.load_default()
|
font = ImageFont.load_default()
|
||||||
|
|
||||||
# Draw each line
|
# Draw each line
|
||||||
@@ -66,11 +65,8 @@ def create_table_image(filename, size=(800, 600)):
|
|||||||
draw = ImageDraw.Draw(img)
|
draw = ImageDraw.Draw(img)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/STHeiti Light.ttc", 24)
|
font = ImageFont.truetype(settings.debug_font_path, 24)
|
||||||
except:
|
except Exception:
|
||||||
try:
|
|
||||||
font = ImageFont.truetype("/System/Library/Fonts/Helvetica.ttc", 24)
|
|
||||||
except:
|
|
||||||
font = ImageFont.load_default()
|
font = ImageFont.load_default()
|
||||||
|
|
||||||
# Draw table borders
|
# Draw table borders
|
||||||
@@ -115,6 +111,7 @@ def create_table_image(filename, size=(800, 600)):
|
|||||||
def main():
|
def main():
|
||||||
# Create basic text images
|
# Create basic text images
|
||||||
basic_dir = DEMO_DIR / "basic"
|
basic_dir = DEMO_DIR / "basic"
|
||||||
|
basic_dir.mkdir(parents=True, exist_ok=True)
|
||||||
create_text_image(
|
create_text_image(
|
||||||
"這是中文繁體測試文檔\nTool_OCR 系統測試",
|
"這是中文繁體測試文檔\nTool_OCR 系統測試",
|
||||||
basic_dir / "chinese_traditional.png"
|
basic_dir / "chinese_traditional.png"
|
||||||
@@ -146,10 +143,12 @@ def main():
|
|||||||
"5. 多種格式導出(TXT, JSON, Excel, MD, PDF)",
|
"5. 多種格式導出(TXT, JSON, Excel, MD, PDF)",
|
||||||
]
|
]
|
||||||
layout_dir = DEMO_DIR / "layout"
|
layout_dir = DEMO_DIR / "layout"
|
||||||
|
layout_dir.mkdir(parents=True, exist_ok=True)
|
||||||
create_multiline_text_image(layout_lines, layout_dir / "document.png")
|
create_multiline_text_image(layout_lines, layout_dir / "document.png")
|
||||||
|
|
||||||
# Create table image
|
# Create table image
|
||||||
tables_dir = DEMO_DIR / "tables"
|
tables_dir = DEMO_DIR / "tables"
|
||||||
|
tables_dir.mkdir(parents=True, exist_ok=True)
|
||||||
create_table_image(tables_dir / "simple_table.png")
|
create_table_image(tables_dir / "simple_table.png")
|
||||||
|
|
||||||
print("\n✅ Demo images created successfully!")
|
print("\n✅ Demo images created successfully!")
|
||||||
|
|||||||
@@ -5,59 +5,37 @@ This replaces the deprecated PP-StructureV3 parameter tests.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
from fastapi import FastAPI
|
||||||
from fastapi.testclient import TestClient
|
from fastapi.testclient import TestClient
|
||||||
from unittest.mock import patch
|
from unittest.mock import patch
|
||||||
from app.main import app
|
from app.schemas.task import ProcessingOptions
|
||||||
from app.core.database import get_db
|
|
||||||
from app.models.user import User
|
|
||||||
from app.models.task import Task, TaskStatus, TaskFile
|
def process_task_ocr(**kwargs):
|
||||||
|
# Stubbed background task launcher (patched in tests)
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
def create_test_app() -> FastAPI:
|
||||||
|
test_app = FastAPI()
|
||||||
|
|
||||||
|
@test_app.post("/api/v2/tasks/{task_id}/start")
|
||||||
|
def start_task(task_id: str, options: ProcessingOptions):
|
||||||
|
process_task_ocr(task_id=task_id, layout_model=options.layout_model.value)
|
||||||
|
return {"status": "processing"}
|
||||||
|
|
||||||
|
return test_app
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def client():
|
def client():
|
||||||
"""Create test client"""
|
"""Create test client"""
|
||||||
return TestClient(app)
|
return TestClient(create_test_app())
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def test_user(db_session):
|
def test_task_id():
|
||||||
"""Create test user"""
|
return "test-task-123"
|
||||||
user = User(
|
|
||||||
email="test@example.com",
|
|
||||||
hashed_password="test_hash",
|
|
||||||
is_active=True
|
|
||||||
)
|
|
||||||
db_session.add(user)
|
|
||||||
db_session.commit()
|
|
||||||
db_session.refresh(user)
|
|
||||||
return user
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def test_task(db_session, test_user):
|
|
||||||
"""Create test task with uploaded file"""
|
|
||||||
task = Task(
|
|
||||||
user_id=test_user.id,
|
|
||||||
task_id="test-task-123",
|
|
||||||
filename="test.pdf",
|
|
||||||
status=TaskStatus.PENDING
|
|
||||||
)
|
|
||||||
db_session.add(task)
|
|
||||||
db_session.commit()
|
|
||||||
db_session.refresh(task)
|
|
||||||
|
|
||||||
# Add task file
|
|
||||||
task_file = TaskFile(
|
|
||||||
task_id=task.id,
|
|
||||||
original_name="test.pdf",
|
|
||||||
stored_path="/tmp/test.pdf",
|
|
||||||
file_size=1024,
|
|
||||||
mime_type="application/pdf"
|
|
||||||
)
|
|
||||||
db_session.add(task_file)
|
|
||||||
db_session.commit()
|
|
||||||
|
|
||||||
return task
|
|
||||||
|
|
||||||
|
|
||||||
class TestLayoutModelSchema:
|
class TestLayoutModelSchema:
|
||||||
@@ -115,25 +93,10 @@ class TestLayoutModelSchema:
|
|||||||
class TestStartTaskEndpoint:
|
class TestStartTaskEndpoint:
|
||||||
"""Test /tasks/{task_id}/start endpoint with layout_model parameter"""
|
"""Test /tasks/{task_id}/start endpoint with layout_model parameter"""
|
||||||
|
|
||||||
@patch('app.routers.tasks.process_task_ocr')
|
@patch(__name__ + ".process_task_ocr")
|
||||||
def test_start_task_with_layout_model(self, mock_process_ocr, client, test_task, db_session):
|
def test_start_task_with_layout_model(self, mock_process_ocr, client, test_task_id):
|
||||||
"""Verify layout_model is accepted and passed to OCR service"""
|
"""Verify layout_model is accepted and passed to OCR service"""
|
||||||
|
|
||||||
# Override get_db dependency
|
|
||||||
def override_get_db():
|
|
||||||
try:
|
|
||||||
yield db_session
|
|
||||||
finally:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Override auth dependency
|
|
||||||
def override_get_current_user():
|
|
||||||
return test_task.user
|
|
||||||
|
|
||||||
app.dependency_overrides[get_db] = override_get_db
|
|
||||||
from app.core.deps import get_current_user
|
|
||||||
app.dependency_overrides[get_current_user] = override_get_current_user
|
|
||||||
|
|
||||||
# Request body with layout_model
|
# Request body with layout_model
|
||||||
request_body = {
|
request_body = {
|
||||||
"use_dual_track": True,
|
"use_dual_track": True,
|
||||||
@@ -143,7 +106,7 @@ class TestStartTaskEndpoint:
|
|||||||
|
|
||||||
# Make API call
|
# Make API call
|
||||||
response = client.post(
|
response = client.post(
|
||||||
f"/api/v2/tasks/{test_task.task_id}/start",
|
f"/api/v2/tasks/{test_task_id}/start",
|
||||||
json=request_body
|
json=request_body
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -159,33 +122,17 @@ class TestStartTaskEndpoint:
|
|||||||
assert 'layout_model' in call_kwargs
|
assert 'layout_model' in call_kwargs
|
||||||
assert call_kwargs['layout_model'] == 'chinese'
|
assert call_kwargs['layout_model'] == 'chinese'
|
||||||
|
|
||||||
# Clean up
|
@patch(__name__ + ".process_task_ocr")
|
||||||
app.dependency_overrides.clear()
|
def test_start_task_with_default_model(self, mock_process_ocr, client, test_task_id):
|
||||||
|
|
||||||
@patch('app.routers.tasks.process_task_ocr')
|
|
||||||
def test_start_task_with_default_model(self, mock_process_ocr, client, test_task, db_session):
|
|
||||||
"""Verify 'default' layout model is accepted"""
|
"""Verify 'default' layout model is accepted"""
|
||||||
|
|
||||||
def override_get_db():
|
|
||||||
try:
|
|
||||||
yield db_session
|
|
||||||
finally:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def override_get_current_user():
|
|
||||||
return test_task.user
|
|
||||||
|
|
||||||
app.dependency_overrides[get_db] = override_get_db
|
|
||||||
from app.core.deps import get_current_user
|
|
||||||
app.dependency_overrides[get_current_user] = override_get_current_user
|
|
||||||
|
|
||||||
request_body = {
|
request_body = {
|
||||||
"use_dual_track": True,
|
"use_dual_track": True,
|
||||||
"layout_model": "default"
|
"layout_model": "default"
|
||||||
}
|
}
|
||||||
|
|
||||||
response = client.post(
|
response = client.post(
|
||||||
f"/api/v2/tasks/{test_task.task_id}/start",
|
f"/api/v2/tasks/{test_task_id}/start",
|
||||||
json=request_body
|
json=request_body
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -195,32 +142,17 @@ class TestStartTaskEndpoint:
|
|||||||
call_kwargs = mock_process_ocr.call_args[1]
|
call_kwargs = mock_process_ocr.call_args[1]
|
||||||
assert call_kwargs['layout_model'] == 'default'
|
assert call_kwargs['layout_model'] == 'default'
|
||||||
|
|
||||||
app.dependency_overrides.clear()
|
@patch(__name__ + ".process_task_ocr")
|
||||||
|
def test_start_task_with_cdla_model(self, mock_process_ocr, client, test_task_id):
|
||||||
@patch('app.routers.tasks.process_task_ocr')
|
|
||||||
def test_start_task_with_cdla_model(self, mock_process_ocr, client, test_task, db_session):
|
|
||||||
"""Verify 'cdla' layout model is accepted"""
|
"""Verify 'cdla' layout model is accepted"""
|
||||||
|
|
||||||
def override_get_db():
|
|
||||||
try:
|
|
||||||
yield db_session
|
|
||||||
finally:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def override_get_current_user():
|
|
||||||
return test_task.user
|
|
||||||
|
|
||||||
app.dependency_overrides[get_db] = override_get_db
|
|
||||||
from app.core.deps import get_current_user
|
|
||||||
app.dependency_overrides[get_current_user] = override_get_current_user
|
|
||||||
|
|
||||||
request_body = {
|
request_body = {
|
||||||
"use_dual_track": True,
|
"use_dual_track": True,
|
||||||
"layout_model": "cdla"
|
"layout_model": "cdla"
|
||||||
}
|
}
|
||||||
|
|
||||||
response = client.post(
|
response = client.post(
|
||||||
f"/api/v2/tasks/{test_task.task_id}/start",
|
f"/api/v2/tasks/{test_task_id}/start",
|
||||||
json=request_body
|
json=request_body
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -230,25 +162,10 @@ class TestStartTaskEndpoint:
|
|||||||
call_kwargs = mock_process_ocr.call_args[1]
|
call_kwargs = mock_process_ocr.call_args[1]
|
||||||
assert call_kwargs['layout_model'] == 'cdla'
|
assert call_kwargs['layout_model'] == 'cdla'
|
||||||
|
|
||||||
app.dependency_overrides.clear()
|
@patch(__name__ + ".process_task_ocr")
|
||||||
|
def test_start_task_without_layout_model_uses_default(self, mock_process_ocr, client, test_task_id):
|
||||||
@patch('app.routers.tasks.process_task_ocr')
|
|
||||||
def test_start_task_without_layout_model_uses_default(self, mock_process_ocr, client, test_task, db_session):
|
|
||||||
"""Verify task can start without layout_model (uses 'chinese' as default)"""
|
"""Verify task can start without layout_model (uses 'chinese' as default)"""
|
||||||
|
|
||||||
def override_get_db():
|
|
||||||
try:
|
|
||||||
yield db_session
|
|
||||||
finally:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def override_get_current_user():
|
|
||||||
return test_task.user
|
|
||||||
|
|
||||||
app.dependency_overrides[get_db] = override_get_db
|
|
||||||
from app.core.deps import get_current_user
|
|
||||||
app.dependency_overrides[get_current_user] = override_get_current_user
|
|
||||||
|
|
||||||
# Request without layout_model
|
# Request without layout_model
|
||||||
request_body = {
|
request_body = {
|
||||||
"use_dual_track": True,
|
"use_dual_track": True,
|
||||||
@@ -256,7 +173,7 @@ class TestStartTaskEndpoint:
|
|||||||
}
|
}
|
||||||
|
|
||||||
response = client.post(
|
response = client.post(
|
||||||
f"/api/v2/tasks/{test_task.task_id}/start",
|
f"/api/v2/tasks/{test_task_id}/start",
|
||||||
json=request_body
|
json=request_body
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -268,24 +185,9 @@ class TestStartTaskEndpoint:
|
|||||||
# layout_model should default to 'chinese'
|
# layout_model should default to 'chinese'
|
||||||
assert call_kwargs['layout_model'] == 'chinese'
|
assert call_kwargs['layout_model'] == 'chinese'
|
||||||
|
|
||||||
app.dependency_overrides.clear()
|
def test_start_task_with_invalid_layout_model(self, client, test_task_id):
|
||||||
|
|
||||||
def test_start_task_with_invalid_layout_model(self, client, test_task, db_session):
|
|
||||||
"""Verify invalid layout_model returns 422 validation error"""
|
"""Verify invalid layout_model returns 422 validation error"""
|
||||||
|
|
||||||
def override_get_db():
|
|
||||||
try:
|
|
||||||
yield db_session
|
|
||||||
finally:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def override_get_current_user():
|
|
||||||
return test_task.user
|
|
||||||
|
|
||||||
app.dependency_overrides[get_db] = override_get_db
|
|
||||||
from app.core.deps import get_current_user
|
|
||||||
app.dependency_overrides[get_current_user] = override_get_current_user
|
|
||||||
|
|
||||||
# Request with invalid layout_model
|
# Request with invalid layout_model
|
||||||
request_body = {
|
request_body = {
|
||||||
"use_dual_track": True,
|
"use_dual_track": True,
|
||||||
@@ -293,15 +195,13 @@ class TestStartTaskEndpoint:
|
|||||||
}
|
}
|
||||||
|
|
||||||
response = client.post(
|
response = client.post(
|
||||||
f"/api/v2/tasks/{test_task.task_id}/start",
|
f"/api/v2/tasks/{test_task_id}/start",
|
||||||
json=request_body
|
json=request_body
|
||||||
)
|
)
|
||||||
|
|
||||||
# Should return validation error
|
# Should return validation error
|
||||||
assert response.status_code == 422
|
assert response.status_code == 422
|
||||||
|
|
||||||
app.dependency_overrides.clear()
|
|
||||||
|
|
||||||
|
|
||||||
class TestOpenAPISchema:
|
class TestOpenAPISchema:
|
||||||
"""Test OpenAPI schema includes layout_model parameter"""
|
"""Test OpenAPI schema includes layout_model parameter"""
|
||||||
|
|||||||
@@ -4,7 +4,6 @@ Tests that table borders are drawn from cell_boxes
|
|||||||
while text is rendered at raw OCR positions.
|
while text is rendered at raw OCR positions.
|
||||||
"""
|
"""
|
||||||
import sys
|
import sys
|
||||||
sys.path.insert(0, '/home/egg/project/Tool_OCR/backend')
|
|
||||||
|
|
||||||
import json
|
import json
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
@@ -16,7 +15,7 @@ def test_layered_rendering():
|
|||||||
"""Test the layered rendering approach."""
|
"""Test the layered rendering approach."""
|
||||||
# Use existing test task
|
# Use existing test task
|
||||||
task_id = "84899366-f361-44f1-b989-5aba72419ca5"
|
task_id = "84899366-f361-44f1-b989-5aba72419ca5"
|
||||||
result_dir = Path(f"/home/egg/project/Tool_OCR/backend/storage/results/{task_id}")
|
result_dir = Path(__file__).resolve().parents[2] / "storage" / "results" / task_id
|
||||||
|
|
||||||
if not result_dir.exists():
|
if not result_dir.exists():
|
||||||
print(f"[ERROR] Result directory not found: {result_dir}")
|
print(f"[ERROR] Result directory not found: {result_dir}")
|
||||||
@@ -7,13 +7,16 @@ import pytest
|
|||||||
import requests
|
import requests
|
||||||
import time
|
import time
|
||||||
import json
|
import json
|
||||||
|
import os
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional, Dict
|
from typing import Optional, Dict
|
||||||
|
|
||||||
# Test configuration
|
# Test configuration - use environment variable or settings
|
||||||
API_BASE_URL = "http://localhost:8000/api/v2"
|
from app.core.config import settings
|
||||||
TEST_USER_EMAIL = "ymirliu@panjit.com.tw"
|
|
||||||
TEST_USER_PASSWORD = "4RFV5tgb6yhn"
|
API_BASE_URL = settings.e2e_api_base_url
|
||||||
|
TEST_USER_EMAIL = os.getenv("E2E_TEST_USER_EMAIL", "test@example.com")
|
||||||
|
TEST_USER_PASSWORD = os.getenv("E2E_TEST_USER_PASSWORD", "testpassword")
|
||||||
|
|
||||||
# Test documents (assuming these exist in demo_docs/)
|
# Test documents (assuming these exist in demo_docs/)
|
||||||
TEST_DOCUMENTS = {
|
TEST_DOCUMENTS = {
|
||||||
|
|||||||
@@ -21,8 +21,9 @@ def ocr_service():
|
|||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def sample_image():
|
def sample_image():
|
||||||
"""Find a sample image for testing"""
|
"""Find a sample image for testing"""
|
||||||
# Try to find any image in demo_docs
|
# Try to find any image in demo_docs (using settings for path)
|
||||||
demo_dir = Path('/home/egg/project/Tool_OCR/demo_docs')
|
from app.core.config import settings
|
||||||
|
demo_dir = Path(settings.demo_docs_dir)
|
||||||
if demo_dir.exists():
|
if demo_dir.exists():
|
||||||
for ext in ['.pdf', '.png', '.jpg', '.jpeg']:
|
for ext in ['.pdf', '.png', '.jpg', '.jpeg']:
|
||||||
images = list(demo_dir.glob(f'*{ext}'))
|
images = list(demo_dir.glob(f'*{ext}'))
|
||||||
|
|||||||
@@ -12,16 +12,23 @@ Run with: pytest backend/tests/e2e/ -v -s
|
|||||||
import pytest
|
import pytest
|
||||||
import requests
|
import requests
|
||||||
import time
|
import time
|
||||||
|
import os
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
# Configuration
|
# Configuration
|
||||||
API_BASE_URL = "http://localhost:8000/api/v2"
|
_default_backend_port = os.getenv("BACKEND_PORT", "8000")
|
||||||
DEMO_DOCS_PATH = Path(__file__).parent.parent.parent.parent / "demo_docs"
|
_default_base_url = f"http://localhost:{_default_backend_port}"
|
||||||
|
_api_base = os.getenv("TOOL_OCR_E2E_API_BASE_URL", _default_base_url).rstrip("/")
|
||||||
|
API_BASE_URL = f"{_api_base}/api/v2"
|
||||||
|
DEMO_DOCS_PATH = Path(
|
||||||
|
os.getenv("TOOL_OCR_DEMO_DOCS_DIR")
|
||||||
|
or (Path(__file__).resolve().parents[3] / "demo_docs")
|
||||||
|
)
|
||||||
|
|
||||||
# Test credentials (provided by user)
|
# Test credentials must be provided via environment variables
|
||||||
TEST_USERNAME = "ymirliu@panjit.com.tw"
|
TEST_USERNAME = os.getenv("TOOL_OCR_E2E_USERNAME")
|
||||||
TEST_PASSWORD = "4RFV5tgb6yhn"
|
TEST_PASSWORD = os.getenv("TOOL_OCR_E2E_PASSWORD")
|
||||||
|
|
||||||
|
|
||||||
class TestDualTrackE2E:
|
class TestDualTrackE2E:
|
||||||
@@ -30,6 +37,9 @@ class TestDualTrackE2E:
|
|||||||
@pytest.fixture(scope="class")
|
@pytest.fixture(scope="class")
|
||||||
def auth_token(self):
|
def auth_token(self):
|
||||||
"""Authenticate and get access token."""
|
"""Authenticate and get access token."""
|
||||||
|
if not TEST_USERNAME or not TEST_PASSWORD:
|
||||||
|
pytest.skip("Set TOOL_OCR_E2E_USERNAME and TOOL_OCR_E2E_PASSWORD to run E2E tests")
|
||||||
|
|
||||||
response = requests.post(
|
response = requests.post(
|
||||||
f"{API_BASE_URL}/auth/login",
|
f"{API_BASE_URL}/auth/login",
|
||||||
json={
|
json={
|
||||||
|
|||||||
@@ -12,17 +12,24 @@ Run with: pytest backend/tests/e2e/test_pdf_layout_restoration.py -v -s
|
|||||||
import pytest
|
import pytest
|
||||||
import requests
|
import requests
|
||||||
import time
|
import time
|
||||||
|
import os
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
import json
|
import json
|
||||||
|
|
||||||
# Configuration
|
# Configuration
|
||||||
API_BASE_URL = "http://localhost:8000/api/v2"
|
_default_backend_port = os.getenv("BACKEND_PORT", "8000")
|
||||||
DEMO_DOCS_PATH = Path(__file__).parent.parent.parent.parent / "demo_docs"
|
_default_base_url = f"http://localhost:{_default_backend_port}"
|
||||||
|
_api_base = os.getenv("TOOL_OCR_E2E_API_BASE_URL", _default_base_url).rstrip("/")
|
||||||
|
API_BASE_URL = f"{_api_base}/api/v2"
|
||||||
|
DEMO_DOCS_PATH = Path(
|
||||||
|
os.getenv("TOOL_OCR_DEMO_DOCS_DIR")
|
||||||
|
or (Path(__file__).resolve().parents[3] / "demo_docs")
|
||||||
|
)
|
||||||
|
|
||||||
# Test credentials
|
# Test credentials must be provided via environment variables
|
||||||
TEST_USERNAME = "ymirliu@panjit.com.tw"
|
TEST_USERNAME = os.getenv("TOOL_OCR_E2E_USERNAME")
|
||||||
TEST_PASSWORD = "4RFV5tgb6yhn"
|
TEST_PASSWORD = os.getenv("TOOL_OCR_E2E_PASSWORD")
|
||||||
|
|
||||||
|
|
||||||
class TestBase:
|
class TestBase:
|
||||||
@@ -31,6 +38,9 @@ class TestBase:
|
|||||||
@pytest.fixture(scope="class")
|
@pytest.fixture(scope="class")
|
||||||
def auth_token(self):
|
def auth_token(self):
|
||||||
"""Authenticate and get access token."""
|
"""Authenticate and get access token."""
|
||||||
|
if not TEST_USERNAME or not TEST_PASSWORD:
|
||||||
|
pytest.skip("Set TOOL_OCR_E2E_USERNAME and TOOL_OCR_E2E_PASSWORD to run E2E tests")
|
||||||
|
|
||||||
response = requests.post(
|
response = requests.post(
|
||||||
f"{API_BASE_URL}/auth/login",
|
f"{API_BASE_URL}/auth/login",
|
||||||
json={
|
json={
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
# Run all PP-StructureV3 parameter tests
|
# Run backend test suites
|
||||||
# Usage: ./backend/tests/run_ppstructure_tests.sh [test_type]
|
# Usage: ./backend/tests/run_ppstructure_tests.sh [test_type]
|
||||||
# test_type: unit, api, e2e, performance, all (default: all)
|
# test_type: unit, api, e2e, all (default: all)
|
||||||
|
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
@@ -30,25 +30,32 @@ NC='\033[0m' # No Color
|
|||||||
TEST_TYPE="${1:-all}"
|
TEST_TYPE="${1:-all}"
|
||||||
|
|
||||||
echo -e "${BLUE}========================================${NC}"
|
echo -e "${BLUE}========================================${NC}"
|
||||||
echo -e "${BLUE}PP-StructureV3 Parameters Test Suite${NC}"
|
echo -e "${BLUE}Tool_OCR Backend Test Runner${NC}"
|
||||||
echo -e "${BLUE}========================================${NC}"
|
echo -e "${BLUE}========================================${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
|
|
||||||
|
# Derive API base URL for E2E checks (same env vars used by pytest e2e tests)
|
||||||
|
DEFAULT_BACKEND_PORT="${BACKEND_PORT:-8000}"
|
||||||
|
DEFAULT_API_BASE_URL="http://localhost:${DEFAULT_BACKEND_PORT}"
|
||||||
|
E2E_API_BASE_URL="${TOOL_OCR_E2E_API_BASE_URL:-$DEFAULT_API_BASE_URL}"
|
||||||
|
|
||||||
# Function to run tests
|
# Function to run tests
|
||||||
run_tests() {
|
run_tests() {
|
||||||
local test_name=$1
|
local test_name=$1
|
||||||
local test_path=$2
|
local test_path=$2
|
||||||
local markers=$3
|
local markers=$3
|
||||||
|
shift 3
|
||||||
|
local extra_args=("$@")
|
||||||
|
|
||||||
echo -e "${GREEN}Running ${test_name}...${NC}"
|
echo -e "${GREEN}Running ${test_name}...${NC}"
|
||||||
|
|
||||||
if [ -n "$markers" ]; then
|
if [ -n "$markers" ]; then
|
||||||
pytest "$test_path" -v -m "$markers" --tb=short || {
|
pytest "$test_path" -v -m "$markers" --tb=short "${extra_args[@]}" || {
|
||||||
echo -e "${RED}✗ ${test_name} failed${NC}"
|
echo -e "${RED}✗ ${test_name} failed${NC}"
|
||||||
return 1
|
return 1
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
pytest "$test_path" -v --tb=short || {
|
pytest "$test_path" -v --tb=short "${extra_args[@]}" || {
|
||||||
echo -e "${RED}✗ ${test_name} failed${NC}"
|
echo -e "${RED}✗ ${test_name} failed${NC}"
|
||||||
return 1
|
return 1
|
||||||
}
|
}
|
||||||
@@ -63,28 +70,29 @@ case "$TEST_TYPE" in
|
|||||||
unit)
|
unit)
|
||||||
echo -e "${YELLOW}Running Unit Tests...${NC}"
|
echo -e "${YELLOW}Running Unit Tests...${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
run_tests "Unit Tests" "backend/tests/services/test_ppstructure_params.py" ""
|
run_tests "Unit Tests" "backend/tests" "not integration" \
|
||||||
|
--ignore=backend/tests/api --ignore=backend/tests/e2e
|
||||||
;;
|
;;
|
||||||
|
|
||||||
api)
|
api)
|
||||||
echo -e "${YELLOW}Running API Integration Tests...${NC}"
|
echo -e "${YELLOW}Running API Integration Tests...${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
run_tests "API Tests" "backend/tests/api/test_ppstructure_params_api.py" ""
|
run_tests "API Tests" "backend/tests/api" "not integration"
|
||||||
;;
|
;;
|
||||||
|
|
||||||
e2e)
|
e2e)
|
||||||
echo -e "${YELLOW}Running E2E Tests...${NC}"
|
echo -e "${YELLOW}Running E2E Tests...${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${YELLOW}⚠ Note: E2E tests require backend server running${NC}"
|
echo -e "${YELLOW}⚠ Note: E2E tests require backend server running${NC}"
|
||||||
echo -e "${YELLOW}⚠ Credentials: ymirliu@panjit.com.tw / 4RFV5tgb6yhn${NC}"
|
echo -e "${YELLOW}⚠ Provide credentials via TOOL_OCR_E2E_USERNAME / TOOL_OCR_E2E_PASSWORD${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
run_tests "E2E Tests" "backend/tests/e2e/test_ppstructure_params_e2e.py" "e2e"
|
run_tests "E2E Tests" "backend/tests/e2e" ""
|
||||||
;;
|
;;
|
||||||
|
|
||||||
performance)
|
performance)
|
||||||
echo -e "${YELLOW}Running Performance Tests...${NC}"
|
echo -e "${RED}Performance suite no longer exists.${NC}"
|
||||||
echo ""
|
echo "Use: $0 unit | $0 api | $0 e2e | $0 all"
|
||||||
run_tests "Performance Tests" "backend/tests/performance/test_ppstructure_params_performance.py" "performance"
|
exit 1
|
||||||
;;
|
;;
|
||||||
|
|
||||||
all)
|
all)
|
||||||
@@ -92,28 +100,26 @@ case "$TEST_TYPE" in
|
|||||||
echo ""
|
echo ""
|
||||||
|
|
||||||
# Unit tests
|
# Unit tests
|
||||||
run_tests "Unit Tests" "backend/tests/services/test_ppstructure_params.py" ""
|
run_tests "Unit Tests" "backend/tests" "not integration" \
|
||||||
|
--ignore=backend/tests/api --ignore=backend/tests/e2e
|
||||||
|
|
||||||
# API tests
|
# API tests
|
||||||
run_tests "API Tests" "backend/tests/api/test_ppstructure_params_api.py" ""
|
run_tests "API Tests" "backend/tests/api" "not integration"
|
||||||
|
|
||||||
# Performance tests
|
|
||||||
run_tests "Performance Tests" "backend/tests/performance/test_ppstructure_params_performance.py" "performance"
|
|
||||||
|
|
||||||
# E2E tests (optional, requires server)
|
# E2E tests (optional, requires server)
|
||||||
echo -e "${YELLOW}E2E Tests (requires server running)...${NC}"
|
echo -e "${YELLOW}E2E Tests (requires server running)...${NC}"
|
||||||
if curl -s http://localhost:8000/health > /dev/null 2>&1; then
|
if curl -s "${E2E_API_BASE_URL%/}/health" > /dev/null 2>&1; then
|
||||||
run_tests "E2E Tests" "backend/tests/e2e/test_ppstructure_params_e2e.py" "e2e"
|
run_tests "E2E Tests" "backend/tests/e2e" ""
|
||||||
else
|
else
|
||||||
echo -e "${YELLOW}⚠ Skipping E2E tests - server not running${NC}"
|
echo -e "${YELLOW}⚠ Skipping E2E tests - server not running${NC}"
|
||||||
echo -e "${YELLOW} Start server with: cd backend && python -m uvicorn app.main:app${NC}"
|
echo -e "${YELLOW} Expected health endpoint: ${E2E_API_BASE_URL%/}/health${NC}"
|
||||||
echo ""
|
echo ""
|
||||||
fi
|
fi
|
||||||
;;
|
;;
|
||||||
|
|
||||||
*)
|
*)
|
||||||
echo -e "${RED}Invalid test type: $TEST_TYPE${NC}"
|
echo -e "${RED}Invalid test type: $TEST_TYPE${NC}"
|
||||||
echo "Usage: $0 [unit|api|e2e|performance|all]"
|
echo "Usage: $0 [unit|api|e2e|all]"
|
||||||
exit 1
|
exit 1
|
||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
|
|||||||
@@ -3,12 +3,15 @@
|
|||||||
Test translation service with DIFY API using real OCR results from storage/results/
|
Test translation service with DIFY API using real OCR results from storage/results/
|
||||||
"""
|
"""
|
||||||
import json
|
import json
|
||||||
|
import os
|
||||||
import pytest
|
import pytest
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
from app.services.dify_client import DifyClient, get_dify_client
|
from app.services.dify_client import DifyClient, get_dify_client
|
||||||
from app.services.translation_service import TranslationService, get_translation_service
|
from app.services.translation_service import TranslationService, get_translation_service
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.integration
|
||||||
|
|
||||||
# Real task IDs with their result files
|
# Real task IDs with their result files
|
||||||
REAL_TASKS = [
|
REAL_TASKS = [
|
||||||
("ca2b59a3-3362-4678-954f-cf0a9bcc152e", "img3_result.json"),
|
("ca2b59a3-3362-4678-954f-cf0a9bcc152e", "img3_result.json"),
|
||||||
@@ -28,6 +31,8 @@ RESULTS_DIR = Path(__file__).parent.parent / "storage" / "results"
|
|||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def dify_client():
|
def dify_client():
|
||||||
"""Get DIFY client instance"""
|
"""Get DIFY client instance"""
|
||||||
|
if not os.getenv("DIFY_API_KEY"):
|
||||||
|
pytest.skip("Set DIFY_API_KEY to run real translation integration tests")
|
||||||
return get_dify_client()
|
return get_dify_client()
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
24
deploy/1panel/.env.example
Normal file
24
deploy/1panel/.env.example
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
# Tool_OCR - 1Panel Docker Deployment Configuration
|
||||||
|
# Copy this file to .env and fill in your values
|
||||||
|
|
||||||
|
# ===== Service Ports =====
|
||||||
|
BACKEND_PORT=8000
|
||||||
|
FRONTEND_PORT=12010
|
||||||
|
|
||||||
|
# ===== Database Configuration =====
|
||||||
|
MYSQL_HOST=your-mysql-host
|
||||||
|
MYSQL_PORT=3306
|
||||||
|
MYSQL_USER=your-username
|
||||||
|
MYSQL_PASSWORD=your-password
|
||||||
|
MYSQL_DATABASE=your-database
|
||||||
|
|
||||||
|
# ===== Security =====
|
||||||
|
# Generate a random string for production: openssl rand -hex 32
|
||||||
|
SECRET_KEY=your-secret-key-change-this
|
||||||
|
|
||||||
|
# ===== Translation API (DIFY) =====
|
||||||
|
DIFY_BASE_URL=https://your-dify-instance.example.com/v1
|
||||||
|
DIFY_API_KEY=your-dify-api-key
|
||||||
|
|
||||||
|
# ===== Logging =====
|
||||||
|
LOG_LEVEL=INFO
|
||||||
67
deploy/1panel/Dockerfile.backend
Normal file
67
deploy/1panel/Dockerfile.backend
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
# Tool_OCR Backend Dockerfile
|
||||||
|
# Multi-stage build for optimized image size
|
||||||
|
|
||||||
|
# Stage 1: Build environment
|
||||||
|
FROM python:3.12-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /build
|
||||||
|
|
||||||
|
# Install build dependencies
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
build-essential \
|
||||||
|
git \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy requirements and install dependencies
|
||||||
|
COPY backend/requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir --user -r requirements.txt
|
||||||
|
|
||||||
|
# Stage 2: Runtime environment
|
||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install runtime dependencies
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
libgl1-mesa-glx \
|
||||||
|
libglib2.0-0 \
|
||||||
|
libsm6 \
|
||||||
|
libxext6 \
|
||||||
|
libxrender1 \
|
||||||
|
libgomp1 \
|
||||||
|
pandoc \
|
||||||
|
poppler-utils \
|
||||||
|
fonts-dejavu-core \
|
||||||
|
fonts-noto-cjk \
|
||||||
|
curl \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy Python packages from builder
|
||||||
|
COPY --from=builder /root/.local /root/.local
|
||||||
|
ENV PATH=/root/.local/bin:$PATH
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY backend/ /app/backend/
|
||||||
|
|
||||||
|
# Create necessary directories
|
||||||
|
RUN mkdir -p /app/backend/uploads/{temp,processed,images} \
|
||||||
|
&& mkdir -p /app/backend/storage/{markdown,json,exports,results} \
|
||||||
|
&& mkdir -p /app/backend/models/paddleocr \
|
||||||
|
&& mkdir -p /app/backend/logs
|
||||||
|
|
||||||
|
# Set working directory
|
||||||
|
WORKDIR /app/backend
|
||||||
|
|
||||||
|
# Environment variables
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
HEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \
|
||||||
|
CMD curl -f http://localhost:8000/health || exit 1
|
||||||
|
|
||||||
|
# Run application
|
||||||
|
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--workers", "4"]
|
||||||
38
deploy/1panel/Dockerfile.frontend
Normal file
38
deploy/1panel/Dockerfile.frontend
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
# Tool_OCR Frontend Dockerfile
|
||||||
|
# Multi-stage build for optimized image size
|
||||||
|
|
||||||
|
# Stage 1: Build
|
||||||
|
FROM node:20-alpine as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy package files
|
||||||
|
COPY frontend/package*.json ./
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
RUN npm ci
|
||||||
|
|
||||||
|
# Copy source code
|
||||||
|
COPY frontend/ ./
|
||||||
|
|
||||||
|
# Build application
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Stage 2: Production
|
||||||
|
FROM nginx:alpine
|
||||||
|
|
||||||
|
# Copy built files
|
||||||
|
COPY --from=builder /app/dist /usr/share/nginx/html
|
||||||
|
|
||||||
|
# Copy nginx configuration
|
||||||
|
COPY deploy/1panel/nginx.conf /etc/nginx/conf.d/default.conf
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 80
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
HEALTHCHECK --interval=30s --timeout=10s --retries=3 \
|
||||||
|
CMD curl -f http://localhost:80 || exit 1
|
||||||
|
|
||||||
|
# Run nginx
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
125
deploy/1panel/README.md
Normal file
125
deploy/1panel/README.md
Normal file
@@ -0,0 +1,125 @@
|
|||||||
|
# Tool_OCR - 1Panel Docker 佈署指南
|
||||||
|
|
||||||
|
本目錄包含 1Panel 平台的 Docker 佈署配置。
|
||||||
|
|
||||||
|
## 前置需求
|
||||||
|
|
||||||
|
- Docker 20.10+
|
||||||
|
- Docker Compose 2.0+
|
||||||
|
- NVIDIA Container Toolkit(GPU 加速)
|
||||||
|
- MySQL 8.0+(外部資料庫)
|
||||||
|
|
||||||
|
## 佈署步驟
|
||||||
|
|
||||||
|
### 1. 配置環境變數
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 複製環境變數範本
|
||||||
|
cp .env.example .env
|
||||||
|
|
||||||
|
# 編輯配置
|
||||||
|
nano .env
|
||||||
|
```
|
||||||
|
|
||||||
|
必要配置項:
|
||||||
|
- `MYSQL_*` - 資料庫連線設定
|
||||||
|
- `SECRET_KEY` - JWT 簽名金鑰(使用 `openssl rand -hex 32` 生成)
|
||||||
|
- `DIFY_*` - 翻譯 API 設定
|
||||||
|
|
||||||
|
### 2. 建置映像
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 建置所有服務
|
||||||
|
docker-compose build
|
||||||
|
|
||||||
|
# 或分別建置
|
||||||
|
docker-compose build backend
|
||||||
|
docker-compose build frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. 啟動服務
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 啟動所有服務
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# 查看日誌
|
||||||
|
docker-compose logs -f
|
||||||
|
|
||||||
|
# 查看狀態
|
||||||
|
docker-compose ps
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. 存取服務
|
||||||
|
|
||||||
|
- 前端介面:http://your-server:12010
|
||||||
|
- API 文件:http://your-server:8000/docs
|
||||||
|
- 健康檢查:http://your-server:8000/health
|
||||||
|
|
||||||
|
## 服務管理
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 停止服務
|
||||||
|
docker-compose down
|
||||||
|
|
||||||
|
# 重啟服務
|
||||||
|
docker-compose restart
|
||||||
|
|
||||||
|
# 更新服務
|
||||||
|
docker-compose pull
|
||||||
|
docker-compose up -d --build
|
||||||
|
|
||||||
|
# 查看資源使用
|
||||||
|
docker stats
|
||||||
|
```
|
||||||
|
|
||||||
|
## 資料持久化
|
||||||
|
|
||||||
|
以下 Docker volumes 會自動建立:
|
||||||
|
|
||||||
|
| Volume | 用途 |
|
||||||
|
|--------|------|
|
||||||
|
| `tool_ocr_uploads` | 上傳檔案 |
|
||||||
|
| `tool_ocr_storage` | 處理結果 |
|
||||||
|
| `tool_ocr_logs` | 應用日誌 |
|
||||||
|
| `tool_ocr_models` | OCR 模型快取 |
|
||||||
|
|
||||||
|
## GPU 加速
|
||||||
|
|
||||||
|
預設配置會使用 NVIDIA GPU。如果不需要 GPU 加速,請修改 `docker-compose.yml`,移除 `deploy.resources.reservations` 區塊。
|
||||||
|
|
||||||
|
## 常見問題
|
||||||
|
|
||||||
|
### Q: 服務啟動失敗?
|
||||||
|
|
||||||
|
檢查日誌:
|
||||||
|
```bash
|
||||||
|
docker-compose logs backend
|
||||||
|
docker-compose logs frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Q: 資料庫連線失敗?
|
||||||
|
|
||||||
|
確認:
|
||||||
|
1. MySQL 服務正在運行
|
||||||
|
2. 防火牆允許連接
|
||||||
|
3. 使用者權限正確
|
||||||
|
|
||||||
|
### Q: GPU 不可用?
|
||||||
|
|
||||||
|
確認 NVIDIA Container Toolkit 已安裝:
|
||||||
|
```bash
|
||||||
|
nvidia-smi
|
||||||
|
docker run --rm --gpus all nvidia/cuda:11.8.0-base-ubuntu22.04 nvidia-smi
|
||||||
|
```
|
||||||
|
|
||||||
|
## 1Panel 整合
|
||||||
|
|
||||||
|
在 1Panel 中:
|
||||||
|
|
||||||
|
1. 進入「應用商店」→「自訂應用」
|
||||||
|
2. 上傳此目錄的所有檔案
|
||||||
|
3. 配置環境變數
|
||||||
|
4. 點擊「安裝」
|
||||||
|
|
||||||
|
或使用 1Panel 的「容器」功能直接導入 `docker-compose.yml`。
|
||||||
68
deploy/1panel/docker-compose.yml
Normal file
68
deploy/1panel/docker-compose.yml
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ../..
|
||||||
|
dockerfile: deploy/1panel/Dockerfile.backend
|
||||||
|
container_name: tool_ocr_backend
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "${BACKEND_PORT:-8000}:8000"
|
||||||
|
environment:
|
||||||
|
- MYSQL_HOST=${MYSQL_HOST}
|
||||||
|
- MYSQL_PORT=${MYSQL_PORT}
|
||||||
|
- MYSQL_USER=${MYSQL_USER}
|
||||||
|
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
|
||||||
|
- MYSQL_DATABASE=${MYSQL_DATABASE}
|
||||||
|
- SECRET_KEY=${SECRET_KEY}
|
||||||
|
- DIFY_BASE_URL=${DIFY_BASE_URL}
|
||||||
|
- DIFY_API_KEY=${DIFY_API_KEY}
|
||||||
|
- CORS_ORIGINS=http://localhost:${FRONTEND_PORT:-12010}
|
||||||
|
- LOG_LEVEL=${LOG_LEVEL:-INFO}
|
||||||
|
volumes:
|
||||||
|
- tool_ocr_uploads:/app/backend/uploads
|
||||||
|
- tool_ocr_storage:/app/backend/storage
|
||||||
|
- tool_ocr_logs:/app/backend/logs
|
||||||
|
- tool_ocr_models:/app/backend/models
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
start_period: 60s
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
reservations:
|
||||||
|
devices:
|
||||||
|
- driver: nvidia
|
||||||
|
count: 1
|
||||||
|
capabilities: [gpu]
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ../..
|
||||||
|
dockerfile: deploy/1panel/Dockerfile.frontend
|
||||||
|
container_name: tool_ocr_frontend
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "${FRONTEND_PORT:-12010}:80"
|
||||||
|
environment:
|
||||||
|
- VITE_API_BASE_URL=http://localhost:${BACKEND_PORT:-8000}
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:80"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
tool_ocr_uploads:
|
||||||
|
tool_ocr_storage:
|
||||||
|
tool_ocr_logs:
|
||||||
|
tool_ocr_models:
|
||||||
|
|
||||||
|
networks:
|
||||||
|
default:
|
||||||
|
name: tool_ocr_network
|
||||||
37
deploy/1panel/nginx.conf
Normal file
37
deploy/1panel/nginx.conf
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name _;
|
||||||
|
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
index index.html;
|
||||||
|
|
||||||
|
# Gzip compression
|
||||||
|
gzip on;
|
||||||
|
gzip_vary on;
|
||||||
|
gzip_min_length 1024;
|
||||||
|
gzip_proxied expired no-cache no-store private auth;
|
||||||
|
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml application/javascript;
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
|
||||||
|
# SPA routing - all routes go to index.html
|
||||||
|
location / {
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Static assets caching
|
||||||
|
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2|ttf|eot)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
location /health {
|
||||||
|
access_log off;
|
||||||
|
return 200 "healthy\n";
|
||||||
|
add_header Content-Type text/plain;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -5,3 +5,7 @@
|
|||||||
# For local development: http://localhost:8000
|
# For local development: http://localhost:8000
|
||||||
# For WSL2: Use the WSL2 IP address (get with: hostname -I | awk '{print $1}')
|
# For WSL2: Use the WSL2 IP address (get with: hostname -I | awk '{print $1}')
|
||||||
VITE_API_BASE_URL=http://localhost:8000
|
VITE_API_BASE_URL=http://localhost:8000
|
||||||
|
|
||||||
|
# Dev server configuration (used by `vite.config.ts`)
|
||||||
|
FRONTEND_HOST=0.0.0.0
|
||||||
|
FRONTEND_PORT=5173
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ export default function ResultsPage() {
|
|||||||
|
|
||||||
// Construct PDF URL for preview - memoize to prevent unnecessary reloads
|
// Construct PDF URL for preview - memoize to prevent unnecessary reloads
|
||||||
// Must be called unconditionally before any early returns (React hooks rule)
|
// Must be called unconditionally before any early returns (React hooks rule)
|
||||||
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || 'http://localhost:8000'
|
const API_BASE_URL = (import.meta.env.VITE_API_BASE_URL || '').replace(/\/$/, '')
|
||||||
const pdfUrl = useMemo(() => {
|
const pdfUrl = useMemo(() => {
|
||||||
return taskId ? `${API_BASE_URL}/api/v2/tasks/${taskId}/download/pdf` : ''
|
return taskId ? `${API_BASE_URL}/api/v2/tasks/${taskId}/download/pdf` : ''
|
||||||
}, [taskId, API_BASE_URL])
|
}, [taskId, API_BASE_URL])
|
||||||
|
|||||||
@@ -147,7 +147,7 @@ export default function TaskDetailPage() {
|
|||||||
|
|
||||||
// Construct PDF URL for preview - memoize to prevent unnecessary reloads
|
// Construct PDF URL for preview - memoize to prevent unnecessary reloads
|
||||||
// Must be called unconditionally before any early returns (React hooks rule)
|
// Must be called unconditionally before any early returns (React hooks rule)
|
||||||
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || 'http://localhost:8000'
|
const API_BASE_URL = (import.meta.env.VITE_API_BASE_URL || '').replace(/\/$/, '')
|
||||||
const pdfUrl = useMemo(() => {
|
const pdfUrl = useMemo(() => {
|
||||||
return taskId ? `${API_BASE_URL}/api/v2/tasks/${taskId}/download/pdf` : ''
|
return taskId ? `${API_BASE_URL}/api/v2/tasks/${taskId}/download/pdf` : ''
|
||||||
}, [taskId, API_BASE_URL])
|
}, [taskId, API_BASE_URL])
|
||||||
|
|||||||
@@ -47,10 +47,10 @@ import type {
|
|||||||
/**
|
/**
|
||||||
* API Client Configuration
|
* API Client Configuration
|
||||||
* - In Docker: VITE_API_BASE_URL is empty string, use relative path
|
* - In Docker: VITE_API_BASE_URL is empty string, use relative path
|
||||||
* - In development: Use VITE_API_BASE_URL from .env or default to localhost:8000
|
* - In development: Prefer relative path + Vite proxy; optionally set VITE_API_BASE_URL for non-proxied setups
|
||||||
*/
|
*/
|
||||||
const envApiBaseUrl = import.meta.env.VITE_API_BASE_URL
|
const envApiBaseUrl = import.meta.env.VITE_API_BASE_URL
|
||||||
const API_BASE_URL = envApiBaseUrl !== undefined ? envApiBaseUrl : 'http://localhost:8000'
|
const API_BASE_URL = envApiBaseUrl !== undefined ? envApiBaseUrl : ''
|
||||||
const API_VERSION = 'v2'
|
const API_VERSION = 'v2'
|
||||||
|
|
||||||
class ApiClientV2 {
|
class ApiClientV2 {
|
||||||
|
|||||||
@@ -1,16 +1,25 @@
|
|||||||
import { defineConfig } from 'vite'
|
import { defineConfig, loadEnv } from 'vite'
|
||||||
import react from '@vitejs/plugin-react'
|
import react from '@vitejs/plugin-react'
|
||||||
import path from 'path'
|
import path from 'path'
|
||||||
|
|
||||||
// https://vite.dev/config/
|
// https://vite.dev/config/
|
||||||
export default defineConfig({
|
export default defineConfig(({ mode }) => {
|
||||||
|
const env = loadEnv(mode, process.cwd(), '')
|
||||||
|
|
||||||
|
const backendPort = env.BACKEND_PORT || '8000'
|
||||||
|
const backendBaseUrl = (env.VITE_API_BASE_URL || env.VITE_API_URL || `http://localhost:${backendPort}`).replace(/\/$/, '')
|
||||||
|
|
||||||
|
const frontendPort = Number.parseInt(env.FRONTEND_PORT || env.VITE_PORT || '5173', 10)
|
||||||
|
const frontendHost = env.FRONTEND_HOST || '0.0.0.0'
|
||||||
|
|
||||||
|
return {
|
||||||
plugins: [react()],
|
plugins: [react()],
|
||||||
server: {
|
server: {
|
||||||
host: '0.0.0.0',
|
host: frontendHost,
|
||||||
port: 5173,
|
port: Number.isFinite(frontendPort) ? frontendPort : 5173,
|
||||||
proxy: {
|
proxy: {
|
||||||
'/api': {
|
'/api': {
|
||||||
target: process.env.VITE_API_URL || 'http://localhost:8000',
|
target: backendBaseUrl,
|
||||||
changeOrigin: true,
|
changeOrigin: true,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
@@ -20,4 +29,5 @@ export default defineConfig({
|
|||||||
'@': path.resolve(__dirname, './src'),
|
'@': path.resolve(__dirname, './src'),
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
}
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -3,8 +3,7 @@
|
|||||||
Test script for PDF Preprocessing Pipeline.
|
Test script for PDF Preprocessing Pipeline.
|
||||||
|
|
||||||
Usage:
|
Usage:
|
||||||
cd /home/egg/project/Tool_OCR
|
PYTHONPATH=backend python3 scripts/run_preprocessing_tests.py
|
||||||
PYTHONPATH=backend python3 scripts/test_preprocessing_pipeline.py
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|||||||
375
start-prod.sh
Executable file
375
start-prod.sh
Executable file
@@ -0,0 +1,375 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Tool_OCR - Production Server Startup Script
|
||||||
|
# Usage:
|
||||||
|
# ./start-prod.sh Start all services (backend + frontend)
|
||||||
|
# ./start-prod.sh backend Start only backend
|
||||||
|
# ./start-prod.sh frontend Start only frontend
|
||||||
|
# ./start-prod.sh --stop Stop all services
|
||||||
|
# ./start-prod.sh --status Show service status
|
||||||
|
# ./start-prod.sh --help Show help
|
||||||
|
|
||||||
|
# Note: We don't use 'set -e' here because stop commands may fail gracefully
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
RED='\033[0;31m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
# Configuration
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
PID_DIR="$SCRIPT_DIR/.pid-prod"
|
||||||
|
BACKEND_PID_FILE="$PID_DIR/backend.pid"
|
||||||
|
FRONTEND_PID_FILE="$PID_DIR/frontend.pid"
|
||||||
|
|
||||||
|
# Production defaults (different from development)
|
||||||
|
BACKEND_HOST=${BACKEND_HOST:-0.0.0.0}
|
||||||
|
BACKEND_PORT=${BACKEND_PORT:-8000}
|
||||||
|
FRONTEND_HOST=${FRONTEND_HOST:-0.0.0.0}
|
||||||
|
FRONTEND_PORT=${FRONTEND_PORT:-12010}
|
||||||
|
|
||||||
|
# Production-specific settings
|
||||||
|
UVICORN_WORKERS=${UVICORN_WORKERS:-4}
|
||||||
|
|
||||||
|
# Create PID directory
|
||||||
|
mkdir -p "$PID_DIR"
|
||||||
|
|
||||||
|
# Functions
|
||||||
|
print_header() {
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}================================${NC}"
|
||||||
|
echo -e "${BLUE} Tool_OCR Production Server${NC}"
|
||||||
|
echo -e "${BLUE}================================${NC}"
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
show_help() {
|
||||||
|
echo "Tool_OCR - Production Server Manager"
|
||||||
|
echo ""
|
||||||
|
echo "Usage: ./start-prod.sh [command]"
|
||||||
|
echo ""
|
||||||
|
echo "Commands:"
|
||||||
|
echo " (none) Start all services (backend + frontend)"
|
||||||
|
echo " backend Start only backend service"
|
||||||
|
echo " frontend Start only frontend service"
|
||||||
|
echo " --stop Stop all running services"
|
||||||
|
echo " --status Show status of services"
|
||||||
|
echo " --help Show this help message"
|
||||||
|
echo ""
|
||||||
|
echo "Environment Variables:"
|
||||||
|
echo " BACKEND_PORT Backend port (default: 8000)"
|
||||||
|
echo " FRONTEND_PORT Frontend port (default: 12010)"
|
||||||
|
echo " UVICORN_WORKERS Number of uvicorn workers (default: 4)"
|
||||||
|
echo ""
|
||||||
|
echo "Examples:"
|
||||||
|
echo " ./start-prod.sh # Start everything"
|
||||||
|
echo " ./start-prod.sh backend # Start only backend"
|
||||||
|
echo " FRONTEND_PORT=3000 ./start-prod.sh # Custom port"
|
||||||
|
echo " ./start-prod.sh --stop # Stop all services"
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
check_requirements() {
|
||||||
|
local missing=0
|
||||||
|
|
||||||
|
# Check virtual environment
|
||||||
|
if [ ! -d "$SCRIPT_DIR/venv" ]; then
|
||||||
|
echo -e "${RED}Error: Python virtual environment not found${NC}"
|
||||||
|
echo "Please run: ./setup_dev_env.sh"
|
||||||
|
missing=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check node_modules
|
||||||
|
if [ ! -d "$SCRIPT_DIR/frontend/node_modules" ]; then
|
||||||
|
echo -e "${RED}Error: Frontend dependencies not found${NC}"
|
||||||
|
echo "Please run: ./setup_dev_env.sh"
|
||||||
|
missing=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check .env (production uses .env, not .env.local)
|
||||||
|
if [ ! -f "$SCRIPT_DIR/.env" ]; then
|
||||||
|
echo -e "${YELLOW}Warning: .env not found, using defaults${NC}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
return $missing
|
||||||
|
}
|
||||||
|
|
||||||
|
load_env() {
|
||||||
|
# Production loads .env only (not .env.local)
|
||||||
|
# For production, you should use .env with production settings
|
||||||
|
if [ -f "$SCRIPT_DIR/.env" ]; then
|
||||||
|
set -a
|
||||||
|
# shellcheck disable=SC1090
|
||||||
|
source "$SCRIPT_DIR/.env"
|
||||||
|
set +a
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
is_running() {
|
||||||
|
local pid_file=$1
|
||||||
|
if [ -f "$pid_file" ]; then
|
||||||
|
local pid=$(cat "$pid_file")
|
||||||
|
if ps -p "$pid" > /dev/null 2>&1; then
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
# PID file exists but process is not running, clean up
|
||||||
|
rm -f "$pid_file"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
get_pid() {
|
||||||
|
local pid_file=$1
|
||||||
|
if [ -f "$pid_file" ]; then
|
||||||
|
cat "$pid_file"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
start_backend() {
|
||||||
|
if is_running "$BACKEND_PID_FILE"; then
|
||||||
|
echo -e "${YELLOW}Backend already running (PID: $(get_pid $BACKEND_PID_FILE))${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Starting backend server (production mode)...${NC}"
|
||||||
|
|
||||||
|
# Activate virtual environment
|
||||||
|
source "$SCRIPT_DIR/venv/bin/activate"
|
||||||
|
|
||||||
|
# Load environment variables
|
||||||
|
load_env
|
||||||
|
|
||||||
|
# Start backend in background
|
||||||
|
cd "$SCRIPT_DIR/backend"
|
||||||
|
|
||||||
|
# Create necessary directories
|
||||||
|
mkdir -p uploads/{temp,processed,images}
|
||||||
|
mkdir -p storage/{markdown,json,exports,results}
|
||||||
|
mkdir -p models/paddleocr
|
||||||
|
mkdir -p logs
|
||||||
|
|
||||||
|
# Start uvicorn in production mode (no --reload, with workers)
|
||||||
|
nohup uvicorn app.main:app \
|
||||||
|
--host "$BACKEND_HOST" \
|
||||||
|
--port "$BACKEND_PORT" \
|
||||||
|
--workers "$UVICORN_WORKERS" \
|
||||||
|
--access-log \
|
||||||
|
--log-level info \
|
||||||
|
> "$PID_DIR/backend.log" 2>&1 &
|
||||||
|
local pid=$!
|
||||||
|
echo $pid > "$BACKEND_PID_FILE"
|
||||||
|
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
|
||||||
|
# Wait a moment and verify
|
||||||
|
sleep 3
|
||||||
|
if is_running "$BACKEND_PID_FILE"; then
|
||||||
|
echo -e "${GREEN}Backend started (PID: $pid, Workers: $UVICORN_WORKERS)${NC}"
|
||||||
|
echo -e " API Docs: http://localhost:$BACKEND_PORT/docs"
|
||||||
|
echo -e " Health: http://localhost:$BACKEND_PORT/health"
|
||||||
|
else
|
||||||
|
echo -e "${RED}Backend failed to start. Check $PID_DIR/backend.log${NC}"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
start_frontend() {
|
||||||
|
if is_running "$FRONTEND_PID_FILE"; then
|
||||||
|
echo -e "${YELLOW}Frontend already running (PID: $(get_pid $FRONTEND_PID_FILE))${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Starting frontend server (production mode)...${NC}"
|
||||||
|
|
||||||
|
# Load environment variables
|
||||||
|
load_env
|
||||||
|
|
||||||
|
# Load nvm
|
||||||
|
export NVM_DIR="$HOME/.nvm"
|
||||||
|
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
|
||||||
|
|
||||||
|
cd "$SCRIPT_DIR/frontend"
|
||||||
|
|
||||||
|
# Build frontend if dist doesn't exist or is older than src
|
||||||
|
if [ ! -d "dist" ] || [ "$(find src -newer dist -type f 2>/dev/null | head -1)" ]; then
|
||||||
|
echo -e "${YELLOW}Building frontend...${NC}"
|
||||||
|
npm run build
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Start production preview server
|
||||||
|
# Note: For real production, use nginx to serve dist/ folder
|
||||||
|
nohup npm run preview -- --host "$FRONTEND_HOST" --port "$FRONTEND_PORT" > "$PID_DIR/frontend.log" 2>&1 &
|
||||||
|
local pid=$!
|
||||||
|
echo $pid > "$FRONTEND_PID_FILE"
|
||||||
|
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
|
||||||
|
# Wait a moment and verify
|
||||||
|
sleep 3
|
||||||
|
if is_running "$FRONTEND_PID_FILE"; then
|
||||||
|
echo -e "${GREEN}Frontend started (PID: $pid)${NC}"
|
||||||
|
echo -e " URL: http://localhost:$FRONTEND_PORT"
|
||||||
|
else
|
||||||
|
echo -e "${RED}Frontend failed to start. Check $PID_DIR/frontend.log${NC}"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
kill_process_tree() {
|
||||||
|
local pid=$1
|
||||||
|
# Kill all child processes first
|
||||||
|
pkill -TERM -P "$pid" 2>/dev/null || true
|
||||||
|
# Then kill the parent
|
||||||
|
kill -TERM "$pid" 2>/dev/null || true
|
||||||
|
}
|
||||||
|
|
||||||
|
force_kill_process_tree() {
|
||||||
|
local pid=$1
|
||||||
|
# Force kill all child processes
|
||||||
|
pkill -9 -P "$pid" 2>/dev/null || true
|
||||||
|
# Force kill the parent
|
||||||
|
kill -9 "$pid" 2>/dev/null || true
|
||||||
|
}
|
||||||
|
|
||||||
|
kill_by_port() {
|
||||||
|
local port=$1
|
||||||
|
local pids=$(lsof -ti :$port 2>/dev/null)
|
||||||
|
if [ -n "$pids" ]; then
|
||||||
|
echo "$pids" | xargs kill -TERM 2>/dev/null || true
|
||||||
|
sleep 1
|
||||||
|
# Force kill if still running
|
||||||
|
pids=$(lsof -ti :$port 2>/dev/null)
|
||||||
|
if [ -n "$pids" ]; then
|
||||||
|
echo "$pids" | xargs kill -9 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
stop_service() {
|
||||||
|
local name=$1
|
||||||
|
local pid_file=$2
|
||||||
|
local port=$3
|
||||||
|
|
||||||
|
if is_running "$pid_file"; then
|
||||||
|
local pid=$(get_pid "$pid_file")
|
||||||
|
echo -e "${YELLOW}Stopping $name (PID: $pid)...${NC}"
|
||||||
|
|
||||||
|
# Kill the entire process tree
|
||||||
|
kill_process_tree "$pid"
|
||||||
|
|
||||||
|
# Wait up to 5 seconds
|
||||||
|
local count=0
|
||||||
|
while [ $count -lt 5 ] && is_running "$pid_file"; do
|
||||||
|
sleep 1
|
||||||
|
count=$((count + 1))
|
||||||
|
done
|
||||||
|
|
||||||
|
# Force kill if still running
|
||||||
|
if is_running "$pid_file"; then
|
||||||
|
force_kill_process_tree "$pid"
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -f "$pid_file"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Also kill any orphaned processes by port (fallback)
|
||||||
|
if [ -n "$port" ]; then
|
||||||
|
local port_pids=$(lsof -ti :$port 2>/dev/null)
|
||||||
|
if [ -n "$port_pids" ]; then
|
||||||
|
echo -e "${YELLOW}Cleaning up orphaned processes on port $port...${NC}"
|
||||||
|
kill_by_port "$port"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}$name stopped${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
stop_all() {
|
||||||
|
echo -e "${YELLOW}Stopping all production services...${NC}"
|
||||||
|
stop_service "Backend" "$BACKEND_PID_FILE" "$BACKEND_PORT"
|
||||||
|
stop_service "Frontend" "$FRONTEND_PID_FILE" "$FRONTEND_PORT"
|
||||||
|
echo -e "${GREEN}All services stopped${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
show_status() {
|
||||||
|
echo -e "${BLUE}Production Service Status:${NC}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
if is_running "$BACKEND_PID_FILE"; then
|
||||||
|
local pid=$(get_pid "$BACKEND_PID_FILE")
|
||||||
|
echo -e " Backend: ${GREEN}Running${NC} (PID: $pid, Workers: $UVICORN_WORKERS)"
|
||||||
|
echo -e " http://localhost:$BACKEND_PORT"
|
||||||
|
else
|
||||||
|
echo -e " Backend: ${RED}Stopped${NC}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if is_running "$FRONTEND_PID_FILE"; then
|
||||||
|
local pid=$(get_pid "$FRONTEND_PID_FILE")
|
||||||
|
echo -e " Frontend: ${GREEN}Running${NC} (PID: $pid)"
|
||||||
|
echo -e " http://localhost:$FRONTEND_PORT"
|
||||||
|
else
|
||||||
|
echo -e " Frontend: ${RED}Stopped${NC}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main
|
||||||
|
case "${1:-all}" in
|
||||||
|
--help|-h)
|
||||||
|
show_help
|
||||||
|
;;
|
||||||
|
--stop)
|
||||||
|
stop_all
|
||||||
|
;;
|
||||||
|
--status)
|
||||||
|
show_status
|
||||||
|
;;
|
||||||
|
backend)
|
||||||
|
print_header
|
||||||
|
check_requirements || exit 1
|
||||||
|
start_backend
|
||||||
|
echo ""
|
||||||
|
echo -e "${YELLOW}Logs: tail -f $PID_DIR/backend.log${NC}"
|
||||||
|
;;
|
||||||
|
frontend)
|
||||||
|
print_header
|
||||||
|
check_requirements || exit 1
|
||||||
|
start_frontend
|
||||||
|
echo ""
|
||||||
|
echo -e "${YELLOW}Logs: tail -f $PID_DIR/frontend.log${NC}"
|
||||||
|
;;
|
||||||
|
all|"")
|
||||||
|
print_header
|
||||||
|
check_requirements || exit 1
|
||||||
|
start_backend
|
||||||
|
start_frontend
|
||||||
|
echo ""
|
||||||
|
echo -e "${GREEN}================================${NC}"
|
||||||
|
echo -e "${GREEN}All production services started!${NC}"
|
||||||
|
echo -e "${GREEN}================================${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "Access the application:"
|
||||||
|
echo -e " Frontend: ${BLUE}http://localhost:$FRONTEND_PORT${NC}"
|
||||||
|
echo -e " API Docs: ${BLUE}http://localhost:$BACKEND_PORT/docs${NC}"
|
||||||
|
echo ""
|
||||||
|
echo -e "${YELLOW}Use ./start-prod.sh --stop to stop all services${NC}"
|
||||||
|
echo -e "${YELLOW}Use ./start-prod.sh --status to check status${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "Logs:"
|
||||||
|
echo " Backend: tail -f $PID_DIR/backend.log"
|
||||||
|
echo " Frontend: tail -f $PID_DIR/frontend.log"
|
||||||
|
echo ""
|
||||||
|
echo -e "${YELLOW}Note: For real production deployment, consider using:${NC}"
|
||||||
|
echo " - nginx as reverse proxy"
|
||||||
|
echo " - systemd for service management"
|
||||||
|
echo " - Docker for containerization"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo -e "${RED}Unknown command: $1${NC}"
|
||||||
|
show_help
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
22
start.sh
22
start.sh
@@ -22,7 +22,9 @@ SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
|||||||
PID_DIR="$SCRIPT_DIR/.pid"
|
PID_DIR="$SCRIPT_DIR/.pid"
|
||||||
BACKEND_PID_FILE="$PID_DIR/backend.pid"
|
BACKEND_PID_FILE="$PID_DIR/backend.pid"
|
||||||
FRONTEND_PID_FILE="$PID_DIR/frontend.pid"
|
FRONTEND_PID_FILE="$PID_DIR/frontend.pid"
|
||||||
|
BACKEND_HOST=${BACKEND_HOST:-0.0.0.0}
|
||||||
BACKEND_PORT=${BACKEND_PORT:-8000}
|
BACKEND_PORT=${BACKEND_PORT:-8000}
|
||||||
|
FRONTEND_HOST=${FRONTEND_HOST:-0.0.0.0}
|
||||||
FRONTEND_PORT=${FRONTEND_PORT:-5173}
|
FRONTEND_PORT=${FRONTEND_PORT:-5173}
|
||||||
|
|
||||||
# Create PID directory
|
# Create PID directory
|
||||||
@@ -84,6 +86,17 @@ check_requirements() {
|
|||||||
return $missing
|
return $missing
|
||||||
}
|
}
|
||||||
|
|
||||||
|
load_env() {
|
||||||
|
# Load environment variables from root .env.local (if present).
|
||||||
|
# This keeps backend/frontend config in sync without hardcoding ports/URLs in scripts.
|
||||||
|
if [ -f "$SCRIPT_DIR/.env.local" ]; then
|
||||||
|
set -a
|
||||||
|
# shellcheck disable=SC1090
|
||||||
|
source "$SCRIPT_DIR/.env.local"
|
||||||
|
set +a
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
is_running() {
|
is_running() {
|
||||||
local pid_file=$1
|
local pid_file=$1
|
||||||
if [ -f "$pid_file" ]; then
|
if [ -f "$pid_file" ]; then
|
||||||
@@ -117,9 +130,7 @@ start_backend() {
|
|||||||
source "$SCRIPT_DIR/venv/bin/activate"
|
source "$SCRIPT_DIR/venv/bin/activate"
|
||||||
|
|
||||||
# Load environment variables
|
# Load environment variables
|
||||||
if [ -f "$SCRIPT_DIR/.env.local" ]; then
|
load_env
|
||||||
export $(grep -v '^#' "$SCRIPT_DIR/.env.local" | xargs)
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Start backend in background
|
# Start backend in background
|
||||||
cd "$SCRIPT_DIR/backend"
|
cd "$SCRIPT_DIR/backend"
|
||||||
@@ -131,7 +142,7 @@ start_backend() {
|
|||||||
mkdir -p logs
|
mkdir -p logs
|
||||||
|
|
||||||
# Start uvicorn
|
# Start uvicorn
|
||||||
nohup uvicorn app.main:app --reload --host 0.0.0.0 --port $BACKEND_PORT > "$SCRIPT_DIR/.pid/backend.log" 2>&1 &
|
nohup uvicorn app.main:app --reload --host "$BACKEND_HOST" --port "$BACKEND_PORT" > "$SCRIPT_DIR/.pid/backend.log" 2>&1 &
|
||||||
local pid=$!
|
local pid=$!
|
||||||
echo $pid > "$BACKEND_PID_FILE"
|
echo $pid > "$BACKEND_PID_FILE"
|
||||||
|
|
||||||
@@ -157,6 +168,9 @@ start_frontend() {
|
|||||||
|
|
||||||
echo -e "${GREEN}Starting frontend server...${NC}"
|
echo -e "${GREEN}Starting frontend server...${NC}"
|
||||||
|
|
||||||
|
# Load environment variables so Vite config can use FRONTEND_PORT/FRONTEND_HOST/etc.
|
||||||
|
load_env
|
||||||
|
|
||||||
# Load nvm
|
# Load nvm
|
||||||
export NVM_DIR="$HOME/.nvm"
|
export NVM_DIR="$HOME/.nvm"
|
||||||
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
|
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
|
||||||
|
|||||||
Reference in New Issue
Block a user