refactor: 重構為 Application Factory 架構
將 Flask 應用重構為正式的 Python package 結構: - 新增 src/mes_dashboard/ package 取代 apps/ 目錄 - 實作 Application Factory pattern (create_app()) - 移除所有 sys.path.insert hacks,使用標準 import - 新增 pyproject.toml 定義 package metadata - 新增 gunicorn.conf.py 部署設定 - 新增 NoOpCache 抽象層,預留未來擴充 - 新增單元測試 tests/test_app_factory.py - 更新 .gitignore 支援新架構 - 新增 OpenSpec 規格文件追蹤變更 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
13
.env.example
Normal file
13
.env.example
Normal file
@@ -0,0 +1,13 @@
|
||||
# Database Configuration
|
||||
# Copy this file to .env and fill in your actual values
|
||||
|
||||
# Oracle Database connection
|
||||
DB_HOST=10.1.1.58
|
||||
DB_PORT=1521
|
||||
DB_SERVICE=DWDB
|
||||
DB_USER=your_username
|
||||
DB_PASSWORD=your_password
|
||||
|
||||
# Flask Configuration (optional)
|
||||
FLASK_ENV=development
|
||||
FLASK_DEBUG=1
|
||||
83
.gitignore
vendored
83
.gitignore
vendored
@@ -1,31 +1,52 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
venv/
|
||||
ENV/
|
||||
env/
|
||||
.venv/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
nul
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
# Local config
|
||||
.env
|
||||
*.local
|
||||
|
||||
# Claude
|
||||
.claude/
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
|
||||
# Virtual environments
|
||||
venv/
|
||||
ENV/
|
||||
env/
|
||||
.venv/
|
||||
|
||||
# Package build artifacts
|
||||
*.egg-info/
|
||||
*.egg
|
||||
dist/
|
||||
build/
|
||||
*.whl
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
*.sublime-*
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
logs/
|
||||
|
||||
# Local config (credentials)
|
||||
.env
|
||||
|
||||
# AI/LLM tools
|
||||
.claude/
|
||||
.codex/
|
||||
|
||||
# Test artifacts
|
||||
.pytest_cache/
|
||||
.coverage
|
||||
htmlcov/
|
||||
.tox/
|
||||
|
||||
# Jupyter
|
||||
.ipynb_checkpoints/
|
||||
|
||||
# Note: openspec/ is tracked (not ignored)
|
||||
|
||||
470
README.md
470
README.md
@@ -1,219 +1,219 @@
|
||||
# MES 報表查詢系統
|
||||
|
||||
基於 Vite/React + Python FastAPI 的 MES 數據報表查詢與可視化系統
|
||||
|
||||
---
|
||||
|
||||
## 專案狀態
|
||||
|
||||
- ✅ 數據庫分析完成
|
||||
- ✅ 系統架構設計完成
|
||||
- ✅ 數據查詢工具完成
|
||||
- ⏳ 待提供 Power BI 報表設計參考
|
||||
- ⏳ 系統開發進行中
|
||||
|
||||
---
|
||||
|
||||
## 快速開始
|
||||
|
||||
### 1. WIP 在制品報表(當前可用)⭐
|
||||
|
||||
查詢當前在制品的數量統計,支援按工序、工作中心、產品線分組查看。
|
||||
|
||||
```bash
|
||||
# 雙擊運行
|
||||
scripts\啟動Dashboard.bat
|
||||
```
|
||||
|
||||
# MES 報表查詢系統
|
||||
|
||||
基於 Vite/React + Python FastAPI 的 MES 數據報表查詢與可視化系統
|
||||
|
||||
---
|
||||
|
||||
## 專案狀態
|
||||
|
||||
- ✅ 數據庫分析完成
|
||||
- ✅ 系統架構設計完成
|
||||
- ✅ 數據查詢工具完成
|
||||
- ⏳ 待提供 Power BI 報表設計參考
|
||||
- ⏳ 系統開發進行中
|
||||
|
||||
---
|
||||
|
||||
## 快速開始
|
||||
|
||||
### 1. WIP 在制品報表(當前可用)⭐
|
||||
|
||||
查詢當前在制品的數量統計,支援按工序、工作中心、產品線分組查看。
|
||||
|
||||
```bash
|
||||
# 雙擊運行
|
||||
scripts\啟動Dashboard.bat
|
||||
```
|
||||
|
||||
然後訪問: **http://localhost:5000**
|
||||
入口頁面可用上方 Tab 切換「WIP 報表 / 數據表查詢工具」。
|
||||
|
||||
**功能**:
|
||||
- 📊 總覽統計(總 LOT 數、總數量、總片數)
|
||||
- 🔍 按 SPEC 和 WORKCENTER 統計
|
||||
- 📈 按產品線統計(匯總 + 明細)
|
||||
- ⏱️ 可選時間範圍(1-30 天)
|
||||
- 🎨 美觀的 Web UI
|
||||
|
||||
詳細說明: [WIP報表說明.md](docs/WIP報表說明.md)
|
||||
|
||||
---
|
||||
|
||||
### 2. 查看數據表內容(當前可用)
|
||||
|
||||
#### 方法 A: 自動初始化(推薦,首次使用)
|
||||
|
||||
```bash
|
||||
# 步驟 1: 初始化環境(只需執行一次)
|
||||
雙擊運行: scripts\0_初始化環境.bat
|
||||
|
||||
# 步驟 2: 啟動服務器
|
||||
雙擊運行: scripts\啟動Dashboard.bat
|
||||
```
|
||||
|
||||
#### 方法 B: 使用 Python 直接啟動
|
||||
|
||||
```bash
|
||||
# 如果您的環境已安裝 Flask, Pandas, oracledb
|
||||
python apps\快速啟動.py
|
||||
```
|
||||
|
||||
#### 方法 C: 手動啟動
|
||||
|
||||
```bash
|
||||
# 1. 創建虛擬環境(首次)
|
||||
python -m venv venv
|
||||
|
||||
# 2. 安裝依賴(首次)
|
||||
venv\Scripts\pip.exe install -r requirements.txt
|
||||
|
||||
# 3. 啟動服務器
|
||||
venv\Scripts\python.exe apps\portal.py
|
||||
```
|
||||
|
||||
然後訪問: **http://localhost:5000**
|
||||
|
||||
**功能**:
|
||||
- 📊 按表性質分類(現況表/歷史表/輔助表)
|
||||
- 🔍 查看各表最後 1000 筆資料
|
||||
- ⏱️ 大表自動按時間欄位排序
|
||||
- 📋 顯示欄位列表和數據樣本
|
||||
|
||||
---
|
||||
|
||||
## 文檔結構
|
||||
|
||||
### 核心文檔
|
||||
|
||||
| 文檔 | 用途 | 適用對象 |
|
||||
|------|------|---------|
|
||||
| **[System_Architecture_Design.md](docs/System_Architecture_Design.md)** | 系統架構設計完整文檔 | 架構師、開發者 |
|
||||
| **[MES_Core_Tables_Analysis_Report.md](docs/MES_Core_Tables_Analysis_Report.md)** | 核心表深度分析報告 ⭐ | 開發者、數據分析師 |
|
||||
| **[MES_Database_Reference.md](docs/MES_Database_Reference.md)** | 數據庫完整結構參考 | 開發者 |
|
||||
|
||||
### 文檔關係
|
||||
|
||||
```
|
||||
docs/System_Architecture_Design.md (系統設計總覽)
|
||||
↓ 引用
|
||||
docs/MES_Core_Tables_Analysis_Report.md (表詳細分析)
|
||||
↓ 引用
|
||||
docs/MES_Database_Reference.md (表結構參考)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 關鍵發現總結
|
||||
|
||||
### 1. 表性質分類
|
||||
|
||||
經過深入分析,16 張核心表分為:
|
||||
|
||||
- **現況快照表(4張)**: WIP, RESOURCE, CONTAINER, JOB
|
||||
- **歷史累積表(10張)**: RESOURCESTATUS, LOTWIPHISTORY 等
|
||||
- **輔助表(2張)**: PARTREQUESTORDER, PJ_COMBINEDASSYLOTS
|
||||
|
||||
### 2. 重要認知更新
|
||||
|
||||
⚠️ **DW_MES_WIP** 雖名為"在制品表",但實際包含 **7700 萬行歷史累積數據**
|
||||
|
||||
⚠️ **DW_MES_RESOURCESTATUS** 記錄設備狀態每次變更,需用兩個時間欄位計算持續時間:
|
||||
```sql
|
||||
狀態持續時間 = (LASTSTATUSCHANGEDATE - OLDLASTSTATUSCHANGEDATE) * 24 小時
|
||||
```
|
||||
|
||||
### 3. 查詢優化鐵律
|
||||
|
||||
**所有超過 1000 萬行的表,查詢時必須加入時間範圍限制!**
|
||||
|
||||
```sql
|
||||
-- DW_MES_WIP (7700萬行)
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - 7
|
||||
|
||||
-- DW_MES_RESOURCESTATUS (6500萬行)
|
||||
WHERE OLDLASTSTATUSCHANGEDATE >= TRUNC(SYSDATE) - 7
|
||||
|
||||
-- DW_MES_LOTWIPHISTORY (5300萬行)
|
||||
WHERE TRACKINTIMESTAMP >= TRUNC(SYSDATE) - 7
|
||||
```
|
||||
|
||||
**建議時間範圍**:
|
||||
- 儀表板查詢: 最近 **7 天**
|
||||
- 報表查詢: 最多 **30 天**
|
||||
- 歷史趨勢: 最多 **90 天**
|
||||
|
||||
---
|
||||
|
||||
## 核心業務場景
|
||||
|
||||
基於表分析,系統應重點支援:
|
||||
|
||||
1. ✅ **在制品(WIP)看板** - 使用 DW_MES_WIP
|
||||
2. ⭐ **設備稼動率(OEE)報表** - 使用 DW_MES_RESOURCESTATUS
|
||||
3. ✅ **批次生產履歷追溯** - 使用 DW_MES_LOTWIPHISTORY
|
||||
4. ✅ **工序 Cycle Time 分析** - 使用 DW_MES_LOTWIPHISTORY
|
||||
5. ✅ **設備產出與效率分析** - 使用 DW_MES_HM_LOTMOVEOUT
|
||||
6. ✅ **Hold 批次分析** - 使用 DW_MES_WIP + DW_MES_HOLDRELEASEHISTORY
|
||||
|
||||
**功能**:
|
||||
- 📊 總覽統計(總 LOT 數、總數量、總片數)
|
||||
- 🔍 按 SPEC 和 WORKCENTER 統計
|
||||
- 📈 按產品線統計(匯總 + 明細)
|
||||
- ⏱️ 可選時間範圍(1-30 天)
|
||||
- 🎨 美觀的 Web UI
|
||||
|
||||
詳細說明: [WIP報表說明.md](docs/WIP報表說明.md)
|
||||
|
||||
---
|
||||
|
||||
### 2. 查看數據表內容(當前可用)
|
||||
|
||||
#### 方法 A: 自動初始化(推薦,首次使用)
|
||||
|
||||
```bash
|
||||
# 步驟 1: 初始化環境(只需執行一次)
|
||||
雙擊運行: scripts\0_初始化環境.bat
|
||||
|
||||
# 步驟 2: 啟動服務器
|
||||
雙擊運行: scripts\啟動Dashboard.bat
|
||||
```
|
||||
|
||||
#### 方法 B: 使用 Python 直接啟動
|
||||
|
||||
```bash
|
||||
# 如果您的環境已安裝 Flask, Pandas, oracledb
|
||||
python apps\快速啟動.py
|
||||
```
|
||||
|
||||
#### 方法 C: 手動啟動
|
||||
|
||||
```bash
|
||||
# 1. 創建虛擬環境(首次)
|
||||
python -m venv venv
|
||||
|
||||
# 2. 安裝依賴(首次)
|
||||
venv\Scripts\pip.exe install -r requirements.txt
|
||||
|
||||
# 3. 啟動服務器
|
||||
venv\Scripts\python.exe apps\portal.py
|
||||
```
|
||||
|
||||
然後訪問: **http://localhost:5000**
|
||||
|
||||
**功能**:
|
||||
- 📊 按表性質分類(現況表/歷史表/輔助表)
|
||||
- 🔍 查看各表最後 1000 筆資料
|
||||
- ⏱️ 大表自動按時間欄位排序
|
||||
- 📋 顯示欄位列表和數據樣本
|
||||
|
||||
---
|
||||
|
||||
## 文檔結構
|
||||
|
||||
### 核心文檔
|
||||
|
||||
| 文檔 | 用途 | 適用對象 |
|
||||
|------|------|---------|
|
||||
| **[System_Architecture_Design.md](docs/System_Architecture_Design.md)** | 系統架構設計完整文檔 | 架構師、開發者 |
|
||||
| **[MES_Core_Tables_Analysis_Report.md](docs/MES_Core_Tables_Analysis_Report.md)** | 核心表深度分析報告 ⭐ | 開發者、數據分析師 |
|
||||
| **[MES_Database_Reference.md](docs/MES_Database_Reference.md)** | 數據庫完整結構參考 | 開發者 |
|
||||
|
||||
### 文檔關係
|
||||
|
||||
```
|
||||
docs/System_Architecture_Design.md (系統設計總覽)
|
||||
↓ 引用
|
||||
docs/MES_Core_Tables_Analysis_Report.md (表詳細分析)
|
||||
↓ 引用
|
||||
docs/MES_Database_Reference.md (表結構參考)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 關鍵發現總結
|
||||
|
||||
### 1. 表性質分類
|
||||
|
||||
經過深入分析,16 張核心表分為:
|
||||
|
||||
- **現況快照表(4張)**: WIP, RESOURCE, CONTAINER, JOB
|
||||
- **歷史累積表(10張)**: RESOURCESTATUS, LOTWIPHISTORY 等
|
||||
- **輔助表(2張)**: PARTREQUESTORDER, PJ_COMBINEDASSYLOTS
|
||||
|
||||
### 2. 重要認知更新
|
||||
|
||||
⚠️ **DW_MES_WIP** 雖名為"在制品表",但實際包含 **7700 萬行歷史累積數據**
|
||||
|
||||
⚠️ **DW_MES_RESOURCESTATUS** 記錄設備狀態每次變更,需用兩個時間欄位計算持續時間:
|
||||
```sql
|
||||
狀態持續時間 = (LASTSTATUSCHANGEDATE - OLDLASTSTATUSCHANGEDATE) * 24 小時
|
||||
```
|
||||
|
||||
### 3. 查詢優化鐵律
|
||||
|
||||
**所有超過 1000 萬行的表,查詢時必須加入時間範圍限制!**
|
||||
|
||||
```sql
|
||||
-- DW_MES_WIP (7700萬行)
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - 7
|
||||
|
||||
-- DW_MES_RESOURCESTATUS (6500萬行)
|
||||
WHERE OLDLASTSTATUSCHANGEDATE >= TRUNC(SYSDATE) - 7
|
||||
|
||||
-- DW_MES_LOTWIPHISTORY (5300萬行)
|
||||
WHERE TRACKINTIMESTAMP >= TRUNC(SYSDATE) - 7
|
||||
```
|
||||
|
||||
**建議時間範圍**:
|
||||
- 儀表板查詢: 最近 **7 天**
|
||||
- 報表查詢: 最多 **30 天**
|
||||
- 歷史趨勢: 最多 **90 天**
|
||||
|
||||
---
|
||||
|
||||
## 核心業務場景
|
||||
|
||||
基於表分析,系統應重點支援:
|
||||
|
||||
1. ✅ **在制品(WIP)看板** - 使用 DW_MES_WIP
|
||||
2. ⭐ **設備稼動率(OEE)報表** - 使用 DW_MES_RESOURCESTATUS
|
||||
3. ✅ **批次生產履歷追溯** - 使用 DW_MES_LOTWIPHISTORY
|
||||
4. ✅ **工序 Cycle Time 分析** - 使用 DW_MES_LOTWIPHISTORY
|
||||
5. ✅ **設備產出與效率分析** - 使用 DW_MES_HM_LOTMOVEOUT
|
||||
6. ✅ **Hold 批次分析** - 使用 DW_MES_WIP + DW_MES_HOLDRELEASEHISTORY
|
||||
7. ✅ **設備維修工單進度追蹤** - 使用 DW_MES_JOB
|
||||
8. ✅ **良率分析** - 使用 DW_MES_LOTREJECTHISTORY
|
||||
|
||||
---
|
||||
|
||||
## 技術架構
|
||||
|
||||
### 前端技術棧
|
||||
- React 18 + TypeScript
|
||||
- Vite 5.x (構建工具)
|
||||
- Ant Design 5.x (UI 組件庫)
|
||||
- ECharts 5.x (圖表庫)
|
||||
- React Query 5.x (數據管理)
|
||||
|
||||
### 後端技術棧
|
||||
- Python 3.11+
|
||||
- FastAPI (Web 框架)
|
||||
- oracledb 2.x (Oracle 驅動)
|
||||
- Pandas 2.x (數據處理)
|
||||
|
||||
### 數據庫
|
||||
- Oracle Database 19c Enterprise Edition
|
||||
- 主機: 10.1.1.58:1521
|
||||
- 服務名: DWDB
|
||||
- 用戶: MBU1_R (只讀)
|
||||
|
||||
---
|
||||
|
||||
## 開發計劃
|
||||
|
||||
### Phase 1: 環境搭建與基礎架構 ⏳
|
||||
- [ ] 初始化 FastAPI 項目
|
||||
- [ ] 初始化 Vite + React 項目
|
||||
- [ ] 建立數據庫連接池
|
||||
- [ ] 實現基礎 API 結構
|
||||
|
||||
### Phase 2: 儀表板開發 ⏳
|
||||
- [ ] 實現儀表板 API
|
||||
- [ ] 開發儀表板前端頁面
|
||||
- [ ] 實現圖表組件
|
||||
|
||||
### Phase 3: 報表查詢模塊開發 ⏳
|
||||
待 Power BI 截圖確認
|
||||
|
||||
### Phase 4: 匯出功能開發 ⏳
|
||||
- [ ] 實現 Excel 匯出
|
||||
- [ ] 實現異步匯出
|
||||
|
||||
### Phase 5: 優化與測試 ⏳
|
||||
- [ ] 性能優化
|
||||
- [ ] 測試
|
||||
|
||||
### Phase 6: 部署上線 ⏳
|
||||
- [ ] 準備部署環境
|
||||
- [ ] 部署
|
||||
|
||||
---
|
||||
|
||||
## 專案文件
|
||||
|
||||
8. ✅ **良率分析** - 使用 DW_MES_LOTREJECTHISTORY
|
||||
|
||||
---
|
||||
|
||||
## 技術架構
|
||||
|
||||
### 前端技術棧
|
||||
- React 18 + TypeScript
|
||||
- Vite 5.x (構建工具)
|
||||
- Ant Design 5.x (UI 組件庫)
|
||||
- ECharts 5.x (圖表庫)
|
||||
- React Query 5.x (數據管理)
|
||||
|
||||
### 後端技術棧
|
||||
- Python 3.11+
|
||||
- FastAPI (Web 框架)
|
||||
- oracledb 2.x (Oracle 驅動)
|
||||
- Pandas 2.x (數據處理)
|
||||
|
||||
### 數據庫
|
||||
- Oracle Database 19c Enterprise Edition
|
||||
- 主機: 10.1.1.58:1521
|
||||
- 服務名: DWDB
|
||||
- 用戶: MBU1_R (只讀)
|
||||
|
||||
---
|
||||
|
||||
## 開發計劃
|
||||
|
||||
### Phase 1: 環境搭建與基礎架構 ⏳
|
||||
- [ ] 初始化 FastAPI 項目
|
||||
- [ ] 初始化 Vite + React 項目
|
||||
- [ ] 建立數據庫連接池
|
||||
- [ ] 實現基礎 API 結構
|
||||
|
||||
### Phase 2: 儀表板開發 ⏳
|
||||
- [ ] 實現儀表板 API
|
||||
- [ ] 開發儀表板前端頁面
|
||||
- [ ] 實現圖表組件
|
||||
|
||||
### Phase 3: 報表查詢模塊開發 ⏳
|
||||
待 Power BI 截圖確認
|
||||
|
||||
### Phase 4: 匯出功能開發 ⏳
|
||||
- [ ] 實現 Excel 匯出
|
||||
- [ ] 實現異步匯出
|
||||
|
||||
### Phase 5: 優化與測試 ⏳
|
||||
- [ ] 性能優化
|
||||
- [ ] 測試
|
||||
|
||||
### Phase 6: 部署上線 ⏳
|
||||
- [ ] 準備部署環境
|
||||
- [ ] 部署
|
||||
|
||||
---
|
||||
|
||||
## 專案文件
|
||||
|
||||
```
|
||||
DashBoard/
|
||||
├── README.md # 本文件
|
||||
@@ -229,25 +229,25 @@ DashBoard/
|
||||
├── backend/ # 後端(待開發)
|
||||
└── frontend/ # 前端(待開發)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 待確認事項
|
||||
|
||||
1. ⏳ **Power BI 報表截圖** - 用於前端 UI 設計參考
|
||||
2. ⏳ **具體報表類型** - 從 8 個業務場景中選擇優先開發的 3-5 個
|
||||
3. ⏳ **部署環境** - 是否有專用服務器,是否使用 Docker
|
||||
4. ⏳ **並發用戶數** - 預計同時使用的用戶數量
|
||||
|
||||
---
|
||||
|
||||
## 聯絡方式
|
||||
|
||||
如有技術問題或需求變更,請及時更新相關文檔。
|
||||
|
||||
---
|
||||
|
||||
**文檔版本**: 1.0
|
||||
**最後更新**: 2026-01-14
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
## 待確認事項
|
||||
|
||||
1. ⏳ **Power BI 報表截圖** - 用於前端 UI 設計參考
|
||||
2. ⏳ **具體報表類型** - 從 8 個業務場景中選擇優先開發的 3-5 個
|
||||
3. ⏳ **部署環境** - 是否有專用服務器,是否使用 Docker
|
||||
4. ⏳ **並發用戶數** - 預計同時使用的用戶數量
|
||||
|
||||
---
|
||||
|
||||
## 聯絡方式
|
||||
|
||||
如有技術問題或需求變更,請及時更新相關文檔。
|
||||
|
||||
---
|
||||
|
||||
**文檔版本**: 1.0
|
||||
**最後更新**: 2026-01-14
|
||||
|
||||
|
||||
|
||||
2205
apps/portal.py
2205
apps/portal.py
File diff suppressed because it is too large
Load Diff
@@ -1,401 +0,0 @@
|
||||
"""
|
||||
WIP 報表查詢工具
|
||||
查詢當前在制品 (Work In Process) 的數量統計
|
||||
"""
|
||||
|
||||
import oracledb
|
||||
import pandas as pd
|
||||
from flask import Flask, render_template, request, jsonify
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
# 數據庫連接配置
|
||||
DB_CONFIG = {
|
||||
'user': 'MBU1_R',
|
||||
'password': 'Pj2481mbu1',
|
||||
'dsn': '10.1.1.58:1521/DWDB'
|
||||
}
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
def get_db_connection():
|
||||
"""建立數據庫連接"""
|
||||
try:
|
||||
connection = oracledb.connect(**DB_CONFIG)
|
||||
return connection
|
||||
except Exception as e:
|
||||
print(f"數據庫連接失敗: {e}")
|
||||
return None
|
||||
|
||||
def query_wip_by_spec_workcenter(days=7):
|
||||
"""
|
||||
查詢各 SPECNAME 及 WORKCENTERNAME 對應的當下 WIP 數量
|
||||
|
||||
Args:
|
||||
days: 查詢最近幾天的數據(默認 7 天)
|
||||
|
||||
Returns:
|
||||
DataFrame: 包含統計數據
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
# SQL 查詢:按 SPECNAME 和 WORKCENTERNAME 統計
|
||||
sql = """
|
||||
SELECT
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND STATUS NOT IN (8, 128) -- 排除已完成/取消
|
||||
AND SPECNAME IS NOT NULL
|
||||
AND WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY SPECNAME, WORKCENTERNAME
|
||||
ORDER BY SPECNAME, WORKCENTERNAME
|
||||
"""
|
||||
|
||||
df = pd.read_sql(sql, connection, params={'days': days})
|
||||
connection.close()
|
||||
return df
|
||||
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
def query_wip_by_product_line(days=7):
|
||||
"""
|
||||
查詢不同產品線的 WIP 數量分布
|
||||
|
||||
Args:
|
||||
days: 查詢最近幾天的數據(默認 7 天)
|
||||
|
||||
Returns:
|
||||
DataFrame: 包含產品線統計數據
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
# SQL 查詢:按產品線統計
|
||||
sql = """
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND STATUS NOT IN (8, 128) -- 排除已完成/取消
|
||||
AND PRODUCTLINENAME_LEF IS NOT NULL
|
||||
GROUP BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
ORDER BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
"""
|
||||
|
||||
df = pd.read_sql(sql, connection, params={'days': days})
|
||||
connection.close()
|
||||
return df
|
||||
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
def query_wip_summary(days=7):
|
||||
"""
|
||||
查詢 WIP 總覽統計
|
||||
|
||||
Returns:
|
||||
dict: 包含總體統計數據
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
# SQL 查詢:總覽統計
|
||||
sql = """
|
||||
SELECT
|
||||
COUNT(DISTINCT CONTAINERNAME) as TOTAL_LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2,
|
||||
COUNT(DISTINCT SPECNAME) as SPEC_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT PRODUCTLINENAME_LEF) as PRODUCT_LINE_COUNT
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND STATUS NOT IN (8, 128)
|
||||
"""
|
||||
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql, {'days': days})
|
||||
result = cursor.fetchone()
|
||||
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if result:
|
||||
return {
|
||||
'total_lot_count': result[0] or 0,
|
||||
'total_qty': result[1] or 0,
|
||||
'total_qty2': result[2] or 0,
|
||||
'spec_count': result[3] or 0,
|
||||
'workcenter_count': result[4] or 0,
|
||||
'product_line_count': result[5] or 0
|
||||
}
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
@app.route('/')
|
||||
def index():
|
||||
"""首頁"""
|
||||
return render_template('wip_report.html')
|
||||
|
||||
@app.route('/api/wip/summary')
|
||||
def api_wip_summary():
|
||||
"""API: WIP 總覽統計"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
|
||||
summary = query_wip_summary(days)
|
||||
if summary:
|
||||
return jsonify({'success': True, 'data': summary})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
@app.route('/api/wip/by_spec_workcenter')
|
||||
def api_wip_by_spec_workcenter():
|
||||
"""API: 按 SPEC 和 WORKCENTER 統計"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
|
||||
df = query_wip_by_spec_workcenter(days)
|
||||
if df is not None:
|
||||
# 轉換為 JSON
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data, 'count': len(data)})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
@app.route('/api/wip/by_product_line')
|
||||
def api_wip_by_product_line():
|
||||
"""API: 按產品線統計"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
|
||||
df = query_wip_by_product_line(days)
|
||||
if df is not None:
|
||||
# 轉換為 JSON
|
||||
data = df.to_dict(orient='records')
|
||||
|
||||
# 計算產品線匯總
|
||||
if not df.empty:
|
||||
product_line_summary = df.groupby('PRODUCTLINENAME_LEF').agg({
|
||||
'LOT_COUNT': 'sum',
|
||||
'TOTAL_QTY': 'sum',
|
||||
'TOTAL_QTY2': 'sum'
|
||||
}).reset_index()
|
||||
|
||||
summary = product_line_summary.to_dict(orient='records')
|
||||
else:
|
||||
summary = []
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'data': data,
|
||||
'summary': summary,
|
||||
'count': len(data)
|
||||
})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
def query_wip_by_status(days=7):
|
||||
"""查詢各狀態的 WIP 分布"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = """
|
||||
SELECT
|
||||
CASE STATUS
|
||||
WHEN 1 THEN 'Queue'
|
||||
WHEN 2 THEN 'Run'
|
||||
WHEN 4 THEN 'Hold'
|
||||
WHEN 8 THEN 'Complete'
|
||||
WHEN 128 THEN 'Scrapped'
|
||||
ELSE 'Other(' || STATUS || ')'
|
||||
END as STATUS_NAME,
|
||||
STATUS,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
GROUP BY STATUS
|
||||
ORDER BY LOT_COUNT DESC
|
||||
"""
|
||||
df = pd.read_sql(sql, connection, params={'days': days})
|
||||
connection.close()
|
||||
return df
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_mfgorder(days=7, limit=20):
|
||||
"""查詢各工單 (GA) 的 WIP 分布"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = """
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
MFGORDERNAME,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND STATUS NOT IN (8, 128)
|
||||
AND MFGORDERNAME IS NOT NULL
|
||||
GROUP BY MFGORDERNAME
|
||||
ORDER BY LOT_COUNT DESC
|
||||
) WHERE ROWNUM <= :limit
|
||||
"""
|
||||
df = pd.read_sql(sql, connection, params={'days': days, 'limit': limit})
|
||||
connection.close()
|
||||
return df
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_heatmap(days=7):
|
||||
"""查詢 SPEC × WORKCENTER 熱力圖數據"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = """
|
||||
SELECT
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
SUM(QTY) as TOTAL_QTY
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND STATUS NOT IN (8, 128)
|
||||
AND SPECNAME IS NOT NULL
|
||||
AND WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY SPECNAME, WORKCENTERNAME
|
||||
ORDER BY SPECNAME, WORKCENTERNAME
|
||||
"""
|
||||
df = pd.read_sql(sql, connection, params={'days': days})
|
||||
connection.close()
|
||||
return df
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"查詢失敗: {e}")
|
||||
return None
|
||||
|
||||
|
||||
@app.route('/api/wip/by_status')
|
||||
def api_wip_by_status():
|
||||
"""API: 按狀態統計"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
|
||||
df = query_wip_by_status(days)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@app.route('/api/wip/by_mfgorder')
|
||||
def api_wip_by_mfgorder():
|
||||
"""API: 按工單統計 (Top 20)"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
limit = request.args.get('limit', 20, type=int)
|
||||
|
||||
df = query_wip_by_mfgorder(days, limit)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@app.route('/api/wip/heatmap')
|
||||
def api_wip_heatmap():
|
||||
"""API: SPEC × WORKCENTER 熱力圖數據"""
|
||||
days = request.args.get('days', 7, type=int)
|
||||
|
||||
df = query_wip_heatmap(days)
|
||||
if df is not None:
|
||||
if df.empty:
|
||||
return jsonify({'success': True, 'specs': [], 'workcenters': [], 'data': []})
|
||||
|
||||
specs = sorted(df['SPECNAME'].unique().tolist())
|
||||
workcenters = sorted(df['WORKCENTERNAME'].unique().tolist())
|
||||
|
||||
# 轉換為熱力圖格式 [workcenter_index, spec_index, value]
|
||||
heatmap_data = []
|
||||
for _, row in df.iterrows():
|
||||
spec_idx = specs.index(row['SPECNAME'])
|
||||
wc_idx = workcenters.index(row['WORKCENTERNAME'])
|
||||
heatmap_data.append([wc_idx, spec_idx, int(row['TOTAL_QTY'] or 0)])
|
||||
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'specs': specs,
|
||||
'workcenters': workcenters,
|
||||
'data': heatmap_data
|
||||
})
|
||||
else:
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
if __name__ == '__main__':
|
||||
# 測試數據庫連接
|
||||
print("正在測試數據庫連接...")
|
||||
conn = get_db_connection()
|
||||
if conn:
|
||||
print("✓ 數據庫連接成功!")
|
||||
conn.close()
|
||||
|
||||
# 測試查詢
|
||||
print("\n正在測試查詢...")
|
||||
summary = query_wip_summary()
|
||||
if summary:
|
||||
print(f"✓ WIP 總覽查詢成功!")
|
||||
print(f" - 總 LOT 數: {summary['total_lot_count']}")
|
||||
print(f" - 總數量: {summary['total_qty']}")
|
||||
print(f" - 總片數: {summary['total_qty2']}")
|
||||
|
||||
print("\n啟動 Web 服務器...")
|
||||
print("請訪問: http://localhost:5001")
|
||||
print("按 Ctrl+C 停止服務器\n")
|
||||
|
||||
app.run(debug=True, host='0.0.0.0', port=5001)
|
||||
else:
|
||||
print("✗ 數據庫連接失敗,請檢查配置")
|
||||
29
apps/快速啟動.py
29
apps/快速啟動.py
@@ -1,29 +0,0 @@
|
||||
"""
|
||||
快速啟動腳本 - 可以直接用 Python 運行
|
||||
使用方法: python apps\快速啟動.py
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
# 檢查依賴
|
||||
try:
|
||||
import flask
|
||||
import pandas
|
||||
import oracledb
|
||||
print("✓ 所有依賴已安裝")
|
||||
except ImportError as e:
|
||||
print(f"[錯誤] 缺少依賴: {e}")
|
||||
print("\n請先執行以下命令安裝依賴:")
|
||||
print(" pip install flask pandas oracledb")
|
||||
print("\n或者運行: scripts\\0_初始化環境.bat")
|
||||
sys.exit(1)
|
||||
|
||||
# 啟動應用
|
||||
print("\n正在啟動 MES 報表入口...")
|
||||
print("請訪問: http://localhost:5000")
|
||||
print("按 Ctrl+C 停止服務器\n")
|
||||
|
||||
# 導入並運行
|
||||
from portal import app
|
||||
app.run(debug=True, host='0.0.0.0', port=5000)
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
282
docs/WIP報表說明.md
282
docs/WIP報表說明.md
@@ -1,282 +0,0 @@
|
||||
# WIP 在制品報表 - 使用說明
|
||||
|
||||
## 功能說明
|
||||
|
||||
這是一個專門查詢當前在制品 (Work In Process) 數量統計的報表工具。
|
||||
|
||||
### 欄位對照
|
||||
|
||||
根據您的需求,系統會顯示以下欄位:
|
||||
- **CONTAINERNAME** = LOT ID (批次號)
|
||||
- **GA_CONTAINERNAME** = GA LOT ID (GA 批次號)
|
||||
- **QTY** = 數量
|
||||
- **QTY2** = 片數
|
||||
- **MFGORDERNAME** = GA (工單號)
|
||||
|
||||
### 報表內容
|
||||
|
||||
#### 1. 總覽統計卡片
|
||||
- 總 LOT 數:當前在制品的批次總數
|
||||
- 總數量 (QTY):總數量統計
|
||||
- 總片數 (QTY2):總片數統計
|
||||
- 工序數 (SPEC):涉及的工序數量
|
||||
- 工作中心數:涉及的工作中心數量
|
||||
- 產品線數:涉及的產品線數量
|
||||
|
||||
#### 2. 按工序與工作中心統計
|
||||
顯示各 SPECNAME (工序) 及 WORKCENTERNAME (工作中心) 對應的當前 WIP 數量:
|
||||
- SPECNAME (工序)
|
||||
- WORKCENTERNAME (工作中心)
|
||||
- LOT 數
|
||||
- 總數量 (QTY)
|
||||
- 總片數 (QTY2)
|
||||
|
||||
#### 3. 按產品線統計
|
||||
顯示不同產品組合 (PRODUCTLINENAME_LEF) 各佔的量:
|
||||
|
||||
**產品線匯總**:
|
||||
- PRODUCTLINENAME_LEF (產品線)
|
||||
- LOT 數合計
|
||||
- 總數量 (QTY) 合計
|
||||
- 總片數 (QTY2) 合計
|
||||
|
||||
**產品線明細**:
|
||||
- PRODUCTLINENAME_LEF (產品線)
|
||||
- SPECNAME (工序)
|
||||
- WORKCENTERNAME (工作中心)
|
||||
- LOT 數
|
||||
- 總數量 (QTY)
|
||||
- 總片數 (QTY2)
|
||||
|
||||
---
|
||||
|
||||
## 啟動方式
|
||||
|
||||
### 方法 1: 使用啟動腳本(推薦)
|
||||
|
||||
```bash
|
||||
# 雙擊運行
|
||||
scripts\啟動Dashboard.bat
|
||||
```
|
||||
|
||||
### 方法 2: 手動啟動
|
||||
|
||||
```bash
|
||||
# 使用虛擬環境的 Python
|
||||
venv\Scripts\python.exe apps\portal.py
|
||||
```
|
||||
|
||||
然後訪問: **http://localhost:5000**
|
||||
(入口頁面可用 Tab 切換,或直接開啟 **http://localhost:5000/wip**)
|
||||
|
||||
---
|
||||
|
||||
## 使用說明
|
||||
|
||||
### 1. 選擇時間範圍
|
||||
|
||||
在頁面頂部的下拉選單中選擇:
|
||||
- 最近 1 天
|
||||
- 最近 3 天
|
||||
- 最近 7 天(默認)
|
||||
- 最近 14 天
|
||||
- 最近 30 天
|
||||
|
||||
### 2. 點擊查詢
|
||||
|
||||
選擇時間範圍後,點擊「🔍 查詢」按鈕重新載入數據
|
||||
|
||||
### 3. 切換報表視圖
|
||||
|
||||
使用頁面中的標籤切換不同的統計視圖:
|
||||
- **按工序與工作中心統計**:查看各 SPEC 和 WORKCENTER 的 WIP 分布
|
||||
- **按產品線統計**:查看各產品線的 WIP 分布(包含匯總和明細)
|
||||
|
||||
---
|
||||
|
||||
## 查詢邏輯
|
||||
|
||||
### 數據範圍
|
||||
|
||||
- 使用 `TXNDATE >= TRUNC(SYSDATE) - N` 查詢最近 N 天的數據
|
||||
- 自動排除已完成或已取消的批次 (`STATUS NOT IN (8, 128)`)
|
||||
- 只查詢有效的數據(非 NULL)
|
||||
|
||||
### SQL 查詢示例
|
||||
|
||||
#### 按 SPEC 和 WORKCENTER 統計
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - 7
|
||||
AND STATUS NOT IN (8, 128)
|
||||
AND SPECNAME IS NOT NULL
|
||||
AND WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY SPECNAME, WORKCENTERNAME
|
||||
ORDER BY SPECNAME, WORKCENTERNAME
|
||||
```
|
||||
|
||||
#### 按產品線統計
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM DW_MES_WIP
|
||||
WHERE TXNDATE >= TRUNC(SYSDATE) - 7
|
||||
AND STATUS NOT IN (8, 128)
|
||||
AND PRODUCTLINENAME_LEF IS NOT NULL
|
||||
GROUP BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
ORDER BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## API 接口
|
||||
|
||||
如需程式化調用,可使用以下 API:
|
||||
|
||||
### 1. WIP 總覽統計
|
||||
```
|
||||
GET /api/wip/summary?days=7
|
||||
```
|
||||
|
||||
**響應**:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"total_lot_count": 1234,
|
||||
"total_qty": 567890,
|
||||
"total_qty2": 123456,
|
||||
"spec_count": 45,
|
||||
"workcenter_count": 23,
|
||||
"product_line_count": 12
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. 按 SPEC 和 WORKCENTER 統計
|
||||
```
|
||||
GET /api/wip/by_spec_workcenter?days=7
|
||||
```
|
||||
|
||||
**響應**:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": [
|
||||
{
|
||||
"SPECNAME": "SMT",
|
||||
"WORKCENTERNAME": "SMT-LINE1",
|
||||
"LOT_COUNT": 50,
|
||||
"TOTAL_QTY": 12500,
|
||||
"TOTAL_QTY2": 2500
|
||||
}
|
||||
],
|
||||
"count": 100
|
||||
}
|
||||
```
|
||||
|
||||
### 3. 按產品線統計
|
||||
```
|
||||
GET /api/wip/by_product_line?days=7
|
||||
```
|
||||
|
||||
**響應**:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": [...],
|
||||
"summary": [
|
||||
{
|
||||
"PRODUCTLINENAME_LEF": "產品線A",
|
||||
"LOT_COUNT": 150,
|
||||
"TOTAL_QTY": 37500,
|
||||
"TOTAL_QTY2": 7500
|
||||
}
|
||||
],
|
||||
"count": 200
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 注意事項
|
||||
|
||||
### 1. 性能考量
|
||||
|
||||
- 查詢使用 `TXNDATE` 欄位進行時間範圍過濾
|
||||
- 建議查詢範圍不超過 30 天
|
||||
- 大數據量查詢可能需要等待幾秒鐘
|
||||
|
||||
### 2. 數據即時性
|
||||
|
||||
- 數據來自 DW_MES_WIP 表
|
||||
- 根據 `TXNDATE` 欄位判斷數據更新時間
|
||||
- 如需最新數據,建議選擇「最近 1 天」或「最近 3 天」
|
||||
|
||||
### 3. 狀態過濾
|
||||
|
||||
系統自動排除以下狀態的批次:
|
||||
- `STATUS = 8`: 已完成
|
||||
- `STATUS = 128`: 已取消
|
||||
|
||||
---
|
||||
|
||||
## 常見問題
|
||||
|
||||
### Q1: 為什麼查詢結果為空?
|
||||
|
||||
**可能原因**:
|
||||
1. 選擇的時間範圍內沒有數據
|
||||
2. 所有批次都已完成或取消
|
||||
3. 數據庫連接問題
|
||||
|
||||
**解決方法**:
|
||||
1. 嘗試擴大時間範圍(例如選擇「最近 30 天」)
|
||||
2. 檢查數據庫連接狀態
|
||||
|
||||
### Q2: 數字顯示為 "-" 是什麼意思?
|
||||
|
||||
表示該欄位的值為 NULL 或查詢失敗。
|
||||
|
||||
### Q3: 如何匯出數據?
|
||||
|
||||
目前版本不支援匯出功能,但您可以:
|
||||
1. 直接使用 API 接口獲取 JSON 格式數據
|
||||
2. 從瀏覽器複製表格內容到 Excel
|
||||
|
||||
### Q4: 可以同時運行多個報表嗎?
|
||||
|
||||
可以,但需要使用不同的端口:
|
||||
- 數據查詢工具: http://localhost:5000
|
||||
- WIP 報表: http://localhost:5000
|
||||
|
||||
---
|
||||
|
||||
## 後續擴展
|
||||
|
||||
可以考慮增加的功能:
|
||||
- [ ] Excel 匯出功能
|
||||
- [ ] 圖表可視化(餅圖、柱狀圖)
|
||||
- [ ] 更多篩選條件(產品、工單號等)
|
||||
- [ ] Hold 批次統計
|
||||
- [ ] 在站時間分析
|
||||
|
||||
---
|
||||
|
||||
**版本**: 1.0
|
||||
**建立日期**: 2026-01-14
|
||||
|
||||
|
||||
210
docs/使用說明.md
210
docs/使用說明.md
@@ -1,210 +0,0 @@
|
||||
# MES 數據查詢工具 - 使用說明
|
||||
|
||||
## 工具用途
|
||||
|
||||
這是一個 Web 界面工具,用於快速查看 MES 數據庫中各表的實際資料,幫助您:
|
||||
- ✅ 確認表結構和欄位內容
|
||||
- ✅ 查看數據樣本(最後 1000 筆)
|
||||
- ✅ 驗證時間欄位和數據格式
|
||||
- ✅ 了解表的實際使用情況
|
||||
|
||||
---
|
||||
|
||||
## 啟動步驟
|
||||
|
||||
### 首次使用(需要初始化環境)
|
||||
|
||||
1. **雙擊運行**: `scripts\0_初始化環境.bat`
|
||||
- 會自動創建 Python 虛擬環境
|
||||
- 自動安裝所需依賴(Flask, Pandas, oracledb)
|
||||
- 測試數據庫連接
|
||||
|
||||
2. **雙擊運行**: `scripts\啟動Dashboard.bat`
|
||||
- 啟動 Web 服務器
|
||||
|
||||
3. **打開瀏覽器訪問**: http://localhost:5000
|
||||
- 上方 Tab 可切換「WIP 報表 / 數據表查詢工具」
|
||||
|
||||
### 後續使用
|
||||
|
||||
直接雙擊運行 `scripts\啟動Dashboard.bat` 即可
|
||||
|
||||
---
|
||||
|
||||
## 使用界面說明
|
||||
|
||||
### 主界面
|
||||
|
||||
打開 http://localhost:5000 後,您會看到入口頁面,請切換到「數據表查詢工具」Tab
|
||||
(或直接開啟 http://localhost:5000/tables)。
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ 📊 MES 數據表查詢工具 │
|
||||
│ 點擊表名查看最後 1000 筆資料 │
|
||||
├─────────────────────────────────────────────┤
|
||||
│ 現況快照表 │
|
||||
│ ┌──────────────┐ ┌──────────────┐ │
|
||||
│ │ WIP (在制品) │ │ RESOURCE │ │
|
||||
│ │ 77,470,834行 │ │ 90,620行 │ │
|
||||
│ └──────────────┘ └──────────────┘ │
|
||||
│ │
|
||||
│ 歷史累積表 │
|
||||
│ ┌──────────────────┐ ┌──────────────┐ │
|
||||
│ │ RESOURCESTATUS ⭐ │ │ LOTWIPHISTORY⭐│ │
|
||||
│ │ 65,139,825行 │ │ 53,085,425行 │ │
|
||||
│ └──────────────────┘ └──────────────┘ │
|
||||
│ ... │
|
||||
└─────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 查看表資料
|
||||
|
||||
1. **點擊任一表卡片**
|
||||
2. **系統會自動查詢並顯示**:
|
||||
- 數據統計(返回行數、欄位數)
|
||||
- 完整的欄位列表(表頭)
|
||||
- 最後 1000 筆資料(表格形式)
|
||||
|
||||
3. **大表自動優化**:
|
||||
- 標有 "大表" 標籤的表會自動按時間欄位倒序排列
|
||||
- 確保查看到的是最新數據
|
||||
|
||||
### 表卡片說明
|
||||
|
||||
每個表卡片顯示:
|
||||
- **表名**: 完整表名和中文說明
|
||||
- **數據量**: 總行數(格式化顯示)
|
||||
- **時間欄位**: 主要時間欄位名稱(如果有)
|
||||
- **用途描述**: 表的業務用途
|
||||
- **大表標籤**: 超過 1000 萬行會顯示紅色 "大表" 標籤
|
||||
|
||||
---
|
||||
|
||||
## 表分類說明
|
||||
|
||||
### 🟢 現況快照表(4張)
|
||||
這些表存儲**當前狀態**,數據會被更新或覆蓋:
|
||||
- **DW_MES_WIP** - 在制品當前狀態(⚠️ 雖名為現況表,但包含 7700 萬行歷史)
|
||||
- **DW_MES_RESOURCE** - 設備資源主檔
|
||||
- **DW_MES_CONTAINER** - 容器當前狀態
|
||||
- **DW_MES_JOB** - 設備維修工單當前狀態
|
||||
|
||||
### 🟡 歷史累積表(10張)
|
||||
這些表只**新增不修改**,記錄完整歷史:
|
||||
- **DW_MES_RESOURCESTATUS** ⭐ - 設備狀態變更歷史(用於計算稼動率)
|
||||
- **DW_MES_LOTWIPHISTORY** ⭐ - 批次完整流轉歷史(用於 Cycle Time)
|
||||
- **DW_MES_LOTWIPDATAHISTORY** - 批次數據採集歷史
|
||||
- **DW_MES_HM_LOTMOVEOUT** - 批次移出事件
|
||||
- **DW_MES_JOBTXNHISTORY** - 維修工單交易歷史
|
||||
- **DW_MES_LOTREJECTHISTORY** - 批次拒絕歷史
|
||||
- **DW_MES_LOTMATERIALSHISTORY** - 物料消耗歷史
|
||||
- **DW_MES_HOLDRELEASEHISTORY** - Hold/Release 歷史
|
||||
- **DW_MES_MAINTENANCE** - 設備維護歷史
|
||||
- **DW_MES_RESOURCESTATUS_SHIFT** - 資源班次狀態
|
||||
|
||||
### 🟣 輔助表(2張)
|
||||
- **DW_MES_PARTREQUESTORDER** - 物料請求訂單
|
||||
- **DW_MES_PJ_COMBINEDASSYLOTS** - 組合裝配批次
|
||||
|
||||
---
|
||||
|
||||
## 使用技巧
|
||||
|
||||
### 1. 確認欄位內容
|
||||
|
||||
點擊表名後,觀察:
|
||||
- **欄位名稱**: 表頭顯示所有欄位
|
||||
- **數據類型**: 從數據內容推斷(數字、日期、文字)
|
||||
- **NULL 值**: 顯示為灰色斜體 "NULL"
|
||||
- **日期格式**: 自動格式化為 `YYYY-MM-DD HH:MM:SS`
|
||||
|
||||
### 2. 驗證時間欄位
|
||||
|
||||
對於大表,系統會自動按時間欄位排序,例如:
|
||||
- **DW_MES_WIP**: 按 `TXNDATE` 倒序
|
||||
- **DW_MES_RESOURCESTATUS**: 按 `OLDLASTSTATUSCHANGEDATE` 倒序
|
||||
- **DW_MES_LOTWIPHISTORY**: 按 `TRACKINTIMESTAMP` 倒序
|
||||
|
||||
這樣可以快速看到最新的數據記錄。
|
||||
|
||||
### 3. 查看數據範圍
|
||||
|
||||
觀察返回的 1000 筆資料:
|
||||
- 最新一筆的時間(第一行)
|
||||
- 最舊一筆的時間(最後一行)
|
||||
- 推算數據更新頻率
|
||||
|
||||
### 4. 確認表性質
|
||||
|
||||
通過查看資料可以確認:
|
||||
- **現況表**: 數據是否重複更新(看 ID 是否相同但時間不同)
|
||||
- **歷史表**: 數據是否只新增(每一行都是獨立事件)
|
||||
|
||||
---
|
||||
|
||||
## 常見問題
|
||||
|
||||
### Q1: 啟動失敗,顯示 "找不到 Python"
|
||||
**解決方法**:
|
||||
1. 確認已安裝 Python 3.11 或更高版本
|
||||
2. 執行 `python --version` 檢查
|
||||
3. 如未安裝,請從 https://www.python.org/downloads/ 下載
|
||||
|
||||
### Q2: 啟動失敗,顯示 "找不到虛擬環境"
|
||||
**解決方法**:
|
||||
1. 先執行 `scripts\0_初始化環境.bat` 初始化環境
|
||||
2. 或手動執行: `python -m venv venv`
|
||||
|
||||
### Q3: 查詢失敗,顯示 "數據庫連接失敗"
|
||||
**原因**:
|
||||
- 數據庫服務器未連接
|
||||
- 網絡問題
|
||||
- 數據庫帳號密碼錯誤
|
||||
|
||||
**解決方法**:
|
||||
1. 檢查網絡連接
|
||||
2. 確認 `apps\table_data_viewer.py` 中的數據庫配置是否正確
|
||||
3. 聯繫數據庫管理員確認帳號狀態
|
||||
|
||||
### Q4: 查詢很慢或超時
|
||||
**原因**:
|
||||
- 查詢的表數據量太大
|
||||
- 沒有使用時間範圍過濾
|
||||
|
||||
**說明**:
|
||||
- 工具會自動對大表使用時間欄位排序和 ROWNUM 限制
|
||||
- 如果仍然很慢,說明該表確實數據量非常大
|
||||
- 這正好驗證了為什麼開發報表時必須加時間範圍限制
|
||||
|
||||
### Q5: 如何停止服務器?
|
||||
**方法**:
|
||||
- 在命令提示字元窗口按 `Ctrl+C`
|
||||
- 或直接關閉命令提示字元窗口
|
||||
|
||||
---
|
||||
|
||||
## 數據隱私提醒
|
||||
|
||||
⚠️ **注意**:
|
||||
- 本工具連接生產數據庫(只讀權限)
|
||||
- 請勿將查詢到的敏感數據外傳
|
||||
- 使用完畢後請關閉服務器
|
||||
|
||||
---
|
||||
|
||||
## 技術支援
|
||||
|
||||
如有問題,請參考:
|
||||
- [README.md](../README.md) - 專案總覽
|
||||
- [System_Architecture_Design.md](System_Architecture_Design.md) - 系統架構文檔
|
||||
- [MES_Core_Tables_Analysis_Report.md](MES_Core_Tables_Analysis_Report.md) - 表分析報告
|
||||
|
||||
---
|
||||
|
||||
**版本**: 1.0
|
||||
**最後更新**: 2026-01-14
|
||||
|
||||
|
||||
|
||||
|
||||
127
docs/開始使用.txt
127
docs/開始使用.txt
@@ -1,127 +0,0 @@
|
||||
===============================================
|
||||
MES 數據查詢工具 - 快速開始指南
|
||||
===============================================
|
||||
|
||||
您好!數據查詢工具已經準備就緒。
|
||||
|
||||
📋 現在可以做什麼?
|
||||
===============================================
|
||||
|
||||
✅ 立即可用:查看各表的實際資料
|
||||
- 雙擊運行: scripts\0_初始化環境.bat(首次使用)
|
||||
- 雙擊運行: scripts\啟動Dashboard.bat
|
||||
- 訪問: http://localhost:5000(上方 Tab 可切換頁面)
|
||||
|
||||
✅ 已完成的文檔:
|
||||
1. README.md - 專案總覽
|
||||
2. docs\System_Architecture_Design.md - 系統架構設計(v1.1)
|
||||
3. docs\MES_Core_Tables_Analysis_Report.md - 表分析報告
|
||||
4. docs\MES_Database_Reference.md - 數據庫參考(已更新)
|
||||
5. docs\使用說明.md - 查詢工具使用指南
|
||||
|
||||
📊 查詢工具功能
|
||||
===============================================
|
||||
|
||||
- 按表性質分類(現況表/歷史表/輔助表)
|
||||
- 查看各表最後 1000 筆資料
|
||||
- 大表自動按時間欄位排序
|
||||
- 美觀的 Web UI 界面
|
||||
- 即時顯示欄位和數據
|
||||
|
||||
🔍 關鍵發現
|
||||
===============================================
|
||||
|
||||
1. 表性質分類:
|
||||
- 現況快照表(4張): WIP, RESOURCE, CONTAINER, JOB
|
||||
- 歷史累積表(10張): RESOURCESTATUS, LOTWIPHISTORY 等
|
||||
- 輔助表(2張)
|
||||
|
||||
2. 重要認知更新:
|
||||
⚠️ DW_MES_WIP 雖名為"在制品表",但包含 7700 萬行歷史數據
|
||||
⚠️ DW_MES_RESOURCESTATUS 記錄狀態變更,需用兩個時間欄位計算持續時間
|
||||
|
||||
3. 查詢優化鐵律:
|
||||
⚠️ 所有超過 1000 萬行的表,查詢時必須加時間範圍限制!
|
||||
- 儀表板查詢: 最近 7 天
|
||||
- 報表查詢: 最多 30 天
|
||||
- 歷史趨勢: 最多 90 天
|
||||
|
||||
📂 專案文件結構
|
||||
===============================================
|
||||
|
||||
DashBoard/
|
||||
├── README.md ← 專案總覽
|
||||
├── docs/ ← 文件
|
||||
│ ├── 開始使用.txt ← 您正在閱讀的文件
|
||||
│ ├── 使用說明.md ← 查詢工具使用指南
|
||||
│ ├── System_Architecture_Design.md ← 系統架構設計
|
||||
│ ├── MES_Core_Tables_Analysis_Report.md ← 表分析報告
|
||||
│ └── MES_Database_Reference.md ← 數據庫參考
|
||||
├── scripts/ ← 啟動腳本
|
||||
│ ├── 0_初始化環境.bat ← 首次使用請執行
|
||||
│ ├── 啟動Dashboard.bat ← 統一入口
|
||||
│ └── 啟動數據查詢工具.bat ← 相容入口
|
||||
├── apps/ ← 可執行應用
|
||||
│ ├── portal.py ← 統一入口主程式
|
||||
│ ├── table_data_viewer.py ← 查詢工具後端
|
||||
│ ├── wip_report.py ← WIP 報表後端
|
||||
│ ├── 快速啟動.py ← Python 直接啟動
|
||||
│ └── templates\portal.html ← 統一入口前端
|
||||
|
||||
🚀 下一步建議
|
||||
===============================================
|
||||
|
||||
現在請執行:
|
||||
|
||||
1. ✅ 初始化環境(首次使用)
|
||||
雙擊: scripts\0_初始化環境.bat
|
||||
|
||||
2. ✅ 啟動報表入口
|
||||
雙擊: scripts\啟動Dashboard.bat
|
||||
|
||||
3. ✅ 查看實際資料
|
||||
訪問: http://localhost:5000
|
||||
用上方 Tab 切換頁面
|
||||
|
||||
4. ⏳ 提供 Power BI 報表截圖
|
||||
確認儀表板與報表的具體設計
|
||||
|
||||
5. ⏳ 選擇優先開發的業務場景
|
||||
從 8 個場景中選擇 3-5 個優先實現
|
||||
|
||||
⏳ 待確認事項
|
||||
===============================================
|
||||
|
||||
1. Power BI 報表截圖 - 用於前端 UI 設計參考
|
||||
2. 具體報表類型 - 確認要開發哪些報表
|
||||
3. 部署環境 - 是否有專用服務器
|
||||
4. 並發用戶數 - 預計同時使用人數
|
||||
|
||||
💡 建議的 8 個核心業務場景
|
||||
===============================================
|
||||
|
||||
1. 在制品(WIP)看板
|
||||
2. 設備稼動率(OEE)報表 ⭐
|
||||
3. 批次生產履歷追溯
|
||||
4. 工序 Cycle Time 分析
|
||||
5. 設備產出與效率分析
|
||||
6. Hold 批次分析
|
||||
7. 設備維修工單進度追蹤
|
||||
8. 良率分析
|
||||
|
||||
📞 獲取幫助
|
||||
===============================================
|
||||
|
||||
查看詳細文檔:
|
||||
- docs\使用說明.md - 如何使用查詢工具
|
||||
- README.md - 專案完整說明
|
||||
- docs\System_Architecture_Design.md - 技術架構文檔
|
||||
|
||||
===============================================
|
||||
準備好開始了嗎?
|
||||
|
||||
雙擊運行: scripts\0_初始化環境.bat
|
||||
===============================================
|
||||
|
||||
|
||||
|
||||
15
environment.yml
Normal file
15
environment.yml
Normal file
@@ -0,0 +1,15 @@
|
||||
name: mes-dashboard
|
||||
channels:
|
||||
- conda-forge
|
||||
- defaults
|
||||
dependencies:
|
||||
- python=3.11
|
||||
- pip
|
||||
- pip:
|
||||
- oracledb>=2.0.0
|
||||
- flask>=3.0.0
|
||||
- pandas>=2.0.0
|
||||
- sqlalchemy>=2.0.0
|
||||
- openpyxl>=3.0.0
|
||||
- python-dotenv>=1.0.0
|
||||
- gunicorn>=21.2.0
|
||||
6
gunicorn.conf.py
Normal file
6
gunicorn.conf.py
Normal file
@@ -0,0 +1,6 @@
|
||||
import os
|
||||
|
||||
bind = os.getenv("GUNICORN_BIND", "0.0.0.0:8080")
|
||||
workers = int(os.getenv("GUNICORN_WORKERS", "1"))
|
||||
threads = int(os.getenv("GUNICORN_THREADS", "4"))
|
||||
worker_class = "gthread"
|
||||
@@ -0,0 +1,2 @@
|
||||
schema: spec-driven
|
||||
created: 2026-01-26
|
||||
@@ -0,0 +1,130 @@
|
||||
## Context
|
||||
|
||||
MES Dashboard 目前使用 Flask 開發,但採用「簡易腳本」架構:
|
||||
- `app = Flask(...)` 在 module level 建立
|
||||
- 使用 `sys.path.insert(0, ...)` 處理 import 路徑
|
||||
- 直接 `app.run()` 啟動開發伺服器
|
||||
|
||||
這種架構在以下方面有限制:
|
||||
1. 無法使用 Gunicorn 多 worker 部署
|
||||
2. 單元測試困難(無法隔離建立 app instance)
|
||||
3. 不支援多環境設定(dev/staging/prod)
|
||||
4. import 路徑 hack 導致 IDE 支援不佳、容易出錯
|
||||
|
||||
## Goals / Non-Goals
|
||||
|
||||
**Goals:**
|
||||
- 重構為 Application Factory pattern,支援 `create_app(config)` 建立
|
||||
- 建立標準 Python package 結構,使用正規 import
|
||||
- 預留擴充點:cache backend 可抽換、connection pool 可調整
|
||||
- 提供 Gunicorn 部署設定,單 worker + threads 為預設
|
||||
- 保持所有現有功能不變
|
||||
|
||||
**Non-Goals:**
|
||||
- 不實作 Redis cache(僅建立抽象介面)
|
||||
- 不做前後端分離
|
||||
- 不升級到 FastAPI(保持 Flask)
|
||||
- 不實作多機部署 / Load Balancer
|
||||
|
||||
## Decisions
|
||||
|
||||
### 1. 目錄結構:src layout
|
||||
|
||||
**選擇**: `src/mes_dashboard/` 結構
|
||||
|
||||
**替代方案**:
|
||||
- Flat layout (`mes_dashboard/` 在根目錄) - 較簡單但容易誤 import 本地未安裝的模組
|
||||
- `apps/` 重命名 - 保留舊名但不符 Python 慣例
|
||||
|
||||
**理由**: src layout 是 Python packaging 最佳實踐,強制使用已安裝的 package,避免 import 混淆。
|
||||
|
||||
### 2. 設定管理:Environment-based config classes
|
||||
|
||||
**選擇**: Config class hierarchy + `.env` file
|
||||
|
||||
```python
|
||||
class Config:
|
||||
"""Base config"""
|
||||
|
||||
class DevelopmentConfig(Config):
|
||||
DEBUG = True
|
||||
|
||||
class ProductionConfig(Config):
|
||||
DEBUG = False
|
||||
```
|
||||
|
||||
**替代方案**:
|
||||
- Pydantic Settings - 更強大但增加依賴
|
||||
- 純 `.env` - 不夠結構化
|
||||
|
||||
**理由**: Flask 原生支援 config object,無需額外依賴。`.env` 處理機敏資料,class 處理環境差異。
|
||||
|
||||
### 3. Cache 抽象:Protocol-based interface
|
||||
|
||||
**選擇**: 定義 `CacheBackend` protocol,目前實作 `NoOpCache`
|
||||
|
||||
```python
|
||||
class CacheBackend(Protocol):
|
||||
def get(self, key: str) -> Any: ...
|
||||
def set(self, key: str, value: Any, ttl: int) -> None: ...
|
||||
|
||||
class NoOpCache:
|
||||
"""Pass-through, no actual caching"""
|
||||
```
|
||||
|
||||
**替代方案**:
|
||||
- Flask-Caching extension - 功能完整但目前不需要
|
||||
- 完全移除 cache code - 未來加回時改動大
|
||||
|
||||
**理由**: 保留介面但不實作功能,未來加入 Redis 只需新增一個 class。
|
||||
|
||||
### 4. Database 連線:Request-scoped via Flask g
|
||||
|
||||
**選擇**: 使用 `flask.g` 管理 request-scoped connection
|
||||
|
||||
```python
|
||||
def get_db():
|
||||
if 'db' not in g:
|
||||
g.db = get_engine().connect()
|
||||
return g.db
|
||||
|
||||
@app.teardown_appcontext
|
||||
def close_db(e=None):
|
||||
db = g.pop('db', None)
|
||||
if db: db.close()
|
||||
```
|
||||
|
||||
**替代方案**:
|
||||
- SQLAlchemy-Flask extension - 引入較多 magic
|
||||
- 每次查詢新建連線 - 效能差
|
||||
|
||||
**理由**: 輕量、明確、與現有 SQLAlchemy engine 相容。
|
||||
|
||||
### 5. 部署方式:Gunicorn gthread worker
|
||||
|
||||
**選擇**: `gunicorn --workers 1 --threads 4`
|
||||
|
||||
**替代方案**:
|
||||
- sync worker + 多 process - 對 Oracle connection pool 較不友善
|
||||
- gevent/eventlet - 需要 monkey patching,增加複雜度
|
||||
|
||||
**理由**: gthread 在單 worker 下提供並發,且與同步 DB driver 相容。未來需要時可增加 workers。
|
||||
|
||||
## Risks / Trade-offs
|
||||
|
||||
| Risk | Mitigation |
|
||||
|------|------------|
|
||||
| 重構過程中斷現有功能 | 分階段進行,每階段可獨立測試 |
|
||||
| Import 路徑變更影響範圍大 | 提供 migration script 批次更新 |
|
||||
| NoOpCache 造成重複查詢 | 報表系統可接受,監控 DB 負載 |
|
||||
| 單 worker 成為瓶頸 | 監控 response time,必要時增加 workers |
|
||||
|
||||
## Migration Plan
|
||||
|
||||
1. **建立新結構** - 在 `src/` 下建立 package,不影響現有 `apps/`
|
||||
2. **移植模組** - 逐一將 config → core → services → routes 移入
|
||||
3. **驗證功能** - 確保所有 API 和頁面正常
|
||||
4. **切換啟動方式** - 更新啟動腳本使用 gunicorn
|
||||
5. **清理舊檔** - 移除 `apps/` 目錄和舊的啟動腳本
|
||||
|
||||
**Rollback**: 保留 `apps/` 直到新架構穩定,隨時可切回。
|
||||
@@ -0,0 +1,36 @@
|
||||
## Why
|
||||
|
||||
目前的 Flask 應用直接在 module level 建立 `app = Flask(...)`,並使用 `sys.path.insert` hacks 處理 import。這種架構無法支援多 worker 部署、測試困難,且不符合 Flask 最佳實踐。公司報表系統需要支援多人使用,架構必須具備未來擴充性。
|
||||
|
||||
## What Changes
|
||||
|
||||
- 重構為 Application Factory pattern (`create_app()`)
|
||||
- 建立正式 Python package 結構,移除所有 `sys.path.insert` hacks
|
||||
- 新增 WSGI 部署設定 (Gunicorn)
|
||||
- 建立 Cache 抽象層(目前為 no-op 實作,保留未來擴充介面)
|
||||
- 統一 config 管理,支援多環境設定
|
||||
- **BREAKING**: 應用啟動方式改變,從 `python portal.py` 改為使用 gunicorn 或 `flask run`
|
||||
|
||||
## Capabilities
|
||||
|
||||
### New Capabilities
|
||||
|
||||
- `app-factory`: Application Factory pattern 實作,支援建立可配置的 Flask app instance
|
||||
- `package-structure`: 正式 Python package 結構,使用標準 import 機制
|
||||
- `deployment-config`: WSGI 部署設定,包含 gunicorn 配置與啟動腳本
|
||||
|
||||
### Modified Capabilities
|
||||
|
||||
<!-- 目前無既有 specs,此為全新建立 -->
|
||||
|
||||
## Impact
|
||||
|
||||
- **Code**:
|
||||
- `apps/` 目錄重構為 `src/mes_dashboard/`
|
||||
- 所有現有模組 import 路徑改變
|
||||
- `portal.py` 拆分為 `app.py` (factory) + entry point
|
||||
- **啟動方式**:
|
||||
- 開發: `flask run` 或 `python -m mes_dashboard`
|
||||
- 生產: `gunicorn "mes_dashboard:create_app()"`
|
||||
- **Dependencies**: 新增 gunicorn、python-dotenv(如尚未有)
|
||||
- **現有功能**: 所有 routes、services、templates 保持不變,僅調整 import 路徑
|
||||
@@ -0,0 +1,47 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Application factory function
|
||||
|
||||
The system SHALL provide a `create_app()` function that creates and configures a Flask application instance.
|
||||
|
||||
#### Scenario: Create app with default config
|
||||
- **WHEN** `create_app()` is called without arguments
|
||||
- **THEN** a Flask app instance is returned with development configuration
|
||||
|
||||
#### Scenario: Create app with specific config
|
||||
- **WHEN** `create_app("production")` is called
|
||||
- **THEN** a Flask app instance is returned with production configuration
|
||||
|
||||
#### Scenario: Multiple app instances are independent
|
||||
- **WHEN** `create_app()` is called twice
|
||||
- **THEN** two independent Flask app instances are returned
|
||||
|
||||
### Requirement: Blueprint registration
|
||||
|
||||
The system SHALL automatically register all route blueprints when creating an app.
|
||||
|
||||
#### Scenario: All existing routes are available
|
||||
- **WHEN** an app is created via `create_app()`
|
||||
- **THEN** all existing API endpoints (`/api/wip/*`, `/api/resource/*`, `/api/dashboard/*`, `/api/excel/*`) are accessible
|
||||
- **AND** all page routes (`/`, `/wip`, `/resource`, `/tables`, etc.) are accessible
|
||||
|
||||
### Requirement: Database initialization
|
||||
|
||||
The system SHALL initialize the database connection pool when creating an app.
|
||||
|
||||
#### Scenario: Database is ready after app creation
|
||||
- **WHEN** an app is created via `create_app()`
|
||||
- **THEN** the SQLAlchemy engine is configured with connection pooling
|
||||
- **AND** `pool_size` and `max_overflow` are set from configuration
|
||||
|
||||
### Requirement: Request-scoped database connection
|
||||
|
||||
The system SHALL provide request-scoped database connections via Flask's application context.
|
||||
|
||||
#### Scenario: Connection obtained during request
|
||||
- **WHEN** a request handler calls `get_db()`
|
||||
- **THEN** a database connection is returned from the pool
|
||||
|
||||
#### Scenario: Connection released after request
|
||||
- **WHEN** a request completes
|
||||
- **THEN** the database connection is returned to the pool automatically
|
||||
@@ -0,0 +1,80 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Gunicorn configuration file
|
||||
|
||||
The system SHALL provide a `gunicorn.conf.py` file with production-ready defaults.
|
||||
|
||||
#### Scenario: Default configuration for single worker
|
||||
- **WHEN** gunicorn loads the config file
|
||||
- **THEN** `workers` is set to 1
|
||||
- **AND** `threads` is set to 4
|
||||
- **AND** `worker_class` is set to `gthread`
|
||||
|
||||
#### Scenario: Bind address is configurable
|
||||
- **WHEN** gunicorn starts
|
||||
- **THEN** it binds to `0.0.0.0:8080` by default
|
||||
- **AND** bind address can be overridden via environment variable
|
||||
|
||||
### Requirement: WSGI entry point
|
||||
|
||||
The system SHALL provide a WSGI-compatible entry point for gunicorn.
|
||||
|
||||
#### Scenario: Gunicorn can import the app
|
||||
- **WHEN** gunicorn is started with `gunicorn "mes_dashboard:create_app()"`
|
||||
- **THEN** the application starts successfully
|
||||
- **AND** all routes are accessible
|
||||
|
||||
### Requirement: Development startup script
|
||||
|
||||
The system SHALL provide a development startup script.
|
||||
|
||||
#### Scenario: Run in development mode
|
||||
- **WHEN** `python -m mes_dashboard` is executed
|
||||
- **THEN** the Flask development server starts
|
||||
- **AND** debug mode is enabled
|
||||
- **AND** auto-reload is enabled
|
||||
|
||||
### Requirement: Production startup script
|
||||
|
||||
The system SHALL provide scripts for production deployment.
|
||||
|
||||
#### Scenario: Start with gunicorn on Linux
|
||||
- **WHEN** `./scripts/start_server.sh` is executed
|
||||
- **THEN** gunicorn starts with the config file settings
|
||||
- **AND** logs are written to stdout
|
||||
|
||||
#### Scenario: Start on Windows
|
||||
- **WHEN** `scripts\start_server.bat` is executed
|
||||
- **THEN** the server starts using waitress (Windows-compatible WSGI server)
|
||||
- **OR** gunicorn if running in WSL
|
||||
|
||||
### Requirement: Environment-based configuration
|
||||
|
||||
The system SHALL load configuration based on environment.
|
||||
|
||||
#### Scenario: Development environment
|
||||
- **WHEN** `FLASK_ENV=development` (or not set)
|
||||
- **THEN** development configuration is loaded
|
||||
- **AND** `DEBUG=True`
|
||||
- **AND** connection pool size is smaller (5)
|
||||
|
||||
#### Scenario: Production environment
|
||||
- **WHEN** `FLASK_ENV=production`
|
||||
- **THEN** production configuration is loaded
|
||||
- **AND** `DEBUG=False`
|
||||
- **AND** connection pool size can be larger
|
||||
|
||||
### Requirement: Cache backend abstraction
|
||||
|
||||
The system SHALL provide an abstract cache interface with no-op default implementation.
|
||||
|
||||
#### Scenario: NoOpCache is used by default
|
||||
- **WHEN** cache is accessed without Redis configuration
|
||||
- **THEN** `NoOpCache` backend is used
|
||||
- **AND** `get()` always returns `None`
|
||||
- **AND** `set()` does nothing
|
||||
|
||||
#### Scenario: Cache interface is extensible
|
||||
- **WHEN** a `RedisCache` implementation is added in the future
|
||||
- **THEN** it can implement the same `CacheBackend` interface
|
||||
- **AND** switching requires only configuration change
|
||||
@@ -0,0 +1,67 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Standard Python package layout
|
||||
|
||||
The system SHALL use src layout for Python package structure.
|
||||
|
||||
#### Scenario: Package is importable after installation
|
||||
- **WHEN** `pip install -e .` is executed in the project root
|
||||
- **THEN** `import mes_dashboard` succeeds
|
||||
- **AND** `from mes_dashboard.app import create_app` succeeds
|
||||
|
||||
#### Scenario: No sys.path manipulation required
|
||||
- **WHEN** any module within the package imports another module
|
||||
- **THEN** standard relative or absolute imports are used
|
||||
- **AND** no `sys.path.insert` or similar hacks are present
|
||||
|
||||
### Requirement: Package directory structure
|
||||
|
||||
The system SHALL organize code in the following structure:
|
||||
|
||||
```
|
||||
src/mes_dashboard/
|
||||
├── __init__.py
|
||||
├── app.py # create_app factory
|
||||
├── config/
|
||||
│ ├── __init__.py
|
||||
│ ├── settings.py # Config classes
|
||||
│ ├── database.py # DB connection settings
|
||||
│ ├── tables.py # Table metadata
|
||||
│ └── workcenter_groups.py
|
||||
├── core/
|
||||
│ ├── __init__.py
|
||||
│ ├── database.py # Engine, get_db
|
||||
│ ├── cache.py # Cache abstraction
|
||||
│ └── utils.py
|
||||
├── services/
|
||||
│ ├── __init__.py
|
||||
│ └── *.py # Business logic services
|
||||
├── routes/
|
||||
│ ├── __init__.py
|
||||
│ └── *.py # Flask blueprints
|
||||
└── templates/
|
||||
└── *.html
|
||||
```
|
||||
|
||||
#### Scenario: Config modules are importable
|
||||
- **WHEN** importing `from mes_dashboard.config.settings import Config`
|
||||
- **THEN** the Config class is available
|
||||
|
||||
#### Scenario: Services can import from core
|
||||
- **WHEN** a service module imports `from mes_dashboard.core.database import get_db`
|
||||
- **THEN** the import succeeds without errors
|
||||
|
||||
### Requirement: pyproject.toml configuration
|
||||
|
||||
The system SHALL provide a `pyproject.toml` file for package metadata and dependencies.
|
||||
|
||||
#### Scenario: Package metadata is defined
|
||||
- **WHEN** `pyproject.toml` is read
|
||||
- **THEN** package name is `mes-dashboard`
|
||||
- **AND** Python version requirement is specified (>=3.9)
|
||||
- **AND** all dependencies are listed
|
||||
|
||||
#### Scenario: Editable install works
|
||||
- **WHEN** `pip install -e .` is executed
|
||||
- **THEN** the package is installed in editable mode
|
||||
- **AND** changes to source files are immediately reflected
|
||||
@@ -0,0 +1,56 @@
|
||||
## 1. Package 結構建立
|
||||
|
||||
- [x] 1.1 建立 `pyproject.toml` 定義 package metadata 和 dependencies
|
||||
- [x] 1.2 建立 `src/mes_dashboard/` 目錄結構
|
||||
- [x] 1.3 建立所有 `__init__.py` 檔案
|
||||
|
||||
## 2. Config 模組遷移
|
||||
|
||||
- [x] 2.1 建立 `src/mes_dashboard/config/settings.py` - Config classes (Base, Dev, Prod)
|
||||
- [x] 2.2 遷移 `apps/config/database.py` → `src/mes_dashboard/config/database.py`
|
||||
- [x] 2.3 遷移 `apps/config/constants.py` → `src/mes_dashboard/config/constants.py`
|
||||
- [x] 2.4 遷移 `apps/config/workcenter_groups.py` → `src/mes_dashboard/config/workcenter_groups.py`
|
||||
- [x] 2.5 建立 `src/mes_dashboard/config/tables.py` 從 database.py 分離 TABLES_CONFIG
|
||||
|
||||
## 3. Core 模組遷移
|
||||
|
||||
- [x] 3.1 建立 `src/mes_dashboard/core/database.py` - Engine factory + request-scoped get_db()
|
||||
- [x] 3.2 建立 `src/mes_dashboard/core/cache.py` - CacheBackend protocol + NoOpCache
|
||||
- [x] 3.3 遷移 `apps/core/utils.py` → `src/mes_dashboard/core/utils.py`
|
||||
|
||||
## 4. Services 模組遷移
|
||||
|
||||
- [x] 4.1 遷移 `apps/services/wip_service.py` - 更新 import paths
|
||||
- [x] 4.2 遷移 `apps/services/resource_service.py` - 更新 import paths
|
||||
- [x] 4.3 遷移 `apps/services/dashboard_service.py` - 更新 import paths
|
||||
- [x] 4.4 遷移 `apps/services/excel_query_service.py` - 更新 import paths
|
||||
|
||||
## 5. Routes 模組遷移
|
||||
|
||||
- [x] 5.1 遷移 `apps/routes/wip_routes.py` - 更新 import paths,移除 cache 呼叫
|
||||
- [x] 5.2 遷移 `apps/routes/resource_routes.py` - 更新 import paths
|
||||
- [x] 5.3 遷移 `apps/routes/dashboard_routes.py` - 更新 import paths
|
||||
- [x] 5.4 遷移 `apps/routes/excel_query_routes.py` - 更新 import paths
|
||||
- [x] 5.5 建立 `src/mes_dashboard/routes/__init__.py` - register_routes() function
|
||||
|
||||
## 6. Templates 遷移
|
||||
|
||||
- [x] 6.1 複製 `apps/templates/` → `src/mes_dashboard/templates/`
|
||||
|
||||
## 7. Application Factory
|
||||
|
||||
- [x] 7.1 建立 `src/mes_dashboard/app.py` - create_app() factory function
|
||||
- [x] 7.2 建立 `src/mes_dashboard/__main__.py` - development entry point
|
||||
|
||||
## 8. 部署設定
|
||||
|
||||
- [x] 8.1 建立 `gunicorn.conf.py` - Gunicorn 配置
|
||||
- [x] 8.2 建立 `scripts/start_server.sh` - Linux 啟動腳本
|
||||
- [x] 8.3 更新 `scripts/啟動Dashboard.bat` - Windows 啟動腳本
|
||||
|
||||
## 9. 清理與驗證
|
||||
|
||||
- [x] 9.1 執行 `pip install -e .` 驗證 package 安裝
|
||||
- [x] 9.2 啟動應用驗證所有頁面和 API
|
||||
- [x] 9.3 移除舊的 `apps/` 目錄
|
||||
- [x] 9.4 更新 `.gitignore` 加入 egg-info 等
|
||||
20
openspec/config.yaml
Normal file
20
openspec/config.yaml
Normal file
@@ -0,0 +1,20 @@
|
||||
schema: spec-driven
|
||||
|
||||
# Project context (optional)
|
||||
# This is shown to AI when creating artifacts.
|
||||
# Add your tech stack, conventions, style guides, domain knowledge, etc.
|
||||
# Example:
|
||||
# context: |
|
||||
# Tech stack: TypeScript, React, Node.js
|
||||
# We use conventional commits
|
||||
# Domain: e-commerce platform
|
||||
|
||||
# Per-artifact rules (optional)
|
||||
# Add custom rules for specific artifacts.
|
||||
# Example:
|
||||
# rules:
|
||||
# proposal:
|
||||
# - Keep proposals under 500 words
|
||||
# - Always include a "Non-goals" section
|
||||
# tasks:
|
||||
# - Break tasks into chunks of max 2 hours
|
||||
47
openspec/specs/app-factory/spec.md
Normal file
47
openspec/specs/app-factory/spec.md
Normal file
@@ -0,0 +1,47 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Application factory function
|
||||
|
||||
The system SHALL provide a `create_app()` function that creates and configures a Flask application instance.
|
||||
|
||||
#### Scenario: Create app with default config
|
||||
- **WHEN** `create_app()` is called without arguments
|
||||
- **THEN** a Flask app instance is returned with development configuration
|
||||
|
||||
#### Scenario: Create app with specific config
|
||||
- **WHEN** `create_app("production")` is called
|
||||
- **THEN** a Flask app instance is returned with production configuration
|
||||
|
||||
#### Scenario: Multiple app instances are independent
|
||||
- **WHEN** `create_app()` is called twice
|
||||
- **THEN** two independent Flask app instances are returned
|
||||
|
||||
### Requirement: Blueprint registration
|
||||
|
||||
The system SHALL automatically register all route blueprints when creating an app.
|
||||
|
||||
#### Scenario: All existing routes are available
|
||||
- **WHEN** an app is created via `create_app()`
|
||||
- **THEN** all existing API endpoints (`/api/wip/*`, `/api/resource/*`, `/api/dashboard/*`, `/api/excel/*`) are accessible
|
||||
- **AND** all page routes (`/`, `/wip`, `/resource`, `/tables`, etc.) are accessible
|
||||
|
||||
### Requirement: Database initialization
|
||||
|
||||
The system SHALL initialize the database connection pool when creating an app.
|
||||
|
||||
#### Scenario: Database is ready after app creation
|
||||
- **WHEN** an app is created via `create_app()`
|
||||
- **THEN** the SQLAlchemy engine is configured with connection pooling
|
||||
- **AND** `pool_size` and `max_overflow` are set from configuration
|
||||
|
||||
### Requirement: Request-scoped database connection
|
||||
|
||||
The system SHALL provide request-scoped database connections via Flask's application context.
|
||||
|
||||
#### Scenario: Connection obtained during request
|
||||
- **WHEN** a request handler calls `get_db()`
|
||||
- **THEN** a database connection is returned from the pool
|
||||
|
||||
#### Scenario: Connection released after request
|
||||
- **WHEN** a request completes
|
||||
- **THEN** the database connection is returned to the pool automatically
|
||||
80
openspec/specs/deployment-config/spec.md
Normal file
80
openspec/specs/deployment-config/spec.md
Normal file
@@ -0,0 +1,80 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Gunicorn configuration file
|
||||
|
||||
The system SHALL provide a `gunicorn.conf.py` file with production-ready defaults.
|
||||
|
||||
#### Scenario: Default configuration for single worker
|
||||
- **WHEN** gunicorn loads the config file
|
||||
- **THEN** `workers` is set to 1
|
||||
- **AND** `threads` is set to 4
|
||||
- **AND** `worker_class` is set to `gthread`
|
||||
|
||||
#### Scenario: Bind address is configurable
|
||||
- **WHEN** gunicorn starts
|
||||
- **THEN** it binds to `0.0.0.0:8080` by default
|
||||
- **AND** bind address can be overridden via environment variable
|
||||
|
||||
### Requirement: WSGI entry point
|
||||
|
||||
The system SHALL provide a WSGI-compatible entry point for gunicorn.
|
||||
|
||||
#### Scenario: Gunicorn can import the app
|
||||
- **WHEN** gunicorn is started with `gunicorn "mes_dashboard:create_app()"`
|
||||
- **THEN** the application starts successfully
|
||||
- **AND** all routes are accessible
|
||||
|
||||
### Requirement: Development startup script
|
||||
|
||||
The system SHALL provide a development startup script.
|
||||
|
||||
#### Scenario: Run in development mode
|
||||
- **WHEN** `python -m mes_dashboard` is executed
|
||||
- **THEN** the Flask development server starts
|
||||
- **AND** debug mode is enabled
|
||||
- **AND** auto-reload is enabled
|
||||
|
||||
### Requirement: Production startup script
|
||||
|
||||
The system SHALL provide scripts for production deployment.
|
||||
|
||||
#### Scenario: Start with gunicorn on Linux
|
||||
- **WHEN** `./scripts/start_server.sh` is executed
|
||||
- **THEN** gunicorn starts with the config file settings
|
||||
- **AND** logs are written to stdout
|
||||
|
||||
#### Scenario: Start on Windows
|
||||
- **WHEN** `scripts\start_server.bat` is executed
|
||||
- **THEN** the server starts using waitress (Windows-compatible WSGI server)
|
||||
- **OR** gunicorn if running in WSL
|
||||
|
||||
### Requirement: Environment-based configuration
|
||||
|
||||
The system SHALL load configuration based on environment.
|
||||
|
||||
#### Scenario: Development environment
|
||||
- **WHEN** `FLASK_ENV=development` (or not set)
|
||||
- **THEN** development configuration is loaded
|
||||
- **AND** `DEBUG=True`
|
||||
- **AND** connection pool size is smaller (5)
|
||||
|
||||
#### Scenario: Production environment
|
||||
- **WHEN** `FLASK_ENV=production`
|
||||
- **THEN** production configuration is loaded
|
||||
- **AND** `DEBUG=False`
|
||||
- **AND** connection pool size can be larger
|
||||
|
||||
### Requirement: Cache backend abstraction
|
||||
|
||||
The system SHALL provide an abstract cache interface with no-op default implementation.
|
||||
|
||||
#### Scenario: NoOpCache is used by default
|
||||
- **WHEN** cache is accessed without Redis configuration
|
||||
- **THEN** `NoOpCache` backend is used
|
||||
- **AND** `get()` always returns `None`
|
||||
- **AND** `set()` does nothing
|
||||
|
||||
#### Scenario: Cache interface is extensible
|
||||
- **WHEN** a `RedisCache` implementation is added in the future
|
||||
- **THEN** it can implement the same `CacheBackend` interface
|
||||
- **AND** switching requires only configuration change
|
||||
67
openspec/specs/package-structure/spec.md
Normal file
67
openspec/specs/package-structure/spec.md
Normal file
@@ -0,0 +1,67 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Standard Python package layout
|
||||
|
||||
The system SHALL use src layout for Python package structure.
|
||||
|
||||
#### Scenario: Package is importable after installation
|
||||
- **WHEN** `pip install -e .` is executed in the project root
|
||||
- **THEN** `import mes_dashboard` succeeds
|
||||
- **AND** `from mes_dashboard.app import create_app` succeeds
|
||||
|
||||
#### Scenario: No sys.path manipulation required
|
||||
- **WHEN** any module within the package imports another module
|
||||
- **THEN** standard relative or absolute imports are used
|
||||
- **AND** no `sys.path.insert` or similar hacks are present
|
||||
|
||||
### Requirement: Package directory structure
|
||||
|
||||
The system SHALL organize code in the following structure:
|
||||
|
||||
```
|
||||
src/mes_dashboard/
|
||||
├── __init__.py
|
||||
├── app.py # create_app factory
|
||||
├── config/
|
||||
│ ├── __init__.py
|
||||
│ ├── settings.py # Config classes
|
||||
│ ├── database.py # DB connection settings
|
||||
│ ├── tables.py # Table metadata
|
||||
│ └── workcenter_groups.py
|
||||
├── core/
|
||||
│ ├── __init__.py
|
||||
│ ├── database.py # Engine, get_db
|
||||
│ ├── cache.py # Cache abstraction
|
||||
│ └── utils.py
|
||||
├── services/
|
||||
│ ├── __init__.py
|
||||
│ └── *.py # Business logic services
|
||||
├── routes/
|
||||
│ ├── __init__.py
|
||||
│ └── *.py # Flask blueprints
|
||||
└── templates/
|
||||
└── *.html
|
||||
```
|
||||
|
||||
#### Scenario: Config modules are importable
|
||||
- **WHEN** importing `from mes_dashboard.config.settings import Config`
|
||||
- **THEN** the Config class is available
|
||||
|
||||
#### Scenario: Services can import from core
|
||||
- **WHEN** a service module imports `from mes_dashboard.core.database import get_db`
|
||||
- **THEN** the import succeeds without errors
|
||||
|
||||
### Requirement: pyproject.toml configuration
|
||||
|
||||
The system SHALL provide a `pyproject.toml` file for package metadata and dependencies.
|
||||
|
||||
#### Scenario: Package metadata is defined
|
||||
- **WHEN** `pyproject.toml` is read
|
||||
- **THEN** package name is `mes-dashboard`
|
||||
- **AND** Python version requirement is specified (>=3.9)
|
||||
- **AND** all dependencies are listed
|
||||
|
||||
#### Scenario: Editable install works
|
||||
- **WHEN** `pip install -e .` is executed
|
||||
- **THEN** the package is installed in editable mode
|
||||
- **AND** changes to source files are immediately reflected
|
||||
41
pyproject.toml
Normal file
41
pyproject.toml
Normal file
@@ -0,0 +1,41 @@
|
||||
[build-system]
|
||||
requires = ["setuptools>=68", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "mes-dashboard"
|
||||
version = "0.1.0"
|
||||
description = "MES Dashboard Portal"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.9"
|
||||
license = { text = "MIT" }
|
||||
authors = [
|
||||
{ name = "MES Dashboard Team" }
|
||||
]
|
||||
keywords = ["flask", "mes", "dashboard"]
|
||||
classifiers = [
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3 :: Only",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"License :: OSI Approved :: MIT License",
|
||||
]
|
||||
dependencies = [
|
||||
"oracledb>=2.0.0",
|
||||
"flask>=3.0.0",
|
||||
"pandas>=2.0.0",
|
||||
"sqlalchemy>=2.0.0",
|
||||
"openpyxl>=3.0.0",
|
||||
"python-dotenv>=1.0.0",
|
||||
"gunicorn>=21.2.0",
|
||||
"waitress>=2.1.2; platform_system == 'Windows'",
|
||||
]
|
||||
|
||||
[tool.setuptools]
|
||||
package-dir = {"" = "src"}
|
||||
include-package-data = true
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["src"]
|
||||
|
||||
[tool.setuptools.package-data]
|
||||
mes_dashboard = ["templates/**/*"]
|
||||
@@ -2,3 +2,7 @@ oracledb>=2.0.0
|
||||
flask>=3.0.0
|
||||
pandas>=2.0.0
|
||||
sqlalchemy>=2.0.0
|
||||
openpyxl>=3.0.0
|
||||
python-dotenv>=1.0.0
|
||||
gunicorn>=21.2.0
|
||||
waitress>=2.1.2; platform_system=="Windows"
|
||||
|
||||
8
scripts/start_server.sh
Normal file
8
scripts/start_server.sh
Normal file
@@ -0,0 +1,8 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
export PYTHONPATH="${ROOT}/src:${PYTHONPATH:-}"
|
||||
|
||||
cd "$ROOT"
|
||||
exec gunicorn --config gunicorn.conf.py "mes_dashboard:create_app()"
|
||||
@@ -21,14 +21,22 @@ if exist "%ROOT%\venv\Scripts\python.exe" (
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
set "PYTHONPATH=%ROOT%\src"
|
||||
set "WAITRESS=%ROOT%\venv\Scripts\waitress-serve.exe"
|
||||
|
||||
echo Starting server...
|
||||
echo URL: http://localhost:5000
|
||||
echo URL: http://localhost:8080
|
||||
echo Press Ctrl+C to stop
|
||||
echo.
|
||||
echo ========================================
|
||||
echo.
|
||||
|
||||
"%PYTHON%" "%ROOT%\apps\portal.py"
|
||||
if exist "%WAITRESS%" (
|
||||
"%WAITRESS%" --listen=0.0.0.0:8080 mes_dashboard:create_app
|
||||
) else (
|
||||
echo [WARN] waitress-serve not found, falling back to development server
|
||||
"%PYTHON%" -m mes_dashboard
|
||||
)
|
||||
|
||||
echo.
|
||||
echo ========================================
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
@echo off
|
||||
chcp 65001 >nul
|
||||
echo ========================================
|
||||
echo WIP 報表已整合到統一入口
|
||||
echo ========================================
|
||||
echo.
|
||||
echo 將轉到: scripts\啟動Dashboard.bat
|
||||
echo.
|
||||
call "%~dp0啟動Dashboard.bat"
|
||||
@@ -1,9 +0,0 @@
|
||||
@echo off
|
||||
chcp 65001 >nul
|
||||
echo ========================================
|
||||
echo 數據查詢工具已整合到統一入口
|
||||
echo ========================================
|
||||
echo.
|
||||
echo 將轉到: scripts\啟動Dashboard.bat
|
||||
echo.
|
||||
call "%~dp0啟動Dashboard.bat"
|
||||
5
src/mes_dashboard/__init__.py
Normal file
5
src/mes_dashboard/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""MES Dashboard package."""
|
||||
|
||||
from .app import create_app
|
||||
|
||||
__all__ = ["create_app"]
|
||||
12
src/mes_dashboard/__main__.py
Normal file
12
src/mes_dashboard/__main__.py
Normal file
@@ -0,0 +1,12 @@
|
||||
"""Development entry point for MES Dashboard."""
|
||||
|
||||
from mes_dashboard.app import create_app
|
||||
|
||||
|
||||
def main() -> None:
|
||||
app = create_app()
|
||||
app.run(debug=True, use_reloader=True, host="0.0.0.0", port=8080)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
103
src/mes_dashboard/app.py
Normal file
103
src/mes_dashboard/app.py
Normal file
@@ -0,0 +1,103 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Flask application factory for MES Dashboard."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from flask import Flask, jsonify, render_template, request
|
||||
|
||||
from mes_dashboard.config.tables import TABLES_CONFIG
|
||||
from mes_dashboard.config.settings import get_config
|
||||
from mes_dashboard.core.cache import NoOpCache
|
||||
from mes_dashboard.core.database import get_table_data, get_table_columns, get_engine, init_db
|
||||
from mes_dashboard.routes import register_routes
|
||||
|
||||
|
||||
def create_app(config_name: str | None = None) -> Flask:
|
||||
"""Create and configure the Flask app instance."""
|
||||
app = Flask(__name__, template_folder="templates")
|
||||
|
||||
config_class = get_config(config_name)
|
||||
app.config.from_object(config_class)
|
||||
|
||||
# Default cache backend (no-op)
|
||||
app.extensions["cache"] = NoOpCache()
|
||||
|
||||
# Initialize database teardown and pool
|
||||
init_db(app)
|
||||
with app.app_context():
|
||||
get_engine()
|
||||
|
||||
# Register API routes
|
||||
register_routes(app)
|
||||
|
||||
# ========================================================
|
||||
# Page Routes
|
||||
# ========================================================
|
||||
|
||||
@app.route('/')
|
||||
def portal_index():
|
||||
"""Portal home with tabs."""
|
||||
return render_template('portal.html')
|
||||
|
||||
@app.route('/tables')
|
||||
def tables_page():
|
||||
"""Table viewer page."""
|
||||
return render_template('index.html', tables_config=TABLES_CONFIG)
|
||||
|
||||
@app.route('/wip')
|
||||
def wip_page():
|
||||
"""WIP report page."""
|
||||
return render_template('wip_report.html')
|
||||
|
||||
@app.route('/resource')
|
||||
def resource_page():
|
||||
"""Resource status report page."""
|
||||
return render_template('resource_status.html')
|
||||
|
||||
@app.route('/wip-overview')
|
||||
def wip_overview_page():
|
||||
"""WIP overview dashboard page."""
|
||||
return render_template('wip_overview.html')
|
||||
|
||||
@app.route('/excel-query')
|
||||
def excel_query_page():
|
||||
"""Excel batch query tool page."""
|
||||
return render_template('excel_query.html')
|
||||
|
||||
# ========================================================
|
||||
# Table Query APIs (for table_data_viewer)
|
||||
# ========================================================
|
||||
|
||||
@app.route('/api/query_table', methods=['POST'])
|
||||
def query_table():
|
||||
"""API: query table data with optional column filters."""
|
||||
data = request.get_json()
|
||||
table_name = data.get('table_name')
|
||||
limit = data.get('limit', 1000)
|
||||
time_field = data.get('time_field')
|
||||
filters = data.get('filters')
|
||||
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定表名'}), 400
|
||||
|
||||
result = get_table_data(table_name, limit, time_field, filters)
|
||||
return jsonify(result)
|
||||
|
||||
@app.route('/api/get_table_columns', methods=['POST'])
|
||||
def api_get_table_columns():
|
||||
"""API: get column names for a table."""
|
||||
data = request.get_json()
|
||||
table_name = data.get('table_name')
|
||||
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定表名'}), 400
|
||||
|
||||
columns = get_table_columns(table_name)
|
||||
return jsonify({'columns': columns})
|
||||
|
||||
@app.route('/api/get_table_info', methods=['GET'])
|
||||
def get_table_info():
|
||||
"""API: get tables config."""
|
||||
return jsonify(TABLES_CONFIG)
|
||||
|
||||
return app
|
||||
43
src/mes_dashboard/config/__init__.py
Normal file
43
src/mes_dashboard/config/__init__.py
Normal file
@@ -0,0 +1,43 @@
|
||||
"""Configuration modules for MES Dashboard."""
|
||||
|
||||
from .database import DB_CONFIG, CONNECTION_STRING
|
||||
from .tables import TABLES_CONFIG
|
||||
from .constants import (
|
||||
EXCLUDED_LOCATIONS,
|
||||
EXCLUDED_ASSET_STATUSES,
|
||||
EQUIPMENT_TYPE_FILTER,
|
||||
CACHE_TTL_DEFAULT,
|
||||
CACHE_TTL_FILTER_OPTIONS,
|
||||
CACHE_TTL_PIVOT_COLUMNS,
|
||||
CACHE_TTL_KPI,
|
||||
CACHE_TTL_TREND,
|
||||
DEFAULT_DAYS_BACK,
|
||||
DEFAULT_WIP_DAYS_BACK,
|
||||
DEFAULT_PAGE_SIZE,
|
||||
MAX_PAGE_SIZE,
|
||||
STATUS_DISPLAY_NAMES,
|
||||
WIP_EXCLUDED_STATUS,
|
||||
)
|
||||
from .workcenter_groups import WORKCENTER_GROUPS, get_workcenter_group
|
||||
|
||||
__all__ = [
|
||||
"DB_CONFIG",
|
||||
"CONNECTION_STRING",
|
||||
"TABLES_CONFIG",
|
||||
"EXCLUDED_LOCATIONS",
|
||||
"EXCLUDED_ASSET_STATUSES",
|
||||
"EQUIPMENT_TYPE_FILTER",
|
||||
"CACHE_TTL_DEFAULT",
|
||||
"CACHE_TTL_FILTER_OPTIONS",
|
||||
"CACHE_TTL_PIVOT_COLUMNS",
|
||||
"CACHE_TTL_KPI",
|
||||
"CACHE_TTL_TREND",
|
||||
"DEFAULT_DAYS_BACK",
|
||||
"DEFAULT_WIP_DAYS_BACK",
|
||||
"DEFAULT_PAGE_SIZE",
|
||||
"MAX_PAGE_SIZE",
|
||||
"STATUS_DISPLAY_NAMES",
|
||||
"WIP_EXCLUDED_STATUS",
|
||||
"WORKCENTER_GROUPS",
|
||||
"get_workcenter_group",
|
||||
]
|
||||
83
src/mes_dashboard/config/constants.py
Normal file
83
src/mes_dashboard/config/constants.py
Normal file
@@ -0,0 +1,83 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Constants and configuration values for MES Dashboard.
|
||||
|
||||
Centralized location for all constant values used across the application.
|
||||
"""
|
||||
|
||||
# ============================================================
|
||||
# Location / Area Exclusions
|
||||
# ============================================================
|
||||
|
||||
# Locations to exclude from equipment queries
|
||||
EXCLUDED_LOCATIONS = [
|
||||
'ATEC',
|
||||
'F區',
|
||||
'F區焊接站',
|
||||
'報廢',
|
||||
'實驗室',
|
||||
'山東',
|
||||
'成型站_F區',
|
||||
'焊接F區',
|
||||
'無錫',
|
||||
'熒茂',
|
||||
]
|
||||
|
||||
# Asset statuses to exclude
|
||||
EXCLUDED_ASSET_STATUSES = ['Disapproved']
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Equipment Type Filters
|
||||
# ============================================================
|
||||
|
||||
# SQL condition for filtering valid equipment types
|
||||
EQUIPMENT_TYPE_FILTER = """
|
||||
((OBJECTCATEGORY = 'ASSEMBLY' AND OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (OBJECTCATEGORY = 'WAFERSORT' AND OBJECTTYPE = 'WAFERSORT'))
|
||||
"""
|
||||
|
||||
# Equipment flag filter templates
|
||||
EQUIPMENT_FLAG_FILTERS = {
|
||||
'isProduction': "NVL(PJ_ISPRODUCTION, 0) = 1",
|
||||
'isKey': "NVL(PJ_ISKEY, 0) = 1",
|
||||
'isMonitor': "NVL(PJ_ISMONITOR, 0) = 1",
|
||||
}
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Cache TTL Settings (in seconds)
|
||||
# ============================================================
|
||||
|
||||
CACHE_TTL_DEFAULT = 60 # Default cache TTL: 1 minute
|
||||
CACHE_TTL_FILTER_OPTIONS = 600 # Filter options: 10 minutes
|
||||
CACHE_TTL_PIVOT_COLUMNS = 300 # Pivot columns: 5 minutes
|
||||
CACHE_TTL_KPI = 60 # KPI data: 1 minute
|
||||
CACHE_TTL_TREND = 300 # Trend data: 5 minutes
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Query Defaults
|
||||
# ============================================================
|
||||
|
||||
DEFAULT_DAYS_BACK = 365 # Default days to look back for queries
|
||||
DEFAULT_WIP_DAYS_BACK = 90 # Default days for WIP queries
|
||||
DEFAULT_PAGE_SIZE = 100 # Default pagination size
|
||||
MAX_PAGE_SIZE = 500 # Maximum allowed page size
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Status Definitions
|
||||
# ============================================================
|
||||
|
||||
# Equipment status codes and their display names
|
||||
STATUS_DISPLAY_NAMES = {
|
||||
'PRD': '生產中',
|
||||
'SBY': '待機',
|
||||
'UDT': '非計畫停機',
|
||||
'SDT': '計畫停機',
|
||||
'EGT': '工程時間',
|
||||
'NST': '未排單',
|
||||
}
|
||||
|
||||
# WIP status codes to exclude (completed/scrapped)
|
||||
WIP_EXCLUDED_STATUS = (8, 128)
|
||||
38
src/mes_dashboard/config/database.py
Normal file
38
src/mes_dashboard/config/database.py
Normal file
@@ -0,0 +1,38 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Database configuration for MES Dashboard.
|
||||
|
||||
Centralized database connection settings used by all modules.
|
||||
Loads credentials from environment variables (.env file).
|
||||
"""
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Load .env file if python-dotenv is available
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Find .env file in project root
|
||||
env_path = Path(__file__).resolve().parents[3] / '.env'
|
||||
load_dotenv(env_path)
|
||||
except ImportError:
|
||||
pass # python-dotenv not installed, rely on system environment variables
|
||||
|
||||
# Database connection settings from environment variables
|
||||
DB_HOST = os.getenv('DB_HOST', '10.1.1.58')
|
||||
DB_PORT = os.getenv('DB_PORT', '1521')
|
||||
DB_SERVICE = os.getenv('DB_SERVICE', 'DWDB')
|
||||
DB_USER = os.getenv('DB_USER', '')
|
||||
DB_PASSWORD = os.getenv('DB_PASSWORD', '')
|
||||
|
||||
# Oracle Database connection config
|
||||
DB_CONFIG = {
|
||||
'user': DB_USER,
|
||||
'password': DB_PASSWORD,
|
||||
'dsn': f'{DB_HOST}:{DB_PORT}/{DB_SERVICE}'
|
||||
}
|
||||
|
||||
# SQLAlchemy connection string
|
||||
CONNECTION_STRING = (
|
||||
f"oracle+oracledb://{DB_USER}:{DB_PASSWORD}@{DB_HOST}:{DB_PORT}/?service_name={DB_SERVICE}"
|
||||
)
|
||||
53
src/mes_dashboard/config/settings.py
Normal file
53
src/mes_dashboard/config/settings.py
Normal file
@@ -0,0 +1,53 @@
|
||||
"""Application configuration classes for MES Dashboard."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
from typing import Type
|
||||
|
||||
|
||||
def _int_env(name: str, default: int) -> int:
|
||||
try:
|
||||
return int(os.getenv(name, str(default)))
|
||||
except (TypeError, ValueError):
|
||||
return default
|
||||
|
||||
|
||||
class Config:
|
||||
"""Base configuration."""
|
||||
|
||||
DEBUG = False
|
||||
TESTING = False
|
||||
ENV = "production"
|
||||
|
||||
# Database pool defaults (can be overridden by env)
|
||||
DB_POOL_SIZE = _int_env("DB_POOL_SIZE", 5)
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 10)
|
||||
|
||||
|
||||
class DevelopmentConfig(Config):
|
||||
"""Development configuration."""
|
||||
|
||||
DEBUG = True
|
||||
ENV = "development"
|
||||
|
||||
DB_POOL_SIZE = _int_env("DB_POOL_SIZE", 5)
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 10)
|
||||
|
||||
|
||||
class ProductionConfig(Config):
|
||||
"""Production configuration."""
|
||||
|
||||
DEBUG = False
|
||||
ENV = "production"
|
||||
|
||||
DB_POOL_SIZE = _int_env("DB_POOL_SIZE", 10)
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 20)
|
||||
|
||||
|
||||
def get_config(env: str | None = None) -> Type[Config]:
|
||||
"""Select config class based on environment name."""
|
||||
value = (env or os.getenv("FLASK_ENV", "development")).lower()
|
||||
if value in {"prod", "production"}:
|
||||
return ProductionConfig
|
||||
return DevelopmentConfig
|
||||
@@ -1,22 +1,7 @@
|
||||
"""
|
||||
MES 數據表查詢工具
|
||||
用於查看各數據表的最後 1000 筆資料,確認表結構和內容
|
||||
"""
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Table configuration metadata for MES Dashboard."""
|
||||
|
||||
import oracledb
|
||||
import pandas as pd
|
||||
from flask import Flask, render_template, request, jsonify
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
# 數據庫連接配置
|
||||
DB_CONFIG = {
|
||||
'user': 'MBU1_R',
|
||||
'password': 'Pj2481mbu1',
|
||||
'dsn': '10.1.1.58:1521/DWDB'
|
||||
}
|
||||
|
||||
# 16 張核心表配置(含表性質分類)
|
||||
# 16 core tables config (with categories)
|
||||
TABLES_CONFIG = {
|
||||
'現況快照表': [
|
||||
{
|
||||
@@ -51,7 +36,7 @@ TABLES_CONFIG = {
|
||||
'歷史累積表': [
|
||||
{
|
||||
'name': 'DW_MES_RESOURCESTATUS',
|
||||
'display_name': 'RESOURCESTATUS (資源狀態歷史) ⭐',
|
||||
'display_name': 'RESOURCESTATUS (資源狀態歷史)',
|
||||
'row_count': 65139825,
|
||||
'time_field': 'OLDLASTSTATUSCHANGEDATE',
|
||||
'description': '設備狀態變更歷史表 - 狀態切換與原因'
|
||||
@@ -65,7 +50,7 @@ TABLES_CONFIG = {
|
||||
},
|
||||
{
|
||||
'name': 'DW_MES_LOTWIPHISTORY',
|
||||
'display_name': 'LOTWIPHISTORY (批次流轉歷史) ⭐',
|
||||
'display_name': 'LOTWIPHISTORY (批次流轉歷史)',
|
||||
'row_count': 53085425,
|
||||
'time_field': 'TRACKINTIMESTAMP',
|
||||
'description': '在製流轉歷史表 - 批次進出站與流程軌跡'
|
||||
@@ -116,7 +101,7 @@ TABLES_CONFIG = {
|
||||
'name': 'DW_MES_MAINTENANCE',
|
||||
'display_name': 'MAINTENANCE (設備維護歷史)',
|
||||
'row_count': 50954850,
|
||||
'time_field': 'CREATEDATE',
|
||||
'time_field': 'TXNDATE',
|
||||
'description': '設備保養/維護紀錄表 - 保養計畫與點檢數據'
|
||||
}
|
||||
],
|
||||
@@ -137,134 +122,3 @@ TABLES_CONFIG = {
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
def get_db_connection():
|
||||
"""建立數據庫連接"""
|
||||
try:
|
||||
connection = oracledb.connect(**DB_CONFIG)
|
||||
return connection
|
||||
except Exception as e:
|
||||
print(f"數據庫連接失敗: {e}")
|
||||
return None
|
||||
|
||||
def get_table_data(table_name, limit=1000, time_field=None):
|
||||
"""
|
||||
查詢表的最後 N 筆資料
|
||||
|
||||
Args:
|
||||
table_name: 表名
|
||||
limit: 返回行數
|
||||
time_field: 時間欄位(用於排序)
|
||||
|
||||
Returns:
|
||||
dict: 包含 columns, data, row_count 的字典
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return {'error': '數據庫連接失敗'}
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
|
||||
# 構建查詢 SQL
|
||||
if time_field:
|
||||
# 如果有時間欄位,按時間倒序
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT * FROM {table_name}
|
||||
WHERE {time_field} IS NOT NULL
|
||||
ORDER BY {time_field} DESC
|
||||
) WHERE ROWNUM <= {limit}
|
||||
"""
|
||||
else:
|
||||
# 沒有時間欄位,直接取前 N 筆
|
||||
sql = f"""
|
||||
SELECT * FROM {table_name}
|
||||
WHERE ROWNUM <= {limit}
|
||||
"""
|
||||
|
||||
# 執行查詢
|
||||
cursor.execute(sql)
|
||||
|
||||
# 獲取欄位名
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
|
||||
# 獲取數據
|
||||
rows = cursor.fetchall()
|
||||
|
||||
# 轉換為 JSON 可序列化格式
|
||||
data = []
|
||||
for row in rows:
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
# 處理日期類型
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
# 處理 None
|
||||
elif value is None:
|
||||
row_dict[col] = None
|
||||
# 處理數字
|
||||
elif isinstance(value, (int, float)):
|
||||
row_dict[col] = value
|
||||
# 其他轉為字符串
|
||||
else:
|
||||
row_dict[col] = str(value)
|
||||
data.append(row_dict)
|
||||
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
return {
|
||||
'columns': columns,
|
||||
'data': data,
|
||||
'row_count': len(data)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
if connection:
|
||||
connection.close()
|
||||
return {'error': f'查詢失敗: {str(e)}'}
|
||||
|
||||
@app.route('/')
|
||||
def index():
|
||||
"""首頁 - 顯示所有表列表"""
|
||||
return render_template('index.html', tables_config=TABLES_CONFIG)
|
||||
|
||||
@app.route('/api/query_table', methods=['POST'])
|
||||
def query_table():
|
||||
"""API: 查詢指定表的資料"""
|
||||
data = request.get_json()
|
||||
table_name = data.get('table_name')
|
||||
limit = data.get('limit', 1000)
|
||||
time_field = data.get('time_field')
|
||||
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定表名'}), 400
|
||||
|
||||
result = get_table_data(table_name, limit, time_field)
|
||||
return jsonify(result)
|
||||
|
||||
@app.route('/api/get_table_info', methods=['GET'])
|
||||
def get_table_info():
|
||||
"""API: 獲取所有表的配置信息"""
|
||||
return jsonify(TABLES_CONFIG)
|
||||
|
||||
if __name__ == '__main__':
|
||||
# 檢查數據庫連接
|
||||
print("正在測試數據庫連接...")
|
||||
conn = get_db_connection()
|
||||
if conn:
|
||||
print("✓ 數據庫連接成功!")
|
||||
conn.close()
|
||||
print("\n啟動 Web 服務器...")
|
||||
print("請訪問: http://localhost:5000")
|
||||
print("\n提示:")
|
||||
print("- 點擊表名查看最後 1000 筆資料")
|
||||
print("- 大表會自動使用時間欄位排序")
|
||||
print("- 按 Ctrl+C 停止服務器")
|
||||
app.run(debug=True, host='0.0.0.0', port=5000)
|
||||
else:
|
||||
print("✗ 數據庫連接失敗,請檢查配置")
|
||||
138
src/mes_dashboard/config/workcenter_groups.py
Normal file
138
src/mes_dashboard/config/workcenter_groups.py
Normal file
@@ -0,0 +1,138 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Workcenter grouping configuration for MES Dashboard.
|
||||
|
||||
Defines how individual workcenters are grouped and their display order.
|
||||
This configuration is used across WIP reports, resource status, and dashboard.
|
||||
"""
|
||||
|
||||
from typing import Tuple, Optional
|
||||
|
||||
# ============================================================
|
||||
# Workcenter Group Definitions
|
||||
# ============================================================
|
||||
# Order determines display sequence (left to right in tables, top to bottom in charts)
|
||||
# Patterns are matched case-insensitively
|
||||
# Exclude patterns take precedence over include patterns
|
||||
|
||||
WORKCENTER_GROUPS = {
|
||||
'切割': {
|
||||
'order': 0,
|
||||
'patterns': ['切割'],
|
||||
'exclude': ['元件切割', 'PKG_SAW'] # 元件切割 is a separate group
|
||||
},
|
||||
'焊接_DB': {
|
||||
'order': 1,
|
||||
'patterns': ['焊接_DB', '焊_DB_料', '焊_DB']
|
||||
},
|
||||
'焊接_WB': {
|
||||
'order': 2,
|
||||
'patterns': ['焊接_WB', '焊_WB_料', '焊_WB']
|
||||
},
|
||||
'焊接_DW': {
|
||||
'order': 3,
|
||||
'patterns': ['焊接_DW', '焊_DW', '焊_DW_料']
|
||||
},
|
||||
'成型': {
|
||||
'order': 4,
|
||||
'patterns': ['成型', '成型_料']
|
||||
},
|
||||
'去膠': {
|
||||
'order': 5,
|
||||
'patterns': ['去膠']
|
||||
},
|
||||
'水吹砂': {
|
||||
'order': 6,
|
||||
'patterns': ['水吹砂']
|
||||
},
|
||||
'電鍍': {
|
||||
'order': 7,
|
||||
'patterns': ['掛鍍', '滾鍍', '條鍍', '電鍍', '補鍍', 'TOTAI', 'BANDL']
|
||||
},
|
||||
'移印': {
|
||||
'order': 8,
|
||||
'patterns': ['移印']
|
||||
},
|
||||
'切彎腳': {
|
||||
'order': 9,
|
||||
'patterns': ['切彎腳']
|
||||
},
|
||||
'元件切割': {
|
||||
'order': 10,
|
||||
'patterns': ['元件切割', 'PKG_SAW']
|
||||
},
|
||||
'測試': {
|
||||
'order': 11,
|
||||
'patterns': ['TMTT', '測試']
|
||||
}
|
||||
}
|
||||
|
||||
# Group order for sorting (exported for frontend use)
|
||||
GROUP_ORDER = {name: config['order'] for name, config in WORKCENTER_GROUPS.items()}
|
||||
|
||||
|
||||
def get_workcenter_group(workcenter_name: Optional[str]) -> Tuple[Optional[str], int]:
|
||||
"""Map workcenter name to its group name and order.
|
||||
|
||||
Args:
|
||||
workcenter_name: The original workcenter name from database
|
||||
|
||||
Returns:
|
||||
Tuple of (group_name, order) where:
|
||||
- group_name: The merged group name (e.g., '焊接_DB') or None if unmatched
|
||||
- order: The display order (0-11 for defined groups, 999 for unmatched)
|
||||
|
||||
Examples:
|
||||
>>> get_workcenter_group('焊接_DB')
|
||||
('焊接_DB', 1)
|
||||
>>> get_workcenter_group('焊_DB_料')
|
||||
('焊接_DB', 1)
|
||||
>>> get_workcenter_group('切割')
|
||||
('切割', 0)
|
||||
>>> get_workcenter_group('元件切割')
|
||||
('元件切割', 10)
|
||||
>>> get_workcenter_group('Unknown_WC')
|
||||
(None, 999)
|
||||
"""
|
||||
if not workcenter_name:
|
||||
return None, 999
|
||||
|
||||
wc_upper = workcenter_name.upper()
|
||||
|
||||
for group_name, config in WORKCENTER_GROUPS.items():
|
||||
# Check exclusions first (important for '切割' vs '元件切割')
|
||||
if 'exclude' in config:
|
||||
excluded = False
|
||||
for excl in config['exclude']:
|
||||
if excl.upper() in wc_upper:
|
||||
excluded = True
|
||||
break
|
||||
if excluded:
|
||||
continue
|
||||
|
||||
# Check patterns
|
||||
for pattern in config['patterns']:
|
||||
if pattern.upper() in wc_upper:
|
||||
return group_name, config['order']
|
||||
|
||||
return None, 999 # Unmatched workcenters
|
||||
|
||||
|
||||
def get_all_group_names() -> list:
|
||||
"""Get all group names in order.
|
||||
|
||||
Returns:
|
||||
List of group names sorted by their order.
|
||||
"""
|
||||
return sorted(WORKCENTER_GROUPS.keys(), key=lambda x: WORKCENTER_GROUPS[x]['order'])
|
||||
|
||||
|
||||
def get_group_order(group_name: str) -> int:
|
||||
"""Get the order number for a group name.
|
||||
|
||||
Args:
|
||||
group_name: The group name to look up
|
||||
|
||||
Returns:
|
||||
Order number (0-11) or 999 if not found
|
||||
"""
|
||||
return GROUP_ORDER.get(group_name, 999)
|
||||
39
src/mes_dashboard/core/__init__.py
Normal file
39
src/mes_dashboard/core/__init__.py
Normal file
@@ -0,0 +1,39 @@
|
||||
"""Core utilities module for MES Dashboard."""
|
||||
|
||||
from .database import (
|
||||
get_db_connection,
|
||||
get_engine,
|
||||
get_db,
|
||||
read_sql_df,
|
||||
get_table_data,
|
||||
get_table_columns,
|
||||
init_db,
|
||||
)
|
||||
from .cache import cache_get, cache_set, make_cache_key, CacheBackend, NoOpCache
|
||||
from .utils import (
|
||||
get_days_back,
|
||||
build_filter_conditions,
|
||||
build_equipment_filter_sql,
|
||||
convert_datetime_fields,
|
||||
format_api_response,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"get_db_connection",
|
||||
"get_engine",
|
||||
"get_db",
|
||||
"read_sql_df",
|
||||
"get_table_data",
|
||||
"get_table_columns",
|
||||
"init_db",
|
||||
"cache_get",
|
||||
"cache_set",
|
||||
"make_cache_key",
|
||||
"CacheBackend",
|
||||
"NoOpCache",
|
||||
"get_days_back",
|
||||
"build_filter_conditions",
|
||||
"build_equipment_filter_sql",
|
||||
"convert_datetime_fields",
|
||||
"format_api_response",
|
||||
]
|
||||
55
src/mes_dashboard/core/cache.py
Normal file
55
src/mes_dashboard/core/cache.py
Normal file
@@ -0,0 +1,55 @@
|
||||
"""Cache abstraction for MES Dashboard."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import Any, Optional, Protocol
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from mes_dashboard.config.constants import CACHE_TTL_DEFAULT
|
||||
|
||||
|
||||
class CacheBackend(Protocol):
|
||||
"""Protocol for cache backends."""
|
||||
|
||||
def get(self, key: str) -> Optional[Any]:
|
||||
...
|
||||
|
||||
def set(self, key: str, value: Any, ttl: int) -> None:
|
||||
...
|
||||
|
||||
|
||||
class NoOpCache:
|
||||
"""No-op cache backend (default)."""
|
||||
|
||||
def get(self, key: str) -> Optional[Any]:
|
||||
return None
|
||||
|
||||
def set(self, key: str, value: Any, ttl: int) -> None:
|
||||
return None
|
||||
|
||||
|
||||
def get_cache() -> CacheBackend:
|
||||
"""Return the configured cache backend or a no-op default."""
|
||||
try:
|
||||
cache = current_app.extensions.get("cache")
|
||||
except RuntimeError:
|
||||
cache = None
|
||||
return cache if cache is not None else NoOpCache()
|
||||
|
||||
|
||||
def cache_get(key: str) -> Optional[Any]:
|
||||
"""Get value from cache backend."""
|
||||
return get_cache().get(key)
|
||||
|
||||
|
||||
def cache_set(key: str, value: Any, ttl: int = CACHE_TTL_DEFAULT) -> None:
|
||||
"""Set value on cache backend."""
|
||||
get_cache().set(key, value, ttl)
|
||||
|
||||
|
||||
def make_cache_key(prefix: str, days_back: Optional[int] = None, filters: Optional[dict] = None) -> str:
|
||||
"""Generate a cache key from prefix and parameters."""
|
||||
filters_key = json.dumps(filters, sort_keys=True, ensure_ascii=False) if filters else ""
|
||||
return f"{prefix}:{days_back}:{filters_key}"
|
||||
207
src/mes_dashboard/core/database.py
Normal file
207
src/mes_dashboard/core/database.py
Normal file
@@ -0,0 +1,207 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Database connection and query utilities for MES Dashboard."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Optional, Dict, Any, Tuple
|
||||
|
||||
import oracledb
|
||||
import pandas as pd
|
||||
from flask import g, current_app
|
||||
from sqlalchemy import create_engine, text
|
||||
|
||||
from mes_dashboard.config.database import DB_CONFIG, CONNECTION_STRING
|
||||
from mes_dashboard.config.settings import DevelopmentConfig
|
||||
|
||||
# ============================================================
|
||||
# SQLAlchemy Engine (Singleton with connection pooling)
|
||||
# ============================================================
|
||||
|
||||
_ENGINE = None
|
||||
|
||||
|
||||
def _get_pool_settings() -> Tuple[int, int]:
|
||||
"""Return pool size and max overflow from app config or defaults."""
|
||||
try:
|
||||
pool_size = current_app.config.get("DB_POOL_SIZE", DevelopmentConfig.DB_POOL_SIZE)
|
||||
max_overflow = current_app.config.get("DB_MAX_OVERFLOW", DevelopmentConfig.DB_MAX_OVERFLOW)
|
||||
except RuntimeError:
|
||||
pool_size = DevelopmentConfig.DB_POOL_SIZE
|
||||
max_overflow = DevelopmentConfig.DB_MAX_OVERFLOW
|
||||
return pool_size, max_overflow
|
||||
|
||||
|
||||
def get_engine():
|
||||
"""Get SQLAlchemy engine with connection pooling (singleton pattern)."""
|
||||
global _ENGINE
|
||||
if _ENGINE is None:
|
||||
pool_size, max_overflow = _get_pool_settings()
|
||||
_ENGINE = create_engine(
|
||||
CONNECTION_STRING,
|
||||
pool_size=pool_size,
|
||||
max_overflow=max_overflow,
|
||||
pool_pre_ping=True,
|
||||
)
|
||||
return _ENGINE
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Request-scoped Connection
|
||||
# ============================================================
|
||||
|
||||
|
||||
def get_db():
|
||||
"""Get request-scoped database connection via Flask g."""
|
||||
if "db" not in g:
|
||||
g.db = get_engine().connect()
|
||||
return g.db
|
||||
|
||||
|
||||
def close_db(_exc: Optional[BaseException] = None) -> None:
|
||||
"""Close request-scoped connection."""
|
||||
db = g.pop("db", None)
|
||||
if db is not None:
|
||||
db.close()
|
||||
|
||||
|
||||
def init_db(app) -> None:
|
||||
"""Register database teardown handlers on the Flask app."""
|
||||
app.teardown_appcontext(close_db)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Direct Connection Helpers
|
||||
# ============================================================
|
||||
|
||||
|
||||
def get_db_connection():
|
||||
"""Create a direct oracledb connection.
|
||||
|
||||
Used for operations that need direct cursor access.
|
||||
"""
|
||||
try:
|
||||
return oracledb.connect(**DB_CONFIG)
|
||||
except Exception as exc:
|
||||
print(f"Database connection failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def read_sql_df(sql: str, params: Optional[Dict[str, Any]] = None) -> pd.DataFrame:
|
||||
"""Execute SQL query and return results as a DataFrame."""
|
||||
engine = get_engine()
|
||||
with engine.connect() as conn:
|
||||
df = pd.read_sql(text(sql), conn, params=params)
|
||||
df.columns = [str(c).upper() for c in df.columns]
|
||||
return df
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Table Utilities
|
||||
# ============================================================
|
||||
|
||||
|
||||
def get_table_columns(table_name: str) -> list:
|
||||
"""Get column names for a table."""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return []
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(f"SELECT * FROM {table_name} WHERE ROWNUM <= 1")
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
cursor.close()
|
||||
connection.close()
|
||||
return columns
|
||||
except Exception:
|
||||
if connection:
|
||||
connection.close()
|
||||
return []
|
||||
|
||||
|
||||
def get_table_data(
|
||||
table_name: str,
|
||||
limit: int = 1000,
|
||||
time_field: Optional[str] = None,
|
||||
filters: Optional[Dict[str, str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""Fetch rows from a table with optional filtering and sorting."""
|
||||
from datetime import datetime
|
||||
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return {'error': 'Database connection failed'}
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
|
||||
where_conditions = []
|
||||
bind_params = {}
|
||||
|
||||
if filters:
|
||||
for col, val in filters.items():
|
||||
if val and val.strip():
|
||||
safe_col = ''.join(c for c in col if c.isalnum() or c == '_')
|
||||
param_name = f"p_{safe_col}"
|
||||
where_conditions.append(
|
||||
f"UPPER(TO_CHAR({safe_col})) LIKE UPPER(:{param_name})"
|
||||
)
|
||||
bind_params[param_name] = f"%{val.strip()}%"
|
||||
|
||||
if time_field:
|
||||
time_condition = f"{time_field} IS NOT NULL"
|
||||
if where_conditions:
|
||||
all_conditions = " AND ".join([time_condition] + where_conditions)
|
||||
else:
|
||||
all_conditions = time_condition
|
||||
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT * FROM {table_name}
|
||||
WHERE {all_conditions}
|
||||
ORDER BY {time_field} DESC
|
||||
) WHERE ROWNUM <= :row_limit
|
||||
"""
|
||||
else:
|
||||
if where_conditions:
|
||||
all_conditions = " AND ".join(where_conditions)
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT * FROM {table_name}
|
||||
WHERE {all_conditions}
|
||||
) WHERE ROWNUM <= :row_limit
|
||||
"""
|
||||
else:
|
||||
sql = f"""
|
||||
SELECT * FROM {table_name}
|
||||
WHERE ROWNUM <= :row_limit
|
||||
"""
|
||||
|
||||
bind_params['row_limit'] = limit
|
||||
cursor.execute(sql, bind_params)
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
rows = cursor.fetchall()
|
||||
|
||||
data = []
|
||||
for row in rows:
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
else:
|
||||
row_dict[col] = value
|
||||
data.append(row_dict)
|
||||
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
return {
|
||||
'columns': columns,
|
||||
'data': data,
|
||||
'row_count': len(data)
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
return {'error': f'查詢失敗: {str(exc)}'}
|
||||
208
src/mes_dashboard/core/utils.py
Normal file
208
src/mes_dashboard/core/utils.py
Normal file
@@ -0,0 +1,208 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Utility functions for MES Dashboard.
|
||||
|
||||
Common helper functions used across services.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
import pandas as pd
|
||||
|
||||
from mes_dashboard.config.constants import (
|
||||
DEFAULT_DAYS_BACK,
|
||||
EQUIPMENT_FLAG_FILTERS,
|
||||
EXCLUDED_LOCATIONS,
|
||||
EXCLUDED_ASSET_STATUSES,
|
||||
)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Parameter Extraction
|
||||
# ============================================================
|
||||
|
||||
|
||||
def get_days_back(filters: Optional[Dict] = None, default: int = DEFAULT_DAYS_BACK) -> int:
|
||||
"""Extract days_back parameter from filters dict."""
|
||||
if filters:
|
||||
return int(filters.get('days_back', default))
|
||||
return default
|
||||
|
||||
|
||||
# ============================================================
|
||||
# SQL Filter Building
|
||||
# ============================================================
|
||||
|
||||
|
||||
def build_filter_conditions(
|
||||
filters: Optional[Dict],
|
||||
field_mapping: Optional[Dict[str, str]] = None,
|
||||
) -> List[str]:
|
||||
"""Build SQL WHERE conditions from filters dict."""
|
||||
if not filters:
|
||||
return []
|
||||
|
||||
conditions = []
|
||||
|
||||
if field_mapping:
|
||||
for filter_key, column_name in field_mapping.items():
|
||||
values = filters.get(filter_key)
|
||||
if values and len(values) > 0:
|
||||
if isinstance(values, list):
|
||||
value_list = "', '".join(str(v) for v in values)
|
||||
conditions.append(f"{column_name} IN ('{value_list}')")
|
||||
else:
|
||||
conditions.append(f"{column_name} = '{values}'")
|
||||
|
||||
return conditions
|
||||
|
||||
|
||||
def build_equipment_filter_sql(filters: Optional[Dict]) -> List[str]:
|
||||
"""Build SQL conditions for equipment flag filters."""
|
||||
if not filters:
|
||||
return []
|
||||
|
||||
conditions = []
|
||||
|
||||
for flag_key, sql_condition in EQUIPMENT_FLAG_FILTERS.items():
|
||||
if filters.get(flag_key):
|
||||
conditions.append(sql_condition)
|
||||
|
||||
return conditions
|
||||
|
||||
|
||||
def build_location_filter_sql(
|
||||
filters: Optional[Dict],
|
||||
column_name: str = 'LOCATIONNAME',
|
||||
) -> Optional[str]:
|
||||
"""Build SQL condition for location filtering."""
|
||||
if not filters:
|
||||
return None
|
||||
|
||||
locations = filters.get('locations')
|
||||
if locations and len(locations) > 0:
|
||||
loc_list = "', '".join(locations)
|
||||
return f"{column_name} IN ('{loc_list}')"
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def build_asset_status_filter_sql(
|
||||
filters: Optional[Dict],
|
||||
column_name: str = 'PJ_ASSETSSTATUS',
|
||||
) -> Optional[str]:
|
||||
"""Build SQL condition for asset status filtering."""
|
||||
if not filters:
|
||||
return None
|
||||
|
||||
statuses = filters.get('assetsStatuses')
|
||||
if statuses and len(statuses) > 0:
|
||||
status_list = "', '".join(statuses)
|
||||
return f"{column_name} IN ('{status_list}')"
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def build_exclusion_sql(
|
||||
locations: List[str] = None,
|
||||
asset_statuses: List[str] = None,
|
||||
location_column: str = 'LOCATIONNAME',
|
||||
status_column: str = 'PJ_ASSETSSTATUS',
|
||||
) -> List[str]:
|
||||
"""Build SQL conditions for excluding specific locations and statuses."""
|
||||
conditions = []
|
||||
|
||||
loc_list = locations if locations is not None else EXCLUDED_LOCATIONS
|
||||
if loc_list:
|
||||
locs = "', '".join(loc_list)
|
||||
conditions.append(f"{location_column} NOT IN ('{locs}')")
|
||||
|
||||
status_list = asset_statuses if asset_statuses is not None else EXCLUDED_ASSET_STATUSES
|
||||
if status_list:
|
||||
stats = "', '".join(status_list)
|
||||
conditions.append(f"{status_column} NOT IN ('{stats}')")
|
||||
|
||||
return conditions
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Data Transformation
|
||||
# ============================================================
|
||||
|
||||
|
||||
def convert_datetime_fields(
|
||||
df: pd.DataFrame,
|
||||
columns: Optional[List[str]] = None,
|
||||
format_str: str = '%Y-%m-%d %H:%M:%S',
|
||||
) -> pd.DataFrame:
|
||||
"""Convert datetime columns in DataFrame to formatted strings."""
|
||||
if df.empty:
|
||||
return df
|
||||
|
||||
if columns is None:
|
||||
columns = df.select_dtypes(include=['datetime64']).columns.tolist()
|
||||
|
||||
for col in columns:
|
||||
if col in df.columns:
|
||||
df[col] = df[col].apply(
|
||||
lambda x: x.strftime(format_str) if pd.notna(x) and hasattr(x, 'strftime') else None
|
||||
)
|
||||
|
||||
return df
|
||||
|
||||
|
||||
def row_to_dict(row: Any, columns: List[str]) -> Dict[str, Any]:
|
||||
"""Convert a database row to dictionary with datetime handling."""
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
else:
|
||||
row_dict[col] = value
|
||||
return row_dict
|
||||
|
||||
|
||||
# ============================================================
|
||||
# API Response Formatting
|
||||
# ============================================================
|
||||
|
||||
|
||||
def format_api_response(
|
||||
success: bool,
|
||||
data: Any = None,
|
||||
error: Optional[str] = None,
|
||||
count: Optional[int] = None,
|
||||
**extra,
|
||||
) -> Dict[str, Any]:
|
||||
"""Create standardized API response dict."""
|
||||
response = {'success': success}
|
||||
|
||||
if data is not None:
|
||||
response['data'] = data
|
||||
|
||||
if error is not None:
|
||||
response['error'] = error
|
||||
|
||||
if count is not None:
|
||||
response['count'] = count
|
||||
|
||||
response.update(extra)
|
||||
|
||||
return response
|
||||
|
||||
|
||||
def safe_int(value: Any, default: int = 0) -> int:
|
||||
"""Safely convert value to int."""
|
||||
try:
|
||||
return int(value) if value is not None else default
|
||||
except (ValueError, TypeError):
|
||||
return default
|
||||
|
||||
|
||||
def safe_float(value: Any, default: float = 0.0) -> float:
|
||||
"""Safely convert value to float."""
|
||||
try:
|
||||
return float(value) if value is not None else default
|
||||
except (ValueError, TypeError):
|
||||
return default
|
||||
26
src/mes_dashboard/routes/__init__.py
Normal file
26
src/mes_dashboard/routes/__init__.py
Normal file
@@ -0,0 +1,26 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""API routes module for MES Dashboard.
|
||||
|
||||
Contains Flask Blueprints for different API endpoints.
|
||||
"""
|
||||
|
||||
from .wip_routes import wip_bp
|
||||
from .resource_routes import resource_bp
|
||||
from .dashboard_routes import dashboard_bp
|
||||
from .excel_query_routes import excel_query_bp
|
||||
|
||||
|
||||
def register_routes(app) -> None:
|
||||
"""Register all API blueprints on the Flask app."""
|
||||
app.register_blueprint(wip_bp)
|
||||
app.register_blueprint(resource_bp)
|
||||
app.register_blueprint(dashboard_bp)
|
||||
app.register_blueprint(excel_query_bp)
|
||||
|
||||
__all__ = [
|
||||
'wip_bp',
|
||||
'resource_bp',
|
||||
'dashboard_bp',
|
||||
'excel_query_bp',
|
||||
'register_routes',
|
||||
]
|
||||
113
src/mes_dashboard/routes/dashboard_routes.py
Normal file
113
src/mes_dashboard/routes/dashboard_routes.py
Normal file
@@ -0,0 +1,113 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Dashboard API routes for MES Dashboard.
|
||||
|
||||
Contains Flask Blueprint for dashboard/KPI-related API endpoints.
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request
|
||||
|
||||
from mes_dashboard.core.cache import cache_get, cache_set, make_cache_key
|
||||
from mes_dashboard.core.utils import get_days_back
|
||||
from mes_dashboard.services.dashboard_service import (
|
||||
query_dashboard_kpi,
|
||||
query_workcenter_cards,
|
||||
query_resource_detail_with_job,
|
||||
query_ou_trend,
|
||||
query_utilization_heatmap,
|
||||
)
|
||||
|
||||
# Create Blueprint
|
||||
dashboard_bp = Blueprint('dashboard', __name__, url_prefix='/api/dashboard')
|
||||
|
||||
|
||||
@dashboard_bp.route('/kpi', methods=['POST'])
|
||||
def api_dashboard_kpi():
|
||||
"""API: Dashboard KPI data."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
|
||||
days_back = get_days_back(filters)
|
||||
cache_key = make_cache_key("dashboard_kpi", days_back, filters)
|
||||
kpi = cache_get(cache_key)
|
||||
if kpi is None:
|
||||
kpi = query_dashboard_kpi(filters)
|
||||
if kpi:
|
||||
cache_set(cache_key, kpi)
|
||||
if kpi:
|
||||
return jsonify({'success': True, 'data': kpi})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@dashboard_bp.route('/workcenter_cards', methods=['POST'])
|
||||
def api_dashboard_workcenter_cards():
|
||||
"""API: Workcenter cards data."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
|
||||
days_back = get_days_back(filters)
|
||||
cache_key = make_cache_key("dashboard_workcenter_cards", days_back, filters)
|
||||
cards = cache_get(cache_key)
|
||||
if cards is None:
|
||||
cards = query_workcenter_cards(filters)
|
||||
if cards is not None:
|
||||
cache_set(cache_key, cards)
|
||||
if cards is not None:
|
||||
return jsonify({'success': True, 'data': cards})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@dashboard_bp.route('/detail', methods=['POST'])
|
||||
def api_dashboard_detail():
|
||||
"""API: Resource detail with JOB info."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
limit = data.get('limit', 200)
|
||||
offset = data.get('offset', 0)
|
||||
|
||||
df, max_status_time = query_resource_detail_with_job(filters, limit, offset)
|
||||
if df is not None:
|
||||
records = df.to_dict(orient='records')
|
||||
return jsonify({
|
||||
'success': True,
|
||||
'data': records,
|
||||
'count': len(records),
|
||||
'offset': offset,
|
||||
'max_status_time': max_status_time
|
||||
})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@dashboard_bp.route('/ou_trend', methods=['POST'])
|
||||
def api_dashboard_ou_trend():
|
||||
"""API: OU% trend data for line chart."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
days = data.get('days', 7)
|
||||
|
||||
cache_key = make_cache_key("dashboard_ou_trend", days, filters)
|
||||
trend = cache_get(cache_key)
|
||||
if trend is None:
|
||||
trend = query_ou_trend(days, filters)
|
||||
if trend is not None:
|
||||
cache_set(cache_key, trend, ttl=300) # 5 min cache
|
||||
if trend is not None:
|
||||
return jsonify({'success': True, 'data': trend})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@dashboard_bp.route('/utilization_heatmap', methods=['POST'])
|
||||
def api_dashboard_utilization_heatmap():
|
||||
"""API: Utilization heatmap data."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
days = data.get('days', 7)
|
||||
|
||||
cache_key = make_cache_key("dashboard_heatmap", days, filters)
|
||||
heatmap = cache_get(cache_key)
|
||||
if heatmap is None:
|
||||
heatmap = query_utilization_heatmap(days, filters)
|
||||
if heatmap is not None:
|
||||
cache_set(cache_key, heatmap, ttl=300) # 5 min cache
|
||||
if heatmap is not None:
|
||||
return jsonify({'success': True, 'data': heatmap})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
193
src/mes_dashboard/routes/excel_query_routes.py
Normal file
193
src/mes_dashboard/routes/excel_query_routes.py
Normal file
@@ -0,0 +1,193 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""API routes for Excel batch query functionality.
|
||||
|
||||
Provides endpoints for:
|
||||
- Excel file upload and parsing
|
||||
- Column value extraction
|
||||
- Batch query execution
|
||||
- CSV export
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request, Response
|
||||
|
||||
from mes_dashboard.config.tables import TABLES_CONFIG
|
||||
from mes_dashboard.core.database import get_table_columns
|
||||
from mes_dashboard.services.excel_query_service import (
|
||||
parse_excel,
|
||||
get_column_unique_values,
|
||||
execute_batch_query,
|
||||
generate_csv_content,
|
||||
)
|
||||
|
||||
|
||||
excel_query_bp = Blueprint('excel_query', __name__, url_prefix='/api/excel-query')
|
||||
|
||||
# Store uploaded Excel data in memory (session-based in production)
|
||||
_uploaded_excel_cache = {}
|
||||
|
||||
|
||||
@excel_query_bp.route('/upload', methods=['POST'])
|
||||
def upload_excel():
|
||||
"""Upload and parse Excel file.
|
||||
|
||||
Returns column list and preview data.
|
||||
"""
|
||||
if 'file' not in request.files:
|
||||
return jsonify({'error': '未選擇檔案'}), 400
|
||||
|
||||
file = request.files['file']
|
||||
if file.filename == '':
|
||||
return jsonify({'error': '未選擇檔案'}), 400
|
||||
|
||||
# Check file extension
|
||||
allowed_extensions = {'.xlsx', '.xls'}
|
||||
import os
|
||||
ext = os.path.splitext(file.filename)[1].lower()
|
||||
if ext not in allowed_extensions:
|
||||
return jsonify({'error': '只支援 .xlsx 或 .xls 檔案'}), 400
|
||||
|
||||
# Parse Excel
|
||||
result = parse_excel(file)
|
||||
if 'error' in result:
|
||||
return jsonify(result), 400
|
||||
|
||||
# Cache the file content for later use
|
||||
file.seek(0)
|
||||
_uploaded_excel_cache['current'] = file.read()
|
||||
_uploaded_excel_cache['filename'] = file.filename
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@excel_query_bp.route('/column-values', methods=['POST'])
|
||||
def get_column_values():
|
||||
"""Get unique values from selected Excel column."""
|
||||
data = request.get_json()
|
||||
column_name = data.get('column_name')
|
||||
|
||||
if not column_name:
|
||||
return jsonify({'error': '請指定欄位名稱'}), 400
|
||||
|
||||
if 'current' not in _uploaded_excel_cache:
|
||||
return jsonify({'error': '請先上傳 Excel 檔案'}), 400
|
||||
|
||||
# Create file-like object from cached content
|
||||
import io
|
||||
file_like = io.BytesIO(_uploaded_excel_cache['current'])
|
||||
|
||||
result = get_column_unique_values(file_like, column_name)
|
||||
if 'error' in result:
|
||||
return jsonify(result), 400
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@excel_query_bp.route('/tables', methods=['GET'])
|
||||
def get_tables():
|
||||
"""Get available tables for querying."""
|
||||
tables = []
|
||||
for category, table_list in TABLES_CONFIG.items():
|
||||
for table in table_list:
|
||||
tables.append({
|
||||
'name': table['name'],
|
||||
'display_name': table['display_name'],
|
||||
'category': category
|
||||
})
|
||||
return jsonify({'tables': tables})
|
||||
|
||||
|
||||
@excel_query_bp.route('/table-columns', methods=['POST'])
|
||||
def get_table_cols():
|
||||
"""Get columns for a specific table."""
|
||||
data = request.get_json()
|
||||
table_name = data.get('table_name')
|
||||
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定資料表名稱'}), 400
|
||||
|
||||
columns = get_table_columns(table_name)
|
||||
if not columns:
|
||||
return jsonify({'error': f'無法取得資料表 {table_name} 的欄位'}), 400
|
||||
|
||||
return jsonify({'columns': columns})
|
||||
|
||||
|
||||
@excel_query_bp.route('/execute', methods=['POST'])
|
||||
def execute_query():
|
||||
"""Execute batch query with Excel values.
|
||||
|
||||
Expects JSON body:
|
||||
{
|
||||
"table_name": "DW_MES_WIP",
|
||||
"search_column": "LOT_ID",
|
||||
"return_columns": ["LOT_ID", "SPEC", "QTY"],
|
||||
"search_values": ["val1", "val2", ...]
|
||||
}
|
||||
"""
|
||||
data = request.get_json()
|
||||
|
||||
table_name = data.get('table_name')
|
||||
search_column = data.get('search_column')
|
||||
return_columns = data.get('return_columns')
|
||||
search_values = data.get('search_values')
|
||||
|
||||
# Validation
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定資料表'}), 400
|
||||
if not search_column:
|
||||
return jsonify({'error': '請指定查詢欄位'}), 400
|
||||
if not return_columns or not isinstance(return_columns, list):
|
||||
return jsonify({'error': '請指定回傳欄位'}), 400
|
||||
if not search_values or not isinstance(search_values, list):
|
||||
return jsonify({'error': '無查詢值'}), 400
|
||||
|
||||
result = execute_batch_query(
|
||||
table_name=table_name,
|
||||
search_column=search_column,
|
||||
return_columns=return_columns,
|
||||
search_values=search_values
|
||||
)
|
||||
|
||||
if 'error' in result:
|
||||
return jsonify(result), 400
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@excel_query_bp.route('/export-csv', methods=['POST'])
|
||||
def export_csv():
|
||||
"""Export query results as CSV file.
|
||||
|
||||
Same parameters as /execute endpoint.
|
||||
"""
|
||||
data = request.get_json()
|
||||
|
||||
table_name = data.get('table_name')
|
||||
search_column = data.get('search_column')
|
||||
return_columns = data.get('return_columns')
|
||||
search_values = data.get('search_values')
|
||||
|
||||
# Validation
|
||||
if not all([table_name, search_column, return_columns, search_values]):
|
||||
return jsonify({'error': '缺少必要參數'}), 400
|
||||
|
||||
result = execute_batch_query(
|
||||
table_name=table_name,
|
||||
search_column=search_column,
|
||||
return_columns=return_columns,
|
||||
search_values=search_values
|
||||
)
|
||||
|
||||
if 'error' in result:
|
||||
return jsonify(result), 400
|
||||
|
||||
# Generate CSV
|
||||
csv_content = generate_csv_content(result['data'], result['columns'])
|
||||
|
||||
return Response(
|
||||
csv_content,
|
||||
mimetype='text/csv; charset=utf-8',
|
||||
headers={
|
||||
'Content-Disposition': 'attachment; filename=query_result.csv'
|
||||
}
|
||||
)
|
||||
152
src/mes_dashboard/routes/resource_routes.py
Normal file
152
src/mes_dashboard/routes/resource_routes.py
Normal file
@@ -0,0 +1,152 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Resource (Equipment) API routes for MES Dashboard.
|
||||
|
||||
Contains Flask Blueprint for resource/equipment-related API endpoints.
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection
|
||||
from mes_dashboard.core.cache import cache_get, cache_set, make_cache_key
|
||||
from mes_dashboard.core.utils import get_days_back
|
||||
from mes_dashboard.services.resource_service import (
|
||||
query_resource_status_summary,
|
||||
query_resource_by_status,
|
||||
query_resource_by_workcenter,
|
||||
query_resource_detail,
|
||||
query_resource_workcenter_status_matrix,
|
||||
query_resource_filter_options,
|
||||
)
|
||||
|
||||
# Create Blueprint
|
||||
resource_bp = Blueprint('resource', __name__, url_prefix='/api/resource')
|
||||
|
||||
|
||||
@resource_bp.route('/summary')
|
||||
def api_resource_summary():
|
||||
"""API: Resource status summary."""
|
||||
days_back = request.args.get('days_back', 30, type=int)
|
||||
cache_key = make_cache_key("resource_summary", days_back)
|
||||
summary = cache_get(cache_key)
|
||||
if summary is None:
|
||||
summary = query_resource_status_summary(days_back)
|
||||
if summary:
|
||||
cache_set(cache_key, summary)
|
||||
if summary:
|
||||
return jsonify({'success': True, 'data': summary})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/by_status')
|
||||
def api_resource_by_status():
|
||||
"""API: Resource count by status."""
|
||||
days_back = request.args.get('days_back', 30, type=int)
|
||||
cache_key = make_cache_key("resource_by_status", days_back)
|
||||
data = cache_get(cache_key)
|
||||
if data is None:
|
||||
df = query_resource_by_status(days_back)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
cache_set(cache_key, data)
|
||||
else:
|
||||
data = None
|
||||
if data is not None:
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/by_workcenter')
|
||||
def api_resource_by_workcenter():
|
||||
"""API: Resource count by workcenter."""
|
||||
days_back = request.args.get('days_back', 30, type=int)
|
||||
cache_key = make_cache_key("resource_by_workcenter", days_back)
|
||||
data = cache_get(cache_key)
|
||||
if data is None:
|
||||
df = query_resource_by_workcenter(days_back)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
cache_set(cache_key, data)
|
||||
else:
|
||||
data = None
|
||||
if data is not None:
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/workcenter_status_matrix')
|
||||
def api_resource_workcenter_status_matrix():
|
||||
"""API: Resource count matrix by workcenter and status category."""
|
||||
days_back = request.args.get('days_back', 30, type=int)
|
||||
cache_key = make_cache_key("resource_workcenter_matrix", days_back)
|
||||
data = cache_get(cache_key)
|
||||
if data is None:
|
||||
df = query_resource_workcenter_status_matrix(days_back)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
cache_set(cache_key, data)
|
||||
else:
|
||||
data = None
|
||||
if data is not None:
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/detail', methods=['POST'])
|
||||
def api_resource_detail():
|
||||
"""API: Resource detail with filters."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
limit = data.get('limit', 500)
|
||||
offset = data.get('offset', 0)
|
||||
days_back = get_days_back(filters)
|
||||
|
||||
df = query_resource_detail(filters, limit, offset, days_back)
|
||||
if df is not None:
|
||||
records = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': records, 'count': len(records), 'offset': offset})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/filter_options')
|
||||
def api_resource_filter_options():
|
||||
"""API: Get filter options."""
|
||||
days_back = request.args.get('days_back', 30, type=int)
|
||||
cache_key = make_cache_key("resource_filter_options", days_back)
|
||||
options = cache_get(cache_key)
|
||||
if options is None:
|
||||
options = query_resource_filter_options(days_back)
|
||||
if options:
|
||||
cache_set(cache_key, options)
|
||||
if options:
|
||||
return jsonify({'success': True, 'data': options})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@resource_bp.route('/status_values')
|
||||
def api_resource_status_values():
|
||||
"""API: Get all distinct status values with counts (for verification)."""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return jsonify({'success': False, 'error': '數據庫連接失敗'}), 500
|
||||
|
||||
try:
|
||||
sql = """
|
||||
SELECT DISTINCT NEWSTATUSNAME, COUNT(*) as CNT
|
||||
FROM DW_MES_RESOURCESTATUS
|
||||
WHERE NEWSTATUSNAME IS NOT NULL
|
||||
AND LASTSTATUSCHANGEDATE >= SYSDATE - 30
|
||||
GROUP BY NEWSTATUSNAME
|
||||
ORDER BY CNT DESC
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
rows = cursor.fetchall()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
data = [{'status': row[0], 'count': row[1]} for row in rows]
|
||||
return jsonify({'success': True, 'data': data})
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
return jsonify({'success': False, 'error': str(exc)}), 500
|
||||
117
src/mes_dashboard/routes/wip_routes.py
Normal file
117
src/mes_dashboard/routes/wip_routes.py
Normal file
@@ -0,0 +1,117 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""WIP API routes for MES Dashboard.
|
||||
|
||||
Contains Flask Blueprint for WIP-related API endpoints.
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request
|
||||
|
||||
from mes_dashboard.services.wip_service import (
|
||||
query_wip_summary,
|
||||
query_wip_by_spec_workcenter,
|
||||
query_wip_by_product_line,
|
||||
query_wip_by_status,
|
||||
query_wip_by_mfgorder,
|
||||
query_wip_distribution_filter_options,
|
||||
query_wip_distribution_pivot_columns,
|
||||
query_wip_distribution,
|
||||
)
|
||||
|
||||
# Create Blueprint
|
||||
wip_bp = Blueprint('wip', __name__, url_prefix='/api/wip')
|
||||
|
||||
|
||||
@wip_bp.route('/summary')
|
||||
def api_wip_summary():
|
||||
"""API: Current WIP summary."""
|
||||
summary = query_wip_summary()
|
||||
if summary:
|
||||
return jsonify({'success': True, 'data': summary})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/by_spec_workcenter')
|
||||
def api_wip_by_spec_workcenter():
|
||||
"""API: Current WIP by spec/workcenter."""
|
||||
df = query_wip_by_spec_workcenter()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data, 'count': len(data)})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/by_product_line')
|
||||
def api_wip_by_product_line():
|
||||
"""API: Current WIP by product line."""
|
||||
df = query_wip_by_product_line()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
if not df.empty:
|
||||
product_line_summary = df.groupby('PRODUCTLINENAME_LEF').agg({
|
||||
'LOT_COUNT': 'sum',
|
||||
'TOTAL_QTY': 'sum',
|
||||
'TOTAL_QTY2': 'sum'
|
||||
}).reset_index()
|
||||
summary = product_line_summary.to_dict(orient='records')
|
||||
else:
|
||||
summary = []
|
||||
return jsonify({'success': True, 'data': data, 'summary': summary, 'count': len(data)})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/by_status')
|
||||
def api_wip_by_status():
|
||||
"""API: Current WIP by status."""
|
||||
df = query_wip_by_status()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/by_mfgorder')
|
||||
def api_wip_by_mfgorder():
|
||||
"""API: Current WIP by mfg order (Top N)."""
|
||||
limit = request.args.get('limit', 100, type=int)
|
||||
df = query_wip_by_mfgorder(limit)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution/filter_options')
|
||||
def api_wip_distribution_filter_options():
|
||||
"""API: Get WIP distribution filter options."""
|
||||
days_back = request.args.get('days_back', 90, type=int)
|
||||
options = query_wip_distribution_filter_options(days_back)
|
||||
if options:
|
||||
return jsonify({'success': True, 'data': options})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution/pivot_columns', methods=['POST'])
|
||||
def api_wip_distribution_pivot_columns():
|
||||
"""API: Get WIP distribution pivot columns."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
days_back = data.get('days_back', 90)
|
||||
columns = query_wip_distribution_pivot_columns(filters, days_back)
|
||||
if columns is not None:
|
||||
return jsonify({'success': True, 'data': columns})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution', methods=['POST'])
|
||||
def api_wip_distribution():
|
||||
"""API: Query WIP distribution main data."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
limit = min(data.get('limit', 500), 1000) # Max 1000 records
|
||||
offset = data.get('offset', 0)
|
||||
days_back = data.get('days_back', 90)
|
||||
|
||||
result = query_wip_distribution(filters, limit, offset, days_back)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
1
src/mes_dashboard/services/__init__.py
Normal file
1
src/mes_dashboard/services/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Service modules for MES Dashboard."""
|
||||
710
src/mes_dashboard/services/dashboard_service.py
Normal file
710
src/mes_dashboard/services/dashboard_service.py
Normal file
@@ -0,0 +1,710 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Dashboard and KPI query services for MES Dashboard.
|
||||
|
||||
Provides functions to query dashboard KPIs, workcenter cards,
|
||||
resource details with job info, OU trends, and utilization heatmap.
|
||||
"""
|
||||
|
||||
import pandas as pd
|
||||
from typing import Optional, Dict, List, Any, Tuple
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection, read_sql_df
|
||||
from mes_dashboard.core.utils import get_days_back, build_equipment_filter_sql
|
||||
from mes_dashboard.config.constants import (
|
||||
EXCLUDED_LOCATIONS,
|
||||
EXCLUDED_ASSET_STATUSES,
|
||||
DEFAULT_DAYS_BACK,
|
||||
)
|
||||
from mes_dashboard.config.workcenter_groups import WORKCENTER_GROUPS, get_workcenter_group
|
||||
from mes_dashboard.services.resource_service import get_resource_latest_status_subquery
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Dashboard KPI Queries
|
||||
# ============================================================
|
||||
|
||||
def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
"""Query overall KPI for dashboard header.
|
||||
|
||||
Status categories:
|
||||
- RUN: PRD (Production)
|
||||
- DOWN: UDT + SDT (Down Time)
|
||||
- IDLE: SBY + NST (Idle)
|
||||
- ENG: EGT (Engineering Time)
|
||||
|
||||
OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
|
||||
Returns:
|
||||
Dict with KPI data or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
# Build filter conditions
|
||||
where_conditions = []
|
||||
if filters:
|
||||
# Equipment flag filters
|
||||
where_conditions.extend(build_equipment_filter_sql(filters))
|
||||
|
||||
# Multi-select location filter
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"LOCATIONNAME IN ('{loc_list}')")
|
||||
|
||||
# Multi-select asset status filter
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME NOT IN ('PRD','SBY','UDT','SDT','EGT','NST') THEN 1 ELSE 0 END) as OTHER_COUNT
|
||||
FROM ({base_sql}) rs
|
||||
WHERE {where_clause}
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
row = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if not row:
|
||||
return None
|
||||
|
||||
total = row[0] or 0
|
||||
prd = row[1] or 0
|
||||
sby = row[2] or 0
|
||||
udt = row[3] or 0
|
||||
sdt = row[4] or 0
|
||||
egt = row[5] or 0
|
||||
nst = row[6] or 0
|
||||
other = row[7] or 0
|
||||
|
||||
# Status categories
|
||||
run_count = prd # RUN = PRD
|
||||
down_count = udt + sdt # DOWN = UDT + SDT
|
||||
idle_count = sby + nst # IDLE = SBY + NST
|
||||
eng_count = egt # ENG = EGT
|
||||
|
||||
# OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
# Denominator excludes NST and OTHER
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
ou_pct = round(prd / operational * 100, 1) if operational > 0 else 0
|
||||
|
||||
# Run% = PRD / Total * 100
|
||||
run_pct = round(prd / total * 100, 1) if total > 0 else 0
|
||||
|
||||
return {
|
||||
'total': total,
|
||||
'prd': prd,
|
||||
'sby': sby,
|
||||
'udt': udt,
|
||||
'sdt': sdt,
|
||||
'egt': egt,
|
||||
'nst': nst,
|
||||
'other': other,
|
||||
# Four main indicators
|
||||
'run': run_count,
|
||||
'down': down_count,
|
||||
'idle': idle_count,
|
||||
'eng': eng_count,
|
||||
# Percentages
|
||||
'ou_pct': ou_pct,
|
||||
'run_pct': run_pct
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"KPI query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Workcenter Cards
|
||||
# ============================================================
|
||||
|
||||
def query_workcenter_cards(filters: Optional[Dict] = None) -> Optional[List[Dict]]:
|
||||
"""Query workcenter status cards for dashboard with grouping.
|
||||
|
||||
Workcenter groups order:
|
||||
0: Cutting (切割)
|
||||
1: DB Bonding (焊接_DB)
|
||||
2: WB Bonding (焊接_WB)
|
||||
3: DW Bonding (焊接_DW)
|
||||
4: Molding (成型)
|
||||
5: Deflash (去膠)
|
||||
6: Blast (水吹砂)
|
||||
7: Plating (電鍍)
|
||||
8: Marking (移印)
|
||||
9: Trim/Form (切彎腳)
|
||||
10: PKG SAW (元件切割)
|
||||
11: Test (測試)
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
|
||||
Returns:
|
||||
List of workcenter card data or None if query fails.
|
||||
"""
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
# Build filter conditions
|
||||
where_conditions = []
|
||||
if filters:
|
||||
where_conditions.extend(build_equipment_filter_sql(filters))
|
||||
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"LOCATIONNAME IN ('{loc_list}')")
|
||||
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST
|
||||
FROM ({base_sql}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL AND {where_clause}
|
||||
GROUP BY WORKCENTERNAME
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Group workcenters
|
||||
grouped_data = {}
|
||||
ungrouped_data = []
|
||||
|
||||
for _, row in df.iterrows():
|
||||
wc_name = row['WORKCENTERNAME']
|
||||
group_name, order = get_workcenter_group(wc_name)
|
||||
|
||||
if group_name:
|
||||
if group_name not in grouped_data:
|
||||
grouped_data[group_name] = {
|
||||
'order': order,
|
||||
'original_wcs': [],
|
||||
'total': 0,
|
||||
'prd': 0,
|
||||
'sby': 0,
|
||||
'udt': 0,
|
||||
'sdt': 0,
|
||||
'egt': 0,
|
||||
'nst': 0
|
||||
}
|
||||
grouped_data[group_name]['original_wcs'].append(wc_name)
|
||||
grouped_data[group_name]['total'] += int(row['TOTAL'])
|
||||
grouped_data[group_name]['prd'] += int(row['PRD'])
|
||||
grouped_data[group_name]['sby'] += int(row['SBY'])
|
||||
grouped_data[group_name]['udt'] += int(row['UDT'])
|
||||
grouped_data[group_name]['sdt'] += int(row['SDT'])
|
||||
grouped_data[group_name]['egt'] += int(row['EGT'])
|
||||
grouped_data[group_name]['nst'] += int(row['NST'])
|
||||
else:
|
||||
# Ungrouped workcenter
|
||||
ungrouped_data.append({
|
||||
'workcenter': wc_name,
|
||||
'original_wcs': [wc_name],
|
||||
'order': 999,
|
||||
'total': int(row['TOTAL']),
|
||||
'prd': int(row['PRD']),
|
||||
'sby': int(row['SBY']),
|
||||
'udt': int(row['UDT']),
|
||||
'sdt': int(row['SDT']),
|
||||
'egt': int(row['EGT']),
|
||||
'nst': int(row['NST'])
|
||||
})
|
||||
|
||||
# Calculate OU% and build result
|
||||
result = []
|
||||
|
||||
# Add grouped workcenters
|
||||
for group_name, data in grouped_data.items():
|
||||
prd = data['prd']
|
||||
sby = data['sby']
|
||||
egt = data['egt']
|
||||
sdt = data['sdt']
|
||||
udt = data['udt']
|
||||
total = data['total']
|
||||
|
||||
# OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
ou_pct = round(prd / operational * 100, 1) if operational > 0 else 0
|
||||
run_pct = round(prd / total * 100, 1) if total > 0 else 0
|
||||
|
||||
result.append({
|
||||
'workcenter': group_name,
|
||||
'original_wcs': data['original_wcs'],
|
||||
'order': data['order'],
|
||||
'total': total,
|
||||
'prd': prd,
|
||||
'sby': sby,
|
||||
'udt': udt,
|
||||
'sdt': sdt,
|
||||
'egt': egt,
|
||||
'nst': data['nst'],
|
||||
'ou_pct': ou_pct,
|
||||
'run_pct': run_pct,
|
||||
'down': udt + sdt,
|
||||
'idle': sby + data['nst'],
|
||||
'eng': egt
|
||||
})
|
||||
|
||||
# Add ungrouped workcenters
|
||||
for data in ungrouped_data:
|
||||
prd = data['prd']
|
||||
sby = data['sby']
|
||||
egt = data['egt']
|
||||
sdt = data['sdt']
|
||||
udt = data['udt']
|
||||
total = data['total']
|
||||
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
ou_pct = round(prd / operational * 100, 1) if operational > 0 else 0
|
||||
run_pct = round(prd / total * 100, 1) if total > 0 else 0
|
||||
|
||||
result.append({
|
||||
'workcenter': data['workcenter'],
|
||||
'original_wcs': data['original_wcs'],
|
||||
'order': data['order'],
|
||||
'total': total,
|
||||
'prd': prd,
|
||||
'sby': sby,
|
||||
'udt': udt,
|
||||
'sdt': sdt,
|
||||
'egt': egt,
|
||||
'nst': data['nst'],
|
||||
'ou_pct': ou_pct,
|
||||
'run_pct': run_pct,
|
||||
'down': udt + sdt,
|
||||
'idle': sby + data['nst'],
|
||||
'eng': egt
|
||||
})
|
||||
|
||||
# Sort by order
|
||||
result.sort(key=lambda x: (x['order'], -x['total']))
|
||||
|
||||
return result
|
||||
except Exception as exc:
|
||||
print(f"Workcenter cards query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Resource Detail with Job Info
|
||||
# ============================================================
|
||||
|
||||
def query_resource_detail_with_job(
|
||||
filters: Optional[Dict] = None,
|
||||
limit: int = 200,
|
||||
offset: int = 0
|
||||
) -> Tuple[Optional[pd.DataFrame], Optional[str]]:
|
||||
"""Query resource detail with JOB info for SDT/UDT drill-down.
|
||||
|
||||
Field sources:
|
||||
- PJ_LOTID: From DW_MES_RESOURCE.PJ_LOTID
|
||||
- SYMPTOMCODENAME: From DW_MES_JOB via JOBID
|
||||
- CAUSECODENAME: From DW_MES_JOB via JOBID
|
||||
- DOWN_MINUTES: Calculated from MAX(LASTSTATUSCHANGEDATE) - resource's LASTSTATUSCHANGEDATE
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
limit: Maximum rows to return
|
||||
offset: Offset for pagination
|
||||
|
||||
Returns:
|
||||
Tuple of (DataFrame with detail records, max_status_time string) or (None, None) if query fails.
|
||||
"""
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
|
||||
# Build exclusion filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (r.LOCATIONNAME IS NULL OR r.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (r.PJ_ASSETSSTATUS IS NULL OR r.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
|
||||
where_conditions = []
|
||||
if filters:
|
||||
# Support workcenter group filter
|
||||
if filters.get('workcenter'):
|
||||
wc_filter = filters['workcenter']
|
||||
# Check if it's a merged group
|
||||
if wc_filter in WORKCENTER_GROUPS:
|
||||
patterns = WORKCENTER_GROUPS[wc_filter]['patterns']
|
||||
pattern_conditions = []
|
||||
for p in patterns:
|
||||
pattern_conditions.append(f"UPPER(rs.WORKCENTERNAME) LIKE '%{p.upper()}%'")
|
||||
where_conditions.append(f"({' OR '.join(pattern_conditions)})")
|
||||
else:
|
||||
where_conditions.append(f"rs.WORKCENTERNAME = '{wc_filter}'")
|
||||
|
||||
if filters.get('original_wcs'):
|
||||
# If original workcenter list provided, use IN query
|
||||
wcs = filters['original_wcs']
|
||||
wc_list = "', '".join(wcs)
|
||||
where_conditions.append(f"rs.WORKCENTERNAME IN ('{wc_list}')")
|
||||
|
||||
if filters.get('status'):
|
||||
where_conditions.append(f"rs.NEWSTATUSNAME = '{filters['status']}'")
|
||||
|
||||
# Equipment flag filters
|
||||
if filters.get('isProduction'):
|
||||
where_conditions.append("NVL(rs.PJ_ISPRODUCTION, 0) = 1")
|
||||
if filters.get('isKey'):
|
||||
where_conditions.append("NVL(rs.PJ_ISKEY, 0) = 1")
|
||||
if filters.get('isMonitor'):
|
||||
where_conditions.append("NVL(rs.PJ_ISMONITOR, 0) = 1")
|
||||
|
||||
# Multi-select location filter
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"rs.LOCATIONNAME IN ('{loc_list}')")
|
||||
|
||||
# Multi-select asset status filter
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"rs.PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
|
||||
# Default to showing only DOWN status (UDT, SDT)
|
||||
where_conditions.append("rs.NEWSTATUSNAME IN ('UDT', 'SDT')")
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
# Left join with JOB table for SDT/UDT details
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DW_MES_RESOURCESTATUS
|
||||
),
|
||||
base_data AS (
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DW_MES_RESOURCE r
|
||||
JOIN DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {days_back}
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
)
|
||||
WHERE rn = 1
|
||||
),
|
||||
max_time AS (
|
||||
SELECT MAX(LASTSTATUSCHANGEDATE) AS MAX_STATUS_TIME FROM base_data
|
||||
)
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
rs.RESOURCENAME,
|
||||
rs.WORKCENTERNAME,
|
||||
rs.RESOURCEFAMILYNAME,
|
||||
rs.NEWSTATUSNAME,
|
||||
rs.NEWREASONNAME,
|
||||
rs.LASTSTATUSCHANGEDATE,
|
||||
rs.PJ_DEPARTMENT,
|
||||
rs.VENDORNAME,
|
||||
rs.VENDORMODEL,
|
||||
rs.PJ_ISPRODUCTION,
|
||||
rs.PJ_ISKEY,
|
||||
rs.PJ_ISMONITOR,
|
||||
j.JOBID,
|
||||
rs.PJ_LOTID,
|
||||
j.JOBORDERNAME,
|
||||
j.JOBSTATUS,
|
||||
j.SYMPTOMCODENAME,
|
||||
j.CAUSECODENAME,
|
||||
j.REPAIRCODENAME,
|
||||
j.CREATEDATE as JOB_CREATEDATE,
|
||||
j.FIRSTCLOCKONDATE,
|
||||
mt.MAX_STATUS_TIME,
|
||||
ROUND((mt.MAX_STATUS_TIME - rs.LASTSTATUSCHANGEDATE) * 24 * 60, 0) as DOWN_MINUTES,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY
|
||||
CASE rs.NEWSTATUSNAME
|
||||
WHEN 'UDT' THEN 1
|
||||
WHEN 'SDT' THEN 2
|
||||
ELSE 3
|
||||
END,
|
||||
rs.LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM base_data rs
|
||||
CROSS JOIN max_time mt
|
||||
LEFT JOIN DW_MES_JOB j ON j.RESOURCEID = rs.RESOURCEID
|
||||
AND j.CREATEDATE = rs.LASTSTATUSCHANGEDATE
|
||||
WHERE {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Get max_status_time for Last Update display
|
||||
max_status_time = None
|
||||
if 'MAX_STATUS_TIME' in df.columns and len(df) > 0:
|
||||
max_status_time = df['MAX_STATUS_TIME'].iloc[0]
|
||||
if pd.notna(max_status_time):
|
||||
max_status_time = max_status_time.strftime('%Y-%m-%d %H:%M:%S')
|
||||
|
||||
# Convert datetime columns
|
||||
datetime_cols = ['LASTSTATUSCHANGEDATE', 'JOB_CREATEDATE', 'FIRSTCLOCKONDATE', 'MAX_STATUS_TIME']
|
||||
for col in datetime_cols:
|
||||
if col in df.columns:
|
||||
df[col] = df[col].apply(
|
||||
lambda x: x.strftime('%Y-%m-%d %H:%M:%S') if pd.notna(x) else None
|
||||
)
|
||||
|
||||
return df, max_status_time
|
||||
except Exception as exc:
|
||||
print(f"Detail query failed: {exc}")
|
||||
return None, None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# OU Trend
|
||||
# ============================================================
|
||||
|
||||
def query_ou_trend(days: int = 7, filters: Optional[Dict] = None) -> Optional[List[Dict]]:
|
||||
"""Query OU% trend by date using RESOURCESTATUS_SHIFT table.
|
||||
|
||||
Uses HOURS field to calculate actual time-based OU%.
|
||||
OU% = PRD_HOURS / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
|
||||
Args:
|
||||
days: Number of days to query (default 7)
|
||||
filters: Optional filters (isProduction, isKey, isMonitor)
|
||||
|
||||
Returns:
|
||||
List of {date, ou_pct, prd_hours, total_hours} records or None if query fails.
|
||||
"""
|
||||
try:
|
||||
# Build location and asset status filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (ss.LOCATIONNAME IS NULL OR ss.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (ss.PJ_ASSETSSTATUS IS NULL OR ss.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
|
||||
# Build filter conditions for equipment flags
|
||||
flag_conditions = []
|
||||
if filters:
|
||||
if filters.get('isProduction'):
|
||||
flag_conditions.append("r.PJ_ISPRODUCTION = 1")
|
||||
if filters.get('isKey'):
|
||||
flag_conditions.append("r.PJ_ISKEY = 1")
|
||||
if filters.get('isMonitor'):
|
||||
flag_conditions.append("r.PJ_ISMONITOR = 1")
|
||||
|
||||
flag_filter = ""
|
||||
if flag_conditions:
|
||||
flag_filter = "AND " + " AND ".join(flag_conditions)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SBY' THEN ss.HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'UDT' THEN ss.HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SDT' THEN ss.HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'EGT' THEN ss.HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(ss.HOURS) as TOTAL_HOURS
|
||||
FROM DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - {days}
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
{flag_filter}
|
||||
GROUP BY TRUNC(ss.TXNDATE)
|
||||
ORDER BY DATA_DATE
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
prd = float(row['PRD_HOURS'] or 0)
|
||||
sby = float(row['SBY_HOURS'] or 0)
|
||||
udt = float(row['UDT_HOURS'] or 0)
|
||||
sdt = float(row['SDT_HOURS'] or 0)
|
||||
egt = float(row['EGT_HOURS'] or 0)
|
||||
|
||||
# OU% denominator: PRD + SBY + EGT + SDT + UDT (excludes NST)
|
||||
denominator = prd + sby + egt + sdt + udt
|
||||
ou_pct = round((prd / denominator * 100), 2) if denominator > 0 else 0
|
||||
|
||||
result.append({
|
||||
'date': row['DATA_DATE'].strftime('%Y-%m-%d') if pd.notna(row['DATA_DATE']) else None,
|
||||
'ou_pct': ou_pct,
|
||||
'prd_hours': round(prd, 1),
|
||||
'sby_hours': round(sby, 1),
|
||||
'udt_hours': round(udt, 1),
|
||||
'sdt_hours': round(sdt, 1),
|
||||
'egt_hours': round(egt, 1),
|
||||
'total_hours': round(float(row['TOTAL_HOURS'] or 0), 1)
|
||||
})
|
||||
|
||||
return result
|
||||
except Exception as exc:
|
||||
print(f"OU trend query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Utilization Heatmap
|
||||
# ============================================================
|
||||
|
||||
def query_utilization_heatmap(days: int = 7, filters: Optional[Dict] = None) -> Optional[List[Dict]]:
|
||||
"""Query equipment utilization heatmap data by workcenter and date.
|
||||
|
||||
Uses HOURS field to calculate PRD% per workcenter per day.
|
||||
|
||||
Args:
|
||||
days: Number of days to query (default 7)
|
||||
filters: Optional filters (isProduction, isKey, isMonitor)
|
||||
|
||||
Returns:
|
||||
List of {workcenter, date, prd_pct, prd_hours, avail_hours} records or None if query fails.
|
||||
"""
|
||||
try:
|
||||
# Build location and asset status filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (ss.LOCATIONNAME IS NULL OR ss.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (ss.PJ_ASSETSSTATUS IS NULL OR ss.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
|
||||
# Build filter conditions for equipment flags
|
||||
flag_conditions = []
|
||||
if filters:
|
||||
if filters.get('isProduction'):
|
||||
flag_conditions.append("r.PJ_ISPRODUCTION = 1")
|
||||
if filters.get('isKey'):
|
||||
flag_conditions.append("r.PJ_ISKEY = 1")
|
||||
if filters.get('isMonitor'):
|
||||
flag_conditions.append("r.PJ_ISMONITOR = 1")
|
||||
|
||||
flag_filter = ""
|
||||
if flag_conditions:
|
||||
flag_filter = "AND " + " AND ".join(flag_conditions)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
ss.WORKCENTERNAME,
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME IN ('PRD', 'SBY', 'UDT', 'SDT', 'EGT') THEN ss.HOURS ELSE 0 END) as AVAIL_HOURS
|
||||
FROM DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - {days}
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ss.WORKCENTERNAME IS NOT NULL
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
{flag_filter}
|
||||
GROUP BY ss.WORKCENTERNAME, TRUNC(ss.TXNDATE)
|
||||
ORDER BY ss.WORKCENTERNAME, DATA_DATE
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Group by workcenter for heatmap format
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
prd = float(row['PRD_HOURS'] or 0)
|
||||
avail = float(row['AVAIL_HOURS'] or 0)
|
||||
prd_pct = round((prd / avail * 100), 2) if avail > 0 else 0
|
||||
|
||||
wc_name = row['WORKCENTERNAME']
|
||||
# Apply workcenter grouping
|
||||
group_name, _ = get_workcenter_group(wc_name)
|
||||
|
||||
result.append({
|
||||
'workcenter': wc_name,
|
||||
'group': group_name,
|
||||
'date': row['DATA_DATE'].strftime('%Y-%m-%d') if pd.notna(row['DATA_DATE']) else None,
|
||||
'prd_pct': prd_pct,
|
||||
'prd_hours': round(prd, 1),
|
||||
'avail_hours': round(avail, 1)
|
||||
})
|
||||
|
||||
return result
|
||||
except Exception as exc:
|
||||
print(f"Utilization heatmap query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
199
src/mes_dashboard/services/excel_query_service.py
Normal file
199
src/mes_dashboard/services/excel_query_service.py
Normal file
@@ -0,0 +1,199 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Excel batch query service for MES Dashboard.
|
||||
|
||||
Provides Excel parsing, batch query execution, and CSV export functions.
|
||||
Supports large datasets (7000+ rows) by splitting queries into batches.
|
||||
"""
|
||||
|
||||
import re
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Tuple
|
||||
|
||||
import pandas as pd
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection
|
||||
|
||||
|
||||
# Oracle IN clause limit
|
||||
BATCH_SIZE = 1000
|
||||
|
||||
|
||||
def parse_excel(file_storage) -> Dict[str, Any]:
|
||||
"""Parse uploaded Excel file and return column info.
|
||||
|
||||
Args:
|
||||
file_storage: Flask FileStorage object
|
||||
|
||||
Returns:
|
||||
Dict with 'columns' list and 'preview' data, or 'error' if failed.
|
||||
"""
|
||||
try:
|
||||
df = pd.read_excel(file_storage)
|
||||
columns = [str(col) for col in df.columns.tolist()]
|
||||
preview_df = df.head(5).copy()
|
||||
preview_df.columns = columns
|
||||
preview = preview_df.to_dict('records')
|
||||
|
||||
return {
|
||||
'columns': columns,
|
||||
'preview': preview,
|
||||
'total_rows': len(df)
|
||||
}
|
||||
except Exception as exc:
|
||||
return {'error': f'Excel 解析失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def get_column_unique_values(file_storage, column_name: str) -> Dict[str, Any]:
|
||||
"""Get unique values from a specific Excel column.
|
||||
|
||||
Args:
|
||||
file_storage: Flask FileStorage object
|
||||
column_name: Name of the column to extract
|
||||
|
||||
Returns:
|
||||
Dict with 'values' list and 'count', or 'error' if failed.
|
||||
"""
|
||||
try:
|
||||
df = pd.read_excel(file_storage)
|
||||
df.columns = [str(col) for col in df.columns]
|
||||
|
||||
if column_name not in df.columns:
|
||||
return {'error': f'欄位 {column_name} 不存在'}
|
||||
|
||||
values = df[column_name].dropna().drop_duplicates()
|
||||
values_list = [str(v).strip() for v in values.tolist() if str(v).strip()]
|
||||
|
||||
return {
|
||||
'values': values_list,
|
||||
'count': len(values_list)
|
||||
}
|
||||
except Exception as exc:
|
||||
return {'error': f'讀取欄位失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def sanitize_column_name(name: str) -> str:
|
||||
"""Sanitize column name to prevent SQL injection."""
|
||||
return re.sub(r'[^a-zA-Z0-9_]', '', name)
|
||||
|
||||
|
||||
def validate_table_name(table_name: str) -> bool:
|
||||
"""Validate table name format."""
|
||||
return bool(re.match(r'^[A-Za-z_][A-Za-z0-9_]*$', table_name))
|
||||
|
||||
|
||||
def execute_batch_query(
|
||||
table_name: str,
|
||||
search_column: str,
|
||||
return_columns: List[str],
|
||||
search_values: List[str]
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute batch query with IN clause, splitting into batches for large datasets.
|
||||
|
||||
Handles Oracle's 1000-value limit by executing multiple queries and merging results.
|
||||
|
||||
Args:
|
||||
table_name: Target table name
|
||||
search_column: Column to search (WHERE ... IN)
|
||||
return_columns: Columns to return in SELECT
|
||||
search_values: Values to search for (can be 7000+)
|
||||
|
||||
Returns:
|
||||
Dict with 'columns', 'data', 'row_count', or 'error' if failed.
|
||||
"""
|
||||
# Validate inputs
|
||||
if not validate_table_name(table_name):
|
||||
return {'error': f'無效的資料表名稱: {table_name}'}
|
||||
|
||||
safe_search_col = sanitize_column_name(search_column)
|
||||
safe_return_cols = [sanitize_column_name(col) for col in return_columns]
|
||||
|
||||
if not safe_search_col:
|
||||
return {'error': '查詢欄位名稱無效'}
|
||||
if not safe_return_cols:
|
||||
return {'error': '回傳欄位名稱無效'}
|
||||
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return {'error': '資料庫連接失敗'}
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
all_data = []
|
||||
columns = None
|
||||
columns_str = ', '.join(safe_return_cols)
|
||||
|
||||
# Calculate batch count for progress info
|
||||
total_batches = (len(search_values) + BATCH_SIZE - 1) // BATCH_SIZE
|
||||
|
||||
# Process in batches
|
||||
for batch_idx in range(0, len(search_values), BATCH_SIZE):
|
||||
batch_values = search_values[batch_idx:batch_idx + BATCH_SIZE]
|
||||
|
||||
# Build placeholders and params for this batch
|
||||
placeholders = ', '.join([f':v{j}' for j in range(len(batch_values))])
|
||||
params = {f'v{j}': str(v) for j, v in enumerate(batch_values)}
|
||||
|
||||
sql = f"""
|
||||
SELECT {columns_str}
|
||||
FROM {table_name}
|
||||
WHERE {safe_search_col} IN ({placeholders})
|
||||
"""
|
||||
|
||||
cursor.execute(sql, params)
|
||||
|
||||
# Get column names from first batch
|
||||
if columns is None:
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
|
||||
rows = cursor.fetchall()
|
||||
|
||||
# Convert rows to dicts
|
||||
for row in rows:
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
else:
|
||||
row_dict[col] = value
|
||||
all_data.append(row_dict)
|
||||
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
return {
|
||||
'columns': columns or safe_return_cols,
|
||||
'data': all_data,
|
||||
'row_count': len(all_data),
|
||||
'search_count': len(search_values),
|
||||
'batch_count': total_batches
|
||||
}
|
||||
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
return {'error': f'查詢失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def generate_csv_content(data: List[Dict], columns: List[str]) -> str:
|
||||
"""Generate CSV content from query results.
|
||||
|
||||
Args:
|
||||
data: List of row dictionaries
|
||||
columns: Column names for header
|
||||
|
||||
Returns:
|
||||
CSV content as string (UTF-8 with BOM for Excel compatibility)
|
||||
"""
|
||||
import csv
|
||||
import io
|
||||
|
||||
output = io.StringIO()
|
||||
# Add BOM for Excel UTF-8 compatibility
|
||||
output.write('\ufeff')
|
||||
|
||||
writer = csv.DictWriter(output, fieldnames=columns, extrasaction='ignore')
|
||||
writer.writeheader()
|
||||
writer.writerows(data)
|
||||
|
||||
return output.getvalue()
|
||||
397
src/mes_dashboard/services/resource_service.py
Normal file
397
src/mes_dashboard/services/resource_service.py
Normal file
@@ -0,0 +1,397 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Resource (Equipment) query services for MES Dashboard.
|
||||
|
||||
Provides functions to query equipment status from DW_MES_RESOURCE and DW_MES_RESOURCESTATUS tables.
|
||||
"""
|
||||
|
||||
import pandas as pd
|
||||
from typing import Optional, Dict, List, Any
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection, read_sql_df
|
||||
from mes_dashboard.core.utils import get_days_back, build_equipment_filter_sql
|
||||
from mes_dashboard.config.constants import (
|
||||
EXCLUDED_LOCATIONS,
|
||||
EXCLUDED_ASSET_STATUSES,
|
||||
DEFAULT_DAYS_BACK,
|
||||
)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Resource Base Subquery
|
||||
# ============================================================
|
||||
|
||||
def get_resource_latest_status_subquery(days_back: int = 30) -> str:
|
||||
"""Returns subquery to get latest status per resource.
|
||||
|
||||
Filter conditions:
|
||||
- (OBJECTCATEGORY = 'ASSEMBLY' AND OBJECTTYPE = 'ASSEMBLY') OR
|
||||
(OBJECTCATEGORY = 'WAFERSORT' AND OBJECTTYPE = 'WAFERSORT')
|
||||
- Excludes specified locations and asset statuses
|
||||
|
||||
Uses ROW_NUMBER() for performance.
|
||||
Only scans recent status changes (default 30 days).
|
||||
Includes JOBID for SDT/UDT drill-down.
|
||||
Includes PJ_LOTID from RESOURCE table.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
SQL subquery string for latest resource status.
|
||||
"""
|
||||
# Build exclusion filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (r.LOCATIONNAME IS NULL OR r.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (r.PJ_ASSETSSTATUS IS NULL OR r.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
|
||||
return f"""
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DW_MES_RESOURCESTATUS
|
||||
)
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DW_MES_RESOURCE r
|
||||
JOIN DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {days_back}
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
)
|
||||
WHERE rn = 1
|
||||
"""
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Resource Summary Queries
|
||||
# ============================================================
|
||||
|
||||
def query_resource_status_summary(days_back: int = 30) -> Optional[Dict]:
|
||||
"""Query resource status summary statistics.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with summary stats or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT RESOURCEFAMILYNAME) as FAMILY_COUNT,
|
||||
COUNT(DISTINCT PJ_DEPARTMENT) as DEPT_COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
result = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if not result:
|
||||
return None
|
||||
return {
|
||||
'total_count': result[0] or 0,
|
||||
'workcenter_count': result[1] or 0,
|
||||
'family_count': result[2] or 0,
|
||||
'dept_count': result[3] or 0
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"Resource summary query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_resource_by_status(days_back: int = 30) -> Optional[pd.DataFrame]:
|
||||
"""Query resource count grouped by status.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with status counts or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE NEWSTATUSNAME IS NOT NULL
|
||||
GROUP BY NEWSTATUSNAME
|
||||
ORDER BY COUNT DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource by status query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_resource_by_workcenter(days_back: int = 30) -> Optional[pd.DataFrame]:
|
||||
"""Query resource count grouped by workcenter and status.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with workcenter/status counts or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME, NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, COUNT DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource by workcenter query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_resource_detail(
|
||||
filters: Optional[Dict] = None,
|
||||
limit: int = 500,
|
||||
offset: int = 0,
|
||||
days_back: int = 30
|
||||
) -> Optional[pd.DataFrame]:
|
||||
"""Query resource detail with optional filters.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
limit: Maximum rows to return
|
||||
offset: Offset for pagination
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with resource details or None if query fails.
|
||||
"""
|
||||
try:
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
where_conditions = []
|
||||
if filters:
|
||||
if filters.get('workcenter'):
|
||||
where_conditions.append(f"WORKCENTERNAME = '{filters['workcenter']}'")
|
||||
if filters.get('status'):
|
||||
where_conditions.append(f"NEWSTATUSNAME = '{filters['status']}'")
|
||||
if filters.get('family'):
|
||||
where_conditions.append(f"RESOURCEFAMILYNAME = '{filters['family']}'")
|
||||
if filters.get('department'):
|
||||
where_conditions.append(f"PJ_DEPARTMENT = '{filters['department']}'")
|
||||
|
||||
# Equipment flag filters
|
||||
if filters.get('isProduction') is not None:
|
||||
where_conditions.append(
|
||||
f"NVL(PJ_ISPRODUCTION, 0) = {1 if filters['isProduction'] else 0}"
|
||||
)
|
||||
if filters.get('isKey') is not None:
|
||||
where_conditions.append(
|
||||
f"NVL(PJ_ISKEY, 0) = {1 if filters['isKey'] else 0}"
|
||||
)
|
||||
if filters.get('isMonitor') is not None:
|
||||
where_conditions.append(
|
||||
f"NVL(PJ_ISMONITOR, 0) = {1 if filters['isMonitor'] else 0}"
|
||||
)
|
||||
|
||||
where_clause = " AND " + " AND ".join(where_conditions) if where_conditions else ""
|
||||
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
RESOURCENAME,
|
||||
WORKCENTERNAME,
|
||||
RESOURCEFAMILYNAME,
|
||||
NEWSTATUSNAME,
|
||||
NEWREASONNAME,
|
||||
LASTSTATUSCHANGEDATE,
|
||||
PJ_DEPARTMENT,
|
||||
VENDORNAME,
|
||||
VENDORMODEL,
|
||||
PJ_ASSETSSTATUS,
|
||||
AVAILABILITY,
|
||||
PJ_ISPRODUCTION,
|
||||
PJ_ISKEY,
|
||||
PJ_ISMONITOR,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM ({base_sql}) rs
|
||||
WHERE 1=1 {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Convert datetime to string
|
||||
if 'LASTSTATUSCHANGEDATE' in df.columns:
|
||||
df['LASTSTATUSCHANGEDATE'] = df['LASTSTATUSCHANGEDATE'].apply(
|
||||
lambda x: x.strftime('%Y-%m-%d %H:%M:%S') if pd.notna(x) else None
|
||||
)
|
||||
|
||||
return df
|
||||
except Exception as exc:
|
||||
print(f"Resource detail query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_resource_workcenter_status_matrix(days_back: int = 30) -> Optional[pd.DataFrame]:
|
||||
"""Query resource count matrix by workcenter and status category.
|
||||
|
||||
Status values in database:
|
||||
- PRD: Productive
|
||||
- SBY: Standby
|
||||
- UDT: Unscheduled Down Time
|
||||
- SDT: Scheduled Down Time
|
||||
- EGT: Engineering Time
|
||||
- NST: Not Scheduled Time
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with workcenter/status matrix or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END as STATUS_CATEGORY,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END,
|
||||
NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, STATUS_CATEGORY
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource status matrix query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_resource_filter_options(days_back: int = 30) -> Optional[Dict]:
|
||||
"""Get available filter options for resource queries.
|
||||
|
||||
Optimized: combines multiple queries into fewer database calls.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with filter options or None if query fails.
|
||||
"""
|
||||
try:
|
||||
# Query from latest status data
|
||||
sql_latest = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
NEWSTATUSNAME,
|
||||
RESOURCEFAMILYNAME,
|
||||
PJ_DEPARTMENT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
"""
|
||||
latest_df = read_sql_df(sql_latest)
|
||||
|
||||
# Query from resource table for location and asset status
|
||||
sql_resource = """
|
||||
SELECT
|
||||
LOCATIONNAME,
|
||||
PJ_ASSETSSTATUS
|
||||
FROM DW_MES_RESOURCE r
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
"""
|
||||
resource_df = read_sql_df(sql_resource)
|
||||
|
||||
# Extract unique values
|
||||
workcenters = sorted(latest_df['WORKCENTERNAME'].dropna().unique().tolist())
|
||||
statuses = sorted(latest_df['NEWSTATUSNAME'].dropna().unique().tolist())
|
||||
families = sorted(latest_df['RESOURCEFAMILYNAME'].dropna().unique().tolist())
|
||||
departments = sorted(latest_df['PJ_DEPARTMENT'].dropna().unique().tolist())
|
||||
locations = sorted(resource_df['LOCATIONNAME'].dropna().unique().tolist())
|
||||
assets_statuses = sorted(resource_df['PJ_ASSETSSTATUS'].dropna().unique().tolist())
|
||||
|
||||
return {
|
||||
'workcenters': workcenters,
|
||||
'statuses': statuses,
|
||||
'families': families,
|
||||
'departments': departments,
|
||||
'locations': locations,
|
||||
'assets_statuses': assets_statuses
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"Resource filter options query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
464
src/mes_dashboard/services/wip_service.py
Normal file
464
src/mes_dashboard/services/wip_service.py
Normal file
@@ -0,0 +1,464 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""WIP (Work In Progress) query services for MES Dashboard.
|
||||
|
||||
Provides functions to query WIP data from DW_MES_WIP table.
|
||||
"""
|
||||
|
||||
import pandas as pd
|
||||
from typing import Optional, Dict, List, Any
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection, read_sql_df
|
||||
from mes_dashboard.config.workcenter_groups import get_workcenter_group
|
||||
from mes_dashboard.config.constants import DEFAULT_WIP_DAYS_BACK, WIP_EXCLUDED_STATUS
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Base Subquery
|
||||
# ============================================================
|
||||
|
||||
def get_current_wip_subquery(days_back: int = DEFAULT_WIP_DAYS_BACK) -> str:
|
||||
"""Returns subquery to get latest record per CONTAINER (current WIP snapshot).
|
||||
|
||||
Uses ROW_NUMBER() analytic function for better performance.
|
||||
Only scans recent data (default 90 days) to reduce scan size.
|
||||
Filters out completed (8) and scrapped (128) status.
|
||||
Excludes DUMMY orders (MFGORDERNAME = 'DUMMY').
|
||||
|
||||
Logic explanation:
|
||||
- PARTITION BY CONTAINERNAME: Groups records by each LOT
|
||||
- ORDER BY TXNDATE DESC: Orders by transaction time (newest first)
|
||||
- rn = 1: Takes only the latest record for each LOT
|
||||
- This gives us the current/latest status of each LOT
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back (default 90)
|
||||
|
||||
Returns:
|
||||
SQL subquery string for current WIP snapshot.
|
||||
"""
|
||||
excluded_status = ', '.join(str(s) for s in WIP_EXCLUDED_STATUS)
|
||||
return f"""
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT w.*,
|
||||
ROW_NUMBER() OVER (PARTITION BY w.CONTAINERNAME ORDER BY w.TXNDATE DESC) as rn
|
||||
FROM DW_MES_WIP w
|
||||
WHERE w.TXNDATE >= SYSDATE - {days_back}
|
||||
AND w.STATUS NOT IN ({excluded_status})
|
||||
AND (w.MFGORDERNAME IS NULL OR w.MFGORDERNAME <> 'DUMMY')
|
||||
)
|
||||
WHERE rn = 1
|
||||
"""
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Summary Queries
|
||||
# ============================================================
|
||||
|
||||
def query_wip_summary(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[Dict]:
|
||||
"""Query current WIP summary statistics.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with summary stats or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(CONTAINERNAME) as TOTAL_LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2,
|
||||
COUNT(DISTINCT SPECNAME) as SPEC_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT PRODUCTLINENAME_LEF) as PRODUCT_LINE_COUNT
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
result = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if not result:
|
||||
return None
|
||||
return {
|
||||
'total_lot_count': result[0] or 0,
|
||||
'total_qty': result[1] or 0,
|
||||
'total_qty2': result[2] or 0,
|
||||
'spec_count': result[3] or 0,
|
||||
'workcenter_count': result[4] or 0,
|
||||
'product_line_count': result[5] or 0
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"WIP summary query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_spec_workcenter(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by spec and workcenter.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by spec/workcenter or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE SPECNAME IS NOT NULL
|
||||
AND WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY SPECNAME, WORKCENTERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by spec/workcenter query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_product_line(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by product line.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by product line or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE PRODUCTLINENAME_LEF IS NOT NULL
|
||||
GROUP BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by product line query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_status(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by status.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by status or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
STATUS,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
GROUP BY STATUS
|
||||
ORDER BY LOT_COUNT DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by status query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_mfgorder(days_back: int = DEFAULT_WIP_DAYS_BACK, top_n: int = 100) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by manufacturing order (top N).
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
top_n: Number of top orders to return
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by MFG order or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
MFGORDERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE MFGORDERNAME IS NOT NULL
|
||||
GROUP BY MFGORDERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
) WHERE ROWNUM <= {top_n}
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by MFG order query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Distribution Table Functions
|
||||
# ============================================================
|
||||
|
||||
def query_wip_distribution_filter_options(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[Dict]:
|
||||
"""Get filter options for WIP distribution table.
|
||||
|
||||
Returns available values for packages, types, areas, and lot statuses.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with filter options or None if query fails.
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
sql = f"""
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
PJ_TYPE,
|
||||
PJ_PRODUCEREGION,
|
||||
HOLDREASONNAME
|
||||
FROM ({base_sql}) wip
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Extract unique values and sort
|
||||
packages = sorted([x for x in df['PRODUCTLINENAME_LEF'].dropna().unique().tolist() if x])
|
||||
types = sorted([x for x in df['PJ_TYPE'].dropna().unique().tolist() if x])
|
||||
areas = sorted([x for x in df['PJ_PRODUCEREGION'].dropna().unique().tolist() if x])
|
||||
|
||||
# Lot status: based on HOLDREASONNAME - has value=Hold, no value=Active
|
||||
lot_statuses = ['Active', 'Hold']
|
||||
|
||||
return {
|
||||
'packages': packages,
|
||||
'types': types,
|
||||
'areas': areas,
|
||||
'lot_statuses': lot_statuses
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"WIP filter options query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
def _build_wip_distribution_where_clause(filters: Optional[Dict]) -> str:
|
||||
"""Build WHERE clause for WIP distribution queries.
|
||||
|
||||
Args:
|
||||
filters: Dict with filter values
|
||||
|
||||
Returns:
|
||||
SQL WHERE clause conditions string.
|
||||
"""
|
||||
where_conditions = []
|
||||
|
||||
if filters:
|
||||
if filters.get('packages') and len(filters['packages']) > 0:
|
||||
pkg_list = "', '".join(filters['packages'])
|
||||
where_conditions.append(f"PRODUCTLINENAME_LEF IN ('{pkg_list}')")
|
||||
|
||||
if filters.get('types') and len(filters['types']) > 0:
|
||||
type_list = "', '".join(filters['types'])
|
||||
where_conditions.append(f"PJ_TYPE IN ('{type_list}')")
|
||||
|
||||
if filters.get('areas') and len(filters['areas']) > 0:
|
||||
area_list = "', '".join(filters['areas'])
|
||||
where_conditions.append(f"PJ_PRODUCEREGION IN ('{area_list}')")
|
||||
|
||||
# Lot status filter: Active = HOLDREASONNAME IS NULL, Hold = IS NOT NULL
|
||||
if filters.get('lot_statuses') and len(filters['lot_statuses']) > 0:
|
||||
status_conds = []
|
||||
if 'Active' in filters['lot_statuses']:
|
||||
status_conds.append("HOLDREASONNAME IS NULL")
|
||||
if 'Hold' in filters['lot_statuses']:
|
||||
status_conds.append("HOLDREASONNAME IS NOT NULL")
|
||||
if status_conds:
|
||||
where_conditions.append(f"({' OR '.join(status_conds)})")
|
||||
|
||||
if filters.get('search'):
|
||||
search_term = filters['search'].replace("'", "''")
|
||||
where_conditions.append(
|
||||
f"(UPPER(MFGORDERNAME) LIKE UPPER('%{search_term}%') "
|
||||
f"OR UPPER(CONTAINERNAME) LIKE UPPER('%{search_term}%'))"
|
||||
)
|
||||
|
||||
return " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
|
||||
def query_wip_distribution_pivot_columns(
|
||||
filters: Optional[Dict] = None,
|
||||
days_back: int = DEFAULT_WIP_DAYS_BACK
|
||||
) -> Optional[List[Dict]]:
|
||||
"""Get pivot columns for WIP distribution table.
|
||||
|
||||
Returns Workcenter|Spec combinations that have data.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
List of pivot column dicts or None if query fails.
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
where_clause = _build_wip_distribution_where_clause(filters)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
SPECNAME as WC_SPEC,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT
|
||||
FROM ({base_sql}) wip
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
AND {where_clause}
|
||||
GROUP BY WORKCENTERNAME, SPECNAME
|
||||
ORDER BY LOT_COUNT DESC
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Convert to pivot column list with WORKCENTER_GROUPS mapping
|
||||
pivot_columns = []
|
||||
for _, row in df.iterrows():
|
||||
wc = row['WORKCENTERNAME'] or ''
|
||||
spec = row['WC_SPEC'] or ''
|
||||
group_name, order = get_workcenter_group(wc)
|
||||
display_wc = group_name if group_name else wc
|
||||
|
||||
pivot_columns.append({
|
||||
'key': f"{wc}|{spec}",
|
||||
'workcenter': wc,
|
||||
'workcenter_group': display_wc,
|
||||
'order': order,
|
||||
'spec': spec,
|
||||
'count': int(row['LOT_COUNT'] or 0)
|
||||
})
|
||||
|
||||
return pivot_columns
|
||||
except Exception as exc:
|
||||
print(f"WIP pivot columns query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_distribution(
|
||||
filters: Optional[Dict] = None,
|
||||
limit: int = 500,
|
||||
offset: int = 0,
|
||||
days_back: int = DEFAULT_WIP_DAYS_BACK
|
||||
) -> Optional[Dict]:
|
||||
"""Query WIP distribution table main data.
|
||||
|
||||
Returns lot details with their Workcenter|Spec positions.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
limit: Maximum rows to return
|
||||
offset: Offset for pagination
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with 'rows', 'total_count', 'offset', 'limit' or None if fails.
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
where_clause = _build_wip_distribution_where_clause(filters)
|
||||
|
||||
# Get total count first
|
||||
count_sql = f"""
|
||||
SELECT COUNT(DISTINCT CONTAINERNAME) as TOTAL_COUNT
|
||||
FROM ({base_sql}) wip
|
||||
WHERE {where_clause}
|
||||
"""
|
||||
count_df = read_sql_df(count_sql)
|
||||
total_count = int(count_df['TOTAL_COUNT'].iloc[0]) if len(count_df) > 0 else 0
|
||||
|
||||
# Paginated main data query
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
MFGORDERNAME,
|
||||
CONTAINERNAME,
|
||||
SPECNAME,
|
||||
PRODUCTLINENAME_LEF,
|
||||
WAFERLOT,
|
||||
PJ_TYPE,
|
||||
PJ_PRODUCEREGION,
|
||||
EQUIPMENTS,
|
||||
WORKCENTERNAME,
|
||||
STATUS,
|
||||
HOLDREASONNAME,
|
||||
QTY,
|
||||
QTY2,
|
||||
TXNDATE,
|
||||
ROW_NUMBER() OVER (ORDER BY TXNDATE DESC, MFGORDERNAME, CONTAINERNAME) as rn
|
||||
FROM ({base_sql}) wip
|
||||
WHERE {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Convert to response format
|
||||
rows = []
|
||||
for _, row in df.iterrows():
|
||||
wc = row['WORKCENTERNAME'] or ''
|
||||
spec = row['SPECNAME'] or ''
|
||||
pivot_key = f"{wc}|{spec}"
|
||||
|
||||
# Lot status: HOLDREASONNAME has value = Hold, no value = Active
|
||||
hold_reason = row['HOLDREASONNAME']
|
||||
lot_status = 'Hold' if (pd.notna(hold_reason) and hold_reason) else 'Active'
|
||||
|
||||
rows.append({
|
||||
'MFGORDERNAME': row['MFGORDERNAME'],
|
||||
'CONTAINERNAME': row['CONTAINERNAME'],
|
||||
'SPECNAME': row['SPECNAME'],
|
||||
'PRODUCTLINENAME_LEF': row['PRODUCTLINENAME_LEF'],
|
||||
'WAFERLOT': row['WAFERLOT'],
|
||||
'PJ_TYPE': row['PJ_TYPE'],
|
||||
'PJ_PRODUCEREGION': row['PJ_PRODUCEREGION'],
|
||||
'EQUIPMENTS': row['EQUIPMENTS'],
|
||||
'WORKCENTERNAME': row['WORKCENTERNAME'],
|
||||
'LOT_STATUS': lot_status,
|
||||
'HOLDREASONNAME': hold_reason if pd.notna(hold_reason) else None,
|
||||
'QTY': int(row['QTY']) if pd.notna(row['QTY']) else 0,
|
||||
'QTY2': int(row['QTY2']) if pd.notna(row['QTY2']) else 0,
|
||||
'pivot_key': pivot_key
|
||||
})
|
||||
|
||||
return {
|
||||
'rows': rows,
|
||||
'total_count': total_count,
|
||||
'offset': offset,
|
||||
'limit': limit
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"WIP distribution query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
829
src/mes_dashboard/templates/excel_query.html
Normal file
829
src/mes_dashboard/templates/excel_query.html
Normal file
@@ -0,0 +1,829 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="zh-TW">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Excel 批次查詢工具</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: 'Microsoft JhengHei', Arial, sans-serif;
|
||||
background: #f5f5f5;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 1400px;
|
||||
margin: 0 auto;
|
||||
background: white;
|
||||
border-radius: 8px;
|
||||
box-shadow: 0 2px 8px rgba(0,0,0,0.1);
|
||||
}
|
||||
|
||||
.header {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
color: white;
|
||||
padding: 30px;
|
||||
border-radius: 8px 8px 0 0;
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 28px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.header p {
|
||||
opacity: 0.9;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.content {
|
||||
padding: 30px;
|
||||
}
|
||||
|
||||
.step-section {
|
||||
margin-bottom: 25px;
|
||||
padding: 20px;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-radius: 8px;
|
||||
background: #fafafa;
|
||||
}
|
||||
|
||||
.step-section.disabled {
|
||||
opacity: 0.5;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
.step-title {
|
||||
font-size: 16px;
|
||||
font-weight: bold;
|
||||
color: #667eea;
|
||||
margin-bottom: 15px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 10px;
|
||||
}
|
||||
|
||||
.step-number {
|
||||
background: #667eea;
|
||||
color: white;
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
border-radius: 50%;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.upload-area {
|
||||
display: flex;
|
||||
gap: 15px;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
input[type="file"] {
|
||||
padding: 10px;
|
||||
border: 2px dashed #ccc;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
input[type="file"]:hover {
|
||||
border-color: #667eea;
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: 10px 20px;
|
||||
border: none;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
font-size: 14px;
|
||||
font-weight: bold;
|
||||
transition: all 0.2s;
|
||||
}
|
||||
|
||||
.btn-primary {
|
||||
background: #667eea;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-primary:hover {
|
||||
background: #5a6fd6;
|
||||
}
|
||||
|
||||
.btn-success {
|
||||
background: #28a745;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-success:hover {
|
||||
background: #218838;
|
||||
}
|
||||
|
||||
.btn:disabled {
|
||||
background: #ccc;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
select {
|
||||
padding: 10px 15px;
|
||||
border: 1px solid #ddd;
|
||||
border-radius: 6px;
|
||||
font-size: 14px;
|
||||
min-width: 200px;
|
||||
background: white;
|
||||
}
|
||||
|
||||
select:focus {
|
||||
outline: none;
|
||||
border-color: #667eea;
|
||||
}
|
||||
|
||||
.info-box {
|
||||
background: #e3e8ff;
|
||||
color: #667eea;
|
||||
padding: 10px 15px;
|
||||
border-radius: 6px;
|
||||
font-size: 13px;
|
||||
margin-top: 10px;
|
||||
}
|
||||
|
||||
.info-box.warning {
|
||||
background: #fff3cd;
|
||||
color: #856404;
|
||||
}
|
||||
|
||||
.checkbox-group {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 10px;
|
||||
margin-top: 10px;
|
||||
max-height: 200px;
|
||||
overflow-y: auto;
|
||||
padding: 10px;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-radius: 6px;
|
||||
background: white;
|
||||
}
|
||||
|
||||
.checkbox-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 5px;
|
||||
padding: 5px 10px;
|
||||
background: #f8f9fa;
|
||||
border-radius: 4px;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.checkbox-item:hover {
|
||||
background: #e3e8ff;
|
||||
}
|
||||
|
||||
.checkbox-item input[type="checkbox"] {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.select-all-bar {
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.select-all-bar button {
|
||||
padding: 5px 10px;
|
||||
font-size: 12px;
|
||||
border: 1px solid #ddd;
|
||||
background: white;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.select-all-bar button:hover {
|
||||
background: #f0f0f0;
|
||||
}
|
||||
|
||||
.result-section {
|
||||
margin-top: 30px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.result-section.active {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.result-header {
|
||||
background: #667eea;
|
||||
color: white;
|
||||
padding: 15px 20px;
|
||||
border-radius: 6px 6px 0 0;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.result-stats {
|
||||
display: flex;
|
||||
gap: 20px;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.table-container {
|
||||
overflow-x: auto;
|
||||
max-height: 500px;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-top: none;
|
||||
border-radius: 0 0 6px 6px;
|
||||
}
|
||||
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
thead {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
background: #f8f9fa;
|
||||
z-index: 10;
|
||||
}
|
||||
|
||||
th {
|
||||
padding: 12px 10px;
|
||||
text-align: left;
|
||||
border-bottom: 2px solid #dee2e6;
|
||||
font-weight: 600;
|
||||
color: #495057;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
td {
|
||||
padding: 10px;
|
||||
border-bottom: 1px solid #f0f0f0;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
tr:hover {
|
||||
background: #f8f9fa;
|
||||
}
|
||||
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 40px;
|
||||
color: #667eea;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
display: inline-block;
|
||||
width: 30px;
|
||||
height: 30px;
|
||||
border: 3px solid #f3f3f3;
|
||||
border-top: 3px solid #667eea;
|
||||
border-radius: 50%;
|
||||
animation: spin 1s linear infinite;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.error {
|
||||
background: #fee;
|
||||
color: #c33;
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-top: 10px;
|
||||
}
|
||||
|
||||
.preview-table {
|
||||
margin-top: 15px;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.preview-table th {
|
||||
background: #e9ecef;
|
||||
padding: 8px;
|
||||
}
|
||||
|
||||
.preview-table td {
|
||||
padding: 8px;
|
||||
max-width: 150px;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.row {
|
||||
display: flex;
|
||||
gap: 20px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.col {
|
||||
flex: 1;
|
||||
min-width: 300px;
|
||||
}
|
||||
|
||||
label {
|
||||
display: block;
|
||||
margin-bottom: 8px;
|
||||
font-weight: 500;
|
||||
color: #333;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="header">
|
||||
<h1>Excel 批次查詢工具</h1>
|
||||
<p>上傳 Excel 檔案,批次查詢資料庫並匯出結果</p>
|
||||
</div>
|
||||
|
||||
<div class="content">
|
||||
<!-- Step 1: Upload Excel -->
|
||||
<div class="step-section" id="step1">
|
||||
<div class="step-title">
|
||||
<span class="step-number">1</span>
|
||||
上傳 Excel 檔案
|
||||
</div>
|
||||
<div class="upload-area">
|
||||
<input type="file" id="excelFile" accept=".xlsx,.xls">
|
||||
<button class="btn btn-primary" onclick="uploadExcel()">上傳</button>
|
||||
</div>
|
||||
<div id="uploadInfo"></div>
|
||||
<div id="previewTable"></div>
|
||||
</div>
|
||||
|
||||
<!-- Step 2: Select Excel Column -->
|
||||
<div class="step-section disabled" id="step2">
|
||||
<div class="step-title">
|
||||
<span class="step-number">2</span>
|
||||
選擇 Excel 欄位(作為查詢值)
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col">
|
||||
<label>選擇欄位:</label>
|
||||
<select id="excelColumn" onchange="loadColumnValues()">
|
||||
<option value="">-- 請選擇 --</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div id="columnInfo"></div>
|
||||
</div>
|
||||
|
||||
<!-- Step 3: Select Target Table -->
|
||||
<div class="step-section disabled" id="step3">
|
||||
<div class="step-title">
|
||||
<span class="step-number">3</span>
|
||||
選擇目標資料表
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col">
|
||||
<label>資料表:</label>
|
||||
<select id="targetTable" onchange="loadTableColumns()">
|
||||
<option value="">-- 請選擇 --</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div id="tableInfo"></div>
|
||||
</div>
|
||||
|
||||
<!-- Step 4: Select Columns -->
|
||||
<div class="step-section disabled" id="step4">
|
||||
<div class="step-title">
|
||||
<span class="step-number">4</span>
|
||||
選擇查詢欄位與回傳欄位
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col">
|
||||
<label>查詢欄位(WHERE IN):</label>
|
||||
<select id="searchColumn">
|
||||
<option value="">-- 請選擇 --</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div style="margin-top: 15px;">
|
||||
<label>回傳欄位(可多選):</label>
|
||||
<div class="select-all-bar">
|
||||
<button onclick="selectAllColumns()">全選</button>
|
||||
<button onclick="deselectAllColumns()">取消全選</button>
|
||||
</div>
|
||||
<div class="checkbox-group" id="returnColumns"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Step 5: Execute -->
|
||||
<div class="step-section disabled" id="step5">
|
||||
<div class="step-title">
|
||||
<span class="step-number">5</span>
|
||||
執行查詢
|
||||
</div>
|
||||
<div style="display: flex; gap: 15px;">
|
||||
<button class="btn btn-primary" onclick="executeQuery()">查詢預覽</button>
|
||||
<button class="btn btn-success" onclick="exportCSV()">匯出 CSV</button>
|
||||
</div>
|
||||
<div id="executeInfo"></div>
|
||||
</div>
|
||||
|
||||
<!-- Result Section -->
|
||||
<div class="result-section" id="resultSection">
|
||||
<div class="result-header">
|
||||
<h3>查詢結果</h3>
|
||||
<div class="result-stats" id="resultStats"></div>
|
||||
</div>
|
||||
<div class="table-container">
|
||||
<div id="resultTable"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// State
|
||||
let excelColumns = [];
|
||||
let searchValues = [];
|
||||
let tableColumns = [];
|
||||
let queryResult = null;
|
||||
|
||||
// Step 1: Upload Excel
|
||||
async function uploadExcel() {
|
||||
const fileInput = document.getElementById('excelFile');
|
||||
const file = fileInput.files[0];
|
||||
|
||||
if (!file) {
|
||||
alert('請選擇檔案');
|
||||
return;
|
||||
}
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
|
||||
document.getElementById('uploadInfo').innerHTML = '<div class="loading"><div class="loading-spinner"></div><br>上傳中...</div>';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/upload', {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
const data = await response.json();
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('uploadInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
excelColumns = data.columns;
|
||||
document.getElementById('uploadInfo').innerHTML = `
|
||||
<div class="info-box">
|
||||
檔案上傳成功!共 ${data.total_rows} 行,${data.columns.length} 欄
|
||||
</div>
|
||||
`;
|
||||
|
||||
// Show preview table
|
||||
renderPreviewTable(data.columns, data.preview);
|
||||
|
||||
// Populate Excel column dropdown
|
||||
const select = document.getElementById('excelColumn');
|
||||
select.innerHTML = '<option value="">-- 請選擇 --</option>';
|
||||
excelColumns.forEach(col => {
|
||||
select.innerHTML += `<option value="${col}">${col}</option>`;
|
||||
});
|
||||
|
||||
// Enable step 2
|
||||
document.getElementById('step2').classList.remove('disabled');
|
||||
|
||||
// Load available tables
|
||||
loadTables();
|
||||
|
||||
} catch (error) {
|
||||
document.getElementById('uploadInfo').innerHTML = `<div class="error">上傳失敗: ${error.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function renderPreviewTable(columns, data) {
|
||||
if (!data || data.length === 0) return;
|
||||
|
||||
let html = '<table class="preview-table"><thead><tr>';
|
||||
columns.forEach(col => {
|
||||
html += `<th>${col}</th>`;
|
||||
});
|
||||
html += '</tr></thead><tbody>';
|
||||
|
||||
data.forEach(row => {
|
||||
html += '<tr>';
|
||||
columns.forEach(col => {
|
||||
const val = row[col] !== null && row[col] !== undefined ? row[col] : '';
|
||||
html += `<td title="${val}">${val}</td>`;
|
||||
});
|
||||
html += '</tr>';
|
||||
});
|
||||
html += '</tbody></table>';
|
||||
|
||||
document.getElementById('previewTable').innerHTML = html;
|
||||
}
|
||||
|
||||
// Step 2: Load column values
|
||||
async function loadColumnValues() {
|
||||
const column = document.getElementById('excelColumn').value;
|
||||
if (!column) {
|
||||
searchValues = [];
|
||||
document.getElementById('columnInfo').innerHTML = '';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('columnInfo').innerHTML = '<div class="loading"><div class="loading-spinner"></div><br>讀取中...</div>';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/column-values', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ column_name: column })
|
||||
});
|
||||
const data = await response.json();
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('columnInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
searchValues = data.values;
|
||||
const warningClass = data.count > 1000 ? ' warning' : '';
|
||||
document.getElementById('columnInfo').innerHTML = `
|
||||
<div class="info-box${warningClass}">
|
||||
共 ${data.count} 個不重複值
|
||||
${data.count > 1000 ? '(將分批查詢,每批 1000 筆)' : ''}
|
||||
</div>
|
||||
`;
|
||||
|
||||
// Enable step 3
|
||||
document.getElementById('step3').classList.remove('disabled');
|
||||
|
||||
} catch (error) {
|
||||
document.getElementById('columnInfo').innerHTML = `<div class="error">讀取失敗: ${error.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
// Load available tables
|
||||
async function loadTables() {
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/tables');
|
||||
const data = await response.json();
|
||||
|
||||
const select = document.getElementById('targetTable');
|
||||
select.innerHTML = '<option value="">-- 請選擇 --</option>';
|
||||
|
||||
data.tables.forEach(table => {
|
||||
select.innerHTML += `<option value="${table.name}">${table.display_name} (${table.name})</option>`;
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Failed to load tables:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Step 3: Load table columns
|
||||
async function loadTableColumns() {
|
||||
const tableName = document.getElementById('targetTable').value;
|
||||
if (!tableName) {
|
||||
tableColumns = [];
|
||||
document.getElementById('tableInfo').innerHTML = '';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('tableInfo').innerHTML = '<div class="loading"><div class="loading-spinner"></div><br>讀取欄位...</div>';
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/table-columns', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ table_name: tableName })
|
||||
});
|
||||
const data = await response.json();
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('tableInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
tableColumns = data.columns;
|
||||
document.getElementById('tableInfo').innerHTML = `
|
||||
<div class="info-box">共 ${data.columns.length} 個欄位</div>
|
||||
`;
|
||||
|
||||
// Populate search column dropdown
|
||||
const searchSelect = document.getElementById('searchColumn');
|
||||
searchSelect.innerHTML = '<option value="">-- 請選擇 --</option>';
|
||||
tableColumns.forEach(col => {
|
||||
searchSelect.innerHTML += `<option value="${col}">${col}</option>`;
|
||||
});
|
||||
|
||||
// Populate return columns checkboxes
|
||||
const container = document.getElementById('returnColumns');
|
||||
container.innerHTML = '';
|
||||
tableColumns.forEach(col => {
|
||||
container.innerHTML += `
|
||||
<label class="checkbox-item">
|
||||
<input type="checkbox" value="${col}" checked>
|
||||
${col}
|
||||
</label>
|
||||
`;
|
||||
});
|
||||
|
||||
// Enable step 4 and 5
|
||||
document.getElementById('step4').classList.remove('disabled');
|
||||
document.getElementById('step5').classList.remove('disabled');
|
||||
|
||||
} catch (error) {
|
||||
document.getElementById('tableInfo').innerHTML = `<div class="error">讀取失敗: ${error.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function selectAllColumns() {
|
||||
document.querySelectorAll('#returnColumns input[type="checkbox"]').forEach(cb => cb.checked = true);
|
||||
}
|
||||
|
||||
function deselectAllColumns() {
|
||||
document.querySelectorAll('#returnColumns input[type="checkbox"]').forEach(cb => cb.checked = false);
|
||||
}
|
||||
|
||||
function getSelectedReturnColumns() {
|
||||
const checkboxes = document.querySelectorAll('#returnColumns input[type="checkbox"]:checked');
|
||||
return Array.from(checkboxes).map(cb => cb.value);
|
||||
}
|
||||
|
||||
function getQueryParams() {
|
||||
return {
|
||||
table_name: document.getElementById('targetTable').value,
|
||||
search_column: document.getElementById('searchColumn').value,
|
||||
return_columns: getSelectedReturnColumns(),
|
||||
search_values: searchValues
|
||||
};
|
||||
}
|
||||
|
||||
function validateQuery() {
|
||||
const params = getQueryParams();
|
||||
|
||||
if (!params.table_name) {
|
||||
alert('請選擇資料表');
|
||||
return false;
|
||||
}
|
||||
if (!params.search_column) {
|
||||
alert('請選擇查詢欄位');
|
||||
return false;
|
||||
}
|
||||
if (params.return_columns.length === 0) {
|
||||
alert('請至少選擇一個回傳欄位');
|
||||
return false;
|
||||
}
|
||||
if (params.search_values.length === 0) {
|
||||
alert('無查詢值,請先選擇 Excel 欄位');
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
// Step 5: Execute query
|
||||
async function executeQuery() {
|
||||
if (!validateQuery()) return;
|
||||
|
||||
const params = getQueryParams();
|
||||
const batchCount = Math.ceil(params.search_values.length / 1000);
|
||||
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="loading">
|
||||
<div class="loading-spinner"></div><br>
|
||||
查詢中... (${params.search_values.length} 筆,${batchCount} 批次)
|
||||
</div>
|
||||
`;
|
||||
document.getElementById('resultSection').classList.remove('active');
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/execute', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(params)
|
||||
});
|
||||
const data = await response.json();
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('executeInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
queryResult = data;
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="info-box">
|
||||
查詢完成!搜尋 ${data.search_count} 筆,找到 ${data.row_count} 筆結果
|
||||
</div>
|
||||
`;
|
||||
|
||||
renderResult(data);
|
||||
|
||||
} catch (error) {
|
||||
document.getElementById('executeInfo').innerHTML = `<div class="error">查詢失敗: ${error.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function renderResult(data) {
|
||||
const section = document.getElementById('resultSection');
|
||||
const statsDiv = document.getElementById('resultStats');
|
||||
const tableDiv = document.getElementById('resultTable');
|
||||
|
||||
statsDiv.innerHTML = `
|
||||
<span>搜尋值: ${data.search_count}</span>
|
||||
<span>結果: ${data.row_count} 筆</span>
|
||||
${data.batch_count > 1 ? `<span>批次: ${data.batch_count}</span>` : ''}
|
||||
`;
|
||||
|
||||
if (data.data.length === 0) {
|
||||
tableDiv.innerHTML = '<div style="padding: 40px; text-align: center; color: #999;">查無資料</div>';
|
||||
} else {
|
||||
let html = '<table><thead><tr>';
|
||||
data.columns.forEach(col => {
|
||||
html += `<th>${col}</th>`;
|
||||
});
|
||||
html += '</tr></thead><tbody>';
|
||||
|
||||
// Show first 1000 rows in preview
|
||||
const previewData = data.data.slice(0, 1000);
|
||||
previewData.forEach(row => {
|
||||
html += '<tr>';
|
||||
data.columns.forEach(col => {
|
||||
const val = row[col] !== null && row[col] !== undefined ? row[col] : '<i style="color:#999">NULL</i>';
|
||||
html += `<td>${val}</td>`;
|
||||
});
|
||||
html += '</tr>';
|
||||
});
|
||||
html += '</tbody></table>';
|
||||
|
||||
if (data.data.length > 1000) {
|
||||
html += `<div style="padding: 15px; text-align: center; color: #666; background: #f8f9fa;">
|
||||
顯示前 1000 筆,完整資料請匯出 CSV
|
||||
</div>`;
|
||||
}
|
||||
|
||||
tableDiv.innerHTML = html;
|
||||
}
|
||||
|
||||
section.classList.add('active');
|
||||
section.scrollIntoView({ behavior: 'smooth' });
|
||||
}
|
||||
|
||||
// Export CSV
|
||||
async function exportCSV() {
|
||||
if (!validateQuery()) return;
|
||||
|
||||
const params = getQueryParams();
|
||||
const batchCount = Math.ceil(params.search_values.length / 1000);
|
||||
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="loading">
|
||||
<div class="loading-spinner"></div><br>
|
||||
匯出中... (${params.search_values.length} 筆,${batchCount} 批次)
|
||||
</div>
|
||||
`;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/excel-query/export-csv', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(params)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json();
|
||||
document.getElementById('executeInfo').innerHTML = `<div class="error">${data.error || '匯出失敗'}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
// Download file
|
||||
const blob = await response.blob();
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
a.download = 'query_result.csv';
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="info-box">CSV 匯出完成!</div>
|
||||
`;
|
||||
|
||||
} catch (error) {
|
||||
document.getElementById('executeInfo').innerHTML = `<div class="error">匯出失敗: ${error.message}</div>`;
|
||||
}
|
||||
}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
File diff suppressed because it is too large
Load Diff
@@ -97,14 +97,18 @@
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" data-target="wipFrame">WIP 在制品報表</button>
|
||||
<button class="tab" data-target="wipOverviewFrame">WIP 即時概況</button>
|
||||
<button class="tab" data-target="resourceFrame">機台狀態報表</button>
|
||||
<button class="tab" data-target="tableFrame">數據表查詢工具</button>
|
||||
<button class="tab" data-target="excelQueryFrame">Excel 批次查詢</button>
|
||||
</div>
|
||||
|
||||
<div class="panel">
|
||||
<iframe id="wipFrame" class="active" src="/wip" title="WIP 在制品報表"></iframe>
|
||||
<iframe id="wipOverviewFrame" src="/wip-overview" title="WIP 即時概況"></iframe>
|
||||
<iframe id="resourceFrame" src="/resource" title="機台狀態報表"></iframe>
|
||||
<iframe id="tableFrame" src="/tables" title="數據表查詢工具"></iframe>
|
||||
<iframe id="excelQueryFrame" src="/excel-query" title="Excel 批次查詢"></iframe>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
606
src/mes_dashboard/templates/wip_overview.html
Normal file
606
src/mes_dashboard/templates/wip_overview.html
Normal file
@@ -0,0 +1,606 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="zh-TW">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>各站 WIP 即時概況</title>
|
||||
<script src="https://cdn.jsdelivr.net/npm/echarts@5.4.3/dist/echarts.min.js"></script>
|
||||
<style>
|
||||
:root {
|
||||
--bg: #f5f7fa;
|
||||
--card-bg: #ffffff;
|
||||
--text: #222;
|
||||
--muted: #666;
|
||||
--border: #e2e6ef;
|
||||
--primary: #667eea;
|
||||
--primary-dark: #5568d3;
|
||||
--shadow: 0 2px 10px rgba(0,0,0,0.08);
|
||||
--shadow-strong: 0 4px 15px rgba(102, 126, 234, 0.2);
|
||||
--success: #22c55e;
|
||||
--danger: #ef4444;
|
||||
--warning: #f59e0b;
|
||||
}
|
||||
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: 'Microsoft JhengHei', Arial, sans-serif;
|
||||
background: var(--bg);
|
||||
color: var(--text);
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
.dashboard {
|
||||
max-width: 1900px;
|
||||
margin: 0 auto;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
/* Header */
|
||||
.header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
gap: 12px;
|
||||
padding: 18px 22px;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
border-radius: 10px;
|
||||
margin-bottom: 16px;
|
||||
box-shadow: var(--shadow-strong);
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 24px;
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.header-right {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 16px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.last-update {
|
||||
color: rgba(255, 255, 255, 0.8);
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: 9px 20px;
|
||||
border: none;
|
||||
border-radius: 8px;
|
||||
font-size: 13px;
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
}
|
||||
|
||||
.btn-light {
|
||||
background: rgba(255,255,255,0.2);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-light:hover {
|
||||
background: rgba(255,255,255,0.3);
|
||||
}
|
||||
|
||||
/* Summary Cards */
|
||||
.summary-row {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(6, 1fr);
|
||||
gap: 14px;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.summary-card {
|
||||
background: var(--card-bg);
|
||||
border-radius: 10px;
|
||||
padding: 16px 20px;
|
||||
text-align: center;
|
||||
border: 1px solid var(--border);
|
||||
box-shadow: var(--shadow);
|
||||
}
|
||||
|
||||
.summary-label {
|
||||
font-size: 12px;
|
||||
color: var(--muted);
|
||||
margin-bottom: 6px;
|
||||
}
|
||||
|
||||
.summary-value {
|
||||
font-size: 26px;
|
||||
font-weight: bold;
|
||||
color: var(--primary);
|
||||
}
|
||||
|
||||
/* Chart Grid */
|
||||
.chart-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
gap: 16px;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.chart-card {
|
||||
background: var(--card-bg);
|
||||
border-radius: 10px;
|
||||
box-shadow: var(--shadow);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.chart-header {
|
||||
padding: 14px 20px;
|
||||
border-bottom: 1px solid var(--border);
|
||||
background: #fafbfc;
|
||||
}
|
||||
|
||||
.chart-title {
|
||||
font-size: 15px;
|
||||
font-weight: 600;
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
.chart-container {
|
||||
padding: 16px;
|
||||
height: 350px;
|
||||
}
|
||||
|
||||
/* Loading */
|
||||
.loading-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(255, 255, 255, 0.9);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 100;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
display: inline-block;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
border: 3px solid var(--border);
|
||||
border-top-color: var(--primary);
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
to { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.placeholder {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
height: 100%;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
/* Responsive */
|
||||
@media (max-width: 1400px) {
|
||||
.summary-row {
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1000px) {
|
||||
.chart-grid {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.summary-row {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="dashboard">
|
||||
<!-- Header -->
|
||||
<div class="header">
|
||||
<h1>各站 WIP 即時概況</h1>
|
||||
<div class="header-right">
|
||||
<span id="lastUpdate" class="last-update"></span>
|
||||
<button class="btn btn-light" onclick="loadAllData()">重新整理</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Summary Cards -->
|
||||
<div class="summary-row">
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">總 LOT 數</div>
|
||||
<div class="summary-value" id="totalLots">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">總數量 (QTY)</div>
|
||||
<div class="summary-value" id="totalQty">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">總片數 (QTY2)</div>
|
||||
<div class="summary-value" id="totalQty2">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">SPEC 數</div>
|
||||
<div class="summary-value" id="specCount">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">工站數</div>
|
||||
<div class="summary-value" id="wcCount">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">產品線數</div>
|
||||
<div class="summary-value" id="plCount">-</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Chart Grid -->
|
||||
<div class="chart-grid">
|
||||
<div class="chart-card">
|
||||
<div class="chart-header">
|
||||
<div class="chart-title">各站 WIP 分布</div>
|
||||
</div>
|
||||
<div class="chart-container" id="chartWorkcenter"></div>
|
||||
</div>
|
||||
<div class="chart-card">
|
||||
<div class="chart-header">
|
||||
<div class="chart-title">狀態分布</div>
|
||||
</div>
|
||||
<div class="chart-container" id="chartStatus"></div>
|
||||
</div>
|
||||
<div class="chart-card">
|
||||
<div class="chart-header">
|
||||
<div class="chart-title">產品線分布</div>
|
||||
</div>
|
||||
<div class="chart-container" id="chartProductLine"></div>
|
||||
</div>
|
||||
<div class="chart-card">
|
||||
<div class="chart-header">
|
||||
<div class="chart-title">Top 20 工單</div>
|
||||
</div>
|
||||
<div class="chart-container" id="chartMfgOrder"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Loading Overlay -->
|
||||
<div class="loading-overlay" id="loadingOverlay">
|
||||
<span class="loading-spinner"></span>
|
||||
<span>載入中...</span>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Chart instances
|
||||
let chartWorkcenter = null;
|
||||
let chartStatus = null;
|
||||
let chartProductLine = null;
|
||||
let chartMfgOrder = null;
|
||||
|
||||
// Format number with commas
|
||||
function formatNumber(num) {
|
||||
if (num === null || num === undefined || num === '-') return '-';
|
||||
return num.toLocaleString('zh-TW');
|
||||
}
|
||||
|
||||
// Status name mapping
|
||||
const STATUS_MAP = {
|
||||
1: 'Queue',
|
||||
2: 'Run',
|
||||
4: 'Hold'
|
||||
};
|
||||
|
||||
// Initialize charts
|
||||
function initCharts() {
|
||||
chartWorkcenter = echarts.init(document.getElementById('chartWorkcenter'));
|
||||
chartStatus = echarts.init(document.getElementById('chartStatus'));
|
||||
chartProductLine = echarts.init(document.getElementById('chartProductLine'));
|
||||
chartMfgOrder = echarts.init(document.getElementById('chartMfgOrder'));
|
||||
|
||||
// Handle resize
|
||||
window.addEventListener('resize', () => {
|
||||
chartWorkcenter.resize();
|
||||
chartStatus.resize();
|
||||
chartProductLine.resize();
|
||||
chartMfgOrder.resize();
|
||||
});
|
||||
}
|
||||
|
||||
// Load summary data
|
||||
async function loadSummary() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/summary');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
const data = result.data;
|
||||
document.getElementById('totalLots').textContent = formatNumber(data.total_lot_count);
|
||||
document.getElementById('totalQty').textContent = formatNumber(data.total_qty);
|
||||
document.getElementById('totalQty2').textContent = formatNumber(data.total_qty2);
|
||||
document.getElementById('specCount').textContent = formatNumber(data.spec_count);
|
||||
document.getElementById('wcCount').textContent = formatNumber(data.workcenter_count);
|
||||
document.getElementById('plCount').textContent = formatNumber(data.product_line_count);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Summary load failed:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Load workcenter chart
|
||||
async function loadWorkcenterChart() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/by_spec_workcenter');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success && result.data.length > 0) {
|
||||
// Aggregate by workcenter
|
||||
const wcMap = {};
|
||||
result.data.forEach(row => {
|
||||
const wc = row.WORKCENTERNAME || '(Unknown)';
|
||||
if (!wcMap[wc]) wcMap[wc] = 0;
|
||||
wcMap[wc] += row.TOTAL_QTY || 0;
|
||||
});
|
||||
|
||||
// Sort by value desc and take top 15
|
||||
const sorted = Object.entries(wcMap)
|
||||
.sort((a, b) => b[1] - a[1])
|
||||
.slice(0, 15);
|
||||
|
||||
const categories = sorted.map(x => x[0]);
|
||||
const values = sorted.map(x => x[1]);
|
||||
|
||||
chartWorkcenter.setOption({
|
||||
tooltip: {
|
||||
trigger: 'axis',
|
||||
axisPointer: { type: 'shadow' }
|
||||
},
|
||||
grid: {
|
||||
left: '3%',
|
||||
right: '4%',
|
||||
bottom: '3%',
|
||||
top: '3%',
|
||||
containLabel: true
|
||||
},
|
||||
xAxis: {
|
||||
type: 'value'
|
||||
},
|
||||
yAxis: {
|
||||
type: 'category',
|
||||
data: categories.reverse(),
|
||||
axisLabel: {
|
||||
fontSize: 11,
|
||||
width: 100,
|
||||
overflow: 'truncate'
|
||||
}
|
||||
},
|
||||
series: [{
|
||||
type: 'bar',
|
||||
data: values.reverse(),
|
||||
itemStyle: {
|
||||
color: new echarts.graphic.LinearGradient(0, 0, 1, 0, [
|
||||
{ offset: 0, color: '#667eea' },
|
||||
{ offset: 1, color: '#764ba2' }
|
||||
])
|
||||
},
|
||||
label: {
|
||||
show: true,
|
||||
position: 'right',
|
||||
fontSize: 11,
|
||||
formatter: params => formatNumber(params.value)
|
||||
}
|
||||
}]
|
||||
});
|
||||
} else {
|
||||
chartWorkcenter.setOption({
|
||||
title: { text: '無資料', left: 'center', top: 'center', textStyle: { color: '#999' } }
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Workcenter chart load failed:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Load status chart
|
||||
async function loadStatusChart() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/by_status');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success && result.data.length > 0) {
|
||||
const pieData = result.data.map(row => ({
|
||||
name: STATUS_MAP[row.STATUS] || `Status ${row.STATUS}`,
|
||||
value: row.TOTAL_QTY || 0
|
||||
}));
|
||||
|
||||
chartStatus.setOption({
|
||||
tooltip: {
|
||||
trigger: 'item',
|
||||
formatter: '{b}: {c} ({d}%)'
|
||||
},
|
||||
legend: {
|
||||
orient: 'vertical',
|
||||
right: '5%',
|
||||
top: 'center'
|
||||
},
|
||||
series: [{
|
||||
type: 'pie',
|
||||
radius: ['40%', '70%'],
|
||||
center: ['40%', '50%'],
|
||||
avoidLabelOverlap: true,
|
||||
itemStyle: {
|
||||
borderRadius: 6,
|
||||
borderColor: '#fff',
|
||||
borderWidth: 2
|
||||
},
|
||||
label: {
|
||||
show: true,
|
||||
formatter: '{b}\n{d}%'
|
||||
},
|
||||
data: pieData,
|
||||
color: ['#667eea', '#22c55e', '#ef4444', '#f59e0b', '#8b5cf6']
|
||||
}]
|
||||
});
|
||||
} else {
|
||||
chartStatus.setOption({
|
||||
title: { text: '無資料', left: 'center', top: 'center', textStyle: { color: '#999' } }
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Status chart load failed:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Load product line chart
|
||||
async function loadProductLineChart() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/by_product_line');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success && result.summary && result.summary.length > 0) {
|
||||
const pieData = result.summary
|
||||
.sort((a, b) => b.TOTAL_QTY - a.TOTAL_QTY)
|
||||
.slice(0, 10)
|
||||
.map(row => ({
|
||||
name: row.PRODUCTLINENAME_LEF || '(Unknown)',
|
||||
value: row.TOTAL_QTY || 0
|
||||
}));
|
||||
|
||||
chartProductLine.setOption({
|
||||
tooltip: {
|
||||
trigger: 'item',
|
||||
formatter: '{b}: {c} ({d}%)'
|
||||
},
|
||||
legend: {
|
||||
orient: 'vertical',
|
||||
right: '5%',
|
||||
top: 'center',
|
||||
formatter: name => name.length > 12 ? name.slice(0, 12) + '...' : name
|
||||
},
|
||||
series: [{
|
||||
type: 'pie',
|
||||
radius: ['40%', '70%'],
|
||||
center: ['40%', '50%'],
|
||||
avoidLabelOverlap: true,
|
||||
itemStyle: {
|
||||
borderRadius: 6,
|
||||
borderColor: '#fff',
|
||||
borderWidth: 2
|
||||
},
|
||||
label: {
|
||||
show: false
|
||||
},
|
||||
data: pieData,
|
||||
color: ['#667eea', '#764ba2', '#22c55e', '#f59e0b', '#ef4444', '#8b5cf6', '#06b6d4', '#ec4899', '#84cc16', '#6366f1']
|
||||
}]
|
||||
});
|
||||
} else {
|
||||
chartProductLine.setOption({
|
||||
title: { text: '無資料', left: 'center', top: 'center', textStyle: { color: '#999' } }
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Product line chart load failed:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Load mfg order chart
|
||||
async function loadMfgOrderChart() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/by_mfgorder?limit=20');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success && result.data.length > 0) {
|
||||
const categories = result.data.map(row => row.MFGORDERNAME || '(Unknown)');
|
||||
const values = result.data.map(row => row.TOTAL_QTY || 0);
|
||||
|
||||
chartMfgOrder.setOption({
|
||||
tooltip: {
|
||||
trigger: 'axis',
|
||||
axisPointer: { type: 'shadow' }
|
||||
},
|
||||
grid: {
|
||||
left: '3%',
|
||||
right: '4%',
|
||||
bottom: '3%',
|
||||
top: '3%',
|
||||
containLabel: true
|
||||
},
|
||||
xAxis: {
|
||||
type: 'value'
|
||||
},
|
||||
yAxis: {
|
||||
type: 'category',
|
||||
data: categories.reverse(),
|
||||
axisLabel: {
|
||||
fontSize: 10,
|
||||
width: 100,
|
||||
overflow: 'truncate'
|
||||
}
|
||||
},
|
||||
series: [{
|
||||
type: 'bar',
|
||||
data: values.reverse(),
|
||||
itemStyle: {
|
||||
color: new echarts.graphic.LinearGradient(0, 0, 1, 0, [
|
||||
{ offset: 0, color: '#22c55e' },
|
||||
{ offset: 1, color: '#16a34a' }
|
||||
])
|
||||
},
|
||||
label: {
|
||||
show: true,
|
||||
position: 'right',
|
||||
fontSize: 10,
|
||||
formatter: params => formatNumber(params.value)
|
||||
}
|
||||
}]
|
||||
});
|
||||
} else {
|
||||
chartMfgOrder.setOption({
|
||||
title: { text: '無資料', left: 'center', top: 'center', textStyle: { color: '#999' } }
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Mfg order chart load failed:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Load all data
|
||||
async function loadAllData() {
|
||||
document.getElementById('loadingOverlay').style.display = 'flex';
|
||||
|
||||
try {
|
||||
await Promise.all([
|
||||
loadSummary(),
|
||||
loadWorkcenterChart(),
|
||||
loadStatusChart(),
|
||||
loadProductLineChart(),
|
||||
loadMfgOrderChart()
|
||||
]);
|
||||
|
||||
document.getElementById('lastUpdate').textContent =
|
||||
`Last Update: ${new Date().toLocaleString('zh-TW')}`;
|
||||
} finally {
|
||||
document.getElementById('loadingOverlay').style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize on load
|
||||
window.onload = function() {
|
||||
initCharts();
|
||||
loadAllData();
|
||||
};
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
File diff suppressed because it is too large
Load Diff
49
tests/test_app_factory.py
Normal file
49
tests/test_app_factory.py
Normal file
@@ -0,0 +1,49 @@
|
||||
import unittest
|
||||
|
||||
from mes_dashboard.app import create_app
|
||||
from mes_dashboard.core.cache import NoOpCache
|
||||
import mes_dashboard.core.database as db
|
||||
|
||||
|
||||
class AppFactoryTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
db._ENGINE = None
|
||||
|
||||
def test_create_app_default_config(self):
|
||||
app = create_app()
|
||||
self.assertTrue(app.config.get("DEBUG"))
|
||||
self.assertEqual(app.config.get("ENV"), "development")
|
||||
self.assertIsInstance(app.extensions.get("cache"), NoOpCache)
|
||||
|
||||
def test_create_app_production_config(self):
|
||||
app = create_app("production")
|
||||
self.assertFalse(app.config.get("DEBUG"))
|
||||
self.assertEqual(app.config.get("ENV"), "production")
|
||||
|
||||
def test_create_app_independent_instances(self):
|
||||
app1 = create_app()
|
||||
db._ENGINE = None
|
||||
app2 = create_app()
|
||||
self.assertIsNot(app1, app2)
|
||||
|
||||
def test_routes_registered(self):
|
||||
app = create_app()
|
||||
rules = {rule.rule for rule in app.url_map.iter_rules()}
|
||||
expected = {
|
||||
"/",
|
||||
"/tables",
|
||||
"/wip",
|
||||
"/resource",
|
||||
"/wip-overview",
|
||||
"/excel-query",
|
||||
"/api/wip/summary",
|
||||
"/api/resource/summary",
|
||||
"/api/dashboard/kpi",
|
||||
"/api/excel-query/upload",
|
||||
}
|
||||
missing = expected - rules
|
||||
self.assertFalse(missing, f"Missing routes: {sorted(missing)}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -1,19 +1,19 @@
|
||||
"""
|
||||
生成 MES 数据库参考文档
|
||||
用于报表开发参考
|
||||
"""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
|
||||
# 读取表结构信息
|
||||
ROOT_DIR = Path(__file__).resolve().parent.parent
|
||||
DATA_FILE = ROOT_DIR / 'data' / 'table_schema_info.json'
|
||||
with open(DATA_FILE, 'r', encoding='utf-8') as f:
|
||||
table_info = json.load(f)
|
||||
|
||||
# 表用途描述(根据表名推断)
|
||||
"""
|
||||
生成 MES 数据库参考文档
|
||||
用于报表开发参考
|
||||
"""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
|
||||
# 读取表结构信息
|
||||
ROOT_DIR = Path(__file__).resolve().parent.parent
|
||||
DATA_FILE = ROOT_DIR / 'data' / 'table_schema_info.json'
|
||||
with open(DATA_FILE, 'r', encoding='utf-8') as f:
|
||||
table_info = json.load(f)
|
||||
|
||||
# 表用途描述(根据表名推断)
|
||||
TABLE_DESCRIPTIONS = {
|
||||
'DW_MES_CONTAINER': '容器/批次主檔 - 目前在製容器狀態、數量與流程資訊',
|
||||
'DW_MES_HOLDRELEASEHISTORY': 'Hold/Release 歷史表 - 批次停工與解除紀錄',
|
||||
@@ -33,307 +33,307 @@ TABLE_DESCRIPTIONS = {
|
||||
'DW_MES_RESOURCE': '資源表 - 設備/載具等資源基本資料(OBJECTCATEGORY=ASSEMBLY 時,RESOURCENAME 為設備編號)'
|
||||
}
|
||||
|
||||
# 常见字段说明
|
||||
COMMON_FIELD_NOTES = {
|
||||
'ID': '唯一标识符',
|
||||
'NAME': '名称',
|
||||
'STATUS': '状态',
|
||||
'TIMESTAMP': '时间戳',
|
||||
'CREATEDATE': '创建日期',
|
||||
'UPDATEDATE': '更新日期',
|
||||
'LOTID': '批次ID',
|
||||
'CONTAINERID': '容器ID',
|
||||
'RESOURCEID': '资源ID',
|
||||
'EQUIPMENTID': '设备ID',
|
||||
'OPERATIONID': '工序ID',
|
||||
'JOBID': '工单ID',
|
||||
'PRODUCTID': '产品ID',
|
||||
'CUSTOMERID': '客户ID',
|
||||
'QTY': '数量',
|
||||
'QUANTITY': '数量'
|
||||
}
|
||||
|
||||
|
||||
def generate_markdown():
|
||||
"""生成 Markdown 文档"""
|
||||
|
||||
md = []
|
||||
|
||||
# 标题和简介
|
||||
md.append("# MES 数据库报表开发参考文档\n")
|
||||
md.append(f"**生成时间**: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n")
|
||||
md.append("---\n")
|
||||
|
||||
# 目录
|
||||
md.append("## 目录\n")
|
||||
md.append("1. [数据库连接信息](#数据库连接信息)")
|
||||
md.append("2. [数据库概览](#数据库概览)")
|
||||
md.append("3. [表结构详细说明](#表结构详细说明)")
|
||||
md.append("4. [报表开发注意事项](#报表开发注意事项)")
|
||||
md.append("5. [常用查询示例](#常用查询示例)\n")
|
||||
md.append("---\n")
|
||||
|
||||
# 1. 数据库连接信息
|
||||
md.append("## 数据库连接信息\n")
|
||||
md.append("### 连接参数\n")
|
||||
md.append("| 参数 | 值 |")
|
||||
md.append("|------|------|")
|
||||
md.append("| 数据库类型 | Oracle Database 19c Enterprise Edition |")
|
||||
md.append("| 主机地址 | 10.1.1.58 |")
|
||||
md.append("| 端口 | 1521 |")
|
||||
md.append("| 服务名 | DWDB |")
|
||||
md.append("| 用户名 | MBU1_R |")
|
||||
md.append("| 密码 | Pj2481mbu1 |\n")
|
||||
|
||||
md.append("### Python 连接示例\n")
|
||||
md.append("```python")
|
||||
md.append("import oracledb")
|
||||
md.append("")
|
||||
md.append("# 连接配置")
|
||||
md.append("DB_CONFIG = {")
|
||||
md.append(" 'user': 'MBU1_R',")
|
||||
md.append(" 'password': 'Pj2481mbu1',")
|
||||
md.append(" 'dsn': '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=10.1.1.58)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=DWDB)))'")
|
||||
md.append("}")
|
||||
md.append("")
|
||||
md.append("# 建立连接")
|
||||
md.append("connection = oracledb.connect(**DB_CONFIG)")
|
||||
md.append("cursor = connection.cursor()")
|
||||
md.append("")
|
||||
md.append("# 执行查询")
|
||||
md.append("cursor.execute('SELECT * FROM DW_MES_WIP WHERE ROWNUM <= 10')")
|
||||
md.append("results = cursor.fetchall()")
|
||||
md.append("")
|
||||
md.append("# 关闭连接")
|
||||
md.append("cursor.close()")
|
||||
md.append("connection.close()")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### JDBC 连接字符串\n")
|
||||
md.append("```")
|
||||
md.append("jdbc:oracle:thin:@10.1.1.58:1521:DWDB")
|
||||
md.append("```\n")
|
||||
|
||||
# 2. 数据库概览
|
||||
md.append("---\n")
|
||||
md.append("## 数据库概览\n")
|
||||
md.append("### 表统计信息\n")
|
||||
md.append("| # | 表名 | 用途 | 数据量 |")
|
||||
md.append("|---|------|------|--------|")
|
||||
|
||||
for idx, (table_name, info) in enumerate(sorted(table_info.items()), 1):
|
||||
if 'error' not in info:
|
||||
row_count = f"{info['row_count']:,}"
|
||||
description = TABLE_DESCRIPTIONS.get(table_name, '待补充')
|
||||
md.append(f"| {idx} | `{table_name}` | {description} | {row_count} |")
|
||||
|
||||
md.append("")
|
||||
|
||||
# 计算总数据量
|
||||
total_rows = sum(info['row_count'] for info in table_info.values() if 'error' not in info)
|
||||
md.append(f"**总数据量**: {total_rows:,} 行\n")
|
||||
|
||||
# 3. 表结构详细说明
|
||||
md.append("---\n")
|
||||
md.append("## 表结构详细说明\n")
|
||||
|
||||
for table_name in sorted(table_info.keys()):
|
||||
info = table_info[table_name]
|
||||
|
||||
if 'error' in info:
|
||||
continue
|
||||
|
||||
md.append(f"### {table_name}\n")
|
||||
|
||||
# 表说明
|
||||
md.append(f"**用途**: {TABLE_DESCRIPTIONS.get(table_name, '待补充')}\n")
|
||||
md.append(f"**数据量**: {info['row_count']:,} 行\n")
|
||||
|
||||
if info.get('table_comment'):
|
||||
md.append(f"**表注释**: {info['table_comment']}\n")
|
||||
|
||||
# 字段列表
|
||||
md.append("#### 字段列表\n")
|
||||
md.append("| # | 字段名 | 数据类型 | 长度 | 可空 | 说明 |")
|
||||
md.append("|---|--------|----------|------|------|------|")
|
||||
|
||||
schema = info.get('schema', [])
|
||||
for col in schema:
|
||||
col_num = col['column_id']
|
||||
col_name = col['column_name']
|
||||
|
||||
# 构建数据类型显示
|
||||
if col['data_type'] in ['VARCHAR2', 'CHAR']:
|
||||
data_type = f"{col['data_type']}({col['data_length']})"
|
||||
elif col['data_type'] == 'NUMBER' and col['data_precision']:
|
||||
if col['data_scale']:
|
||||
data_type = f"NUMBER({col['data_precision']},{col['data_scale']})"
|
||||
else:
|
||||
data_type = f"NUMBER({col['data_precision']})"
|
||||
else:
|
||||
data_type = col['data_type']
|
||||
|
||||
nullable = "是" if col['nullable'] == 'Y' else "否"
|
||||
|
||||
# 获取字段说明
|
||||
column_comments = info.get('column_comments', {})
|
||||
comment = column_comments.get(col_name, '')
|
||||
|
||||
# 如果没有注释,尝试从常见字段说明中获取
|
||||
if not comment:
|
||||
for key, value in COMMON_FIELD_NOTES.items():
|
||||
if key in col_name:
|
||||
comment = value
|
||||
break
|
||||
|
||||
md.append(f"| {col_num} | `{col_name}` | {data_type} | {col.get('data_length', '-')} | {nullable} | {comment} |")
|
||||
|
||||
md.append("")
|
||||
|
||||
# 索引信息
|
||||
indexes = info.get('indexes', [])
|
||||
if indexes:
|
||||
md.append("#### 索引\n")
|
||||
md.append("| 索引名 | 类型 | 字段 |")
|
||||
md.append("|--------|------|------|")
|
||||
for idx_info in indexes:
|
||||
idx_type = "唯一索引" if idx_info[1] == 'UNIQUE' else "普通索引"
|
||||
md.append(f"| `{idx_info[0]}` | {idx_type} | {idx_info[2]} |")
|
||||
md.append("")
|
||||
|
||||
md.append("---\n")
|
||||
|
||||
# 4. 报表开发注意事项
|
||||
md.append("## 报表开发注意事项\n")
|
||||
md.append("### 性能优化建议\n")
|
||||
md.append("1. **大数据量表查询优化**")
|
||||
md.append(" - 以下表数据量较大,查询时务必添加时间范围限制:")
|
||||
|
||||
large_tables = [(name, info['row_count']) for name, info in table_info.items()
|
||||
if 'error' not in info and info['row_count'] > 10000000]
|
||||
large_tables.sort(key=lambda x: x[1], reverse=True)
|
||||
|
||||
for table_name, count in large_tables:
|
||||
md.append(f" - `{table_name}`: {count:,} 行")
|
||||
|
||||
md.append("")
|
||||
md.append("2. **索引使用**")
|
||||
md.append(" - 查询时尽量使用已建立索引的字段作为查询条件")
|
||||
md.append(" - 避免在索引字段上使用函数,会导致索引失效")
|
||||
md.append("")
|
||||
md.append("3. **连接池配置**")
|
||||
md.append(" - 建议使用连接池管理数据库连接")
|
||||
md.append(" - 推荐连接池大小:5-10 个连接")
|
||||
md.append("")
|
||||
md.append("4. **查询超时设置**")
|
||||
md.append(" - 建议设置查询超时时间为 30-60 秒")
|
||||
md.append(" - 避免长时间运行的查询影响系统性能")
|
||||
md.append("")
|
||||
|
||||
md.append("### 数据时效性\n")
|
||||
# 常见字段说明
|
||||
COMMON_FIELD_NOTES = {
|
||||
'ID': '唯一标识符',
|
||||
'NAME': '名称',
|
||||
'STATUS': '状态',
|
||||
'TIMESTAMP': '时间戳',
|
||||
'CREATEDATE': '创建日期',
|
||||
'UPDATEDATE': '更新日期',
|
||||
'LOTID': '批次ID',
|
||||
'CONTAINERID': '容器ID',
|
||||
'RESOURCEID': '资源ID',
|
||||
'EQUIPMENTID': '设备ID',
|
||||
'OPERATIONID': '工序ID',
|
||||
'JOBID': '工单ID',
|
||||
'PRODUCTID': '产品ID',
|
||||
'CUSTOMERID': '客户ID',
|
||||
'QTY': '数量',
|
||||
'QUANTITY': '数量'
|
||||
}
|
||||
|
||||
|
||||
def generate_markdown():
|
||||
"""生成 Markdown 文档"""
|
||||
|
||||
md = []
|
||||
|
||||
# 标题和简介
|
||||
md.append("# MES 数据库报表开发参考文档\n")
|
||||
md.append(f"**生成时间**: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n")
|
||||
md.append("---\n")
|
||||
|
||||
# 目录
|
||||
md.append("## 目录\n")
|
||||
md.append("1. [数据库连接信息](#数据库连接信息)")
|
||||
md.append("2. [数据库概览](#数据库概览)")
|
||||
md.append("3. [表结构详细说明](#表结构详细说明)")
|
||||
md.append("4. [报表开发注意事项](#报表开发注意事项)")
|
||||
md.append("5. [常用查询示例](#常用查询示例)\n")
|
||||
md.append("---\n")
|
||||
|
||||
# 1. 数据库连接信息
|
||||
md.append("## 数据库连接信息\n")
|
||||
md.append("### 连接参数\n")
|
||||
md.append("| 参数 | 值 |")
|
||||
md.append("|------|------|")
|
||||
md.append("| 数据库类型 | Oracle Database 19c Enterprise Edition |")
|
||||
md.append("| 主机地址 | 10.1.1.58 |")
|
||||
md.append("| 端口 | 1521 |")
|
||||
md.append("| 服务名 | DWDB |")
|
||||
md.append("| 用户名 | MBU1_R |")
|
||||
md.append("| 密码 | Pj2481mbu1 |\n")
|
||||
|
||||
md.append("### Python 连接示例\n")
|
||||
md.append("```python")
|
||||
md.append("import oracledb")
|
||||
md.append("")
|
||||
md.append("# 连接配置")
|
||||
md.append("DB_CONFIG = {")
|
||||
md.append(" 'user': 'MBU1_R',")
|
||||
md.append(" 'password': 'Pj2481mbu1',")
|
||||
md.append(" 'dsn': '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=10.1.1.58)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=DWDB)))'")
|
||||
md.append("}")
|
||||
md.append("")
|
||||
md.append("# 建立连接")
|
||||
md.append("connection = oracledb.connect(**DB_CONFIG)")
|
||||
md.append("cursor = connection.cursor()")
|
||||
md.append("")
|
||||
md.append("# 执行查询")
|
||||
md.append("cursor.execute('SELECT * FROM DW_MES_WIP WHERE ROWNUM <= 10')")
|
||||
md.append("results = cursor.fetchall()")
|
||||
md.append("")
|
||||
md.append("# 关闭连接")
|
||||
md.append("cursor.close()")
|
||||
md.append("connection.close()")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### JDBC 连接字符串\n")
|
||||
md.append("```")
|
||||
md.append("jdbc:oracle:thin:@10.1.1.58:1521:DWDB")
|
||||
md.append("```\n")
|
||||
|
||||
# 2. 数据库概览
|
||||
md.append("---\n")
|
||||
md.append("## 数据库概览\n")
|
||||
md.append("### 表统计信息\n")
|
||||
md.append("| # | 表名 | 用途 | 数据量 |")
|
||||
md.append("|---|------|------|--------|")
|
||||
|
||||
for idx, (table_name, info) in enumerate(sorted(table_info.items()), 1):
|
||||
if 'error' not in info:
|
||||
row_count = f"{info['row_count']:,}"
|
||||
description = TABLE_DESCRIPTIONS.get(table_name, '待补充')
|
||||
md.append(f"| {idx} | `{table_name}` | {description} | {row_count} |")
|
||||
|
||||
md.append("")
|
||||
|
||||
# 计算总数据量
|
||||
total_rows = sum(info['row_count'] for info in table_info.values() if 'error' not in info)
|
||||
md.append(f"**总数据量**: {total_rows:,} 行\n")
|
||||
|
||||
# 3. 表结构详细说明
|
||||
md.append("---\n")
|
||||
md.append("## 表结构详细说明\n")
|
||||
|
||||
for table_name in sorted(table_info.keys()):
|
||||
info = table_info[table_name]
|
||||
|
||||
if 'error' in info:
|
||||
continue
|
||||
|
||||
md.append(f"### {table_name}\n")
|
||||
|
||||
# 表说明
|
||||
md.append(f"**用途**: {TABLE_DESCRIPTIONS.get(table_name, '待补充')}\n")
|
||||
md.append(f"**数据量**: {info['row_count']:,} 行\n")
|
||||
|
||||
if info.get('table_comment'):
|
||||
md.append(f"**表注释**: {info['table_comment']}\n")
|
||||
|
||||
# 字段列表
|
||||
md.append("#### 字段列表\n")
|
||||
md.append("| # | 字段名 | 数据类型 | 长度 | 可空 | 说明 |")
|
||||
md.append("|---|--------|----------|------|------|------|")
|
||||
|
||||
schema = info.get('schema', [])
|
||||
for col in schema:
|
||||
col_num = col['column_id']
|
||||
col_name = col['column_name']
|
||||
|
||||
# 构建数据类型显示
|
||||
if col['data_type'] in ['VARCHAR2', 'CHAR']:
|
||||
data_type = f"{col['data_type']}({col['data_length']})"
|
||||
elif col['data_type'] == 'NUMBER' and col['data_precision']:
|
||||
if col['data_scale']:
|
||||
data_type = f"NUMBER({col['data_precision']},{col['data_scale']})"
|
||||
else:
|
||||
data_type = f"NUMBER({col['data_precision']})"
|
||||
else:
|
||||
data_type = col['data_type']
|
||||
|
||||
nullable = "是" if col['nullable'] == 'Y' else "否"
|
||||
|
||||
# 获取字段说明
|
||||
column_comments = info.get('column_comments', {})
|
||||
comment = column_comments.get(col_name, '')
|
||||
|
||||
# 如果没有注释,尝试从常见字段说明中获取
|
||||
if not comment:
|
||||
for key, value in COMMON_FIELD_NOTES.items():
|
||||
if key in col_name:
|
||||
comment = value
|
||||
break
|
||||
|
||||
md.append(f"| {col_num} | `{col_name}` | {data_type} | {col.get('data_length', '-')} | {nullable} | {comment} |")
|
||||
|
||||
md.append("")
|
||||
|
||||
# 索引信息
|
||||
indexes = info.get('indexes', [])
|
||||
if indexes:
|
||||
md.append("#### 索引\n")
|
||||
md.append("| 索引名 | 类型 | 字段 |")
|
||||
md.append("|--------|------|------|")
|
||||
for idx_info in indexes:
|
||||
idx_type = "唯一索引" if idx_info[1] == 'UNIQUE' else "普通索引"
|
||||
md.append(f"| `{idx_info[0]}` | {idx_type} | {idx_info[2]} |")
|
||||
md.append("")
|
||||
|
||||
md.append("---\n")
|
||||
|
||||
# 4. 报表开发注意事项
|
||||
md.append("## 报表开发注意事项\n")
|
||||
md.append("### 性能优化建议\n")
|
||||
md.append("1. **大数据量表查询优化**")
|
||||
md.append(" - 以下表数据量较大,查询时务必添加时间范围限制:")
|
||||
|
||||
large_tables = [(name, info['row_count']) for name, info in table_info.items()
|
||||
if 'error' not in info and info['row_count'] > 10000000]
|
||||
large_tables.sort(key=lambda x: x[1], reverse=True)
|
||||
|
||||
for table_name, count in large_tables:
|
||||
md.append(f" - `{table_name}`: {count:,} 行")
|
||||
|
||||
md.append("")
|
||||
md.append("2. **索引使用**")
|
||||
md.append(" - 查询时尽量使用已建立索引的字段作为查询条件")
|
||||
md.append(" - 避免在索引字段上使用函数,会导致索引失效")
|
||||
md.append("")
|
||||
md.append("3. **连接池配置**")
|
||||
md.append(" - 建议使用连接池管理数据库连接")
|
||||
md.append(" - 推荐连接池大小:5-10 个连接")
|
||||
md.append("")
|
||||
md.append("4. **查询超时设置**")
|
||||
md.append(" - 建议设置查询超时时间为 30-60 秒")
|
||||
md.append(" - 避免长时间运行的查询影响系统性能")
|
||||
md.append("")
|
||||
|
||||
md.append("### 数据时效性\n")
|
||||
md.append("- **实时数据表**: `DW_MES_WIP`(含歷史累積), `DW_MES_RESOURCESTATUS`")
|
||||
md.append("- **历史数据表**: 带有 `HISTORY` 后缀的表")
|
||||
md.append("- **主数据表**: `DW_MES_RESOURCE`, `DW_MES_CONTAINER`")
|
||||
md.append("")
|
||||
|
||||
md.append("### 常用时间字段\n")
|
||||
md.append("大多数历史表包含以下时间相关字段:")
|
||||
md.append("- `CREATEDATE` / `CREATETIMESTAMP`: 记录创建时间")
|
||||
md.append("- `UPDATEDATE` / `UPDATETIMESTAMP`: 记录更新时间")
|
||||
md.append("- `TRANSACTIONDATE`: 交易发生时间")
|
||||
md.append("")
|
||||
|
||||
md.append("### 数据权限\n")
|
||||
md.append("- 当前账号 `MBU1_R` 为只读账号")
|
||||
md.append("- 仅可执行 SELECT 查询")
|
||||
md.append("- 无法进行 INSERT, UPDATE, DELETE 操作")
|
||||
md.append("")
|
||||
|
||||
# 5. 常用查询示例
|
||||
md.append("---\n")
|
||||
md.append("## 常用查询示例\n")
|
||||
|
||||
md.append("### 1. 查询当前在制品数量\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT COUNT(*) as WIP_COUNT")
|
||||
md.append("FROM DW_MES_WIP")
|
||||
md.append("WHERE CURRENTSTATUSID IS NOT NULL;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 2. 查询设备状态统计\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" CURRENTSTATUSID,")
|
||||
md.append(" COUNT(*) as COUNT")
|
||||
md.append("FROM DW_MES_RESOURCESTATUS")
|
||||
md.append("GROUP BY CURRENTSTATUSID")
|
||||
md.append("ORDER BY COUNT DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 3. 查询最近 7 天的批次历史\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT *")
|
||||
md.append("FROM DW_MES_LOTWIPHISTORY")
|
||||
md.append("WHERE CREATEDATE >= SYSDATE - 7")
|
||||
md.append("ORDER BY CREATEDATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 4. 查询工单完成情况\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" JOBID,")
|
||||
md.append(" JOBSTATUS,")
|
||||
md.append(" COUNT(*) as COUNT")
|
||||
md.append("FROM DW_MES_JOB")
|
||||
md.append("GROUP BY JOBID, JOBSTATUS")
|
||||
md.append("ORDER BY JOBID;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 5. 按日期统计生产数量\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" TRUNC(CREATEDATE) as PRODUCTION_DATE,")
|
||||
md.append(" COUNT(*) as LOT_COUNT")
|
||||
md.append("FROM DW_MES_HM_LOTMOVEOUT")
|
||||
md.append("WHERE CREATEDATE >= SYSDATE - 30")
|
||||
md.append("GROUP BY TRUNC(CREATEDATE)")
|
||||
md.append("ORDER BY PRODUCTION_DATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 6. 联表查询示例(批次与容器)\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" w.LOTID,")
|
||||
md.append(" w.CONTAINERNAME,")
|
||||
md.append(" c.CURRENTSTATUSID,")
|
||||
md.append(" c.CUSTOMERID")
|
||||
md.append("FROM DW_MES_WIP w")
|
||||
md.append("LEFT JOIN DW_MES_CONTAINER c ON w.CONTAINERID = c.CONTAINERID")
|
||||
md.append("WHERE w.CREATEDATE >= SYSDATE - 1")
|
||||
md.append("ORDER BY w.CREATEDATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("---\n")
|
||||
md.append("## 附录\n")
|
||||
md.append("### 文档更新记录\n")
|
||||
md.append(f"- {datetime.now().strftime('%Y-%m-%d')}: 初始版本创建")
|
||||
md.append("")
|
||||
md.append("### 联系方式\n")
|
||||
md.append("如有疑问或需要补充信息,请联系数据库管理员。\n")
|
||||
|
||||
return '\n'.join(md)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
print("Generating documentation...")
|
||||
markdown_content = generate_markdown()
|
||||
|
||||
output_file = ROOT_DIR / 'docs' / 'MES_Database_Reference.md'
|
||||
with open(output_file, 'w', encoding='utf-8') as f:
|
||||
f.write(markdown_content)
|
||||
|
||||
print(f"[OK] Documentation generated: {output_file}")
|
||||
|
||||
|
||||
|
||||
|
||||
md.append("- **历史数据表**: 带有 `HISTORY` 后缀的表")
|
||||
md.append("- **主数据表**: `DW_MES_RESOURCE`, `DW_MES_CONTAINER`")
|
||||
md.append("")
|
||||
|
||||
md.append("### 常用时间字段\n")
|
||||
md.append("大多数历史表包含以下时间相关字段:")
|
||||
md.append("- `CREATEDATE` / `CREATETIMESTAMP`: 记录创建时间")
|
||||
md.append("- `UPDATEDATE` / `UPDATETIMESTAMP`: 记录更新时间")
|
||||
md.append("- `TRANSACTIONDATE`: 交易发生时间")
|
||||
md.append("")
|
||||
|
||||
md.append("### 数据权限\n")
|
||||
md.append("- 当前账号 `MBU1_R` 为只读账号")
|
||||
md.append("- 仅可执行 SELECT 查询")
|
||||
md.append("- 无法进行 INSERT, UPDATE, DELETE 操作")
|
||||
md.append("")
|
||||
|
||||
# 5. 常用查询示例
|
||||
md.append("---\n")
|
||||
md.append("## 常用查询示例\n")
|
||||
|
||||
md.append("### 1. 查询当前在制品数量\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT COUNT(*) as WIP_COUNT")
|
||||
md.append("FROM DW_MES_WIP")
|
||||
md.append("WHERE CURRENTSTATUSID IS NOT NULL;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 2. 查询设备状态统计\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" CURRENTSTATUSID,")
|
||||
md.append(" COUNT(*) as COUNT")
|
||||
md.append("FROM DW_MES_RESOURCESTATUS")
|
||||
md.append("GROUP BY CURRENTSTATUSID")
|
||||
md.append("ORDER BY COUNT DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 3. 查询最近 7 天的批次历史\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT *")
|
||||
md.append("FROM DW_MES_LOTWIPHISTORY")
|
||||
md.append("WHERE CREATEDATE >= SYSDATE - 7")
|
||||
md.append("ORDER BY CREATEDATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 4. 查询工单完成情况\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" JOBID,")
|
||||
md.append(" JOBSTATUS,")
|
||||
md.append(" COUNT(*) as COUNT")
|
||||
md.append("FROM DW_MES_JOB")
|
||||
md.append("GROUP BY JOBID, JOBSTATUS")
|
||||
md.append("ORDER BY JOBID;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 5. 按日期统计生产数量\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" TRUNC(CREATEDATE) as PRODUCTION_DATE,")
|
||||
md.append(" COUNT(*) as LOT_COUNT")
|
||||
md.append("FROM DW_MES_HM_LOTMOVEOUT")
|
||||
md.append("WHERE CREATEDATE >= SYSDATE - 30")
|
||||
md.append("GROUP BY TRUNC(CREATEDATE)")
|
||||
md.append("ORDER BY PRODUCTION_DATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("### 6. 联表查询示例(批次与容器)\n")
|
||||
md.append("```sql")
|
||||
md.append("SELECT")
|
||||
md.append(" w.LOTID,")
|
||||
md.append(" w.CONTAINERNAME,")
|
||||
md.append(" c.CURRENTSTATUSID,")
|
||||
md.append(" c.CUSTOMERID")
|
||||
md.append("FROM DW_MES_WIP w")
|
||||
md.append("LEFT JOIN DW_MES_CONTAINER c ON w.CONTAINERID = c.CONTAINERID")
|
||||
md.append("WHERE w.CREATEDATE >= SYSDATE - 1")
|
||||
md.append("ORDER BY w.CREATEDATE DESC;")
|
||||
md.append("```\n")
|
||||
|
||||
md.append("---\n")
|
||||
md.append("## 附录\n")
|
||||
md.append("### 文档更新记录\n")
|
||||
md.append(f"- {datetime.now().strftime('%Y-%m-%d')}: 初始版本创建")
|
||||
md.append("")
|
||||
md.append("### 联系方式\n")
|
||||
md.append("如有疑问或需要补充信息,请联系数据库管理员。\n")
|
||||
|
||||
return '\n'.join(md)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
print("Generating documentation...")
|
||||
markdown_content = generate_markdown()
|
||||
|
||||
output_file = ROOT_DIR / 'docs' / 'MES_Database_Reference.md'
|
||||
with open(output_file, 'w', encoding='utf-8') as f:
|
||||
f.write(markdown_content)
|
||||
|
||||
print(f"[OK] Documentation generated: {output_file}")
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
|
||||
import sys
|
||||
import io
|
||||
import os
|
||||
import oracledb
|
||||
import json
|
||||
from pathlib import Path
|
||||
@@ -12,11 +13,25 @@ from pathlib import Path
|
||||
# 设置 UTF-8 编码输出
|
||||
sys.stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace')
|
||||
|
||||
# 数据库连接信息
|
||||
# Load .env file
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
env_path = Path(__file__).resolve().parent.parent / '.env'
|
||||
load_dotenv(env_path)
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
# 数据库连接信息 (从环境变量读取)
|
||||
DB_HOST = os.getenv('DB_HOST', '10.1.1.58')
|
||||
DB_PORT = os.getenv('DB_PORT', '1521')
|
||||
DB_SERVICE = os.getenv('DB_SERVICE', 'DWDB')
|
||||
DB_USER = os.getenv('DB_USER', '')
|
||||
DB_PASSWORD = os.getenv('DB_PASSWORD', '')
|
||||
|
||||
DB_CONFIG = {
|
||||
'user': 'MBU1_R',
|
||||
'password': 'Pj2481mbu1',
|
||||
'dsn': '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=10.1.1.58)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=DWDB)))'
|
||||
'user': DB_USER,
|
||||
'password': DB_PASSWORD,
|
||||
'dsn': f'(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST={DB_HOST})(PORT={DB_PORT})))(CONNECT_DATA=(SERVICE_NAME={DB_SERVICE})))'
|
||||
}
|
||||
|
||||
# MES 表列表
|
||||
|
||||
@@ -5,17 +5,33 @@ Oracle Database Connection Test Script
|
||||
|
||||
import sys
|
||||
import io
|
||||
import os
|
||||
import oracledb
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
# 设置 UTF-8 编码输出
|
||||
sys.stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8', errors='replace')
|
||||
|
||||
# 数据库连接信息
|
||||
# Load .env file
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
env_path = Path(__file__).resolve().parent.parent / '.env'
|
||||
load_dotenv(env_path)
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
# 数据库连接信息 (从环境变量读取)
|
||||
DB_HOST = os.getenv('DB_HOST', '10.1.1.58')
|
||||
DB_PORT = os.getenv('DB_PORT', '1521')
|
||||
DB_SERVICE = os.getenv('DB_SERVICE', 'DWDB')
|
||||
DB_USER = os.getenv('DB_USER', '')
|
||||
DB_PASSWORD = os.getenv('DB_PASSWORD', '')
|
||||
|
||||
DB_CONFIG = {
|
||||
'user': 'MBU1_R',
|
||||
'password': 'Pj2481mbu1',
|
||||
'dsn': '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=10.1.1.58)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=DWDB)))'
|
||||
'user': DB_USER,
|
||||
'password': DB_PASSWORD,
|
||||
'dsn': f'(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST={DB_HOST})(PORT={DB_PORT})))(CONNECT_DATA=(SERVICE_NAME={DB_SERVICE})))'
|
||||
}
|
||||
|
||||
# MES 表列表
|
||||
@@ -45,9 +61,9 @@ def test_connection():
|
||||
print("Oracle Database Connection Test")
|
||||
print("=" * 60)
|
||||
print(f"Test Time: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}")
|
||||
print(f"Host: 10.1.1.58:1521")
|
||||
print(f"Service Name: DWDB")
|
||||
print(f"User: {DB_CONFIG['user']}")
|
||||
print(f"Host: {DB_HOST}:{DB_PORT}")
|
||||
print(f"Service Name: {DB_SERVICE}")
|
||||
print(f"User: {DB_USER}")
|
||||
print("=" * 60)
|
||||
|
||||
try:
|
||||
|
||||
Reference in New Issue
Block a user