feat: WIP 進階篩選功能與資料庫連線穩定性改進
新增功能: - WORKORDER/LOT ID 模糊搜尋與 autocomplete 下拉選單 - DUMMY lot 預設排除機制 (可透過 include_dummy 參數覆蓋) - WIP Detail 頁面,支援四種篩選條件組合 - 搜尋 API 端點 GET /api/wip/meta/search 穩定性改進: - 改用 NullPool 取代連線池,避免長時間閒置連線被防火牆中斷 - 新增 gunicorn worker timeout (60s),防止 worker 卡死 - 前端 fetchWithTimeout 機制,30 秒 API 逾時處理 - 改進 start_server.sh 停止邏輯,處理孤兒 worker 進程 其他: - Echarts 下載至本地,避免 CDN 追蹤保護警告 - 新增 WIP 篩選功能單元測試 - 更新 MES 核心表分析報告,加入 DWH.DW_PJ_LOT_V 說明 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -50,3 +50,4 @@ htmlcov/
|
||||
.ipynb_checkpoints/
|
||||
|
||||
# Note: openspec/ is tracked (not ignored)
|
||||
tmp/
|
||||
|
||||
@@ -1,29 +1,37 @@
|
||||
# MES 核心表詳細分析報告
|
||||
|
||||
**生成時間**: 2026-01-14
|
||||
**分析範圍**: 16 張 MES 核心表
|
||||
**資料來源**: MES_Database_Reference.md
|
||||
**生成時間**: 2026-01-14(最後更新: 2026-01-27)
|
||||
**分析範圍**: 17 張 MES 核心表(含 1 張 DWH 即時視圖)
|
||||
**資料來源**: MES_Database_Reference.md, DWH.DW_PJ_LOT_V 實際數據分析
|
||||
|
||||
---
|
||||
|
||||
## 目錄
|
||||
|
||||
1. [表性質分類總覽](#表性質分類總覽)
|
||||
2. [現況快照表分析](#現況快照表分析)
|
||||
3. [歷史累積表分析](#歷史累積表分析)
|
||||
4. [表間關聯關係圖](#表間關聯關係圖)
|
||||
5. [關鍵業務場景查詢策略](#關鍵業務場景查詢策略)
|
||||
2. [即時數據表分析](#即時數據表分析)
|
||||
3. [現況快照表分析](#現況快照表分析)
|
||||
4. [歷史累積表分析](#歷史累積表分析)
|
||||
5. [表間關聯關係圖](#表間關聯關係圖)
|
||||
6. [關鍵業務場景查詢策略](#關鍵業務場景查詢策略)
|
||||
|
||||
---
|
||||
|
||||
## 表性質分類總覽
|
||||
|
||||
### 即時數據表(Real-time Views)
|
||||
透過 DB Link 從 DWH 取得的即時 WIP 視圖,每 5 分鐘自動更新
|
||||
|
||||
| 表名 | 數據量 | 主要用途 | 更新方式 |
|
||||
|------|--------|---------|---------|
|
||||
| **DWH.DW_PJ_LOT_V** | ~9,000-12,000 | 即時 WIP 分布(70欄位) | 每 5 分鐘從 DWH 同步 |
|
||||
|
||||
### 現況快照表(Snapshot Tables)
|
||||
存儲當前狀態的數據,數據會被更新或覆蓋
|
||||
|
||||
| 表名 | 數據量 | 主要用途 | 更新方式 |
|
||||
|------|--------|---------|---------|
|
||||
| **DW_MES_WIP** | 77,470,834 | 在制品現況(含歷史累積) | 隨生產流程更新 |
|
||||
| **DW_MES_WIP** | 77,470,834 | 在制品現況(含歷史累積) | 隨生產流程更新 |
|
||||
| **DW_MES_RESOURCE** | 90,620 | 資源主檔(設備/工位) | 異動時更新 |
|
||||
| **DW_MES_CONTAINER** | 5,185,532 | 容器當前狀態 | 隨批次流轉更新 |
|
||||
| **DW_MES_JOB** | 1,239,659 | 設備維修工單當前狀態 | 維修工單狀態變更時更新 |
|
||||
@@ -53,13 +61,260 @@
|
||||
|
||||
---
|
||||
|
||||
## 即時數據表分析
|
||||
|
||||
### DWH.DW_PJ_LOT_V(即時 WIP 批次視圖)⭐⭐⭐
|
||||
|
||||
**表性質**: 即時數據視圖(Real-time View)
|
||||
|
||||
**業務定義**: DWH 提供的即時 WIP 視圖,透過 DB Link 從 `PJ_LOT_MV@DWDB_MESDB` 取得,每 5 分鐘自動更新。包含完整的批次狀態、工站位置、設備資訊、Hold 原因等 70 個欄位,是 WIP Dashboard 的主要數據源。
|
||||
|
||||
**數據來源**: `PJ_LOT_MV@DWDB_MESDB`(DB Link 連線)
|
||||
|
||||
**數據量**: 約 9,000 - 12,000 筆(動態變化)
|
||||
|
||||
#### 欄位分類總覽(70 欄位)
|
||||
|
||||
| 分類 | 欄位數 | 說明 |
|
||||
|------|--------|------|
|
||||
| 批次識別 | 5 | LOTID, CONTAINERID, WORKORDER, FIRSTNAME, NO |
|
||||
| 數量相關 | 6 | QTY, QTY2, STARTQTY, STARTQTY2, MOVEINQTY, MOVEINQTY2 |
|
||||
| 狀態相關 | 4 | STATUS, CURRENTHOLDCOUNT, STARTREASON, OWNER |
|
||||
| 時間相關 | 7 | STARTDATE, UTS, MOVEINTIMESTAMP, SYS_DATE, AGEBYDAYS, REMAINTIME, OCCURRENCEDATE |
|
||||
| 工站/流程 | 12 | WORKCENTER*, SPEC*, STEP, WORKFLOWNAME, LOCATIONNAME |
|
||||
| 產品/封裝 | 8 | PRODUCT, PRODUCTLINENAME, PACKAGE_LEF, MATERIALTYPE, PJ_TYPE, PJ_FUNCTION, BOP |
|
||||
| Hold 相關 | 8 | HOLDREASONNAME, HOLDEMP, HOLDLOCATION, RELEASETIME, RELEASEEMP, RELEASEREASON, COMMENT_HOLD |
|
||||
| 設備相關 | 4 | EQUIPMENTNAME, EQUIPMENTS, EQUIPMENTCOUNT, DEPTNAME |
|
||||
| 物料資訊 | 6 | LEADFRAMENAME, LEADFRAMEOPTION, WAFERNAME, WAFERLOT, COMNAME, DATECODE |
|
||||
| 備註/其他 | 10 | CONTAINERCOMMENTS, COMMENT_*, PRIORITYCODENAME, JOB*, PB_FUNCTION, TMTT_R, WAFER_FACTOR |
|
||||
|
||||
#### 關鍵時間欄位
|
||||
|
||||
| 欄位名 | 類型 | 用途 | 說明 |
|
||||
|--------|------|------|------|
|
||||
| `SYS_DATE` | TIMESTAMP | 數據更新時間 | 視圖同步時間戳,用於確認數據新鮮度 |
|
||||
| `STARTDATE` | TIMESTAMP | 批次開始時間 | 批次投產的時間點 |
|
||||
| `MOVEINTIMESTAMP` | TIMESTAMP | 移入當前工站時間 | 進入當前工序的時間 |
|
||||
| `UTS` | VARCHAR2 | 預計完成日期 | 格式為 'YYYY/MM/DD' |
|
||||
| `AGEBYDAYS` | NUMBER | 批次天數 | 從 STARTDATE 到現在的天數(含小數) |
|
||||
| `REMAINTIME` | NUMBER | 剩餘時間 | 預計完成前的剩餘天數(含小數) |
|
||||
|
||||
#### 關鍵業務欄位詳解
|
||||
|
||||
##### 批次識別欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 範例值 |
|
||||
|--------|------|------|--------|
|
||||
| `LOTID` | VARCHAR2(40) | 批次號(業務識別碼) | `GA26011704-A00-003` |
|
||||
| `CONTAINERID` | VARCHAR2(40) | 容器 ID(系統識別碼) | `48810480002ab0b4` |
|
||||
| `WORKORDER` | VARCHAR2(40) | 工單號 | `GA26011704` |
|
||||
| `FIRSTNAME` | VARCHAR2(100) | 首片批號 | `PSMS-4473#RFTLD3` |
|
||||
| `NO` | NUMBER | 序號(查詢結果排序用) | 1, 2, 3... |
|
||||
|
||||
##### 狀態欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 實際值分布 |
|
||||
|--------|------|------|-----------|
|
||||
| `STATUS` | VARCHAR2(20) | 批次狀態 | `ACTIVE`(約 98.7%)、`HOLD`(約 1.3%) |
|
||||
| `OWNER` | VARCHAR2(40) | 所有者/用途 | `量產`、`重工RW`、`代工`、`點測`、`樣品`、`餘晶`、`工程`、`久存`、`PROD`、`降規` |
|
||||
| `MATERIALTYPE` | VARCHAR2(40) | 物料類型 | `成品`(約 99%)、`Wafer`(約 1%) |
|
||||
| `STARTREASON` | VARCHAR2(40) | 開始原因 | `NORMAL`、`RW` 等 |
|
||||
|
||||
##### 數量欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 數值範圍 |
|
||||
|--------|------|------|---------|
|
||||
| `QTY` | NUMBER | 當前數量(主單位) | 1 - 3,000,000+ |
|
||||
| `QTY2` | NUMBER | 當前數量(輔單位) | 通常為 0 |
|
||||
| `STARTQTY` | NUMBER | 起始數量 | 通常 ≥ QTY |
|
||||
| `MOVEINQTY` | NUMBER | 移入數量 | 進站時的數量 |
|
||||
|
||||
##### 工站/流程欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 範例值 |
|
||||
|--------|------|------|--------|
|
||||
| `WORKCENTERNAME` | VARCHAR2(40) | 工作中心名稱 | `成型`、`TMTT`、`電鍍`、`焊接` |
|
||||
| `WORKCENTER_GROUP` | VARCHAR2(40) | 工作中心群組 | 與 WORKCENTERNAME 相同或分組 |
|
||||
| `WORKCENTER_SHORT` | VARCHAR2(20) | 工站簡稱 | `Mold`、`TMTT`、`DB`、`WB` |
|
||||
| `WORKCENTERSEQUENCE` | VARCHAR2(10) | 工站順序 | `130`、`300` 等(數值越大越後段) |
|
||||
| `SPECNAME` | VARCHAR2(100) | 工序規格名稱 | `成型烘烤`、`PRE TMTT` |
|
||||
| `STEP` | VARCHAR2(100) | 當前步驟 | 通常與 SPECNAME 相同 |
|
||||
| `WORKFLOWNAME` | VARCHAR2(100) | 工藝流程名稱 | `PCC_SOT-223`、`UAC_SOD-523` |
|
||||
|
||||
##### 產品/封裝欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 範例值 |
|
||||
|--------|------|------|--------|
|
||||
| `PRODUCT` | VARCHAR2(100) | 產品名稱(完整) | `PJW5P06A_R2_00701` |
|
||||
| `PRODUCTLINENAME` | VARCHAR2(40) | 產品線/封裝類型 | `SOT-223`、`SOD-523` |
|
||||
| `PACKAGE_LEF` | VARCHAR2(40) | 封裝型號 | `SOT-223`、`SOD-523` |
|
||||
| `PJ_TYPE` | VARCHAR2(40) | 產品型號 | `PJW5P06A`、`RB521S30-NC` |
|
||||
| `PJ_FUNCTION` | VARCHAR2(40) | 產品功能分類 | `MOSFET`、`SKY` |
|
||||
| `BOP` | VARCHAR2(40) | BOP 代碼 | `PCC15`、`UAC10` |
|
||||
|
||||
##### Hold 相關欄位
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 範例值 |
|
||||
|--------|------|------|--------|
|
||||
| `HOLDREASONNAME` | VARCHAR2(100) | Hold 原因 | `S2品質異常單(PE)`、`特殊需求管控` |
|
||||
| `CURRENTHOLDCOUNT` | NUMBER | 當前 Hold 次數 | 0 = 非 Hold,≥1 = Hold 中 |
|
||||
| `HOLDEMP` | VARCHAR2(40) | Hold 操作人員 | 員工姓名 |
|
||||
| `HOLDLOCATION` | VARCHAR2(40) | Hold 位置 | 通常為 NULL |
|
||||
| `RELEASETIME` | TIMESTAMP | 預計解除時間 | NULL 表示未設定 |
|
||||
| `RELEASEEMP` | VARCHAR2(40) | 解除人員 | NULL 表示尚未解除 |
|
||||
| `RELEASEREASON` | VARCHAR2(200) | 解除原因 | NULL 表示尚未解除 |
|
||||
| `COMMENT_HOLD` | VARCHAR2(4000) | Hold 備註 | 詳細說明 Hold 原因 |
|
||||
|
||||
##### 設備欄位(重要說明)
|
||||
|
||||
| 欄位名 | 類型 | 說明 | 使用工站 |
|
||||
|--------|------|------|---------|
|
||||
| `EQUIPMENTNAME` | VARCHAR2(40) | 設備名稱(單一設備) | TMTT(82%)、切彎腳(69%)、PKG_SAW |
|
||||
| `EQUIPMENTS` | VARCHAR2(4000) | 設備清單(逗號分隔) | 成型、焊接、電鍍、打印等其他工站 |
|
||||
| `EQUIPMENTCOUNT` | NUMBER | 設備數量 | 0 表示尚無設備綁定 |
|
||||
|
||||
**⚠️ 重要**: `EQUIPMENTNAME` 與 `EQUIPMENTS` 為**互斥使用**:
|
||||
- **TMTT、切彎腳、PKG_SAW** 工站使用 `EQUIPMENTNAME`(單一設備)
|
||||
- **其他工站**(成型、焊接、電鍍、打印等)使用 `EQUIPMENTS`(設備清單)
|
||||
- 僅約 100 筆同時有兩欄位數據(均為 TMTT 工站)
|
||||
- **建議查詢**: 使用 `COALESCE(EQUIPMENTNAME, EQUIPMENTS)` 取得統一設備資訊
|
||||
|
||||
##### 優先度欄位
|
||||
|
||||
| 欄位名 | 值 | 說明 |
|
||||
|--------|-----|------|
|
||||
| `PRIORITYCODENAME` | `1.超特急` | 最高優先度 |
|
||||
| | `2.特急` | 高優先度 |
|
||||
| | `3.急件` | 中高優先度(約 3%) |
|
||||
| | `4.一般` | 一般優先度(約 96%) |
|
||||
|
||||
##### Hold 原因分布(參考數據)
|
||||
|
||||
| HOLDREASONNAME | 說明 | 典型佔比 |
|
||||
|----------------|------|---------|
|
||||
| `特殊需求管控` | 特殊製程或客戶要求 | 最常見 |
|
||||
| `S2品質異常單(PE)` | PE 開立的品質異常 | 常見 |
|
||||
| `現場品質異常單(PQC)` | PQC 開立的品質異常 | 常見 |
|
||||
| `自行暫停` | 自主暫停 | 偶爾 |
|
||||
| `治具不足HOLD` | 治具問題 | 偶爾 |
|
||||
| 其他 | 換線暫停、生管暫停等 | 少見 |
|
||||
|
||||
#### 查詢策略
|
||||
|
||||
**1. WIP 即時分布統計(按工站)**
|
||||
```sql
|
||||
SELECT
|
||||
WORKCENTER_GROUP,
|
||||
WORKCENTER_SHORT,
|
||||
COUNT(*) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(CASE WHEN STATUS = 'HOLD' THEN 1 ELSE 0 END) as HOLD_LOTS,
|
||||
SUM(CASE WHEN STATUS = 'HOLD' THEN QTY ELSE 0 END) as HOLD_QTY
|
||||
FROM DWH.DW_PJ_LOT_V
|
||||
WHERE OWNER NOT IN ('DUMMY') -- 排除 DUMMY 批次
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTER_SHORT, WORKCENTERSEQUENCE_GROUP
|
||||
ORDER BY TO_NUMBER(WORKCENTERSEQUENCE_GROUP);
|
||||
```
|
||||
|
||||
**2. WIP 交叉分析(工站 x 封裝)**
|
||||
```sql
|
||||
SELECT
|
||||
WORKCENTER_GROUP,
|
||||
PRODUCTLINENAME,
|
||||
COUNT(*) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY
|
||||
FROM DWH.DW_PJ_LOT_V
|
||||
WHERE OWNER NOT IN ('DUMMY')
|
||||
GROUP BY WORKCENTER_GROUP, PRODUCTLINENAME
|
||||
ORDER BY WORKCENTER_GROUP, LOT_COUNT DESC;
|
||||
```
|
||||
|
||||
**3. Hold 批次清單**
|
||||
```sql
|
||||
SELECT
|
||||
LOTID,
|
||||
PRODUCT,
|
||||
WORKCENTERNAME,
|
||||
SPECNAME,
|
||||
QTY,
|
||||
HOLDREASONNAME,
|
||||
HOLDEMP,
|
||||
COMMENT_HOLD,
|
||||
AGEBYDAYS
|
||||
FROM DWH.DW_PJ_LOT_V
|
||||
WHERE STATUS = 'HOLD'
|
||||
ORDER BY AGEBYDAYS DESC;
|
||||
```
|
||||
|
||||
**4. 設備使用查詢(統一處理 EQUIPMENTNAME/EQUIPMENTS)**
|
||||
```sql
|
||||
SELECT
|
||||
LOTID,
|
||||
WORKCENTERNAME,
|
||||
COALESCE(EQUIPMENTNAME, EQUIPMENTS) as EQUIPMENT_INFO,
|
||||
EQUIPMENTCOUNT,
|
||||
QTY
|
||||
FROM DWH.DW_PJ_LOT_V
|
||||
WHERE COALESCE(EQUIPMENTNAME, EQUIPMENTS) IS NOT NULL
|
||||
ORDER BY WORKCENTERNAME;
|
||||
```
|
||||
|
||||
**5. 批次詳細查詢**
|
||||
```sql
|
||||
SELECT
|
||||
LOTID,
|
||||
CONTAINERID,
|
||||
WORKORDER,
|
||||
PRODUCT,
|
||||
PJ_TYPE,
|
||||
PJ_FUNCTION,
|
||||
PRODUCTLINENAME,
|
||||
WORKCENTERNAME,
|
||||
SPECNAME,
|
||||
STATUS,
|
||||
QTY,
|
||||
STARTQTY,
|
||||
AGEBYDAYS,
|
||||
REMAINTIME,
|
||||
UTS,
|
||||
PRIORITYCODENAME,
|
||||
OWNER,
|
||||
COALESCE(EQUIPMENTNAME, EQUIPMENTS) as EQUIPMENT,
|
||||
SYS_DATE
|
||||
FROM DWH.DW_PJ_LOT_V
|
||||
WHERE LOTID LIKE 'GA26011%' -- 工單篩選
|
||||
ORDER BY WORKCENTERSEQUENCE;
|
||||
```
|
||||
|
||||
#### 與其他表的關聯
|
||||
|
||||
| 關聯表 | 關聯欄位 | 用途 |
|
||||
|--------|---------|------|
|
||||
| DW_MES_CONTAINER | CONTAINERID | 取得更詳細的容器資訊 |
|
||||
| DW_MES_LOTWIPHISTORY | CONTAINERID | 查詢批次流轉歷史 |
|
||||
| DW_MES_HOLDRELEASEHISTORY | CONTAINERID | 查詢 Hold/Release 歷史 |
|
||||
|
||||
#### 重要注意事項
|
||||
|
||||
⚠️ **資料更新頻率**: 每 5 分鐘從 DWH 同步,查詢時注意 `SYS_DATE` 確認數據新鮮度
|
||||
|
||||
⚠️ **DUMMY 批次過濾**: 生產報表應排除 `OWNER IN ('DUMMY')` 的測試批次
|
||||
|
||||
⚠️ **設備欄位選擇**: 使用 `COALESCE(EQUIPMENTNAME, EQUIPMENTS)` 處理不同工站的設備資訊
|
||||
|
||||
⚠️ **時間欄位**: `UTS` 為 VARCHAR2 格式 'YYYY/MM/DD',需轉換後才能計算
|
||||
|
||||
⚠️ **無資料庫備註**: 此視圖無 Oracle 欄位備註(ALL_COL_COMMENTS 為空),欄位說明請參考本文件
|
||||
|
||||
---
|
||||
|
||||
## 現況快照表分析
|
||||
|
||||
### 1. DW_MES_WIP(在制品表)⭐⭐⭐
|
||||
|
||||
**表性質**: 現況快照表(含歷史累積)
|
||||
|
||||
**業務定義**: 存儲在制品(WIP)的現況資料,但實際包含歷史累積,需搭配時間條件(如 `TXNDATE`)限制查詢範圍
|
||||
**表性質**: 現況快照表(含歷史累積)
|
||||
|
||||
**業務定義**: 存儲在制品(WIP)的現況資料,但實際包含歷史累積,需搭配時間條件(如 `TXNDATE`)限制查詢範圍
|
||||
|
||||
#### 關鍵時間欄位
|
||||
|
||||
@@ -1134,7 +1389,7 @@ ORDER BY TOTAL_OUTPUT DESC;
|
||||
|
||||
1. 在制品流轉主線(核心業務流程)
|
||||
|
||||
DW_MES_WIP (現況快照,含歷史累積)
|
||||
DW_MES_WIP (現況快照,含歷史累積)
|
||||
↓ CONTAINERID
|
||||
DW_MES_CONTAINER (容器主檔)
|
||||
↓ CONTAINERID
|
||||
@@ -1196,49 +1451,49 @@ ORDER BY TOTAL_OUTPUT DESC;
|
||||
DW_MES_MAINTENANCE (維護記錄)
|
||||
```
|
||||
|
||||
### 詳細關聯鍵對照表
|
||||
|
||||
| 主表 | 關聯表 | 關聯欄位 | 關聯類型 | 說明 |
|
||||
|------|--------|---------|---------|------|
|
||||
| **DW_MES_WIP** | DW_MES_CONTAINER | CONTAINERID | 1:1 | 在制品關聯容器 |
|
||||
| **DW_MES_CONTAINER** | DW_MES_LOTWIPHISTORY | CONTAINERID | 1:N | 容器的流轉歷史 |
|
||||
### 詳細關聯鍵對照表
|
||||
|
||||
| 主表 | 關聯表 | 關聯欄位 | 關聯類型 | 說明 |
|
||||
|------|--------|---------|---------|------|
|
||||
| **DW_MES_WIP** | DW_MES_CONTAINER | CONTAINERID | 1:1 | 在制品關聯容器 |
|
||||
| **DW_MES_CONTAINER** | DW_MES_LOTWIPHISTORY | CONTAINERID | 1:N | 容器的流轉歷史 |
|
||||
| **DW_MES_LOTWIPHISTORY** | DW_MES_LOTWIPDATAHISTORY | WIPLOTHISTORYID | 1:N | 流轉記錄的數據採集 |
|
||||
| **DW_MES_LOTWIPHISTORY** | DW_MES_HM_LOTMOVEOUT | CONTAINERID + HISTORYMAINLINEID | 1:N | 流轉的移出事件 |
|
||||
| **DW_MES_LOTWIPHISTORY** | DW_MES_LOTREJECTHISTORY | HISTORYMAINLINEID | 1:N | 流轉的拒絕記錄 |
|
||||
| **DW_MES_LOTWIPHISTORY** | DW_MES_LOTMATERIALSHISTORY | CONTAINERID | 1:N | 流轉的物料消耗 |
|
||||
| **DW_MES_WIP** | DW_MES_HOLDRELEASEHISTORY | CONTAINERID | 1:N | 在制品的Hold歷史 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_RESOURCESTATUS | RESOURCEID = HISTORYID | 1:N | 資源的狀態歷史 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_RESOURCESTATUS_SHIFT | RESOURCEID = HISTORYID | 1:N | 資源的班次彙總 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_MAINTENANCE | RESOURCEID | 1:N | 資源的維護記錄 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_PARTREQUESTORDER | RESOURCEID | 1:N | 資源的維修用料請求 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_HM_LOTMOVEOUT | RESOURCEID | 1:N | 資源對應的移出事件 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_RESOURCESTATUS | RESOURCEID = HISTORYID | 1:N | 資源的狀態歷史 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_RESOURCESTATUS_SHIFT | RESOURCEID = HISTORYID | 1:N | 資源的班次彙總 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_MAINTENANCE | RESOURCEID | 1:N | 資源的維護記錄 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_PARTREQUESTORDER | RESOURCEID | 1:N | 資源的維修用料請求 |
|
||||
| **DW_MES_RESOURCE** | DW_MES_HM_LOTMOVEOUT | RESOURCEID | 1:N | 資源對應的移出事件 |
|
||||
| **DW_MES_JOB** | DW_MES_JOBTXNHISTORY | JOBID | 1:N | 工單的交易歷史 |
|
||||
| **DW_MES_JOB** | DW_MES_RESOURCE | RESOURCEID | N:1 | 工單關聯資源 |
|
||||
| **DW_MES_JOB** | DW_MES_PARTREQUESTORDER | JOBID | 1:N | 工單的物料請求 |
|
||||
| **DW_MES_CONTAINER** | DW_MES_PJ_COMBINEDASSYLOTS | CONTAINERID | 1:N | 容器的組合裝配 |
|
||||
|
||||
### Reference 備註確認的關聯
|
||||
|
||||
以下關聯來自 `MES_Database_Reference.md` 的欄位備註(維護人註記):
|
||||
|
||||
| 表 | 欄位 | 備註 | 可推得關聯/用途 |
|
||||
|------|------|------|----------------|
|
||||
| **DW_MES_RESOURCESTATUS** | HISTORYID | RESOURCEID | 關聯 `DW_MES_RESOURCE.RESOURCEID` |
|
||||
| **DW_MES_RESOURCESTATUS_SHIFT** | HISTORYID | RESOURCEID | 關聯 `DW_MES_RESOURCE.RESOURCEID` |
|
||||
| **DW_MES_JOB** | PARTREQUESTORDERNAME | DW_MES_PARTREQUESTORDER | 可由 `DW_MES_PARTREQUESTORDER` 取得工單請領資訊 |
|
||||
| **DW_MES_WIP** | RELEASETIME / RELEASEEMP / RELEASEREASON | DW_MES_HOLDRELEASEHISTORY | WIP 的解除資訊來源於 Hold/Release 歷史 |
|
||||
|
||||
### 欄位來源備註(同表內派生)
|
||||
|
||||
以下備註顯示欄位來源於同表關鍵欄位(非跨表),建議查詢時以 ID 欄位為主:
|
||||
|
||||
| 表 | 欄位 | 備註 |
|
||||
|------|------|------|
|
||||
| **DW_MES_CONTAINER** | MFGORDERNAME / PJ_BOP / PJ_PRODUCEREGION / PRODUCTBOMBASEID | MFGORDERID |
|
||||
| **DW_MES_WIP** | STARTREASONNAME / MFGORDERNAME / FIRSTNAME / OWNERNAME / PRIORITYCODENAME / PRODUCTBOMBASEID / PRODUCTNAME / PRODUCTLINENAME / PJ_BOP / PJ_PRODUCEREGION / PJ_TYPE / PJ_FUNCTION | CONTAINERID |
|
||||
| **DW_MES_WIP** | WOQTY / WOPLANNEDCOMPLETIONDATE | CONTAINERID -> MFGORDERID |
|
||||
|
||||
### 關鍵關聯欄位說明
|
||||
| **DW_MES_JOB** | DW_MES_PARTREQUESTORDER | JOBID | 1:N | 工單的物料請求 |
|
||||
| **DW_MES_CONTAINER** | DW_MES_PJ_COMBINEDASSYLOTS | CONTAINERID | 1:N | 容器的組合裝配 |
|
||||
|
||||
### Reference 備註確認的關聯
|
||||
|
||||
以下關聯來自 `MES_Database_Reference.md` 的欄位備註(維護人註記):
|
||||
|
||||
| 表 | 欄位 | 備註 | 可推得關聯/用途 |
|
||||
|------|------|------|----------------|
|
||||
| **DW_MES_RESOURCESTATUS** | HISTORYID | RESOURCEID | 關聯 `DW_MES_RESOURCE.RESOURCEID` |
|
||||
| **DW_MES_RESOURCESTATUS_SHIFT** | HISTORYID | RESOURCEID | 關聯 `DW_MES_RESOURCE.RESOURCEID` |
|
||||
| **DW_MES_JOB** | PARTREQUESTORDERNAME | DW_MES_PARTREQUESTORDER | 可由 `DW_MES_PARTREQUESTORDER` 取得工單請領資訊 |
|
||||
| **DW_MES_WIP** | RELEASETIME / RELEASEEMP / RELEASEREASON | DW_MES_HOLDRELEASEHISTORY | WIP 的解除資訊來源於 Hold/Release 歷史 |
|
||||
|
||||
### 欄位來源備註(同表內派生)
|
||||
|
||||
以下備註顯示欄位來源於同表關鍵欄位(非跨表),建議查詢時以 ID 欄位為主:
|
||||
|
||||
| 表 | 欄位 | 備註 |
|
||||
|------|------|------|
|
||||
| **DW_MES_CONTAINER** | MFGORDERNAME / PJ_BOP / PJ_PRODUCEREGION / PRODUCTBOMBASEID | MFGORDERID |
|
||||
| **DW_MES_WIP** | STARTREASONNAME / MFGORDERNAME / FIRSTNAME / OWNERNAME / PRIORITYCODENAME / PRODUCTBOMBASEID / PRODUCTNAME / PRODUCTLINENAME / PJ_BOP / PJ_PRODUCEREGION / PJ_TYPE / PJ_FUNCTION | CONTAINERID |
|
||||
| **DW_MES_WIP** | WOQTY / WOPLANNEDCOMPLETIONDATE | CONTAINERID -> MFGORDERID |
|
||||
|
||||
### 關鍵關聯欄位說明
|
||||
|
||||
#### CONTAINERID
|
||||
- 批次/容器的唯一標識(16位元CHAR)
|
||||
@@ -1836,8 +2091,9 @@ WHERE RN > 0;
|
||||
|
||||
---
|
||||
|
||||
**文檔版本**: v1.0
|
||||
**最後更新**: 2026-01-14
|
||||
**文檔版本**: v1.1
|
||||
**最後更新**: 2026-01-27
|
||||
**更新內容**: 新增 DWH.DW_PJ_LOT_V 即時 WIP 視圖詳細分析(70 欄位)
|
||||
**建議更新週期**: 每季度或表結構變更時
|
||||
|
||||
|
||||
|
||||
@@ -1,6 +1,11 @@
|
||||
import os
|
||||
|
||||
bind = os.getenv("GUNICORN_BIND", "0.0.0.0:8080")
|
||||
workers = int(os.getenv("GUNICORN_WORKERS", "1"))
|
||||
workers = int(os.getenv("GUNICORN_WORKERS", "2")) # 2 workers for redundancy
|
||||
threads = int(os.getenv("GUNICORN_THREADS", "4"))
|
||||
worker_class = "gthread"
|
||||
|
||||
# Timeout settings - critical for dashboard stability
|
||||
timeout = 60 # Worker timeout: 60 seconds max per request
|
||||
graceful_timeout = 30 # Graceful shutdown timeout
|
||||
keepalive = 5 # Keep-alive connections timeout
|
||||
|
||||
@@ -0,0 +1,2 @@
|
||||
schema: spec-driven
|
||||
created: 2026-01-26
|
||||
@@ -0,0 +1,147 @@
|
||||
## Context
|
||||
|
||||
現有 WIP 報表使用 `DW_MES_WIP` 交易歷史表,需要 ROW_NUMBER() 窗口函數來計算每個 Lot 的最新狀態。IT 已建立即時 View `DWH.DW_PJ_LOT_V`,每 5 分鐘更新一次,直接提供當前 WIP 快照。
|
||||
|
||||
**現有架構問題**:
|
||||
- 查詢複雜 (ROW_NUMBER + 90天範圍掃描)
|
||||
- 缺少預算好的 WORKCENTER_GROUP
|
||||
- 前端沒有自動刷新機制
|
||||
|
||||
**新 View 優勢**:
|
||||
- 直接查詢,無需窗口函數
|
||||
- 內建 WORKCENTER_GROUP、WORKCENTERSEQUENCE_GROUP 排序欄位
|
||||
- 包含完整 HOLD 資訊 (HOLDEMP, COMMENT_HOLD 等)
|
||||
- SYS_DATE 欄位標記資料更新時間
|
||||
|
||||
## Goals / Non-Goals
|
||||
|
||||
**Goals:**
|
||||
- 使用 `DWH.DW_PJ_LOT_V` 作為唯一資料來源
|
||||
- 建立兩種 Dashboard: Overview (高階主管) + Detail (產線)
|
||||
- 實現無縫自動刷新 (10分鐘間隔,無頁面閃爍)
|
||||
- 刪除舊的 WIP 報表程式碼
|
||||
|
||||
**Non-Goals:**
|
||||
- 老化分析功能 (AGEBYDAYS) - 不在此次範圍
|
||||
- 歷史趨勢分析 - 只顯示即時資料
|
||||
- 匯出功能 - 未來再做
|
||||
- 多語系支援
|
||||
|
||||
## Decisions
|
||||
|
||||
### 1. 資料來源: 使用 Schema Prefix
|
||||
|
||||
**決定**: 查詢時使用 `DWH.DW_PJ_LOT_V` (含 schema prefix)
|
||||
|
||||
**原因**: 直接查詢 `DW_PJ_LOT_V` 會報 ORA-00942,必須加上 owner schema。
|
||||
|
||||
**替代方案**: 建立 synonym - 但需要 DBA 權限,不如直接用 prefix 簡單。
|
||||
|
||||
### 2. 排序欄位: 使用 View 內建欄位
|
||||
|
||||
**決定**:
|
||||
- WORKCENTER_GROUP 排序用 `WORKCENTERSEQUENCE_GROUP`
|
||||
- SPECNAME 排序用 `SPECSEQUENCE`
|
||||
|
||||
**原因**: View 已預算好這些欄位,不需要維護專案內的 workcenter_groups.py mapping。
|
||||
|
||||
**刪除**: `src/mes_dashboard/config/workcenter_groups.py` 不再需要。
|
||||
|
||||
### 3. 在機判斷邏輯
|
||||
|
||||
**決定**:
|
||||
- `EQUIPMENTNAME IS NOT NULL` → 在機
|
||||
- `EQUIPMENTNAME IS NULL` → 待料
|
||||
|
||||
**原因**: 經確認這是正確的判斷邏輯。
|
||||
|
||||
### 4. 自動刷新架構
|
||||
|
||||
**決定**: 純前端輪詢 + 局部 DOM 更新
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────┐
|
||||
│ Frontend Auto-Refresh Architecture │
|
||||
├─────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ setInterval(() => { │
|
||||
│ fetch('/api/wip/...') │
|
||||
│ .then(newData => { │
|
||||
│ diffAndUpdate(currentData, newData); │
|
||||
│ }); │
|
||||
│ }, 10 * 60 * 1000); // 10 minutes │
|
||||
│ │
|
||||
│ diffAndUpdate(): │
|
||||
│ 1. 比對新舊資料 │
|
||||
│ 2. 只更新變化的 DOM 元素 │
|
||||
│ 3. 使用 CSS transition 處理數值變化 │
|
||||
│ 4. 表格行增刪使用 fade in/out animation │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
**原因**:
|
||||
- 不需要 WebSocket (資料更新頻率僅 5 分鐘)
|
||||
- 純前端輪詢簡單可靠
|
||||
- 局部更新避免閃爍
|
||||
|
||||
**替代方案**:
|
||||
- Server-Sent Events - 過度設計
|
||||
- WebSocket - 過度設計
|
||||
- 整頁 reload - 使用者體驗差
|
||||
|
||||
### 5. API 設計
|
||||
|
||||
**決定**: RESTful JSON API
|
||||
|
||||
```
|
||||
GET /api/wip/overview/summary → KPI 摘要
|
||||
GET /api/wip/overview/matrix → 工站×產品線矩陣
|
||||
GET /api/wip/overview/hold → Hold 摘要
|
||||
GET /api/wip/detail/{workcenter} → 工站細部資料
|
||||
GET /api/wip/meta/workcenters → 工站列表 (用於下拉選單)
|
||||
GET /api/wip/meta/packages → Package 列表
|
||||
```
|
||||
|
||||
**原因**: 分離 API 讓前端可以獨立刷新各區塊。
|
||||
|
||||
### 6. 前端框架
|
||||
|
||||
**決定**: 純 JavaScript (Vanilla JS) + CSS Transitions
|
||||
|
||||
**原因**:
|
||||
- 現有專案沒有使用 React/Vue
|
||||
- Dashboard 相對簡單,不需要框架
|
||||
- 減少依賴和學習成本
|
||||
|
||||
### 7. 更新時間顯示
|
||||
|
||||
**決定**: 顯示 `SYS_DATE` 欄位的值
|
||||
|
||||
**格式**: `最後更新: 2026-01-26 19:18:29`
|
||||
|
||||
**原因**: 這是 View 實際的資料更新時間,比顯示「每 5 分鐘更新」更有意義。
|
||||
|
||||
## Risks / Trade-offs
|
||||
|
||||
| Risk | Mitigation |
|
||||
|------|------------|
|
||||
| View 結構變更 | 與 IT 確認欄位穩定性;建立欄位 mapping 層 |
|
||||
| 大量資料效能 | 使用分頁;只查詢需要的欄位 |
|
||||
| 前端狀態管理複雜 | 保持簡單的資料結構;避免過度抽象 |
|
||||
| 舊報表刪除後無法回退 | Git 版本控制;可以 revert |
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
src/mes_dashboard/
|
||||
├── routes/
|
||||
│ └── wip.py # WIP 路由 (新建/重寫)
|
||||
├── services/
|
||||
│ └── wip_service.py # WIP 資料服務 (重寫)
|
||||
├── templates/
|
||||
│ ├── wip_overview.html # 高階主管總覽 (新建)
|
||||
│ └── wip_detail.html # 產線細部檢視 (新建)
|
||||
└── config/
|
||||
└── (刪除 workcenter_groups.py)
|
||||
```
|
||||
@@ -0,0 +1,47 @@
|
||||
## Why
|
||||
|
||||
現有 WIP 報表使用 `DW_MES_WIP` 歷史表,需要複雜的 ROW_NUMBER() 計算來取得即時快照。IT 已建立新的即時 WIP View `DWH.DW_PJ_LOT_V`(5分鐘更新),可大幅簡化查詢並提供更完整的欄位資訊。需要重建 WIP Dashboard 以使用新資料來源,並建立兩種不同使用者導向的報表。
|
||||
|
||||
## What Changes
|
||||
|
||||
- **刪除** 現有 WIP 報表相關檔案 (`wip_report.html`, `wip_service.py` 中的舊查詢)
|
||||
- **新增** WIP Overview Dashboard - 高階主管總覽
|
||||
- WORKCENTER_GROUP × PRODUCTLINENAME 矩陣 (顯示 QTY)
|
||||
- Hold 摘要表
|
||||
- 點擊工站可跳轉至 Detail
|
||||
- **新增** WIP Detail Dashboard - 產線工站細部檢視
|
||||
- 依 WORKCENTER_GROUP 篩選
|
||||
- SPECNAME 橫向展開 (依 SPECSEQUENCE 排序)
|
||||
- 顯示 Lot、設備、狀態資訊
|
||||
- **新增** 自動刷新機制
|
||||
- 前端每 10 分鐘無縫更新 (無整頁 reload)
|
||||
- 局部 DOM 更新避免畫面閃爍
|
||||
- **移除** 老化分析相關功能 (AGEBYDAYS)
|
||||
|
||||
## Capabilities
|
||||
|
||||
### New Capabilities
|
||||
|
||||
- `wip-overview`: 高階主管 WIP 總覽 Dashboard - 工站×產品線矩陣、Hold 摘要、KPI 卡片
|
||||
- `wip-detail`: 產線工站 WIP 細部 Dashboard - 單一工站的 Lot 明細、設備狀態、Spec 分布
|
||||
- `wip-data-service`: WIP 資料查詢服務 - 使用 `DWH.DW_PJ_LOT_V` 作為資料來源的後端 API
|
||||
- `auto-refresh`: 前端自動刷新機制 - 無縫更新 DOM、避免畫面閃爍
|
||||
|
||||
### Modified Capabilities
|
||||
|
||||
(無 - 完全新建,不修改現有 specs)
|
||||
|
||||
## Impact
|
||||
|
||||
- **刪除檔案**:
|
||||
- `src/mes_dashboard/templates/wip_report.html`
|
||||
- `src/mes_dashboard/templates/wip_overview.html` (若存在)
|
||||
- `src/mes_dashboard/services/wip_service.py` 中的舊查詢函數
|
||||
- `src/mes_dashboard/config/workcenter_groups.py` (改用 View 內建 WORKCENTER_GROUP)
|
||||
- **新增檔案**:
|
||||
- `src/mes_dashboard/templates/wip_overview.html`
|
||||
- `src/mes_dashboard/templates/wip_detail.html`
|
||||
- `src/mes_dashboard/services/wip_service.py` (重寫)
|
||||
- `src/mes_dashboard/routes/wip.py` (新增路由)
|
||||
- **資料來源**: `DW_MES_WIP` → `DWH.DW_PJ_LOT_V`
|
||||
- **API 變更**: 新的 API endpoints 取代現有的
|
||||
@@ -0,0 +1,82 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: 自動刷新間隔
|
||||
|
||||
系統 SHALL 每 10 分鐘自動刷新 Dashboard 資料。
|
||||
|
||||
#### Scenario: 自動刷新觸發
|
||||
- **WHEN** 頁面載入後經過 10 分鐘
|
||||
- **THEN** 系統自動呼叫 API 取得最新資料
|
||||
- **AND** 更新頁面顯示
|
||||
- **AND** 持續每 10 分鐘重複此行為
|
||||
|
||||
### Requirement: 無縫更新
|
||||
|
||||
系統 SHALL 在刷新時不重新載入整個頁面,只更新變化的資料區塊。
|
||||
|
||||
#### Scenario: 數值更新
|
||||
- **WHEN** 自動刷新取得新資料
|
||||
- **AND** 某個 KPI 數值有變化
|
||||
- **THEN** 系統只更新該數值的 DOM 元素
|
||||
- **AND** 使用 CSS transition 顯示數值變化效果(0.3s fade)
|
||||
- **AND** 頁面不閃爍、不跳動
|
||||
|
||||
#### Scenario: 表格資料更新
|
||||
- **WHEN** 自動刷新取得新資料
|
||||
- **AND** 表格有新增或刪除的列
|
||||
- **THEN** 系統使用 fade in/out animation 處理列的增刪
|
||||
- **AND** 已存在的列只更新變化的欄位
|
||||
|
||||
### Requirement: 更新時間同步
|
||||
|
||||
系統 SHALL 在每次刷新後更新「最後更新」時間顯示。
|
||||
|
||||
#### Scenario: 更新時間顯示
|
||||
- **WHEN** 自動刷新完成
|
||||
- **THEN** 系統更新右上角的「最後更新」時間為新資料的 sys_date
|
||||
|
||||
### Requirement: 刷新狀態指示
|
||||
|
||||
系統 SHALL 在刷新過程中顯示subtle loading indicator。
|
||||
|
||||
#### Scenario: 刷新中狀態
|
||||
- **WHEN** 系統開始呼叫 API 取得新資料
|
||||
- **THEN** 系統在「最後更新」旁顯示小型 spinner 或 loading dot
|
||||
- **AND** spinner 不應遮蔽內容或干擾使用者操作
|
||||
|
||||
#### Scenario: 刷新完成狀態
|
||||
- **WHEN** API 回應成功
|
||||
- **THEN** 系統隱藏 loading indicator
|
||||
- **AND** 可選:顯示短暫的 success feedback(如綠色 checkmark,1秒後消失)
|
||||
|
||||
### Requirement: 錯誤處理
|
||||
|
||||
系統 SHALL 在刷新失敗時優雅處理,不影響使用者查看現有資料。
|
||||
|
||||
#### Scenario: API 錯誤
|
||||
- **WHEN** 自動刷新 API 呼叫失敗
|
||||
- **THEN** 系統保留現有資料顯示
|
||||
- **AND** 在「最後更新」旁顯示 subtle 錯誤提示(如紅色 dot)
|
||||
- **AND** 下次刷新時間仍按 10 分鐘計算
|
||||
|
||||
### Requirement: 手動刷新
|
||||
|
||||
系統 SHALL 提供手動刷新按鈕,讓使用者可以立即取得最新資料。
|
||||
|
||||
#### Scenario: 手動刷新
|
||||
- **WHEN** 使用者點擊刷新按鈕
|
||||
- **THEN** 系統立即呼叫 API 取得最新資料
|
||||
- **AND** 重置 10 分鐘自動刷新計時器
|
||||
|
||||
### Requirement: 頁面可見性處理
|
||||
|
||||
系統 SHALL 在頁面不可見時暫停自動刷新,恢復可見時立即刷新。
|
||||
|
||||
#### Scenario: 頁面隱藏
|
||||
- **WHEN** 使用者切換到其他分頁或最小化視窗
|
||||
- **THEN** 系統暫停自動刷新計時器
|
||||
|
||||
#### Scenario: 頁面恢復
|
||||
- **WHEN** 使用者返回該分頁
|
||||
- **THEN** 系統立即執行一次刷新
|
||||
- **AND** 重新啟動 10 分鐘自動刷新計時器
|
||||
@@ -0,0 +1,167 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: 資料來源
|
||||
|
||||
系統 SHALL 使用 `DWH.DW_PJ_LOT_V` View 作為 WIP 資料的唯一來源。
|
||||
|
||||
#### Scenario: 查詢資料
|
||||
- **WHEN** 任何 WIP API 被呼叫
|
||||
- **THEN** 系統從 `DWH.DW_PJ_LOT_V` 查詢資料(含 schema prefix)
|
||||
|
||||
### Requirement: Overview Summary API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/overview/summary` API 回傳 KPI 摘要。
|
||||
|
||||
#### Scenario: 取得 KPI 摘要
|
||||
- **WHEN** 呼叫 `GET /api/wip/overview/summary`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"total_lots": 9073,
|
||||
"total_qty": 858878718,
|
||||
"hold_lots": 120,
|
||||
"hold_qty": 8213395,
|
||||
"sys_date": "2026-01-26 19:18:29"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Requirement: Overview Matrix API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/overview/matrix` API 回傳工站×產品線矩陣。
|
||||
|
||||
#### Scenario: 取得矩陣資料
|
||||
- **WHEN** 呼叫 `GET /api/wip/overview/matrix`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"workcenters": ["切割", "焊接_DB", ...],
|
||||
"packages": ["SOT-23", "SOD-323", ...],
|
||||
"matrix": {
|
||||
"切割": {"SOT-23": 50200000, "SOD-323": 42100000, ...},
|
||||
...
|
||||
},
|
||||
"workcenter_totals": {"切割": 234334583, ...},
|
||||
"package_totals": {"SOT-23": 172340257, ...},
|
||||
"grand_total": 858878718
|
||||
}
|
||||
}
|
||||
```
|
||||
- **AND** workcenters 依 WORKCENTERSEQUENCE_GROUP 排序
|
||||
- **AND** packages 依 total QTY 降序排序
|
||||
|
||||
### Requirement: Overview Hold API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/overview/hold` API 回傳 Hold 摘要。
|
||||
|
||||
#### Scenario: 取得 Hold 摘要
|
||||
- **WHEN** 呼叫 `GET /api/wip/overview/hold`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"items": [
|
||||
{"reason": "特殊需求管控", "lots": 44, "qty": 4235060},
|
||||
{"reason": "YieldLimit", "lots": 21, "qty": 1084443},
|
||||
...
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
- **AND** 依 lots 數量降序排序
|
||||
|
||||
### Requirement: Detail API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/detail/{workcenter}` API 回傳工站細部資料。
|
||||
|
||||
#### Scenario: 取得工站細部資料
|
||||
- **WHEN** 呼叫 `GET /api/wip/detail/焊接_DB?package=&status=&page=1&page_size=100`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"workcenter": "焊接_DB",
|
||||
"summary": {
|
||||
"total_lots": 859,
|
||||
"on_equipment_lots": 312,
|
||||
"waiting_lots": 547,
|
||||
"hold_lots": 15
|
||||
},
|
||||
"specs": ["Spec1", "Spec2", ...],
|
||||
"lots": [
|
||||
{
|
||||
"lot_id": "GA25102485-A00-004",
|
||||
"equipment": "GSMP-0054",
|
||||
"status": "ACTIVE",
|
||||
"hold_reason": null,
|
||||
"qty": 750,
|
||||
"package": "SOT-23",
|
||||
"spec": "鈦昇"
|
||||
},
|
||||
...
|
||||
],
|
||||
"pagination": {
|
||||
"page": 1,
|
||||
"page_size": 100,
|
||||
"total_count": 859,
|
||||
"total_pages": 9
|
||||
},
|
||||
"sys_date": "2026-01-26 19:18:29"
|
||||
}
|
||||
}
|
||||
```
|
||||
- **AND** specs 依 SPECSEQUENCE 排序
|
||||
- **AND** lots 依 LOTID 排序
|
||||
- **AND** 前端將 qty 顯示在對應的 spec 欄位中(非獨立欄位)
|
||||
|
||||
#### Scenario: 篩選 Package
|
||||
- **WHEN** 呼叫 `GET /api/wip/detail/焊接_DB?package=SOT-23`
|
||||
- **THEN** 系統只回傳 PRODUCTLINENAME = 'SOT-23' 的 Lots
|
||||
|
||||
#### Scenario: 篩選 Status
|
||||
- **WHEN** 呼叫 `GET /api/wip/detail/焊接_DB?status=HOLD`
|
||||
- **THEN** 系統只回傳 STATUS = 'HOLD' 的 Lots
|
||||
|
||||
### Requirement: Workcenters Meta API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/meta/workcenters` API 回傳工站列表。
|
||||
|
||||
#### Scenario: 取得工站列表
|
||||
- **WHEN** 呼叫 `GET /api/wip/meta/workcenters`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": [
|
||||
{"name": "切割", "lot_count": 1377},
|
||||
{"name": "焊接_DB", "lot_count": 859},
|
||||
...
|
||||
]
|
||||
}
|
||||
```
|
||||
- **AND** 依 WORKCENTERSEQUENCE_GROUP 排序
|
||||
|
||||
### Requirement: Packages Meta API
|
||||
|
||||
系統 SHALL 提供 `GET /api/wip/meta/packages` API 回傳 Package 列表。
|
||||
|
||||
#### Scenario: 取得 Package 列表
|
||||
- **WHEN** 呼叫 `GET /api/wip/meta/packages`
|
||||
- **THEN** 系統回傳 JSON:
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": [
|
||||
{"name": "SOT-23", "lot_count": 2234},
|
||||
{"name": "SOD-323", "lot_count": 1392},
|
||||
...
|
||||
]
|
||||
}
|
||||
```
|
||||
- **AND** 依 lot_count 降序排序
|
||||
@@ -0,0 +1,66 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: WIP Detail Dashboard 頁面
|
||||
|
||||
系統 SHALL 提供 `/wip-detail` 路由,顯示產線用的單一工站 WIP 細部 Dashboard。
|
||||
|
||||
#### Scenario: 載入 Detail 頁面
|
||||
- **WHEN** 使用者訪問 `/wip-detail?workcenter=焊接_DB`
|
||||
- **THEN** 系統顯示該 WORKCENTER_GROUP 的 WIP 細部資訊
|
||||
|
||||
#### Scenario: 無參數訪問
|
||||
- **WHEN** 使用者訪問 `/wip-detail` 不帶 workcenter 參數
|
||||
- **THEN** 系統顯示第一個 WORKCENTER_GROUP(依 WORKCENTERSEQUENCE_GROUP 排序)的資料
|
||||
|
||||
### Requirement: 篩選器
|
||||
|
||||
系統 SHALL 提供篩選功能,包含:Package 下拉選單、Status 下拉選單。(Workcenter 不需要篩選,因為每個 WORKCENTER_GROUP 有獨立的 Detail 頁面)
|
||||
|
||||
#### Scenario: Package 篩選
|
||||
- **WHEN** 使用者選擇特定 Package
|
||||
- **THEN** 系統只顯示該 PRODUCTLINENAME 的 Lots
|
||||
|
||||
#### Scenario: Status 篩選
|
||||
- **WHEN** 使用者選擇 Status(全部/Active/Hold)
|
||||
- **THEN** 系統依 STATUS 欄位篩選顯示
|
||||
|
||||
### Requirement: 工站摘要卡片
|
||||
|
||||
系統 SHALL 在篩選器下方顯示 4 個摘要卡片:總 Lots、在機 Lots、待料 Lots、Hold Lots。
|
||||
|
||||
#### Scenario: 顯示工站摘要
|
||||
- **WHEN** Detail 頁面載入完成
|
||||
- **THEN** 系統顯示:
|
||||
- 總 Lots:該工站的 Lot 總數
|
||||
- 在機 Lots:EQUIPMENTNAME IS NOT NULL 的數量
|
||||
- 待料 Lots:EQUIPMENTNAME IS NULL 的數量
|
||||
- Hold Lots:STATUS = 'HOLD' 的數量
|
||||
|
||||
### Requirement: Lot 明細表格
|
||||
|
||||
系統 SHALL 顯示該工站的 Lot 明細表格,橫向展開 SPECNAME,並在對應 Spec 欄位顯示數量。
|
||||
|
||||
#### Scenario: 顯示 Lot 明細
|
||||
- **WHEN** Detail 頁面載入資料
|
||||
- **THEN** 系統顯示表格:
|
||||
- 固定欄位:LOTID、EQUIPMENTNAME(NULL 顯示「待料」)、STATUS(HOLD 顯示紅色 + HOLDREASONNAME)、PRODUCTLINENAME
|
||||
- 動態欄位:該工站的所有 SPECNAME(依 SPECSEQUENCE 排序)
|
||||
- Spec 欄位顯示該 Lot 的 QTY(千分位格式),非當前 Spec 的欄位留空
|
||||
|
||||
#### Scenario: Hold 狀態顯示
|
||||
- **WHEN** Lot 的 STATUS = 'HOLD'
|
||||
- **THEN** 系統以紅色顯示 Status 欄位
|
||||
- **AND** 顯示括號內的 HOLDREASONNAME(如「Hold (YieldLimit)」)
|
||||
|
||||
#### Scenario: 表格分頁
|
||||
- **WHEN** Lot 數量超過 100 筆
|
||||
- **THEN** 系統顯示分頁控制項
|
||||
- **AND** 每頁顯示 100 筆
|
||||
|
||||
### Requirement: 資料更新時間顯示
|
||||
|
||||
系統 SHALL 顯示資料的最後更新時間(來自 SYS_DATE 欄位)。
|
||||
|
||||
#### Scenario: 顯示更新時間
|
||||
- **WHEN** 資料載入完成
|
||||
- **THEN** 系統在頁面右上角顯示「最後更新: YYYY-MM-DD HH:MM:SS」
|
||||
@@ -0,0 +1,57 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: WIP Overview Dashboard 頁面
|
||||
|
||||
系統 SHALL 提供 `/wip-overview` 路由,顯示高階主管用的 WIP 總覽 Dashboard。
|
||||
|
||||
#### Scenario: 載入 Overview 頁面
|
||||
- **WHEN** 使用者訪問 `/wip-overview`
|
||||
- **THEN** 系統顯示 WIP Overview Dashboard,包含 KPI 摘要、工站×產品線矩陣、Hold 摘要
|
||||
|
||||
### Requirement: KPI 摘要卡片
|
||||
|
||||
系統 SHALL 在頁面頂部顯示 4 個 KPI 摘要卡片:總 Lots、總 QTY、Hold Lots、Hold QTY。
|
||||
|
||||
#### Scenario: 顯示 KPI 摘要
|
||||
- **WHEN** Overview 頁面載入完成
|
||||
- **THEN** 系統顯示:
|
||||
- 總 Lots 數量
|
||||
- 總 QTY 數量(使用千分位格式)
|
||||
- Hold Lots 數量(紅色標示)
|
||||
- Hold QTY 數量(紅色標示)
|
||||
|
||||
### Requirement: 工站×產品線矩陣
|
||||
|
||||
系統 SHALL 顯示 WORKCENTER_GROUP × PRODUCTLINENAME 的 QTY 矩陣表格。
|
||||
|
||||
#### Scenario: 顯示矩陣表格
|
||||
- **WHEN** Overview 頁面載入資料
|
||||
- **THEN** 系統顯示矩陣表格:
|
||||
- 縱軸:WORKCENTER_GROUP(依 WORKCENTERSEQUENCE_GROUP 排序)
|
||||
- 橫軸:PRODUCTLINENAME(依數量排序,取前 N 個主要 Package)
|
||||
- 格內:顯示 QTY(千分位格式)
|
||||
- 最右欄:該工站的 Total QTY
|
||||
- 最下列:該 Package 的 Total QTY
|
||||
|
||||
#### Scenario: 點擊工站跳轉
|
||||
- **WHEN** 使用者點擊矩陣中的某個 WORKCENTER_GROUP 列
|
||||
- **THEN** 系統跳轉至 `/wip-detail?workcenter={WORKCENTER_GROUP}`
|
||||
|
||||
### Requirement: Hold 摘要表
|
||||
|
||||
系統 SHALL 顯示 Hold 原因分布表格。
|
||||
|
||||
#### Scenario: 顯示 Hold 摘要
|
||||
- **WHEN** Overview 頁面載入資料
|
||||
- **THEN** 系統顯示 Hold 摘要表格:
|
||||
- 欄位:HOLDREASONNAME、Lots 數量、QTY 數量
|
||||
- 依 Lots 數量降序排序
|
||||
- 只顯示有 Hold 的資料
|
||||
|
||||
### Requirement: 資料更新時間顯示
|
||||
|
||||
系統 SHALL 顯示資料的最後更新時間(來自 SYS_DATE 欄位)。
|
||||
|
||||
#### Scenario: 顯示更新時間
|
||||
- **WHEN** 資料載入完成
|
||||
- **THEN** 系統在頁面右上角顯示「最後更新: YYYY-MM-DD HH:MM:SS」
|
||||
@@ -0,0 +1,83 @@
|
||||
## 1. 清理舊程式碼
|
||||
|
||||
- [x] 1.1 刪除 `src/mes_dashboard/templates/wip_report.html`
|
||||
- [x] 1.2 保留 `src/mes_dashboard/config/workcenter_groups.py`(被 dashboard_service.py 使用)
|
||||
- [x] 1.3 重寫 `src/mes_dashboard/services/wip_service.py`(使用 DWH.DW_PJ_LOT_V)
|
||||
- [x] 1.4 重寫 `src/mes_dashboard/routes/wip_routes.py`(新 API 端點)
|
||||
|
||||
## 2. 後端資料服務
|
||||
|
||||
- [x] 2.1 建立新的 `wip_service.py` - 基礎查詢函數(連接 `DWH.DW_PJ_LOT_V`)
|
||||
- [x] 2.2 實作 `get_wip_summary()` - KPI 摘要查詢
|
||||
- [x] 2.3 實作 `get_wip_matrix()` - 工站×產品線矩陣查詢
|
||||
- [x] 2.4 實作 `get_wip_hold_summary()` - Hold 摘要查詢
|
||||
- [x] 2.5 實作 `get_wip_detail()` - 工站細部查詢(含分頁、篩選)
|
||||
- [x] 2.6 實作 `get_workcenters()` - 工站列表查詢
|
||||
- [x] 2.7 實作 `get_packages()` - Package 列表查詢
|
||||
|
||||
## 3. 後端 API 路由
|
||||
|
||||
- [x] 3.1 建立 `routes/wip.py` Blueprint
|
||||
- [x] 3.2 實作 `GET /api/wip/overview/summary`
|
||||
- [x] 3.3 實作 `GET /api/wip/overview/matrix`
|
||||
- [x] 3.4 實作 `GET /api/wip/overview/hold`
|
||||
- [x] 3.5 實作 `GET /api/wip/detail/<workcenter>`
|
||||
- [x] 3.6 實作 `GET /api/wip/meta/workcenters`
|
||||
- [x] 3.7 實作 `GET /api/wip/meta/packages`
|
||||
- [x] 3.8 註冊 Blueprint 到 app factory(沿用現有 register_routes)
|
||||
|
||||
## 4. 前端 - WIP Overview 頁面
|
||||
|
||||
- [x] 4.1 建立 `templates/wip_overview.html` 基本結構
|
||||
- [x] 4.2 實作 KPI 摘要卡片區塊
|
||||
- [x] 4.3 實作工站×產品線矩陣表格
|
||||
- [x] 4.4 實作矩陣資料載入與渲染
|
||||
- [x] 4.5 實作點擊工站跳轉至 Detail 功能
|
||||
- [x] 4.6 實作 Hold 摘要表格
|
||||
- [x] 4.7 實作更新時間顯示
|
||||
|
||||
## 5. 前端 - WIP Detail 頁面
|
||||
|
||||
- [x] 5.1 建立 `templates/wip_detail.html` 基本結構
|
||||
- [x] 5.2 實作篩選器區塊(Package/Status 下拉選單)
|
||||
- [x] 5.3 實作工站摘要卡片(總/在機/待料/Hold)
|
||||
- [x] 5.4 實作 Lot 明細表格(固定欄位:Lot ID/設備/狀態/Package + 動態 Spec 欄位)
|
||||
- [x] 5.5 實作 Spec 橫向展開(依 SPECSEQUENCE 排序,在對應 Spec 顯示 QTY)
|
||||
- [x] 5.6 實作 Hold 狀態紅色標示與 Hold Reason 顯示
|
||||
- [x] 5.7 實作分頁功能
|
||||
- [x] 5.8 實作篩選器與資料連動
|
||||
|
||||
## 6. 前端 - 自動刷新機制
|
||||
|
||||
- [x] 6.1 在各頁面內建 auto-refresh JavaScript(非獨立模組)
|
||||
- [x] 6.2 實作 10 分鐘定時刷新(setInterval)
|
||||
- [x] 6.3 實作 DOM 局部更新(避免整頁重渲染)
|
||||
- [x] 6.4 實作 CSS transition 數值變化效果
|
||||
- [x] 6.5 實作 subtle loading indicator(spinner + success/error 指示)
|
||||
- [x] 6.6 實作錯誤處理(保留現有資料,顯示錯誤指示)
|
||||
- [x] 6.7 實作手動刷新按鈕
|
||||
- [x] 6.8 實作頁面可見性處理(visibilitychange event)
|
||||
|
||||
## 7. 樣式設計
|
||||
|
||||
- [x] 7.1 設計 Overview 頁面 CSS 樣式
|
||||
- [x] 7.2 設計 Detail 頁面 CSS 樣式
|
||||
- [x] 7.3 設計響應式佈局(RWD)
|
||||
- [x] 7.4 設計 transition/animation 效果
|
||||
|
||||
## 8. 頁面路由整合
|
||||
|
||||
- [x] 8.1 新增 `/wip-overview` 頁面路由
|
||||
- [x] 8.2 新增 `/wip-detail` 頁面路由
|
||||
- [x] 8.3 更新導航選單連結
|
||||
- [x] 8.4 移除舊的 `/wip` 路由
|
||||
|
||||
## 9. 測試與驗證
|
||||
|
||||
- [x] 9.1 測試後端 API 函數正常運作
|
||||
- [x] 9.2 測試 Overview 頁面載入與資料顯示
|
||||
- [x] 9.3 測試 Detail 頁面載入與篩選功能
|
||||
- [x] 9.4 測試自動刷新機制
|
||||
- [x] 9.5 測試手動刷新功能
|
||||
- [x] 9.6 測試頁面可見性處理
|
||||
- [x] 9.7 測試 API 錯誤時的降級處理
|
||||
@@ -0,0 +1,2 @@
|
||||
schema: spec-driven
|
||||
created: 2026-01-26
|
||||
@@ -0,0 +1,101 @@
|
||||
## Context
|
||||
|
||||
WIP Dashboard 已完成基礎重建(wip-dashboard-rebuild),使用 `DWH.DW_PJ_LOT_V` 作為資料來源。目前提供 Package 與 Status 篩選,但缺乏 WORKORDER 與 LOT ID 的搜尋功能。資料中存在 DUMMY lot 會影響統計準確性。
|
||||
|
||||
現有架構:
|
||||
- 後端:`wip_service.py` 提供查詢函數,`wip_routes.py` 提供 API 端點
|
||||
- 前端:`wip_overview.html` 與 `wip_detail.html` 使用 vanilla JavaScript
|
||||
|
||||
## Goals / Non-Goals
|
||||
|
||||
**Goals:**
|
||||
- 預設排除 DUMMY lot,提升數據準確性
|
||||
- 提供 WORKORDER 與 LOT ID 模糊搜尋功能
|
||||
- 使用 autocomplete 下拉選單提升使用體驗
|
||||
- 保持現有自動刷新機制正常運作
|
||||
|
||||
**Non-Goals:**
|
||||
- 不實作複雜的全文搜尋引擎
|
||||
- 不儲存使用者篩選偏好(session/cookie)
|
||||
- 不修改資料庫結構或新增索引
|
||||
|
||||
## Decisions
|
||||
|
||||
### 1. DUMMY 排除策略
|
||||
|
||||
**決定**: 在後端 SQL 查詢中加入 `LOTID NOT LIKE '%DUMMY%'` 條件
|
||||
|
||||
**理由**:
|
||||
- 簡單直接,無需前端配合
|
||||
- 統一處理,確保所有 API 一致性
|
||||
- 如需檢視 DUMMY 資料,可透過參數 `include_dummy=true` 覆蓋
|
||||
|
||||
**替代方案**: 前端過濾 → 效能差,資料量大時不適用
|
||||
|
||||
### 2. 模糊搜尋實作方式
|
||||
|
||||
**決定**: 使用 SQL `LIKE '%keyword%'` 搭配 API 參數
|
||||
|
||||
**理由**:
|
||||
- 無需額外依賴(如 Elasticsearch)
|
||||
- 資料量(約 9000 lots)在可接受範圍內
|
||||
- 實作簡單,維護成本低
|
||||
|
||||
**替代方案**:
|
||||
- Oracle Text 全文搜尋 → 過度設計
|
||||
- 前端 fuzzy match → 需載入全部資料,效能差
|
||||
|
||||
### 3. 下拉選單資料載入策略
|
||||
|
||||
**決定**: 採用「搜尋觸發載入」而非預載全部選項
|
||||
|
||||
**API 設計**:
|
||||
```
|
||||
GET /api/wip/meta/search?type=workorder&q=GA26&limit=20
|
||||
GET /api/wip/meta/search?type=lotid&q=GA26011&limit=20
|
||||
```
|
||||
|
||||
**理由**:
|
||||
- WORKORDER 與 LOTID 數量龐大,預載全部不實際
|
||||
- 使用者輸入 2-3 字元後觸發搜尋,回傳前 20 筆匹配結果
|
||||
- 減少初始載入時間與記憶體使用
|
||||
|
||||
**替代方案**: 預載全部 → LOTID 有 9000+ 筆,瀏覽器記憶體負擔大
|
||||
|
||||
### 4. 前端 Autocomplete 實作
|
||||
|
||||
**決定**: 使用 HTML5 `<datalist>` 搭配自訂 JavaScript
|
||||
|
||||
**理由**:
|
||||
- 無需引入第三方函式庫(如 Select2, Choices.js)
|
||||
- 保持與現有架構一致(vanilla JS)
|
||||
- 瀏覽器原生支援,效能佳
|
||||
|
||||
**替代方案**: 第三方函式庫 → 增加依賴,bundle size 增加
|
||||
|
||||
### 5. 篩選器 UI 配置
|
||||
|
||||
**WIP Overview (大表)**:
|
||||
- 新增 WORKORDER 與 LOTID 搜尋框於現有區域上方
|
||||
- 套用篩選後,矩陣與摘要同步更新
|
||||
|
||||
**WIP Detail**:
|
||||
- 在現有 Package/Status 篩選器旁新增 WORKORDER 與 LOTID 搜尋框
|
||||
- 維持現有篩選邏輯,新篩選條件為 AND 關係
|
||||
|
||||
## Risks / Trade-offs
|
||||
|
||||
| 風險 | 緩解措施 |
|
||||
|------|----------|
|
||||
| SQL LIKE 效能問題 | 限制搜尋結果數量(limit=20),要求至少輸入 2 字元 |
|
||||
| datalist 瀏覽器相容性 | 目標瀏覽器(Chrome/Edge)支援良好,IE 不在支援範圍 |
|
||||
| 搜尋延遲影響體驗 | 加入 debounce(300ms),顯示 loading 指示 |
|
||||
| DUMMY 排除影響特定使用情境 | 提供 `include_dummy` 參數供進階使用 |
|
||||
|
||||
## API 變更摘要
|
||||
|
||||
| 端點 | 變更 |
|
||||
|------|------|
|
||||
| `/api/wip/overview/*` | 新增 `workorder`, `lotid`, `include_dummy` 參數 |
|
||||
| `/api/wip/detail/<wc>` | 新增 `workorder`, `lotid`, `include_dummy` 參數 |
|
||||
| `/api/wip/meta/search` | **新增** - 模糊搜尋 WORKORDER/LOTID |
|
||||
@@ -0,0 +1,28 @@
|
||||
## Why
|
||||
|
||||
WIP Dashboard 目前缺乏有效的篩選機制,使用者無法快速定位特定的 Lot 或 Work Order。此外,資料中包含 DUMMY lot(測試用途),會干擾實際生產數據的分析。需要增強篩選功能以提升使用者體驗與數據準確性。
|
||||
|
||||
## What Changes
|
||||
|
||||
- 預設排除 LOT ID 包含 "DUMMY" 的資料(適用於所有 WIP 查詢)
|
||||
- 新增 WORKORDER 篩選器,支援模糊搜尋與下拉選單
|
||||
- 新增 LOT ID 篩選器,支援模糊搜尋與下拉選單
|
||||
- 頁面載入時預載選項清單供下拉選單使用
|
||||
- 提供輸入框讓使用者透過模糊搜尋快速找出特定 LOT 或 WORKORDER
|
||||
|
||||
## Capabilities
|
||||
|
||||
### New Capabilities
|
||||
|
||||
- `wip-advanced-filter`: WIP 進階篩選功能,包含 WORKORDER 與 LOT ID 的模糊搜尋、預設 DUMMY 排除、autocomplete 下拉選單
|
||||
|
||||
### Modified Capabilities
|
||||
|
||||
- `wip-service`: 修改現有 WIP 查詢服務,加入 DUMMY 排除邏輯與新篩選參數支援
|
||||
|
||||
## Impact
|
||||
|
||||
- **後端服務**: `wip_service.py` - 修改所有查詢函數加入 DUMMY 排除,新增 `get_workorders()` 與 `get_lot_ids()` API
|
||||
- **後端路由**: `wip_routes.py` - 新增 meta API 端點
|
||||
- **前端頁面**: `wip_overview.html`, `wip_detail.html` - 新增篩選器 UI 與 autocomplete 元件
|
||||
- **資料庫**: 無結構變更,僅查詢邏輯調整
|
||||
@@ -0,0 +1,109 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: DUMMY Lot 預設排除
|
||||
|
||||
系統 SHALL 預設排除 LOTID 包含 "DUMMY" 字串的資料,適用於所有 WIP 查詢端點。
|
||||
|
||||
#### Scenario: 預設查詢排除 DUMMY
|
||||
- **WHEN** 使用者載入 WIP Overview 或 Detail 頁面,未指定 include_dummy 參數
|
||||
- **THEN** 系統回傳的資料不包含任何 LOTID 含 "DUMMY" 的記錄
|
||||
|
||||
#### Scenario: 明確包含 DUMMY
|
||||
- **WHEN** API 請求包含參數 `include_dummy=true`
|
||||
- **THEN** 系統回傳的資料包含所有記錄(含 DUMMY lots)
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WORKORDER 模糊搜尋
|
||||
|
||||
系統 SHALL 提供 WORKORDER 模糊搜尋功能,允許使用者透過部分字串找出匹配的 Work Order。
|
||||
|
||||
#### Scenario: 搜尋 WORKORDER
|
||||
- **WHEN** 使用者在 WORKORDER 搜尋框輸入 "GA26" 並觸發搜尋
|
||||
- **THEN** 系統回傳 WORKORDER 欄位包含 "GA26" 的前 20 筆不重複結果
|
||||
|
||||
#### Scenario: 最少輸入字元限制
|
||||
- **WHEN** 使用者輸入少於 2 個字元
|
||||
- **THEN** 系統不觸發搜尋,下拉選單維持空白
|
||||
|
||||
#### Scenario: 套用 WORKORDER 篩選
|
||||
- **WHEN** 使用者選擇特定 WORKORDER 並套用篩選
|
||||
- **THEN** 頁面僅顯示該 WORKORDER 的相關 lot 資料
|
||||
|
||||
---
|
||||
|
||||
### Requirement: LOT ID 模糊搜尋
|
||||
|
||||
系統 SHALL 提供 LOT ID 模糊搜尋功能,允許使用者透過部分字串找出匹配的 Lot。
|
||||
|
||||
#### Scenario: 搜尋 LOT ID
|
||||
- **WHEN** 使用者在 LOT ID 搜尋框輸入 "GA26011" 並觸發搜尋
|
||||
- **THEN** 系統回傳 LOTID 欄位包含 "GA26011" 的前 20 筆結果
|
||||
|
||||
#### Scenario: 套用 LOT ID 篩選
|
||||
- **WHEN** 使用者選擇特定 LOT ID 並套用篩選
|
||||
- **THEN** 頁面僅顯示該 LOT 的詳細資料
|
||||
|
||||
---
|
||||
|
||||
### Requirement: Autocomplete 下拉選單
|
||||
|
||||
系統 SHALL 提供 autocomplete 下拉選單,顯示搜尋結果供使用者選擇。
|
||||
|
||||
#### Scenario: 顯示搜尋結果
|
||||
- **WHEN** 搜尋 API 回傳結果
|
||||
- **THEN** 下拉選單顯示匹配的選項,最多 20 筆
|
||||
|
||||
#### Scenario: 選擇選項
|
||||
- **WHEN** 使用者點擊下拉選單中的選項
|
||||
- **THEN** 該值填入搜尋框,並可用於後續篩選
|
||||
|
||||
#### Scenario: 輸入防抖
|
||||
- **WHEN** 使用者快速連續輸入字元
|
||||
- **THEN** 系統等待 300ms 無新輸入後才觸發搜尋 API
|
||||
|
||||
---
|
||||
|
||||
### Requirement: 搜尋 API 端點
|
||||
|
||||
系統 SHALL 提供統一的搜尋 API 端點供前端查詢 WORKORDER 與 LOT ID。
|
||||
|
||||
#### Scenario: 查詢 WORKORDER 清單
|
||||
- **WHEN** 前端請求 `GET /api/wip/meta/search?type=workorder&q=GA26&limit=20`
|
||||
- **THEN** 系統回傳 JSON 格式的 WORKORDER 清單,欄位包含 `items` 陣列
|
||||
|
||||
#### Scenario: 查詢 LOT ID 清單
|
||||
- **WHEN** 前端請求 `GET /api/wip/meta/search?type=lotid&q=GA26011&limit=20`
|
||||
- **THEN** 系統回傳 JSON 格式的 LOT ID 清單,欄位包含 `items` 陣列
|
||||
|
||||
#### Scenario: 空查詢字串
|
||||
- **WHEN** 前端請求搜尋 API 但 `q` 參數為空或少於 2 字元
|
||||
- **THEN** 系統回傳空的 `items` 陣列
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WIP Overview 篩選器整合
|
||||
|
||||
WIP Overview 頁面 SHALL 整合 WORKORDER 與 LOT ID 篩選功能。
|
||||
|
||||
#### Scenario: 篩選器顯示於頁面
|
||||
- **WHEN** 使用者載入 WIP Overview 頁面
|
||||
- **THEN** 頁面顯示 WORKORDER 與 LOT ID 搜尋框
|
||||
|
||||
#### Scenario: 篩選影響所有區塊
|
||||
- **WHEN** 使用者套用 WORKORDER 或 LOT ID 篩選
|
||||
- **THEN** KPI 摘要、矩陣表格、Hold 摘要皆依篩選條件更新
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WIP Detail 篩選器整合
|
||||
|
||||
WIP Detail 頁面 SHALL 整合 WORKORDER 與 LOT ID 篩選功能。
|
||||
|
||||
#### Scenario: 篩選器顯示於頁面
|
||||
- **WHEN** 使用者載入 WIP Detail 頁面
|
||||
- **THEN** 頁面顯示 WORKORDER 與 LOT ID 搜尋框,與現有 Package/Status 篩選器並列
|
||||
|
||||
#### Scenario: 多重篩選條件
|
||||
- **WHEN** 使用者同時設定 Package、Status、WORKORDER 篩選
|
||||
- **THEN** 系統以 AND 邏輯組合所有條件,回傳符合全部條件的資料
|
||||
@@ -0,0 +1,57 @@
|
||||
## 1. 後端 - DUMMY 排除邏輯
|
||||
|
||||
- [x] 1.1 修改 `get_wip_summary()` 加入 DUMMY 排除條件
|
||||
- [x] 1.2 修改 `get_wip_matrix()` 加入 DUMMY 排除條件
|
||||
- [x] 1.3 修改 `get_wip_hold_summary()` 加入 DUMMY 排除條件
|
||||
- [x] 1.4 修改 `get_wip_detail()` 加入 DUMMY 排除條件
|
||||
- [x] 1.5 修改 `get_workcenters()` 加入 DUMMY 排除條件
|
||||
- [x] 1.6 修改 `get_packages()` 加入 DUMMY 排除條件
|
||||
- [x] 1.7 新增 `include_dummy` 參數支援(可選覆蓋預設行為)
|
||||
|
||||
## 2. 後端 - 搜尋 API
|
||||
|
||||
- [x] 2.1 實作 `search_workorders(q, limit)` 函數
|
||||
- [x] 2.2 實作 `search_lot_ids(q, limit)` 函數
|
||||
- [x] 2.3 新增 `GET /api/wip/meta/search` 路由端點
|
||||
- [x] 2.4 加入輸入驗證(最少 2 字元、limit 上限)
|
||||
|
||||
## 3. 後端 - 篩選參數擴充
|
||||
|
||||
- [x] 3.1 修改 `get_wip_summary()` 支援 workorder/lotid 參數
|
||||
- [x] 3.2 修改 `get_wip_matrix()` 支援 workorder/lotid 參數
|
||||
- [x] 3.3 修改 `get_wip_hold_summary()` 支援 workorder/lotid 參數
|
||||
- [x] 3.4 修改 `get_wip_detail()` 支援 workorder/lotid 參數
|
||||
- [x] 3.5 修改對應的 API 路由接收新參數
|
||||
|
||||
## 4. 前端 - Autocomplete 元件
|
||||
|
||||
- [x] 4.1 實作 autocomplete 搜尋框 JavaScript 函數
|
||||
- [x] 4.2 實作 debounce 機制(300ms)
|
||||
- [x] 4.3 實作下拉選單 UI(使用 datalist 或自訂)
|
||||
- [x] 4.4 實作 loading 指示器
|
||||
|
||||
## 5. 前端 - WIP Overview 整合
|
||||
|
||||
- [x] 5.1 新增 WORKORDER 搜尋框 HTML
|
||||
- [x] 5.2 新增 LOT ID 搜尋框 HTML
|
||||
- [x] 5.3 整合篩選器與資料載入邏輯
|
||||
- [x] 5.4 修改 KPI/矩陣/Hold 更新函數接受新篩選參數
|
||||
- [x] 5.5 新增清除篩選按鈕
|
||||
|
||||
## 6. 前端 - WIP Detail 整合
|
||||
|
||||
- [x] 6.1 新增 WORKORDER 搜尋框 HTML
|
||||
- [x] 6.2 新增 LOT ID 搜尋框 HTML
|
||||
- [x] 6.3 整合篩選器與現有 Package/Status 篩選邏輯
|
||||
- [x] 6.4 修改資料載入函數接受新篩選參數
|
||||
- [x] 6.5 更新清除篩選功能包含新篩選器
|
||||
|
||||
## 7. 測試與驗證
|
||||
|
||||
- [x] 7.1 測試 DUMMY 排除在所有 API 正常運作(單元測試)
|
||||
- [x] 7.2 測試搜尋 API 回傳正確結果(單元測試)
|
||||
- [x] 7.3 測試 Overview 頁面篩選功能(手動測試)
|
||||
- [x] 7.4 測試 Detail 頁面篩選功能(手動測試)
|
||||
- [x] 7.5 測試多重篩選條件組合(單元測試)
|
||||
- [x] 7.6 測試 autocomplete 防抖機制(手動測試)
|
||||
- [x] 7.7 撰寫單元測試(搜尋函數)
|
||||
109
openspec/specs/wip-advanced-filter/spec.md
Normal file
109
openspec/specs/wip-advanced-filter/spec.md
Normal file
@@ -0,0 +1,109 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: DUMMY Lot 預設排除
|
||||
|
||||
系統 SHALL 預設排除 LOTID 包含 "DUMMY" 字串的資料,適用於所有 WIP 查詢端點。
|
||||
|
||||
#### Scenario: 預設查詢排除 DUMMY
|
||||
- **WHEN** 使用者載入 WIP Overview 或 Detail 頁面,未指定 include_dummy 參數
|
||||
- **THEN** 系統回傳的資料不包含任何 LOTID 含 "DUMMY" 的記錄
|
||||
|
||||
#### Scenario: 明確包含 DUMMY
|
||||
- **WHEN** API 請求包含參數 `include_dummy=true`
|
||||
- **THEN** 系統回傳的資料包含所有記錄(含 DUMMY lots)
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WORKORDER 模糊搜尋
|
||||
|
||||
系統 SHALL 提供 WORKORDER 模糊搜尋功能,允許使用者透過部分字串找出匹配的 Work Order。
|
||||
|
||||
#### Scenario: 搜尋 WORKORDER
|
||||
- **WHEN** 使用者在 WORKORDER 搜尋框輸入 "GA26" 並觸發搜尋
|
||||
- **THEN** 系統回傳 WORKORDER 欄位包含 "GA26" 的前 20 筆不重複結果
|
||||
|
||||
#### Scenario: 最少輸入字元限制
|
||||
- **WHEN** 使用者輸入少於 2 個字元
|
||||
- **THEN** 系統不觸發搜尋,下拉選單維持空白
|
||||
|
||||
#### Scenario: 套用 WORKORDER 篩選
|
||||
- **WHEN** 使用者選擇特定 WORKORDER 並套用篩選
|
||||
- **THEN** 頁面僅顯示該 WORKORDER 的相關 lot 資料
|
||||
|
||||
---
|
||||
|
||||
### Requirement: LOT ID 模糊搜尋
|
||||
|
||||
系統 SHALL 提供 LOT ID 模糊搜尋功能,允許使用者透過部分字串找出匹配的 Lot。
|
||||
|
||||
#### Scenario: 搜尋 LOT ID
|
||||
- **WHEN** 使用者在 LOT ID 搜尋框輸入 "GA26011" 並觸發搜尋
|
||||
- **THEN** 系統回傳 LOTID 欄位包含 "GA26011" 的前 20 筆結果
|
||||
|
||||
#### Scenario: 套用 LOT ID 篩選
|
||||
- **WHEN** 使用者選擇特定 LOT ID 並套用篩選
|
||||
- **THEN** 頁面僅顯示該 LOT 的詳細資料
|
||||
|
||||
---
|
||||
|
||||
### Requirement: Autocomplete 下拉選單
|
||||
|
||||
系統 SHALL 提供 autocomplete 下拉選單,顯示搜尋結果供使用者選擇。
|
||||
|
||||
#### Scenario: 顯示搜尋結果
|
||||
- **WHEN** 搜尋 API 回傳結果
|
||||
- **THEN** 下拉選單顯示匹配的選項,最多 20 筆
|
||||
|
||||
#### Scenario: 選擇選項
|
||||
- **WHEN** 使用者點擊下拉選單中的選項
|
||||
- **THEN** 該值填入搜尋框,並可用於後續篩選
|
||||
|
||||
#### Scenario: 輸入防抖
|
||||
- **WHEN** 使用者快速連續輸入字元
|
||||
- **THEN** 系統等待 300ms 無新輸入後才觸發搜尋 API
|
||||
|
||||
---
|
||||
|
||||
### Requirement: 搜尋 API 端點
|
||||
|
||||
系統 SHALL 提供統一的搜尋 API 端點供前端查詢 WORKORDER 與 LOT ID。
|
||||
|
||||
#### Scenario: 查詢 WORKORDER 清單
|
||||
- **WHEN** 前端請求 `GET /api/wip/meta/search?type=workorder&q=GA26&limit=20`
|
||||
- **THEN** 系統回傳 JSON 格式的 WORKORDER 清單,欄位包含 `items` 陣列
|
||||
|
||||
#### Scenario: 查詢 LOT ID 清單
|
||||
- **WHEN** 前端請求 `GET /api/wip/meta/search?type=lotid&q=GA26011&limit=20`
|
||||
- **THEN** 系統回傳 JSON 格式的 LOT ID 清單,欄位包含 `items` 陣列
|
||||
|
||||
#### Scenario: 空查詢字串
|
||||
- **WHEN** 前端請求搜尋 API 但 `q` 參數為空或少於 2 字元
|
||||
- **THEN** 系統回傳空的 `items` 陣列
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WIP Overview 篩選器整合
|
||||
|
||||
WIP Overview 頁面 SHALL 整合 WORKORDER 與 LOT ID 篩選功能。
|
||||
|
||||
#### Scenario: 篩選器顯示於頁面
|
||||
- **WHEN** 使用者載入 WIP Overview 頁面
|
||||
- **THEN** 頁面顯示 WORKORDER 與 LOT ID 搜尋框
|
||||
|
||||
#### Scenario: 篩選影響所有區塊
|
||||
- **WHEN** 使用者套用 WORKORDER 或 LOT ID 篩選
|
||||
- **THEN** KPI 摘要、矩陣表格、Hold 摘要皆依篩選條件更新
|
||||
|
||||
---
|
||||
|
||||
### Requirement: WIP Detail 篩選器整合
|
||||
|
||||
WIP Detail 頁面 SHALL 整合 WORKORDER 與 LOT ID 篩選功能。
|
||||
|
||||
#### Scenario: 篩選器顯示於頁面
|
||||
- **WHEN** 使用者載入 WIP Detail 頁面
|
||||
- **THEN** 頁面顯示 WORKORDER 與 LOT ID 搜尋框,與現有 Package/Status 篩選器並列
|
||||
|
||||
#### Scenario: 多重篩選條件
|
||||
- **WHEN** 使用者同時設定 Package、Status、WORKORDER 篩選
|
||||
- **THEN** 系統以 AND 邏輯組合所有條件,回傳符合全部條件的資料
|
||||
@@ -1,8 +1,447 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
#
|
||||
# MES Dashboard Server Management Script
|
||||
# Usage: ./start_server.sh [start|stop|restart|status|logs]
|
||||
#
|
||||
set -uo pipefail
|
||||
|
||||
# ============================================================
|
||||
# Configuration
|
||||
# ============================================================
|
||||
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
export PYTHONPATH="${ROOT}/src:${PYTHONPATH:-}"
|
||||
CONDA_ENV="mes-dashboard"
|
||||
APP_NAME="mes-dashboard"
|
||||
PID_FILE="${ROOT}/tmp/gunicorn.pid"
|
||||
LOG_DIR="${ROOT}/logs"
|
||||
ACCESS_LOG="${LOG_DIR}/access.log"
|
||||
ERROR_LOG="${LOG_DIR}/error.log"
|
||||
STARTUP_LOG="${LOG_DIR}/startup.log"
|
||||
DEFAULT_PORT="${GUNICORN_BIND:-0.0.0.0:8080}"
|
||||
PORT=$(echo "$DEFAULT_PORT" | cut -d: -f2)
|
||||
|
||||
cd "$ROOT"
|
||||
exec gunicorn --config gunicorn.conf.py "mes_dashboard:create_app()"
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# ============================================================
|
||||
# Helper Functions
|
||||
# ============================================================
|
||||
log_info() {
|
||||
echo -e "${BLUE}[INFO]${NC} $1"
|
||||
}
|
||||
|
||||
log_success() {
|
||||
echo -e "${GREEN}[OK]${NC} $1"
|
||||
}
|
||||
|
||||
log_warn() {
|
||||
echo -e "${YELLOW}[WARN]${NC} $1"
|
||||
}
|
||||
|
||||
log_error() {
|
||||
echo -e "${RED}[ERROR]${NC} $1"
|
||||
}
|
||||
|
||||
timestamp() {
|
||||
date '+%Y-%m-%d %H:%M:%S'
|
||||
}
|
||||
|
||||
# ============================================================
|
||||
# Environment Check Functions
|
||||
# ============================================================
|
||||
check_conda() {
|
||||
if ! command -v conda &> /dev/null; then
|
||||
log_error "Conda not found. Please install Miniconda/Anaconda."
|
||||
return 1
|
||||
fi
|
||||
|
||||
# Source conda
|
||||
source "$(conda info --base)/etc/profile.d/conda.sh"
|
||||
|
||||
# Check if environment exists
|
||||
if ! conda env list | grep -q "^${CONDA_ENV} "; then
|
||||
log_error "Conda environment '${CONDA_ENV}' not found."
|
||||
log_info "Create it with: conda create -n ${CONDA_ENV} python=3.11"
|
||||
return 1
|
||||
fi
|
||||
|
||||
log_success "Conda environment '${CONDA_ENV}' found"
|
||||
return 0
|
||||
}
|
||||
|
||||
check_dependencies() {
|
||||
conda activate "$CONDA_ENV"
|
||||
|
||||
local missing=()
|
||||
|
||||
# Check critical packages
|
||||
python -c "import flask" 2>/dev/null || missing+=("flask")
|
||||
python -c "import gunicorn" 2>/dev/null || missing+=("gunicorn")
|
||||
python -c "import pandas" 2>/dev/null || missing+=("pandas")
|
||||
python -c "import oracledb" 2>/dev/null || missing+=("oracledb")
|
||||
|
||||
if [ ${#missing[@]} -gt 0 ]; then
|
||||
log_error "Missing dependencies: ${missing[*]}"
|
||||
log_info "Install with: pip install ${missing[*]}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
log_success "All dependencies installed"
|
||||
return 0
|
||||
}
|
||||
|
||||
check_env_file() {
|
||||
if [ ! -f "${ROOT}/.env" ]; then
|
||||
if [ -f "${ROOT}/.env.example" ]; then
|
||||
log_warn ".env file not found, but .env.example exists"
|
||||
log_info "Copy and configure: cp .env.example .env"
|
||||
else
|
||||
log_warn ".env file not found (optional but recommended)"
|
||||
fi
|
||||
return 0
|
||||
fi
|
||||
|
||||
log_success ".env file found"
|
||||
return 0
|
||||
}
|
||||
|
||||
check_port() {
|
||||
if lsof -i ":${PORT}" -sTCP:LISTEN &>/dev/null; then
|
||||
local pid=$(lsof -t -i ":${PORT}" -sTCP:LISTEN 2>/dev/null | head -1)
|
||||
log_error "Port ${PORT} is already in use (PID: ${pid})"
|
||||
log_info "Stop the existing process or change GUNICORN_BIND"
|
||||
return 1
|
||||
fi
|
||||
|
||||
log_success "Port ${PORT} is available"
|
||||
return 0
|
||||
}
|
||||
|
||||
check_database() {
|
||||
conda activate "$CONDA_ENV"
|
||||
export PYTHONPATH="${ROOT}/src:${PYTHONPATH:-}"
|
||||
|
||||
if python -c "
|
||||
from sqlalchemy import text
|
||||
from mes_dashboard.core.database import get_engine
|
||||
engine = get_engine()
|
||||
with engine.connect() as conn:
|
||||
conn.execute(text('SELECT 1 FROM DUAL'))
|
||||
" 2>/dev/null; then
|
||||
log_success "Database connection OK"
|
||||
return 0
|
||||
else
|
||||
log_warn "Database connection failed (service may still start)"
|
||||
return 0 # Non-fatal, allow startup
|
||||
fi
|
||||
}
|
||||
|
||||
run_all_checks() {
|
||||
log_info "Running environment checks..."
|
||||
echo ""
|
||||
|
||||
check_conda || return 1
|
||||
check_dependencies || return 1
|
||||
check_env_file
|
||||
check_port || return 1
|
||||
check_database
|
||||
|
||||
echo ""
|
||||
log_success "All checks passed"
|
||||
return 0
|
||||
}
|
||||
|
||||
# ============================================================
|
||||
# Service Management Functions
|
||||
# ============================================================
|
||||
ensure_dirs() {
|
||||
mkdir -p "${LOG_DIR}"
|
||||
mkdir -p "${ROOT}/tmp"
|
||||
}
|
||||
|
||||
get_pid() {
|
||||
if [ -f "$PID_FILE" ]; then
|
||||
local pid=$(cat "$PID_FILE" 2>/dev/null)
|
||||
if [ -n "$pid" ] && kill -0 "$pid" 2>/dev/null; then
|
||||
echo "$pid"
|
||||
return 0
|
||||
fi
|
||||
fi
|
||||
|
||||
# Fallback: find by port
|
||||
local pid=$(lsof -t -i ":${PORT}" -sTCP:LISTEN 2>/dev/null | head -1)
|
||||
if [ -n "$pid" ]; then
|
||||
echo "$pid"
|
||||
return 0
|
||||
fi
|
||||
|
||||
return 1
|
||||
}
|
||||
|
||||
is_running() {
|
||||
get_pid &>/dev/null
|
||||
}
|
||||
|
||||
do_start() {
|
||||
local foreground=false
|
||||
|
||||
if [ "${1:-}" = "-f" ] || [ "${1:-}" = "--foreground" ]; then
|
||||
foreground=true
|
||||
fi
|
||||
|
||||
if is_running; then
|
||||
local pid=$(get_pid)
|
||||
log_warn "Server is already running (PID: ${pid})"
|
||||
return 1
|
||||
fi
|
||||
|
||||
# Run checks
|
||||
run_all_checks || return 1
|
||||
|
||||
echo ""
|
||||
log_info "Starting ${APP_NAME} server..."
|
||||
|
||||
ensure_dirs
|
||||
conda activate "$CONDA_ENV"
|
||||
export PYTHONPATH="${ROOT}/src:${PYTHONPATH:-}"
|
||||
cd "$ROOT"
|
||||
|
||||
# Log startup
|
||||
echo "[$(timestamp)] Starting server" >> "$STARTUP_LOG"
|
||||
|
||||
if [ "$foreground" = true ]; then
|
||||
log_info "Running in foreground mode (Ctrl+C to stop)"
|
||||
exec gunicorn \
|
||||
--config gunicorn.conf.py \
|
||||
--pid "$PID_FILE" \
|
||||
--access-logfile "$ACCESS_LOG" \
|
||||
--error-logfile "$ERROR_LOG" \
|
||||
--capture-output \
|
||||
"mes_dashboard:create_app()"
|
||||
else
|
||||
gunicorn \
|
||||
--config gunicorn.conf.py \
|
||||
--pid "$PID_FILE" \
|
||||
--access-logfile "$ACCESS_LOG" \
|
||||
--error-logfile "$ERROR_LOG" \
|
||||
--capture-output \
|
||||
--daemon \
|
||||
"mes_dashboard:create_app()"
|
||||
|
||||
sleep 1
|
||||
|
||||
if is_running; then
|
||||
local pid=$(get_pid)
|
||||
log_success "Server started successfully (PID: ${pid})"
|
||||
log_info "Access URL: http://localhost:${PORT}"
|
||||
log_info "Logs: ${LOG_DIR}/"
|
||||
echo "[$(timestamp)] Server started (PID: ${pid})" >> "$STARTUP_LOG"
|
||||
else
|
||||
log_error "Failed to start server"
|
||||
log_info "Check error log: ${ERROR_LOG}"
|
||||
echo "[$(timestamp)] Server start failed" >> "$STARTUP_LOG"
|
||||
return 1
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
do_stop() {
|
||||
if ! is_running; then
|
||||
log_warn "Server is not running"
|
||||
return 0
|
||||
fi
|
||||
|
||||
local pid=$(get_pid)
|
||||
log_info "Stopping server (PID: ${pid})..."
|
||||
|
||||
# Find all gunicorn processes (master + workers)
|
||||
local all_pids=$(pgrep -f "gunicorn.*mes_dashboard" 2>/dev/null | tr '\n' ' ')
|
||||
|
||||
# Graceful shutdown with SIGTERM
|
||||
kill -TERM "$pid" 2>/dev/null
|
||||
|
||||
# Wait for graceful shutdown (max 10 seconds)
|
||||
local count=0
|
||||
while kill -0 "$pid" 2>/dev/null && [ $count -lt 10 ]; do
|
||||
sleep 1
|
||||
count=$((count + 1))
|
||||
echo -n "."
|
||||
done
|
||||
echo ""
|
||||
|
||||
# Force kill if still running (including orphaned workers)
|
||||
if kill -0 "$pid" 2>/dev/null || [ -n "$(pgrep -f 'gunicorn.*mes_dashboard' 2>/dev/null)" ]; then
|
||||
log_warn "Graceful shutdown timeout, forcing..."
|
||||
# Kill all gunicorn processes related to mes_dashboard
|
||||
pkill -9 -f "gunicorn.*mes_dashboard" 2>/dev/null
|
||||
sleep 1
|
||||
fi
|
||||
|
||||
# Cleanup PID file
|
||||
rm -f "$PID_FILE"
|
||||
|
||||
# Verify all processes are stopped
|
||||
if [ -z "$(pgrep -f 'gunicorn.*mes_dashboard' 2>/dev/null)" ]; then
|
||||
log_success "Server stopped"
|
||||
echo "[$(timestamp)] Server stopped (PID: ${pid})" >> "$STARTUP_LOG"
|
||||
else
|
||||
log_error "Failed to stop server"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
do_restart() {
|
||||
log_info "Restarting ${APP_NAME} server..."
|
||||
do_stop
|
||||
sleep 1
|
||||
do_start "$@"
|
||||
}
|
||||
|
||||
do_status() {
|
||||
echo ""
|
||||
echo "=========================================="
|
||||
echo " ${APP_NAME} Server Status"
|
||||
echo "=========================================="
|
||||
echo ""
|
||||
|
||||
if is_running; then
|
||||
local pid=$(get_pid)
|
||||
echo -e " Status: ${GREEN}RUNNING${NC}"
|
||||
echo " PID: ${pid}"
|
||||
echo " Port: ${PORT}"
|
||||
echo " URL: http://localhost:${PORT}"
|
||||
echo ""
|
||||
|
||||
# Show process info
|
||||
if command -v ps &>/dev/null; then
|
||||
echo " Process Info:"
|
||||
ps -p "$pid" -o pid,ppid,%cpu,%mem,etime,cmd --no-headers 2>/dev/null | \
|
||||
awk '{printf " PID: %s | CPU: %s%% | MEM: %s%% | Uptime: %s\n", $1, $3, $4, $5}'
|
||||
fi
|
||||
|
||||
# Show recent log entries
|
||||
if [ -f "$ERROR_LOG" ]; then
|
||||
echo ""
|
||||
echo " Recent Errors (last 3):"
|
||||
tail -3 "$ERROR_LOG" 2>/dev/null | sed 's/^/ /'
|
||||
fi
|
||||
else
|
||||
echo -e " Status: ${RED}STOPPED${NC}"
|
||||
echo ""
|
||||
echo " Start with: $0 start"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=========================================="
|
||||
}
|
||||
|
||||
do_logs() {
|
||||
local log_type="${1:-all}"
|
||||
local lines="${2:-50}"
|
||||
|
||||
case "$log_type" in
|
||||
access)
|
||||
if [ -f "$ACCESS_LOG" ]; then
|
||||
log_info "Access log (last ${lines} lines):"
|
||||
tail -n "$lines" "$ACCESS_LOG"
|
||||
else
|
||||
log_warn "Access log not found"
|
||||
fi
|
||||
;;
|
||||
error)
|
||||
if [ -f "$ERROR_LOG" ]; then
|
||||
log_info "Error log (last ${lines} lines):"
|
||||
tail -n "$lines" "$ERROR_LOG"
|
||||
else
|
||||
log_warn "Error log not found"
|
||||
fi
|
||||
;;
|
||||
follow)
|
||||
log_info "Following logs (Ctrl+C to stop)..."
|
||||
tail -f "$ACCESS_LOG" "$ERROR_LOG" 2>/dev/null
|
||||
;;
|
||||
*)
|
||||
log_info "=== Error Log (last 20 lines) ==="
|
||||
tail -20 "$ERROR_LOG" 2>/dev/null || echo "(empty)"
|
||||
echo ""
|
||||
log_info "=== Access Log (last 20 lines) ==="
|
||||
tail -20 "$ACCESS_LOG" 2>/dev/null || echo "(empty)"
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
do_check() {
|
||||
run_all_checks
|
||||
}
|
||||
|
||||
show_help() {
|
||||
echo ""
|
||||
echo "Usage: $0 <command> [options]"
|
||||
echo ""
|
||||
echo "Commands:"
|
||||
echo " start [-f] Start the server (-f for foreground mode)"
|
||||
echo " stop Stop the server gracefully"
|
||||
echo " restart Restart the server"
|
||||
echo " status Show server status"
|
||||
echo " logs [type] View logs (access|error|follow|all)"
|
||||
echo " check Run environment checks only"
|
||||
echo " help Show this help message"
|
||||
echo ""
|
||||
echo "Examples:"
|
||||
echo " $0 start # Start in background"
|
||||
echo " $0 start -f # Start in foreground"
|
||||
echo " $0 logs follow # Follow logs in real-time"
|
||||
echo " $0 logs error 100 # Show last 100 error log lines"
|
||||
echo ""
|
||||
echo "Environment Variables:"
|
||||
echo " GUNICORN_BIND Bind address (default: 0.0.0.0:8080)"
|
||||
echo " GUNICORN_WORKERS Number of workers (default: 1)"
|
||||
echo " GUNICORN_THREADS Threads per worker (default: 4)"
|
||||
echo ""
|
||||
}
|
||||
|
||||
# ============================================================
|
||||
# Main
|
||||
# ============================================================
|
||||
main() {
|
||||
local command="${1:-}"
|
||||
shift || true
|
||||
|
||||
case "$command" in
|
||||
start)
|
||||
do_start "$@"
|
||||
;;
|
||||
stop)
|
||||
do_stop
|
||||
;;
|
||||
restart)
|
||||
do_restart "$@"
|
||||
;;
|
||||
status)
|
||||
do_status
|
||||
;;
|
||||
logs)
|
||||
do_logs "$@"
|
||||
;;
|
||||
check)
|
||||
do_check
|
||||
;;
|
||||
help|--help|-h)
|
||||
show_help
|
||||
;;
|
||||
"")
|
||||
# Default: start in foreground for backward compatibility
|
||||
do_start
|
||||
;;
|
||||
*)
|
||||
log_error "Unknown command: ${command}"
|
||||
show_help
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
main "$@"
|
||||
|
||||
@@ -8,7 +8,7 @@ from flask import Flask, jsonify, render_template, request
|
||||
from mes_dashboard.config.tables import TABLES_CONFIG
|
||||
from mes_dashboard.config.settings import get_config
|
||||
from mes_dashboard.core.cache import NoOpCache
|
||||
from mes_dashboard.core.database import get_table_data, get_table_columns, get_engine, init_db
|
||||
from mes_dashboard.core.database import get_table_data, get_table_columns, get_engine, init_db, start_keepalive
|
||||
from mes_dashboard.routes import register_routes
|
||||
|
||||
|
||||
@@ -26,6 +26,7 @@ def create_app(config_name: str | None = None) -> Flask:
|
||||
init_db(app)
|
||||
with app.app_context():
|
||||
get_engine()
|
||||
start_keepalive() # Keep database connections alive
|
||||
|
||||
# Register API routes
|
||||
register_routes(app)
|
||||
@@ -44,21 +45,21 @@ def create_app(config_name: str | None = None) -> Flask:
|
||||
"""Table viewer page."""
|
||||
return render_template('index.html', tables_config=TABLES_CONFIG)
|
||||
|
||||
@app.route('/wip')
|
||||
def wip_page():
|
||||
"""WIP report page."""
|
||||
return render_template('wip_report.html')
|
||||
@app.route('/wip-overview')
|
||||
def wip_overview_page():
|
||||
"""WIP Overview Dashboard - for executives."""
|
||||
return render_template('wip_overview.html')
|
||||
|
||||
@app.route('/wip-detail')
|
||||
def wip_detail_page():
|
||||
"""WIP Detail Dashboard - for production lines."""
|
||||
return render_template('wip_detail.html')
|
||||
|
||||
@app.route('/resource')
|
||||
def resource_page():
|
||||
"""Resource status report page."""
|
||||
return render_template('resource_status.html')
|
||||
|
||||
@app.route('/wip-overview')
|
||||
def wip_overview_page():
|
||||
"""WIP overview dashboard page."""
|
||||
return render_template('wip_overview.html')
|
||||
|
||||
@app.route('/excel-query')
|
||||
def excel_query_page():
|
||||
"""Excel batch query tool page."""
|
||||
|
||||
@@ -31,8 +31,9 @@ class DevelopmentConfig(Config):
|
||||
DEBUG = True
|
||||
ENV = "development"
|
||||
|
||||
DB_POOL_SIZE = _int_env("DB_POOL_SIZE", 5)
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 10)
|
||||
# Smaller pool to ensure keep-alive covers all connections
|
||||
DB_POOL_SIZE = _int_env("DB_POOL_SIZE", 2)
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 3)
|
||||
|
||||
|
||||
class ProductionConfig(Config):
|
||||
@@ -45,9 +46,22 @@ class ProductionConfig(Config):
|
||||
DB_MAX_OVERFLOW = _int_env("DB_MAX_OVERFLOW", 20)
|
||||
|
||||
|
||||
class TestingConfig(Config):
|
||||
"""Testing configuration."""
|
||||
|
||||
DEBUG = True
|
||||
TESTING = True
|
||||
ENV = "testing"
|
||||
|
||||
DB_POOL_SIZE = 1
|
||||
DB_MAX_OVERFLOW = 0
|
||||
|
||||
|
||||
def get_config(env: str | None = None) -> Type[Config]:
|
||||
"""Select config class based on environment name."""
|
||||
value = (env or os.getenv("FLASK_ENV", "development")).lower()
|
||||
if value in {"prod", "production"}:
|
||||
return ProductionConfig
|
||||
if value in {"test", "testing"}:
|
||||
return TestingConfig
|
||||
return DevelopmentConfig
|
||||
|
||||
@@ -1,8 +1,17 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Table configuration metadata for MES Dashboard."""
|
||||
|
||||
# 16 core tables config (with categories)
|
||||
# 17 core tables config (with categories)
|
||||
TABLES_CONFIG = {
|
||||
'即時數據表 (DWH)': [
|
||||
{
|
||||
'name': 'DWH.DW_PJ_LOT_V',
|
||||
'display_name': 'WIP 即時批次 (DW_PJ_LOT_V)',
|
||||
'row_count': 10000, # 動態變化,約 9000-12000
|
||||
'time_field': 'SYS_DATE',
|
||||
'description': 'DWH 即時 WIP View - 每 5 分鐘更新,包含完整批次狀態、工站、設備、Hold 原因等 70 欄位'
|
||||
}
|
||||
],
|
||||
'現況快照表': [
|
||||
{
|
||||
'name': 'DW_MES_WIP',
|
||||
|
||||
@@ -3,44 +3,44 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Optional, Dict, Any, Tuple
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
import oracledb
|
||||
import pandas as pd
|
||||
from flask import g, current_app
|
||||
from sqlalchemy import create_engine, text
|
||||
from sqlalchemy.pool import NullPool
|
||||
|
||||
from mes_dashboard.config.database import DB_CONFIG, CONNECTION_STRING
|
||||
from mes_dashboard.config.settings import DevelopmentConfig
|
||||
|
||||
# ============================================================
|
||||
# SQLAlchemy Engine (Singleton with connection pooling)
|
||||
# SQLAlchemy Engine (NullPool - no connection pooling)
|
||||
# ============================================================
|
||||
# Using NullPool for dashboard applications that need long-term stability.
|
||||
# Each query creates a new connection and closes it immediately after use.
|
||||
# This avoids issues with idle connections being dropped by firewalls/NAT.
|
||||
|
||||
_ENGINE = None
|
||||
|
||||
|
||||
def _get_pool_settings() -> Tuple[int, int]:
|
||||
"""Return pool size and max overflow from app config or defaults."""
|
||||
try:
|
||||
pool_size = current_app.config.get("DB_POOL_SIZE", DevelopmentConfig.DB_POOL_SIZE)
|
||||
max_overflow = current_app.config.get("DB_MAX_OVERFLOW", DevelopmentConfig.DB_MAX_OVERFLOW)
|
||||
except RuntimeError:
|
||||
pool_size = DevelopmentConfig.DB_POOL_SIZE
|
||||
max_overflow = DevelopmentConfig.DB_MAX_OVERFLOW
|
||||
return pool_size, max_overflow
|
||||
|
||||
|
||||
def get_engine():
|
||||
"""Get SQLAlchemy engine with connection pooling (singleton pattern)."""
|
||||
"""Get SQLAlchemy engine without connection pooling.
|
||||
|
||||
Uses NullPool to create fresh connections for each request.
|
||||
This is more reliable for long-running dashboard applications
|
||||
where idle connections may be dropped by network infrastructure.
|
||||
"""
|
||||
global _ENGINE
|
||||
if _ENGINE is None:
|
||||
pool_size, max_overflow = _get_pool_settings()
|
||||
_ENGINE = create_engine(
|
||||
CONNECTION_STRING,
|
||||
pool_size=pool_size,
|
||||
max_overflow=max_overflow,
|
||||
pool_pre_ping=True,
|
||||
poolclass=NullPool, # No connection pooling - fresh connection each time
|
||||
connect_args={
|
||||
"tcp_connect_timeout": 15, # TCP connect timeout 15s
|
||||
"retry_count": 2, # Retry twice on connection failure
|
||||
"retry_delay": 1, # 1s delay between retries
|
||||
}
|
||||
)
|
||||
return _ENGINE
|
||||
|
||||
@@ -69,6 +69,22 @@ def init_db(app) -> None:
|
||||
app.teardown_appcontext(close_db)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Keep-Alive (No-op with NullPool)
|
||||
# ============================================================
|
||||
# Keep-alive is not needed with NullPool since each query creates
|
||||
# a fresh connection. These functions are kept for API compatibility.
|
||||
|
||||
def start_keepalive():
|
||||
"""No-op: Keep-alive not needed with NullPool."""
|
||||
print("[DB] Using NullPool - no keep-alive needed")
|
||||
|
||||
|
||||
def stop_keepalive():
|
||||
"""No-op: Keep-alive not needed with NullPool."""
|
||||
pass
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Direct Connection Helpers
|
||||
# ============================================================
|
||||
@@ -80,7 +96,12 @@ def get_db_connection():
|
||||
Used for operations that need direct cursor access.
|
||||
"""
|
||||
try:
|
||||
return oracledb.connect(**DB_CONFIG)
|
||||
return oracledb.connect(
|
||||
**DB_CONFIG,
|
||||
tcp_connect_timeout=10, # TCP connect timeout 10s
|
||||
retry_count=1, # Retry once on connection failure
|
||||
retry_delay=1, # 1s delay between retries
|
||||
)
|
||||
except Exception as exc:
|
||||
print(f"Database connection failed: {exc}")
|
||||
return None
|
||||
|
||||
@@ -2,116 +2,239 @@
|
||||
"""WIP API routes for MES Dashboard.
|
||||
|
||||
Contains Flask Blueprint for WIP-related API endpoints.
|
||||
Uses DWH.DW_PJ_LOT_V view for real-time WIP data.
|
||||
"""
|
||||
|
||||
from flask import Blueprint, jsonify, request
|
||||
|
||||
from mes_dashboard.services.wip_service import (
|
||||
query_wip_summary,
|
||||
query_wip_by_spec_workcenter,
|
||||
query_wip_by_product_line,
|
||||
query_wip_by_status,
|
||||
query_wip_by_mfgorder,
|
||||
query_wip_distribution_filter_options,
|
||||
query_wip_distribution_pivot_columns,
|
||||
query_wip_distribution,
|
||||
from flask import Blueprint, jsonify, request
|
||||
|
||||
from mes_dashboard.services.wip_service import (
|
||||
get_wip_summary,
|
||||
get_wip_matrix,
|
||||
get_wip_hold_summary,
|
||||
get_wip_detail,
|
||||
get_workcenters,
|
||||
get_packages,
|
||||
search_workorders,
|
||||
search_lot_ids,
|
||||
)
|
||||
|
||||
# Create Blueprint
|
||||
wip_bp = Blueprint('wip', __name__, url_prefix='/api/wip')
|
||||
|
||||
|
||||
@wip_bp.route('/summary')
|
||||
def api_wip_summary():
|
||||
"""API: Current WIP summary."""
|
||||
summary = query_wip_summary()
|
||||
if summary:
|
||||
return jsonify({'success': True, 'data': summary})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
def _parse_bool(value: str) -> bool:
|
||||
"""Parse boolean from query string."""
|
||||
return value.lower() in ('true', '1', 'yes') if value else False
|
||||
|
||||
|
||||
@wip_bp.route('/by_spec_workcenter')
|
||||
def api_wip_by_spec_workcenter():
|
||||
"""API: Current WIP by spec/workcenter."""
|
||||
df = query_wip_by_spec_workcenter()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data, 'count': len(data)})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
# ============================================================
|
||||
# Overview APIs
|
||||
# ============================================================
|
||||
|
||||
@wip_bp.route('/overview/summary')
|
||||
def api_overview_summary():
|
||||
"""API: Get WIP KPI summary for overview dashboard.
|
||||
|
||||
@wip_bp.route('/by_product_line')
|
||||
def api_wip_by_product_line():
|
||||
"""API: Current WIP by product line."""
|
||||
df = query_wip_by_product_line()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
if not df.empty:
|
||||
product_line_summary = df.groupby('PRODUCTLINENAME_LEF').agg({
|
||||
'LOT_COUNT': 'sum',
|
||||
'TOTAL_QTY': 'sum',
|
||||
'TOTAL_QTY2': 'sum'
|
||||
}).reset_index()
|
||||
summary = product_line_summary.to_dict(orient='records')
|
||||
else:
|
||||
summary = []
|
||||
return jsonify({'success': True, 'data': data, 'summary': summary, 'count': len(data)})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
Query Parameters:
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with total_lots, total_qty, hold_lots, hold_qty, sys_date
|
||||
"""
|
||||
workorder = request.args.get('workorder', '').strip() or None
|
||||
lotid = request.args.get('lotid', '').strip() or None
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
@wip_bp.route('/by_status')
|
||||
def api_wip_by_status():
|
||||
"""API: Current WIP by status."""
|
||||
df = query_wip_by_status()
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/by_mfgorder')
|
||||
def api_wip_by_mfgorder():
|
||||
"""API: Current WIP by mfg order (Top N)."""
|
||||
limit = request.args.get('limit', 100, type=int)
|
||||
df = query_wip_by_mfgorder(limit)
|
||||
if df is not None:
|
||||
data = df.to_dict(orient='records')
|
||||
return jsonify({'success': True, 'data': data})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution/filter_options')
|
||||
def api_wip_distribution_filter_options():
|
||||
"""API: Get WIP distribution filter options."""
|
||||
days_back = request.args.get('days_back', 90, type=int)
|
||||
options = query_wip_distribution_filter_options(days_back)
|
||||
if options:
|
||||
return jsonify({'success': True, 'data': options})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution/pivot_columns', methods=['POST'])
|
||||
def api_wip_distribution_pivot_columns():
|
||||
"""API: Get WIP distribution pivot columns."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
days_back = data.get('days_back', 90)
|
||||
columns = query_wip_distribution_pivot_columns(filters, days_back)
|
||||
if columns is not None:
|
||||
return jsonify({'success': True, 'data': columns})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/distribution', methods=['POST'])
|
||||
def api_wip_distribution():
|
||||
"""API: Query WIP distribution main data."""
|
||||
data = request.get_json() or {}
|
||||
filters = data.get('filters')
|
||||
limit = min(data.get('limit', 500), 1000) # Max 1000 records
|
||||
offset = data.get('offset', 0)
|
||||
days_back = data.get('days_back', 90)
|
||||
|
||||
result = query_wip_distribution(filters, limit, offset, days_back)
|
||||
result = get_wip_summary(
|
||||
include_dummy=include_dummy,
|
||||
workorder=workorder,
|
||||
lotid=lotid
|
||||
)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/overview/matrix')
|
||||
def api_overview_matrix():
|
||||
"""API: Get workcenter x product line matrix for overview dashboard.
|
||||
|
||||
Query Parameters:
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with workcenters, packages, matrix, workcenter_totals,
|
||||
package_totals, grand_total
|
||||
"""
|
||||
workorder = request.args.get('workorder', '').strip() or None
|
||||
lotid = request.args.get('lotid', '').strip() or None
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
result = get_wip_matrix(
|
||||
include_dummy=include_dummy,
|
||||
workorder=workorder,
|
||||
lotid=lotid
|
||||
)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/overview/hold')
|
||||
def api_overview_hold():
|
||||
"""API: Get hold summary grouped by hold reason.
|
||||
|
||||
Query Parameters:
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with items list containing reason, lots, qty
|
||||
"""
|
||||
workorder = request.args.get('workorder', '').strip() or None
|
||||
lotid = request.args.get('lotid', '').strip() or None
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
result = get_wip_hold_summary(
|
||||
include_dummy=include_dummy,
|
||||
workorder=workorder,
|
||||
lotid=lotid
|
||||
)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Detail APIs
|
||||
# ============================================================
|
||||
|
||||
@wip_bp.route('/detail/<workcenter>')
|
||||
def api_detail(workcenter: str):
|
||||
"""API: Get WIP detail for a specific workcenter group.
|
||||
|
||||
Args:
|
||||
workcenter: WORKCENTER_GROUP name (URL path parameter)
|
||||
|
||||
Query Parameters:
|
||||
package: Optional PRODUCTLINENAME filter
|
||||
status: Optional STATUS filter ('ACTIVE', 'HOLD')
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
page: Page number (default 1)
|
||||
page_size: Records per page (default 100, max 500)
|
||||
|
||||
Returns:
|
||||
JSON with workcenter, summary, specs, lots, pagination, sys_date
|
||||
"""
|
||||
package = request.args.get('package', '').strip() or None
|
||||
status = request.args.get('status', '').strip() or None
|
||||
workorder = request.args.get('workorder', '').strip() or None
|
||||
lotid = request.args.get('lotid', '').strip() or None
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
page = request.args.get('page', 1, type=int)
|
||||
page_size = min(request.args.get('page_size', 100, type=int), 500)
|
||||
|
||||
if page < 1:
|
||||
page = 1
|
||||
|
||||
result = get_wip_detail(
|
||||
workcenter=workcenter,
|
||||
package=package,
|
||||
status=status,
|
||||
workorder=workorder,
|
||||
lotid=lotid,
|
||||
include_dummy=include_dummy,
|
||||
page=page,
|
||||
page_size=page_size
|
||||
)
|
||||
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Meta APIs
|
||||
# ============================================================
|
||||
|
||||
@wip_bp.route('/meta/workcenters')
|
||||
def api_meta_workcenters():
|
||||
"""API: Get list of workcenter groups with lot counts.
|
||||
|
||||
Query Parameters:
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with list of {name, lot_count} sorted by sequence
|
||||
"""
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
result = get_workcenters(include_dummy=include_dummy)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/meta/packages')
|
||||
def api_meta_packages():
|
||||
"""API: Get list of packages (product lines) with lot counts.
|
||||
|
||||
Query Parameters:
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with list of {name, lot_count} sorted by count desc
|
||||
"""
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
result = get_packages(include_dummy=include_dummy)
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': result})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
|
||||
@wip_bp.route('/meta/search')
|
||||
def api_meta_search():
|
||||
"""API: Search for WORKORDER or LOTID values.
|
||||
|
||||
Query Parameters:
|
||||
type: Search type ('workorder' or 'lotid')
|
||||
q: Search query (minimum 2 characters)
|
||||
limit: Maximum results (default: 20, max: 50)
|
||||
include_dummy: Include DUMMY lots (default: false)
|
||||
|
||||
Returns:
|
||||
JSON with items list containing matching values
|
||||
"""
|
||||
search_type = request.args.get('type', '').strip().lower()
|
||||
q = request.args.get('q', '').strip()
|
||||
limit = min(request.args.get('limit', 20, type=int), 50)
|
||||
include_dummy = _parse_bool(request.args.get('include_dummy', ''))
|
||||
|
||||
# Validate search type
|
||||
if search_type not in ('workorder', 'lotid'):
|
||||
return jsonify({
|
||||
'success': False,
|
||||
'error': 'Invalid type. Use "workorder" or "lotid"'
|
||||
}), 400
|
||||
|
||||
# Validate query length
|
||||
if len(q) < 2:
|
||||
return jsonify({'success': True, 'data': {'items': []}})
|
||||
|
||||
# Perform search
|
||||
if search_type == 'workorder':
|
||||
result = search_workorders(q=q, limit=limit, include_dummy=include_dummy)
|
||||
else:
|
||||
result = search_lot_ids(q=q, limit=limit, include_dummy=include_dummy)
|
||||
|
||||
if result is not None:
|
||||
return jsonify({'success': True, 'data': {'items': result}})
|
||||
return jsonify({'success': False, 'error': '查詢失敗'}), 500
|
||||
|
||||
@@ -1,464 +1,590 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""WIP (Work In Progress) query services for MES Dashboard.
|
||||
|
||||
Provides functions to query WIP data from DW_MES_WIP table.
|
||||
Provides functions to query WIP data from DWH.DW_PJ_LOT_V view.
|
||||
This view provides real-time WIP information updated every 5 minutes.
|
||||
"""
|
||||
|
||||
import pandas as pd
|
||||
from typing import Optional, Dict, List, Any
|
||||
|
||||
from mes_dashboard.core.database import get_db_connection, read_sql_df
|
||||
from mes_dashboard.config.workcenter_groups import get_workcenter_group
|
||||
from mes_dashboard.config.constants import DEFAULT_WIP_DAYS_BACK, WIP_EXCLUDED_STATUS
|
||||
from typing import Optional, Dict, List, Any
|
||||
|
||||
import pandas as pd
|
||||
|
||||
from mes_dashboard.core.database import read_sql_df
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Base Subquery
|
||||
# ============================================================
|
||||
|
||||
def get_current_wip_subquery(days_back: int = DEFAULT_WIP_DAYS_BACK) -> str:
|
||||
"""Returns subquery to get latest record per CONTAINER (current WIP snapshot).
|
||||
|
||||
Uses ROW_NUMBER() analytic function for better performance.
|
||||
Only scans recent data (default 90 days) to reduce scan size.
|
||||
Filters out completed (8) and scrapped (128) status.
|
||||
Excludes DUMMY orders (MFGORDERNAME = 'DUMMY').
|
||||
|
||||
Logic explanation:
|
||||
- PARTITION BY CONTAINERNAME: Groups records by each LOT
|
||||
- ORDER BY TXNDATE DESC: Orders by transaction time (newest first)
|
||||
- rn = 1: Takes only the latest record for each LOT
|
||||
- This gives us the current/latest status of each LOT
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back (default 90)
|
||||
|
||||
Returns:
|
||||
SQL subquery string for current WIP snapshot.
|
||||
"""
|
||||
excluded_status = ', '.join(str(s) for s in WIP_EXCLUDED_STATUS)
|
||||
return f"""
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT w.*,
|
||||
ROW_NUMBER() OVER (PARTITION BY w.CONTAINERNAME ORDER BY w.TXNDATE DESC) as rn
|
||||
FROM DW_MES_WIP w
|
||||
WHERE w.TXNDATE >= SYSDATE - {days_back}
|
||||
AND w.STATUS NOT IN ({excluded_status})
|
||||
AND (w.MFGORDERNAME IS NULL OR w.MFGORDERNAME <> 'DUMMY')
|
||||
)
|
||||
WHERE rn = 1
|
||||
"""
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Summary Queries
|
||||
# ============================================================
|
||||
|
||||
def query_wip_summary(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[Dict]:
|
||||
"""Query current WIP summary statistics.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with summary stats or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
def _safe_value(val):
|
||||
"""Convert pandas NaN/NaT to None for JSON serialization."""
|
||||
if pd.isna(val):
|
||||
return None
|
||||
return val
|
||||
|
||||
|
||||
def _escape_sql(value: str) -> str:
|
||||
"""Escape single quotes in SQL string values."""
|
||||
if value is None:
|
||||
return None
|
||||
return value.replace("'", "''")
|
||||
|
||||
|
||||
def _build_base_conditions(
|
||||
include_dummy: bool = False,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None
|
||||
) -> List[str]:
|
||||
"""Build base WHERE conditions for WIP queries.
|
||||
|
||||
Args:
|
||||
include_dummy: If False (default), exclude LOTID containing 'DUMMY'
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
|
||||
Returns:
|
||||
List of SQL condition strings
|
||||
"""
|
||||
conditions = []
|
||||
|
||||
# DUMMY exclusion (default behavior)
|
||||
if not include_dummy:
|
||||
conditions.append("LOTID NOT LIKE '%DUMMY%'")
|
||||
|
||||
# WORKORDER filter (fuzzy match)
|
||||
if workorder:
|
||||
conditions.append(f"WORKORDER LIKE '%{_escape_sql(workorder)}%'")
|
||||
|
||||
# LOTID filter (fuzzy match)
|
||||
if lotid:
|
||||
conditions.append(f"LOTID LIKE '%{_escape_sql(lotid)}%'")
|
||||
|
||||
return conditions
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Data Source Configuration
|
||||
# ============================================================
|
||||
# The view DWH.DW_PJ_LOT_V must be accessed with schema prefix
|
||||
WIP_VIEW = "DWH.DW_PJ_LOT_V"
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Overview API Functions
|
||||
# ============================================================
|
||||
|
||||
def get_wip_summary(
|
||||
include_dummy: bool = False,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP KPI summary for overview dashboard.
|
||||
|
||||
Args:
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
|
||||
Returns:
|
||||
Dict with summary stats:
|
||||
- total_lots: Total number of lots
|
||||
- total_qty: Total quantity
|
||||
- hold_lots: Number of hold lots
|
||||
- hold_qty: Hold quantity
|
||||
- sys_date: Data timestamp
|
||||
"""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}" if conditions else ""
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(CONTAINERNAME) as TOTAL_LOT_COUNT,
|
||||
COUNT(*) as TOTAL_LOTS,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2,
|
||||
COUNT(DISTINCT SPECNAME) as SPEC_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT PRODUCTLINENAME_LEF) as PRODUCT_LINE_COUNT
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
SUM(CASE WHEN STATUS = 'HOLD' THEN 1 ELSE 0 END) as HOLD_LOTS,
|
||||
SUM(CASE WHEN STATUS = 'HOLD' THEN QTY ELSE 0 END) as HOLD_QTY,
|
||||
MAX(SYS_DATE) as SYS_DATE
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
result = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
df = read_sql_df(sql)
|
||||
|
||||
if not result:
|
||||
if df is None or df.empty:
|
||||
return None
|
||||
|
||||
row = df.iloc[0]
|
||||
return {
|
||||
'total_lot_count': result[0] or 0,
|
||||
'total_qty': result[1] or 0,
|
||||
'total_qty2': result[2] or 0,
|
||||
'spec_count': result[3] or 0,
|
||||
'workcenter_count': result[4] or 0,
|
||||
'product_line_count': result[5] or 0
|
||||
'total_lots': int(row['TOTAL_LOTS'] or 0),
|
||||
'total_qty': int(row['TOTAL_QTY'] or 0),
|
||||
'hold_lots': int(row['HOLD_LOTS'] or 0),
|
||||
'hold_qty': int(row['HOLD_QTY'] or 0),
|
||||
'sys_date': str(row['SYS_DATE']) if row['SYS_DATE'] else None
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"WIP summary query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_spec_workcenter(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by spec and workcenter.
|
||||
def get_wip_matrix(
|
||||
include_dummy: bool = False,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get workcenter x product line matrix for overview dashboard.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by spec/workcenter or None if query fails.
|
||||
Dict with matrix data:
|
||||
- workcenters: List of workcenter groups (sorted by WORKCENTERSEQUENCE_GROUP)
|
||||
- packages: List of product lines (sorted by total QTY desc)
|
||||
- matrix: Dict of {workcenter: {package: qty}}
|
||||
- workcenter_totals: Dict of {workcenter: total_qty}
|
||||
- package_totals: Dict of {package: total_qty}
|
||||
- grand_total: Overall total
|
||||
"""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append("WORKCENTER_GROUP IS NOT NULL")
|
||||
conditions.append("PRODUCTLINENAME IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE SPECNAME IS NOT NULL
|
||||
AND WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY SPECNAME, WORKCENTERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by spec/workcenter query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_product_line(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by product line.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by product line or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
SPECNAME,
|
||||
WORKCENTERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE PRODUCTLINENAME_LEF IS NOT NULL
|
||||
GROUP BY PRODUCTLINENAME_LEF, SPECNAME, WORKCENTERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by product line query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_status(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by status.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by status or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
STATUS,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
GROUP BY STATUS
|
||||
ORDER BY LOT_COUNT DESC
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by status query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_by_mfgorder(days_back: int = DEFAULT_WIP_DAYS_BACK, top_n: int = 100) -> Optional[pd.DataFrame]:
|
||||
"""Query current WIP grouped by manufacturing order (top N).
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
top_n: Number of top orders to return
|
||||
|
||||
Returns:
|
||||
DataFrame with WIP by MFG order or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
MFGORDERNAME,
|
||||
COUNT(CONTAINERNAME) as LOT_COUNT,
|
||||
SUM(QTY) as TOTAL_QTY,
|
||||
SUM(QTY2) as TOTAL_QTY2
|
||||
FROM ({get_current_wip_subquery(days_back)}) wip
|
||||
WHERE MFGORDERNAME IS NOT NULL
|
||||
GROUP BY MFGORDERNAME
|
||||
ORDER BY TOTAL_QTY DESC
|
||||
) WHERE ROWNUM <= {top_n}
|
||||
"""
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"WIP by MFG order query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# WIP Distribution Table Functions
|
||||
# ============================================================
|
||||
|
||||
def query_wip_distribution_filter_options(days_back: int = DEFAULT_WIP_DAYS_BACK) -> Optional[Dict]:
|
||||
"""Get filter options for WIP distribution table.
|
||||
|
||||
Returns available values for packages, types, areas, and lot statuses.
|
||||
|
||||
Args:
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
Dict with filter options or None if query fails.
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
sql = f"""
|
||||
SELECT
|
||||
PRODUCTLINENAME_LEF,
|
||||
PJ_TYPE,
|
||||
PJ_PRODUCEREGION,
|
||||
HOLDREASONNAME
|
||||
FROM ({base_sql}) wip
|
||||
WORKCENTER_GROUP,
|
||||
WORKCENTERSEQUENCE_GROUP,
|
||||
PRODUCTLINENAME,
|
||||
SUM(QTY) as QTY
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTERSEQUENCE_GROUP, PRODUCTLINENAME
|
||||
ORDER BY WORKCENTERSEQUENCE_GROUP, PRODUCTLINENAME
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Extract unique values and sort
|
||||
packages = sorted([x for x in df['PRODUCTLINENAME_LEF'].dropna().unique().tolist() if x])
|
||||
types = sorted([x for x in df['PJ_TYPE'].dropna().unique().tolist() if x])
|
||||
areas = sorted([x for x in df['PJ_PRODUCEREGION'].dropna().unique().tolist() if x])
|
||||
if df is None or df.empty:
|
||||
return {
|
||||
'workcenters': [],
|
||||
'packages': [],
|
||||
'matrix': {},
|
||||
'workcenter_totals': {},
|
||||
'package_totals': {},
|
||||
'grand_total': 0
|
||||
}
|
||||
|
||||
# Lot status: based on HOLDREASONNAME - has value=Hold, no value=Active
|
||||
lot_statuses = ['Active', 'Hold']
|
||||
# Build matrix
|
||||
matrix = {}
|
||||
workcenter_totals = {}
|
||||
package_totals = {}
|
||||
|
||||
# Get unique workcenters sorted by sequence
|
||||
wc_order = df.drop_duplicates('WORKCENTER_GROUP')[['WORKCENTER_GROUP', 'WORKCENTERSEQUENCE_GROUP']]
|
||||
wc_order = wc_order.sort_values('WORKCENTERSEQUENCE_GROUP')
|
||||
workcenters = wc_order['WORKCENTER_GROUP'].tolist()
|
||||
|
||||
# Build matrix and totals
|
||||
for _, row in df.iterrows():
|
||||
wc = row['WORKCENTER_GROUP']
|
||||
pkg = row['PRODUCTLINENAME']
|
||||
qty = int(row['QTY'] or 0)
|
||||
|
||||
if wc not in matrix:
|
||||
matrix[wc] = {}
|
||||
matrix[wc][pkg] = qty
|
||||
|
||||
workcenter_totals[wc] = workcenter_totals.get(wc, 0) + qty
|
||||
package_totals[pkg] = package_totals.get(pkg, 0) + qty
|
||||
|
||||
# Sort packages by total qty desc
|
||||
packages = sorted(package_totals.keys(), key=lambda x: package_totals[x], reverse=True)
|
||||
|
||||
grand_total = sum(workcenter_totals.values())
|
||||
|
||||
return {
|
||||
'workcenters': workcenters,
|
||||
'packages': packages,
|
||||
'types': types,
|
||||
'areas': areas,
|
||||
'lot_statuses': lot_statuses
|
||||
'matrix': matrix,
|
||||
'workcenter_totals': workcenter_totals,
|
||||
'package_totals': package_totals,
|
||||
'grand_total': grand_total
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"WIP filter options query failed: {exc}")
|
||||
print(f"WIP matrix query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
def _build_wip_distribution_where_clause(filters: Optional[Dict]) -> str:
|
||||
"""Build WHERE clause for WIP distribution queries.
|
||||
def get_wip_hold_summary(
|
||||
include_dummy: bool = False,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get hold summary grouped by hold reason.
|
||||
|
||||
Args:
|
||||
filters: Dict with filter values
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
|
||||
Returns:
|
||||
SQL WHERE clause conditions string.
|
||||
"""
|
||||
where_conditions = []
|
||||
|
||||
if filters:
|
||||
if filters.get('packages') and len(filters['packages']) > 0:
|
||||
pkg_list = "', '".join(filters['packages'])
|
||||
where_conditions.append(f"PRODUCTLINENAME_LEF IN ('{pkg_list}')")
|
||||
|
||||
if filters.get('types') and len(filters['types']) > 0:
|
||||
type_list = "', '".join(filters['types'])
|
||||
where_conditions.append(f"PJ_TYPE IN ('{type_list}')")
|
||||
|
||||
if filters.get('areas') and len(filters['areas']) > 0:
|
||||
area_list = "', '".join(filters['areas'])
|
||||
where_conditions.append(f"PJ_PRODUCEREGION IN ('{area_list}')")
|
||||
|
||||
# Lot status filter: Active = HOLDREASONNAME IS NULL, Hold = IS NOT NULL
|
||||
if filters.get('lot_statuses') and len(filters['lot_statuses']) > 0:
|
||||
status_conds = []
|
||||
if 'Active' in filters['lot_statuses']:
|
||||
status_conds.append("HOLDREASONNAME IS NULL")
|
||||
if 'Hold' in filters['lot_statuses']:
|
||||
status_conds.append("HOLDREASONNAME IS NOT NULL")
|
||||
if status_conds:
|
||||
where_conditions.append(f"({' OR '.join(status_conds)})")
|
||||
|
||||
if filters.get('search'):
|
||||
search_term = filters['search'].replace("'", "''")
|
||||
where_conditions.append(
|
||||
f"(UPPER(MFGORDERNAME) LIKE UPPER('%{search_term}%') "
|
||||
f"OR UPPER(CONTAINERNAME) LIKE UPPER('%{search_term}%'))"
|
||||
)
|
||||
|
||||
return " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
|
||||
def query_wip_distribution_pivot_columns(
|
||||
filters: Optional[Dict] = None,
|
||||
days_back: int = DEFAULT_WIP_DAYS_BACK
|
||||
) -> Optional[List[Dict]]:
|
||||
"""Get pivot columns for WIP distribution table.
|
||||
|
||||
Returns Workcenter|Spec combinations that have data.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
days_back: Number of days to look back
|
||||
|
||||
Returns:
|
||||
List of pivot column dicts or None if query fails.
|
||||
Dict with hold items sorted by lots desc:
|
||||
- items: List of {reason, lots, qty}
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
where_clause = _build_wip_distribution_where_clause(filters)
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append("STATUS = 'HOLD'")
|
||||
conditions.append("HOLDREASONNAME IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
SPECNAME as WC_SPEC,
|
||||
COUNT(DISTINCT CONTAINERNAME) as LOT_COUNT
|
||||
FROM ({base_sql}) wip
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
AND {where_clause}
|
||||
GROUP BY WORKCENTERNAME, SPECNAME
|
||||
ORDER BY LOT_COUNT DESC
|
||||
HOLDREASONNAME as REASON,
|
||||
COUNT(*) as LOTS,
|
||||
SUM(QTY) as QTY
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
GROUP BY HOLDREASONNAME
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Convert to pivot column list with WORKCENTER_GROUPS mapping
|
||||
pivot_columns = []
|
||||
for _, row in df.iterrows():
|
||||
wc = row['WORKCENTERNAME'] or ''
|
||||
spec = row['WC_SPEC'] or ''
|
||||
group_name, order = get_workcenter_group(wc)
|
||||
display_wc = group_name if group_name else wc
|
||||
if df is None or df.empty:
|
||||
return {'items': []}
|
||||
|
||||
pivot_columns.append({
|
||||
'key': f"{wc}|{spec}",
|
||||
'workcenter': wc,
|
||||
'workcenter_group': display_wc,
|
||||
'order': order,
|
||||
'spec': spec,
|
||||
'count': int(row['LOT_COUNT'] or 0)
|
||||
items = []
|
||||
for _, row in df.iterrows():
|
||||
items.append({
|
||||
'reason': row['REASON'],
|
||||
'lots': int(row['LOTS'] or 0),
|
||||
'qty': int(row['QTY'] or 0)
|
||||
})
|
||||
|
||||
return pivot_columns
|
||||
return {'items': items}
|
||||
except Exception as exc:
|
||||
print(f"WIP pivot columns query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
print(f"WIP hold summary query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def query_wip_distribution(
|
||||
filters: Optional[Dict] = None,
|
||||
limit: int = 500,
|
||||
offset: int = 0,
|
||||
days_back: int = DEFAULT_WIP_DAYS_BACK
|
||||
) -> Optional[Dict]:
|
||||
"""Query WIP distribution table main data.
|
||||
# ============================================================
|
||||
# Detail API Functions
|
||||
# ============================================================
|
||||
|
||||
Returns lot details with their Workcenter|Spec positions.
|
||||
def get_wip_detail(
|
||||
workcenter: str,
|
||||
package: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None,
|
||||
include_dummy: bool = False,
|
||||
page: int = 1,
|
||||
page_size: int = 100
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP detail for a specific workcenter group.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
limit: Maximum rows to return
|
||||
offset: Offset for pagination
|
||||
days_back: Number of days to look back
|
||||
workcenter: WORKCENTER_GROUP name
|
||||
package: Optional PRODUCTLINENAME filter
|
||||
status: Optional STATUS filter ('ACTIVE', 'HOLD')
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
page: Page number (1-based)
|
||||
page_size: Number of records per page
|
||||
|
||||
Returns:
|
||||
Dict with 'rows', 'total_count', 'offset', 'limit' or None if fails.
|
||||
Dict with:
|
||||
- workcenter: The workcenter group name
|
||||
- summary: {total_lots, on_equipment_lots, waiting_lots, hold_lots}
|
||||
- specs: List of spec names (sorted by SPECSEQUENCE)
|
||||
- lots: List of lot details
|
||||
- pagination: {page, page_size, total_count, total_pages}
|
||||
- sys_date: Data timestamp
|
||||
"""
|
||||
try:
|
||||
base_sql = get_current_wip_subquery(days_back)
|
||||
where_clause = _build_wip_distribution_where_clause(filters)
|
||||
# Build WHERE conditions
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append(f"WORKCENTER_GROUP = '{_escape_sql(workcenter)}'")
|
||||
|
||||
# Get total count first
|
||||
count_sql = f"""
|
||||
SELECT COUNT(DISTINCT CONTAINERNAME) as TOTAL_COUNT
|
||||
FROM ({base_sql}) wip
|
||||
WHERE {where_clause}
|
||||
if package:
|
||||
conditions.append(f"PRODUCTLINENAME = '{_escape_sql(package)}'")
|
||||
|
||||
if status:
|
||||
conditions.append(f"STATUS = '{_escape_sql(status)}'")
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
# Get summary
|
||||
summary_sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_LOTS,
|
||||
SUM(CASE WHEN EQUIPMENTNAME IS NOT NULL THEN 1 ELSE 0 END) as ON_EQUIPMENT_LOTS,
|
||||
SUM(CASE WHEN EQUIPMENTNAME IS NULL THEN 1 ELSE 0 END) as WAITING_LOTS,
|
||||
SUM(CASE WHEN STATUS = 'HOLD' THEN 1 ELSE 0 END) as HOLD_LOTS,
|
||||
MAX(SYS_DATE) as SYS_DATE
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
count_df = read_sql_df(count_sql)
|
||||
total_count = int(count_df['TOTAL_COUNT'].iloc[0]) if len(count_df) > 0 else 0
|
||||
|
||||
# Paginated main data query
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
summary_df = read_sql_df(summary_sql)
|
||||
|
||||
if summary_df is None or summary_df.empty:
|
||||
return None
|
||||
|
||||
summary_row = summary_df.iloc[0]
|
||||
total_count = int(summary_row['TOTAL_LOTS'] or 0)
|
||||
sys_date = str(summary_row['SYS_DATE']) if summary_row['SYS_DATE'] else None
|
||||
|
||||
summary = {
|
||||
'total_lots': total_count,
|
||||
'on_equipment_lots': int(summary_row['ON_EQUIPMENT_LOTS'] or 0),
|
||||
'waiting_lots': int(summary_row['WAITING_LOTS'] or 0),
|
||||
'hold_lots': int(summary_row['HOLD_LOTS'] or 0)
|
||||
}
|
||||
|
||||
# Get unique specs for this workcenter (sorted by SPECSEQUENCE)
|
||||
specs_sql = f"""
|
||||
SELECT DISTINCT SPECNAME, SPECSEQUENCE
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
AND SPECNAME IS NOT NULL
|
||||
ORDER BY SPECSEQUENCE
|
||||
"""
|
||||
|
||||
specs_df = read_sql_df(specs_sql)
|
||||
specs = specs_df['SPECNAME'].tolist() if specs_df is not None and not specs_df.empty else []
|
||||
|
||||
# Get paginated lot details
|
||||
offset = (page - 1) * page_size
|
||||
lots_sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
MFGORDERNAME,
|
||||
CONTAINERNAME,
|
||||
SPECNAME,
|
||||
PRODUCTLINENAME_LEF,
|
||||
WAFERLOT,
|
||||
PJ_TYPE,
|
||||
PJ_PRODUCEREGION,
|
||||
EQUIPMENTS,
|
||||
WORKCENTERNAME,
|
||||
LOTID,
|
||||
EQUIPMENTNAME,
|
||||
STATUS,
|
||||
HOLDREASONNAME,
|
||||
QTY,
|
||||
QTY2,
|
||||
TXNDATE,
|
||||
ROW_NUMBER() OVER (ORDER BY TXNDATE DESC, MFGORDERNAME, CONTAINERNAME) as rn
|
||||
FROM ({base_sql}) wip
|
||||
WHERE {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
PRODUCTLINENAME,
|
||||
SPECNAME,
|
||||
ROW_NUMBER() OVER (ORDER BY LOTID) as RN
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
)
|
||||
WHERE RN > {offset} AND RN <= {offset + page_size}
|
||||
ORDER BY RN
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Convert to response format
|
||||
rows = []
|
||||
for _, row in df.iterrows():
|
||||
wc = row['WORKCENTERNAME'] or ''
|
||||
spec = row['SPECNAME'] or ''
|
||||
pivot_key = f"{wc}|{spec}"
|
||||
lots_df = read_sql_df(lots_sql)
|
||||
|
||||
# Lot status: HOLDREASONNAME has value = Hold, no value = Active
|
||||
hold_reason = row['HOLDREASONNAME']
|
||||
lot_status = 'Hold' if (pd.notna(hold_reason) and hold_reason) else 'Active'
|
||||
lots = []
|
||||
if lots_df is not None and not lots_df.empty:
|
||||
for _, row in lots_df.iterrows():
|
||||
lots.append({
|
||||
'lot_id': _safe_value(row['LOTID']),
|
||||
'equipment': _safe_value(row['EQUIPMENTNAME']),
|
||||
'status': _safe_value(row['STATUS']),
|
||||
'hold_reason': _safe_value(row['HOLDREASONNAME']),
|
||||
'qty': int(row['QTY'] or 0),
|
||||
'package': _safe_value(row['PRODUCTLINENAME']),
|
||||
'spec': _safe_value(row['SPECNAME'])
|
||||
})
|
||||
|
||||
rows.append({
|
||||
'MFGORDERNAME': row['MFGORDERNAME'],
|
||||
'CONTAINERNAME': row['CONTAINERNAME'],
|
||||
'SPECNAME': row['SPECNAME'],
|
||||
'PRODUCTLINENAME_LEF': row['PRODUCTLINENAME_LEF'],
|
||||
'WAFERLOT': row['WAFERLOT'],
|
||||
'PJ_TYPE': row['PJ_TYPE'],
|
||||
'PJ_PRODUCEREGION': row['PJ_PRODUCEREGION'],
|
||||
'EQUIPMENTS': row['EQUIPMENTS'],
|
||||
'WORKCENTERNAME': row['WORKCENTERNAME'],
|
||||
'LOT_STATUS': lot_status,
|
||||
'HOLDREASONNAME': hold_reason if pd.notna(hold_reason) else None,
|
||||
'QTY': int(row['QTY']) if pd.notna(row['QTY']) else 0,
|
||||
'QTY2': int(row['QTY2']) if pd.notna(row['QTY2']) else 0,
|
||||
'pivot_key': pivot_key
|
||||
})
|
||||
total_pages = (total_count + page_size - 1) // page_size if total_count > 0 else 1
|
||||
|
||||
return {
|
||||
'rows': rows,
|
||||
'total_count': total_count,
|
||||
'offset': offset,
|
||||
'limit': limit
|
||||
'workcenter': workcenter,
|
||||
'summary': summary,
|
||||
'specs': specs,
|
||||
'lots': lots,
|
||||
'pagination': {
|
||||
'page': page,
|
||||
'page_size': page_size,
|
||||
'total_count': total_count,
|
||||
'total_pages': total_pages
|
||||
},
|
||||
'sys_date': sys_date
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"WIP distribution query failed: {exc}")
|
||||
print(f"WIP detail query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Meta API Functions
|
||||
# ============================================================
|
||||
|
||||
def get_workcenters(include_dummy: bool = False) -> Optional[List[Dict[str, Any]]]:
|
||||
"""Get list of workcenter groups with lot counts.
|
||||
|
||||
Args:
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
|
||||
Returns:
|
||||
List of {name, lot_count} sorted by WORKCENTERSEQUENCE_GROUP
|
||||
"""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("WORKCENTER_GROUP IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTER_GROUP,
|
||||
WORKCENTERSEQUENCE_GROUP,
|
||||
COUNT(*) as LOT_COUNT
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTERSEQUENCE_GROUP
|
||||
ORDER BY WORKCENTERSEQUENCE_GROUP
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
result.append({
|
||||
'name': row['WORKCENTER_GROUP'],
|
||||
'lot_count': int(row['LOT_COUNT'] or 0)
|
||||
})
|
||||
|
||||
return result
|
||||
except Exception as exc:
|
||||
print(f"Workcenters query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def get_packages(include_dummy: bool = False) -> Optional[List[Dict[str, Any]]]:
|
||||
"""Get list of packages (product lines) with lot counts.
|
||||
|
||||
Args:
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
|
||||
Returns:
|
||||
List of {name, lot_count} sorted by lot_count desc
|
||||
"""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("PRODUCTLINENAME IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
PRODUCTLINENAME,
|
||||
COUNT(*) as LOT_COUNT
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
GROUP BY PRODUCTLINENAME
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
result.append({
|
||||
'name': row['PRODUCTLINENAME'],
|
||||
'lot_count': int(row['LOT_COUNT'] or 0)
|
||||
})
|
||||
|
||||
return result
|
||||
except Exception as exc:
|
||||
print(f"Packages query failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Search API Functions
|
||||
# ============================================================
|
||||
|
||||
def search_workorders(
|
||||
q: str,
|
||||
limit: int = 20,
|
||||
include_dummy: bool = False
|
||||
) -> Optional[List[str]]:
|
||||
"""Search for WORKORDER values matching the query.
|
||||
|
||||
Args:
|
||||
q: Search query (minimum 2 characters)
|
||||
limit: Maximum number of results (default: 20, max: 50)
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
|
||||
Returns:
|
||||
List of matching WORKORDER values (distinct)
|
||||
"""
|
||||
try:
|
||||
# Validate input
|
||||
if not q or len(q) < 2:
|
||||
return []
|
||||
|
||||
limit = min(limit, 50) # Cap at 50
|
||||
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append(f"WORKORDER LIKE '%{_escape_sql(q)}%'")
|
||||
conditions.append("WORKORDER IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT DISTINCT WORKORDER
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY WORKORDER
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
|
||||
return df['WORKORDER'].tolist()
|
||||
except Exception as exc:
|
||||
print(f"Search workorders failed: {exc}")
|
||||
return None
|
||||
|
||||
|
||||
def search_lot_ids(
|
||||
q: str,
|
||||
limit: int = 20,
|
||||
include_dummy: bool = False
|
||||
) -> Optional[List[str]]:
|
||||
"""Search for LOTID values matching the query.
|
||||
|
||||
Args:
|
||||
q: Search query (minimum 2 characters)
|
||||
limit: Maximum number of results (default: 20, max: 50)
|
||||
include_dummy: If True, include DUMMY lots (default: False)
|
||||
|
||||
Returns:
|
||||
List of matching LOTID values
|
||||
"""
|
||||
try:
|
||||
# Validate input
|
||||
if not q or len(q) < 2:
|
||||
return []
|
||||
|
||||
limit = min(limit, 50) # Cap at 50
|
||||
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append(f"LOTID LIKE '%{_escape_sql(q)}%'")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
|
||||
sql = f"""
|
||||
SELECT LOTID
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY LOTID
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
|
||||
return df['LOTID'].tolist()
|
||||
except Exception as exc:
|
||||
print(f"Search lot IDs failed: {exc}")
|
||||
return None
|
||||
|
||||
45
src/mes_dashboard/static/js/echarts.min.js
vendored
Normal file
45
src/mes_dashboard/static/js/echarts.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
@@ -92,20 +92,20 @@
|
||||
<div class="shell">
|
||||
<div class="header">
|
||||
<h1>MES 報表入口</h1>
|
||||
<p>統一入口:WIP 報表、機台狀態報表與數據表查詢工具</p>
|
||||
<p>統一入口:WIP 即時看板、機台狀態報表與數據表查詢工具</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" data-target="wipFrame">WIP 在制品報表</button>
|
||||
<button class="tab" data-target="wipOverviewFrame">WIP 即時概況</button>
|
||||
<button class="tab active" data-target="wipOverviewFrame">WIP 即時概況</button>
|
||||
<button class="tab" data-target="wipDetailFrame">WIP 工站明細</button>
|
||||
<button class="tab" data-target="resourceFrame">機台狀態報表</button>
|
||||
<button class="tab" data-target="tableFrame">數據表查詢工具</button>
|
||||
<button class="tab" data-target="excelQueryFrame">Excel 批次查詢</button>
|
||||
</div>
|
||||
|
||||
<div class="panel">
|
||||
<iframe id="wipFrame" class="active" src="/wip" title="WIP 在制品報表"></iframe>
|
||||
<iframe id="wipOverviewFrame" src="/wip-overview" title="WIP 即時概況"></iframe>
|
||||
<iframe id="wipOverviewFrame" class="active" src="/wip-overview" title="WIP 即時概況"></iframe>
|
||||
<iframe id="wipDetailFrame" src="/wip-detail" title="WIP 工站明細"></iframe>
|
||||
<iframe id="resourceFrame" src="/resource" title="機台狀態報表"></iframe>
|
||||
<iframe id="tableFrame" src="/tables" title="數據表查詢工具"></iframe>
|
||||
<iframe id="excelQueryFrame" src="/excel-query" title="Excel 批次查詢"></iframe>
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>全廠機況 Dashboard</title>
|
||||
<script src="https://cdn.jsdelivr.net/npm/echarts@5.4.3/dist/echarts.min.js"></script>
|
||||
<script src="/static/js/echarts.min.js"></script>
|
||||
|
||||
<style>
|
||||
:root {
|
||||
|
||||
1108
src/mes_dashboard/templates/wip_detail.html
Normal file
1108
src/mes_dashboard/templates/wip_detail.html
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,837 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="zh-TW">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>WIP 即時分布表</title>
|
||||
<style>
|
||||
:root {
|
||||
--bg: #f5f7fa;
|
||||
--card-bg: #ffffff;
|
||||
--text: #222;
|
||||
--muted: #666;
|
||||
--border: #e2e6ef;
|
||||
--primary: #667eea;
|
||||
--primary-dark: #5568d3;
|
||||
--shadow: 0 2px 10px rgba(0,0,0,0.08);
|
||||
--shadow-strong: 0 4px 15px rgba(102, 126, 234, 0.2);
|
||||
--success: #22c55e;
|
||||
--danger: #ef4444;
|
||||
--warning: #f59e0b;
|
||||
}
|
||||
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: 'Microsoft JhengHei', Arial, sans-serif;
|
||||
background: var(--bg);
|
||||
color: var(--text);
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
.dashboard {
|
||||
max-width: 1900px;
|
||||
margin: 0 auto;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
/* Header */
|
||||
.header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
gap: 12px;
|
||||
padding: 18px 22px;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
border-radius: 10px;
|
||||
margin-bottom: 16px;
|
||||
box-shadow: var(--shadow-strong);
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 24px;
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.header-right {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 16px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.last-update {
|
||||
color: rgba(255, 255, 255, 0.8);
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
/* Filters */
|
||||
.filters {
|
||||
background: var(--card-bg);
|
||||
padding: 16px 20px;
|
||||
border-radius: 10px;
|
||||
margin-bottom: 16px;
|
||||
box-shadow: var(--shadow);
|
||||
display: flex;
|
||||
gap: 16px;
|
||||
align-items: flex-end;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.filter-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 6px;
|
||||
}
|
||||
|
||||
.filter-group label {
|
||||
font-size: 12px;
|
||||
font-weight: 600;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
.filter-group select,
|
||||
.filter-group input {
|
||||
padding: 8px 12px;
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 6px;
|
||||
font-size: 13px;
|
||||
min-width: 150px;
|
||||
}
|
||||
|
||||
.filter-group select:focus,
|
||||
.filter-group input:focus {
|
||||
outline: none;
|
||||
border-color: var(--primary);
|
||||
box-shadow: 0 0 0 2px rgba(102, 126, 234, 0.2);
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: 9px 20px;
|
||||
border: none;
|
||||
border-radius: 8px;
|
||||
font-size: 13px;
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
}
|
||||
|
||||
.btn-primary {
|
||||
background: var(--primary);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-primary:hover {
|
||||
background: var(--primary-dark);
|
||||
}
|
||||
|
||||
.btn-primary:disabled {
|
||||
background: #a6b0f5;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
.btn-secondary {
|
||||
background: #6c757d;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-secondary:hover {
|
||||
background: #5a6268;
|
||||
}
|
||||
|
||||
/* Summary Cards */
|
||||
.summary-row {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(4, 1fr);
|
||||
gap: 14px;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.summary-card {
|
||||
background: var(--card-bg);
|
||||
border-radius: 10px;
|
||||
padding: 16px 20px;
|
||||
text-align: center;
|
||||
border: 1px solid var(--border);
|
||||
box-shadow: var(--shadow);
|
||||
}
|
||||
|
||||
.summary-label {
|
||||
font-size: 12px;
|
||||
color: var(--muted);
|
||||
margin-bottom: 6px;
|
||||
}
|
||||
|
||||
.summary-value {
|
||||
font-size: 28px;
|
||||
font-weight: bold;
|
||||
color: var(--primary);
|
||||
}
|
||||
|
||||
/* Table Section */
|
||||
.table-section {
|
||||
background: var(--card-bg);
|
||||
border-radius: 10px;
|
||||
box-shadow: var(--shadow);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.table-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 14px 20px;
|
||||
border-bottom: 1px solid var(--border);
|
||||
background: #fafbfc;
|
||||
}
|
||||
|
||||
.table-title {
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
.table-info {
|
||||
font-size: 13px;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
.table-container {
|
||||
overflow-x: auto;
|
||||
overflow-y: auto;
|
||||
max-height: 600px;
|
||||
}
|
||||
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
thead {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
z-index: 10;
|
||||
}
|
||||
|
||||
th {
|
||||
background: #f8f9fa;
|
||||
padding: 10px 12px;
|
||||
text-align: left;
|
||||
border-bottom: 2px solid var(--border);
|
||||
font-weight: 600;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
/* 固定欄位 */
|
||||
th.fixed-col,
|
||||
td.fixed-col {
|
||||
position: sticky;
|
||||
background: #fff;
|
||||
z-index: 5;
|
||||
}
|
||||
|
||||
th.fixed-col {
|
||||
background: #f8f9fa;
|
||||
z-index: 11;
|
||||
}
|
||||
|
||||
th.fixed-col:nth-child(1), td.fixed-col:nth-child(1) { left: 0; min-width: 100px; }
|
||||
th.fixed-col:nth-child(2), td.fixed-col:nth-child(2) { left: 100px; min-width: 130px; }
|
||||
th.fixed-col:nth-child(3), td.fixed-col:nth-child(3) { left: 230px; min-width: 90px; }
|
||||
th.fixed-col:nth-child(4), td.fixed-col:nth-child(4) { left: 320px; min-width: 100px; }
|
||||
th.fixed-col:nth-child(5), td.fixed-col:nth-child(5) { left: 420px; min-width: 70px; }
|
||||
th.fixed-col:nth-child(6), td.fixed-col:nth-child(6) { left: 490px; min-width: 50px; }
|
||||
th.fixed-col:nth-child(7), td.fixed-col:nth-child(7) { left: 540px; min-width: 120px; }
|
||||
th.fixed-col:nth-child(8), td.fixed-col:nth-child(8) { left: 660px; min-width: 90px; border-right: 2px solid var(--primary); }
|
||||
|
||||
td {
|
||||
padding: 10px 12px;
|
||||
border-bottom: 1px solid #f0f0f0;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
tbody tr:hover td {
|
||||
background: #f8f9fc;
|
||||
}
|
||||
|
||||
tbody tr:hover td.fixed-col {
|
||||
background: #f0f2ff;
|
||||
}
|
||||
|
||||
/* Workcenter 分組標題 */
|
||||
th.wc-group {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
color: white;
|
||||
text-align: center;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
th.spec-col {
|
||||
background: #e8ebff;
|
||||
text-align: center;
|
||||
font-size: 11px;
|
||||
max-width: 80px;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
|
||||
/* Pivot 資料格 */
|
||||
td.pivot-cell {
|
||||
text-align: center;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
td.pivot-cell.has-data {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
}
|
||||
|
||||
td.pivot-cell.no-data {
|
||||
color: #ccc;
|
||||
}
|
||||
|
||||
/* Pagination */
|
||||
.pagination {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
padding: 16px;
|
||||
border-top: 1px solid var(--border);
|
||||
}
|
||||
|
||||
.pagination button {
|
||||
padding: 8px 16px;
|
||||
border: 1px solid var(--border);
|
||||
background: white;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.pagination button:hover:not(:disabled) {
|
||||
border-color: var(--primary);
|
||||
color: var(--primary);
|
||||
}
|
||||
|
||||
.pagination button:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
.pagination .page-info {
|
||||
font-size: 13px;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
/* Loading */
|
||||
.loading-overlay {
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(255, 255, 255, 0.9);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 20;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
display: inline-block;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
border: 3px solid var(--border);
|
||||
border-top-color: var(--primary);
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
to { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.placeholder {
|
||||
text-align: center;
|
||||
padding: 60px 20px;
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
/* Responsive */
|
||||
@media (max-width: 1200px) {
|
||||
.summary-row {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.filters {
|
||||
flex-direction: column;
|
||||
align-items: stretch;
|
||||
}
|
||||
.filter-group select,
|
||||
.filter-group input {
|
||||
width: 100%;
|
||||
}
|
||||
.summary-row {
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="dashboard">
|
||||
<!-- Header -->
|
||||
<div class="header">
|
||||
<h1>WIP 即時分布表</h1>
|
||||
<div class="header-right">
|
||||
<span id="lastUpdate" class="last-update"></span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="filters">
|
||||
<div class="filter-group">
|
||||
<label>Package (產品線)</label>
|
||||
<select id="filterPackage">
|
||||
<option value="">全部</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="filter-group">
|
||||
<label>Type (類型)</label>
|
||||
<select id="filterType">
|
||||
<option value="">全部</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="filter-group">
|
||||
<label>Area (廠區)</label>
|
||||
<select id="filterArea">
|
||||
<option value="">全部</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="filter-group">
|
||||
<label>Lot Status (狀態)</label>
|
||||
<select id="filterLotStatus">
|
||||
<option value="">全部</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="filter-group">
|
||||
<label>搜尋 (GA/Lot ID)</label>
|
||||
<input type="text" id="filterSearch" placeholder="輸入關鍵字...">
|
||||
</div>
|
||||
<button id="btnQuery" class="btn btn-primary" onclick="loadData()">查詢</button>
|
||||
<button class="btn btn-secondary" onclick="clearFilters()">清除篩選</button>
|
||||
</div>
|
||||
|
||||
<!-- Summary Cards -->
|
||||
<div class="summary-row">
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">總 LOT 數</div>
|
||||
<div class="summary-value" id="totalLots">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">總數量 (QTY)</div>
|
||||
<div class="summary-value" id="totalQty">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">HOLD LOT 數</div>
|
||||
<div class="summary-value" id="holdLots" style="color: var(--danger);">-</div>
|
||||
</div>
|
||||
<div class="summary-card">
|
||||
<div class="summary-label">HOLD 數量</div>
|
||||
<div class="summary-value" id="holdQty" style="color: var(--danger);">-</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Table Section -->
|
||||
<div class="table-section" style="position: relative;">
|
||||
<div class="table-header">
|
||||
<div class="table-title">WIP 分布明細</div>
|
||||
<div class="table-info" id="tableInfo">請點擊「查詢」載入資料</div>
|
||||
</div>
|
||||
<div class="table-container" id="tableContainer">
|
||||
<div class="placeholder">請點擊「查詢」載入資料</div>
|
||||
</div>
|
||||
<div class="pagination" id="pagination" style="display: none;">
|
||||
<button id="btnPrev" onclick="prevPage()">上一頁</button>
|
||||
<span class="page-info" id="pageInfo">第 1 頁</span>
|
||||
<button id="btnNext" onclick="nextPage()">下一頁</button>
|
||||
</div>
|
||||
<div class="loading-overlay" id="loadingOverlay" style="display: none;">
|
||||
<span class="loading-spinner"></span>
|
||||
<span>載入中...</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// 狀態管理
|
||||
const state = {
|
||||
filters: { packages: [], types: [], areas: [], search: '' },
|
||||
pivotColumns: [],
|
||||
groupedPivotColumns: {}, // 按 Workcenter 分組
|
||||
data: [],
|
||||
page: 1,
|
||||
pageSize: 100,
|
||||
totalCount: 0,
|
||||
isLoading: false
|
||||
};
|
||||
|
||||
// 工站排序順序由後端 WORKCENTER_GROUPS 處理,前端使用 API 回傳的 order 值
|
||||
|
||||
function formatNumber(num) {
|
||||
if (num === null || num === undefined || num === '-') return '-';
|
||||
return num.toLocaleString('zh-TW');
|
||||
}
|
||||
|
||||
// Lot 狀態說明
|
||||
// Active = HOLDREASONNAME IS NULL (正常在製)
|
||||
// Hold = HOLDREASONNAME IS NOT NULL (暫停/鎖定)
|
||||
|
||||
// 取得篩選條件
|
||||
function getFilters() {
|
||||
const filters = {};
|
||||
const pkg = document.getElementById('filterPackage').value;
|
||||
const type = document.getElementById('filterType').value;
|
||||
const area = document.getElementById('filterArea').value;
|
||||
const lotStatus = document.getElementById('filterLotStatus').value;
|
||||
const search = document.getElementById('filterSearch').value.trim();
|
||||
|
||||
if (pkg) filters.packages = [pkg];
|
||||
if (type) filters.types = [type];
|
||||
if (area) filters.areas = [area];
|
||||
if (lotStatus) filters.lot_statuses = [lotStatus];
|
||||
if (search) filters.search = search;
|
||||
|
||||
return Object.keys(filters).length > 0 ? filters : null;
|
||||
}
|
||||
|
||||
// 載入篩選選項
|
||||
async function loadFilterOptions() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/distribution/filter_options');
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
populateSelect('filterPackage', result.data.packages);
|
||||
populateSelect('filterType', result.data.types);
|
||||
populateSelect('filterArea', result.data.areas);
|
||||
populateSelect('filterLotStatus', result.data.lot_statuses);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('篩選選項載入失敗:', error);
|
||||
}
|
||||
}
|
||||
|
||||
function populateSelect(elementId, options) {
|
||||
const select = document.getElementById(elementId);
|
||||
const currentValue = select.value;
|
||||
select.innerHTML = '<option value="">全部</option>';
|
||||
options.forEach(opt => {
|
||||
const option = document.createElement('option');
|
||||
option.value = opt;
|
||||
option.textContent = opt;
|
||||
select.appendChild(option);
|
||||
});
|
||||
select.value = currentValue;
|
||||
}
|
||||
|
||||
// 載入 Pivot 欄位
|
||||
async function loadPivotColumns() {
|
||||
try {
|
||||
const response = await fetch('/api/wip/distribution/pivot_columns', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ filters: getFilters() })
|
||||
});
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
state.pivotColumns = result.data;
|
||||
// groupPivotColumns 會在 loadData 取得數據後呼叫
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Pivot 欄位載入失敗:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// 按 Workcenter Group 分組 Pivot 欄位 (使用後端回傳的 workcenter_group 和 order)
|
||||
// 只顯示有資料的站點組合
|
||||
function groupPivotColumns() {
|
||||
// 先收集當前數據中實際存在的 pivot_key
|
||||
const existingKeys = new Set();
|
||||
state.data.forEach(row => {
|
||||
if (row.pivot_key) {
|
||||
existingKeys.add(row.pivot_key);
|
||||
}
|
||||
});
|
||||
|
||||
// 過濾只保留有資料的 pivot columns
|
||||
const filteredColumns = state.pivotColumns.filter(col => existingKeys.has(col.key));
|
||||
|
||||
const groups = {};
|
||||
filteredColumns.forEach(col => {
|
||||
// 使用後端計算的合併群組名稱
|
||||
const groupName = col.workcenter_group || col.workcenter || '(未知)';
|
||||
const order = col.order !== undefined ? col.order : 999;
|
||||
|
||||
if (!groups[groupName]) {
|
||||
groups[groupName] = { workcenter: groupName, specs: [], order: order };
|
||||
}
|
||||
groups[groupName].specs.push(col);
|
||||
});
|
||||
|
||||
// 按工站順序排序 (order 小的在左邊)
|
||||
state.groupedPivotColumns = Object.values(groups).sort((a, b) => {
|
||||
if (a.order !== b.order) return a.order - b.order;
|
||||
return a.workcenter.localeCompare(b.workcenter);
|
||||
});
|
||||
}
|
||||
|
||||
// 載入主數據
|
||||
async function loadData() {
|
||||
if (state.isLoading) return;
|
||||
state.isLoading = true;
|
||||
state.page = 1;
|
||||
|
||||
showLoading();
|
||||
|
||||
try {
|
||||
// 同時載入 Pivot 欄位和主數據
|
||||
await loadPivotColumns();
|
||||
|
||||
const response = await fetch('/api/wip/distribution', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
filters: getFilters(),
|
||||
limit: state.pageSize,
|
||||
offset: 0
|
||||
})
|
||||
});
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
state.data = result.data.rows;
|
||||
state.totalCount = result.data.total_count;
|
||||
|
||||
// 根據當前數據過濾並分組 Pivot 欄位(只顯示有資料的站點)
|
||||
groupPivotColumns();
|
||||
renderTable();
|
||||
updateSummary();
|
||||
updatePagination();
|
||||
|
||||
document.getElementById('lastUpdate').textContent =
|
||||
`Last Update: ${new Date().toLocaleString('zh-TW')}`;
|
||||
} else {
|
||||
showError(result.error || '查詢失敗');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('數據載入失敗:', error);
|
||||
showError('數據載入失敗');
|
||||
} finally {
|
||||
state.isLoading = false;
|
||||
hideLoading();
|
||||
}
|
||||
}
|
||||
|
||||
// 渲染表格
|
||||
function renderTable() {
|
||||
const container = document.getElementById('tableContainer');
|
||||
|
||||
if (state.data.length === 0) {
|
||||
container.innerHTML = '<div class="placeholder">查無資料</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let html = '<table><thead>';
|
||||
|
||||
// 第一行:Workcenter 分組標題
|
||||
html += '<tr>';
|
||||
// 固定欄位佔位 (colspan=8)
|
||||
html += '<th class="fixed-col" rowspan="2">GA</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Lot ID</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Package</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Wafer Lot</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Type</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Area</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Lot Status</th>';
|
||||
html += '<th class="fixed-col" rowspan="2">Equipment</th>';
|
||||
|
||||
// Workcenter 分組標題
|
||||
state.groupedPivotColumns.forEach(group => {
|
||||
html += `<th class="wc-group" colspan="${group.specs.length}">${group.workcenter}</th>`;
|
||||
});
|
||||
html += '</tr>';
|
||||
|
||||
// 第二行:Spec 子標題
|
||||
html += '<tr>';
|
||||
state.groupedPivotColumns.forEach(group => {
|
||||
group.specs.forEach(spec => {
|
||||
const displaySpec = spec.spec || '-';
|
||||
html += `<th class="spec-col" title="${group.workcenter} | ${displaySpec}">${displaySpec}</th>`;
|
||||
});
|
||||
});
|
||||
html += '</tr></thead><tbody>';
|
||||
|
||||
// 建立 pivot key 到索引的映射
|
||||
const pivotKeySet = new Set();
|
||||
state.pivotColumns.forEach(col => pivotKeySet.add(col.key));
|
||||
|
||||
// 數據行
|
||||
state.data.forEach(row => {
|
||||
html += '<tr>';
|
||||
// 固定欄位
|
||||
html += `<td class="fixed-col">${row.MFGORDERNAME || '-'}</td>`;
|
||||
html += `<td class="fixed-col">${row.CONTAINERNAME || '-'}</td>`;
|
||||
html += `<td class="fixed-col">${row.PRODUCTLINENAME_LEF || '-'}</td>`;
|
||||
html += `<td class="fixed-col">${row.WAFERLOT || '-'}</td>`;
|
||||
html += `<td class="fixed-col">${row.PJ_TYPE || '-'}</td>`;
|
||||
html += `<td class="fixed-col">${row.PJ_PRODUCEREGION || '-'}</td>`;
|
||||
// Lot Status: Active (綠色) / Hold (紅色,顯示 Hold 原因)
|
||||
const lotStatusClass = row.LOT_STATUS === 'Hold' ? 'style="color: var(--danger); font-weight: bold;"' : 'style="color: var(--success);"';
|
||||
const lotStatusText = row.LOT_STATUS === 'Hold' ? `Hold (${row.HOLDREASONNAME || ''})` : 'Active';
|
||||
html += `<td class="fixed-col" ${lotStatusClass}>${lotStatusText}</td>`;
|
||||
html += `<td class="fixed-col">${row.EQUIPMENTS || '-'}</td>`;
|
||||
|
||||
// Pivot 欄位
|
||||
state.groupedPivotColumns.forEach(group => {
|
||||
group.specs.forEach(spec => {
|
||||
const isMatch = row.pivot_key === spec.key;
|
||||
if (isMatch) {
|
||||
html += `<td class="pivot-cell has-data">✓</td>`;
|
||||
} else {
|
||||
html += `<td class="pivot-cell no-data">-</td>`;
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
html += '</tr>';
|
||||
});
|
||||
|
||||
html += '</tbody></table>';
|
||||
container.innerHTML = html;
|
||||
|
||||
document.getElementById('tableInfo').textContent =
|
||||
`共 ${formatNumber(state.totalCount)} 筆,顯示第 ${(state.page - 1) * state.pageSize + 1} - ${Math.min(state.page * state.pageSize, state.totalCount)} 筆`;
|
||||
}
|
||||
|
||||
// 更新統計摘要
|
||||
function updateSummary() {
|
||||
document.getElementById('totalLots').textContent = formatNumber(state.totalCount);
|
||||
|
||||
// 計算總 QTY
|
||||
const totalQty = state.data.reduce((sum, row) => sum + (row.QTY || 0), 0);
|
||||
document.getElementById('totalQty').textContent = formatNumber(totalQty);
|
||||
|
||||
// 計算 HOLD LOT 數和 HOLD 數量
|
||||
const holdRows = state.data.filter(row => row.LOT_STATUS === 'Hold');
|
||||
const holdLots = holdRows.length;
|
||||
const holdQty = holdRows.reduce((sum, row) => sum + (row.QTY || 0), 0);
|
||||
|
||||
document.getElementById('holdLots').textContent = formatNumber(holdLots);
|
||||
document.getElementById('holdQty').textContent = formatNumber(holdQty);
|
||||
}
|
||||
|
||||
// 更新分頁
|
||||
function updatePagination() {
|
||||
const pagination = document.getElementById('pagination');
|
||||
const totalPages = Math.ceil(state.totalCount / state.pageSize);
|
||||
|
||||
if (totalPages <= 1) {
|
||||
pagination.style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
pagination.style.display = 'flex';
|
||||
document.getElementById('pageInfo').textContent = `第 ${state.page} 頁 / 共 ${totalPages} 頁`;
|
||||
document.getElementById('btnPrev').disabled = state.page <= 1;
|
||||
document.getElementById('btnNext').disabled = state.page >= totalPages;
|
||||
}
|
||||
|
||||
async function prevPage() {
|
||||
if (state.page <= 1) return;
|
||||
state.page--;
|
||||
await loadPageData();
|
||||
}
|
||||
|
||||
async function nextPage() {
|
||||
const totalPages = Math.ceil(state.totalCount / state.pageSize);
|
||||
if (state.page >= totalPages) return;
|
||||
state.page++;
|
||||
await loadPageData();
|
||||
}
|
||||
|
||||
async function loadPageData() {
|
||||
showLoading();
|
||||
try {
|
||||
const response = await fetch('/api/wip/distribution', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
filters: getFilters(),
|
||||
limit: state.pageSize,
|
||||
offset: (state.page - 1) * state.pageSize
|
||||
})
|
||||
});
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
state.data = result.data.rows;
|
||||
// 根據當前頁面數據重新過濾 Pivot 欄位
|
||||
groupPivotColumns();
|
||||
renderTable();
|
||||
updatePagination();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('分頁載入失敗:', error);
|
||||
} finally {
|
||||
hideLoading();
|
||||
}
|
||||
}
|
||||
|
||||
function clearFilters() {
|
||||
document.getElementById('filterPackage').value = '';
|
||||
document.getElementById('filterType').value = '';
|
||||
document.getElementById('filterArea').value = '';
|
||||
document.getElementById('filterLotStatus').value = '';
|
||||
document.getElementById('filterSearch').value = '';
|
||||
}
|
||||
|
||||
function showLoading() {
|
||||
document.getElementById('loadingOverlay').style.display = 'flex';
|
||||
document.getElementById('btnQuery').disabled = true;
|
||||
}
|
||||
|
||||
function hideLoading() {
|
||||
document.getElementById('loadingOverlay').style.display = 'none';
|
||||
document.getElementById('btnQuery').disabled = false;
|
||||
}
|
||||
|
||||
function showError(message) {
|
||||
document.getElementById('tableContainer').innerHTML =
|
||||
`<div class="placeholder" style="color: var(--danger);">${message}</div>`;
|
||||
}
|
||||
|
||||
// Enter 鍵觸發查詢
|
||||
document.getElementById('filterSearch').addEventListener('keypress', function(e) {
|
||||
if (e.key === 'Enter') {
|
||||
loadData();
|
||||
}
|
||||
});
|
||||
|
||||
// 頁面初始化
|
||||
window.onload = function() {
|
||||
loadFilterOptions();
|
||||
};
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
61
tests/conftest.py
Normal file
61
tests/conftest.py
Normal file
@@ -0,0 +1,61 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Pytest configuration and fixtures for MES Dashboard tests."""
|
||||
|
||||
import pytest
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Add the src directory to Python path
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', 'src'))
|
||||
|
||||
import mes_dashboard.core.database as db
|
||||
from mes_dashboard.app import create_app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app():
|
||||
"""Create application for testing."""
|
||||
db._ENGINE = None
|
||||
app = create_app('testing')
|
||||
app.config['TESTING'] = True
|
||||
return app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(app):
|
||||
"""Create test client."""
|
||||
return app.test_client()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def runner(app):
|
||||
"""Create test CLI runner."""
|
||||
return app.test_cli_runner()
|
||||
|
||||
|
||||
def pytest_configure(config):
|
||||
"""Add custom markers."""
|
||||
config.addinivalue_line(
|
||||
"markers", "integration: mark test as integration test (requires database)"
|
||||
)
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
"""Add custom command line options."""
|
||||
parser.addoption(
|
||||
"--run-integration",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="Run integration tests that require database connection"
|
||||
)
|
||||
|
||||
|
||||
def pytest_collection_modifyitems(config, items):
|
||||
"""Skip integration tests unless --run-integration is specified."""
|
||||
if config.getoption("--run-integration"):
|
||||
return
|
||||
|
||||
skip_integration = pytest.mark.skip(reason="need --run-integration option to run")
|
||||
for item in items:
|
||||
if "integration" in item.keywords:
|
||||
item.add_marker(skip_integration)
|
||||
@@ -32,11 +32,16 @@ class AppFactoryTests(unittest.TestCase):
|
||||
expected = {
|
||||
"/",
|
||||
"/tables",
|
||||
"/wip",
|
||||
"/resource",
|
||||
"/wip-overview",
|
||||
"/wip-detail",
|
||||
"/excel-query",
|
||||
"/api/wip/summary",
|
||||
"/api/wip/overview/summary",
|
||||
"/api/wip/overview/matrix",
|
||||
"/api/wip/overview/hold",
|
||||
"/api/wip/detail/<workcenter>",
|
||||
"/api/wip/meta/workcenters",
|
||||
"/api/wip/meta/packages",
|
||||
"/api/resource/summary",
|
||||
"/api/dashboard/kpi",
|
||||
"/api/excel-query/upload",
|
||||
|
||||
333
tests/test_wip_routes.py
Normal file
333
tests/test_wip_routes.py
Normal file
@@ -0,0 +1,333 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Unit tests for WIP API routes.
|
||||
|
||||
Tests the WIP API endpoints in wip_routes.py.
|
||||
"""
|
||||
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
import json
|
||||
|
||||
from mes_dashboard.app import create_app
|
||||
import mes_dashboard.core.database as db
|
||||
|
||||
|
||||
class TestWipRoutesBase(unittest.TestCase):
|
||||
"""Base class for WIP routes tests."""
|
||||
|
||||
def setUp(self):
|
||||
"""Set up test client."""
|
||||
db._ENGINE = None
|
||||
self.app = create_app('testing')
|
||||
self.app.config['TESTING'] = True
|
||||
self.client = self.app.test_client()
|
||||
|
||||
|
||||
class TestOverviewSummaryRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/overview/summary endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_summary')
|
||||
def test_returns_success_with_data(self, mock_get_summary):
|
||||
"""Should return success=True with summary data."""
|
||||
mock_get_summary.return_value = {
|
||||
'total_lots': 9073,
|
||||
'total_qty': 858878718,
|
||||
'hold_lots': 120,
|
||||
'hold_qty': 8213395,
|
||||
'sys_date': '2026-01-26 19:18:29'
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/overview/summary')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertEqual(data['data']['total_lots'], 9073)
|
||||
self.assertEqual(data['data']['hold_lots'], 120)
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_summary')
|
||||
def test_returns_error_on_failure(self, mock_get_summary):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_summary.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/overview/summary')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
self.assertIn('error', data)
|
||||
|
||||
|
||||
class TestOverviewMatrixRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/overview/matrix endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_matrix')
|
||||
def test_returns_success_with_matrix(self, mock_get_matrix):
|
||||
"""Should return success=True with matrix data."""
|
||||
mock_get_matrix.return_value = {
|
||||
'workcenters': ['切割', '焊接_DB'],
|
||||
'packages': ['SOT-23', 'SOD-323'],
|
||||
'matrix': {'切割': {'SOT-23': 50000000}},
|
||||
'workcenter_totals': {'切割': 50000000},
|
||||
'package_totals': {'SOT-23': 50000000},
|
||||
'grand_total': 50000000
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/overview/matrix')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertIn('workcenters', data['data'])
|
||||
self.assertIn('packages', data['data'])
|
||||
self.assertIn('matrix', data['data'])
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_matrix')
|
||||
def test_returns_error_on_failure(self, mock_get_matrix):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_matrix.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/overview/matrix')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
|
||||
|
||||
class TestOverviewHoldRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/overview/hold endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_hold_summary')
|
||||
def test_returns_success_with_hold_items(self, mock_get_hold):
|
||||
"""Should return success=True with hold items."""
|
||||
mock_get_hold.return_value = {
|
||||
'items': [
|
||||
{'reason': '特殊需求管控', 'lots': 44, 'qty': 4235060},
|
||||
{'reason': 'YieldLimit', 'lots': 21, 'qty': 1084443}
|
||||
]
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/overview/hold')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertEqual(len(data['data']['items']), 2)
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_hold_summary')
|
||||
def test_returns_error_on_failure(self, mock_get_hold):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_hold.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/overview/hold')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
|
||||
|
||||
class TestDetailRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/detail/<workcenter> endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_detail')
|
||||
def test_returns_success_with_detail(self, mock_get_detail):
|
||||
"""Should return success=True with detail data."""
|
||||
mock_get_detail.return_value = {
|
||||
'workcenter': '焊接_DB',
|
||||
'summary': {
|
||||
'total_lots': 859,
|
||||
'on_equipment_lots': 312,
|
||||
'waiting_lots': 547,
|
||||
'hold_lots': 15
|
||||
},
|
||||
'specs': ['Spec1', 'Spec2'],
|
||||
'lots': [
|
||||
{'lot_id': 'GA25102485', 'equipment': 'GSMP-0054',
|
||||
'status': 'ACTIVE', 'hold_reason': None,
|
||||
'qty': 750, 'package': 'SOT-23', 'spec': 'Spec1'}
|
||||
],
|
||||
'pagination': {
|
||||
'page': 1, 'page_size': 100,
|
||||
'total_count': 859, 'total_pages': 9
|
||||
},
|
||||
'sys_date': '2026-01-26 19:18:29'
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/detail/焊接_DB')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertEqual(data['data']['workcenter'], '焊接_DB')
|
||||
self.assertIn('summary', data['data'])
|
||||
self.assertIn('lots', data['data'])
|
||||
self.assertIn('pagination', data['data'])
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_detail')
|
||||
def test_passes_query_parameters(self, mock_get_detail):
|
||||
"""Should pass query parameters to service function."""
|
||||
mock_get_detail.return_value = {
|
||||
'workcenter': '焊接_DB',
|
||||
'summary': {'total_lots': 100, 'on_equipment_lots': 50,
|
||||
'waiting_lots': 50, 'hold_lots': 0},
|
||||
'specs': [],
|
||||
'lots': [],
|
||||
'pagination': {'page': 2, 'page_size': 50,
|
||||
'total_count': 100, 'total_pages': 2},
|
||||
'sys_date': None
|
||||
}
|
||||
|
||||
response = self.client.get(
|
||||
'/api/wip/detail/焊接_DB?package=SOT-23&status=ACTIVE&page=2&page_size=50'
|
||||
)
|
||||
|
||||
mock_get_detail.assert_called_once_with(
|
||||
workcenter='焊接_DB',
|
||||
package='SOT-23',
|
||||
status='ACTIVE',
|
||||
workorder=None,
|
||||
lotid=None,
|
||||
include_dummy=False,
|
||||
page=2,
|
||||
page_size=50
|
||||
)
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_detail')
|
||||
def test_limits_page_size_to_500(self, mock_get_detail):
|
||||
"""Page size should be capped at 500."""
|
||||
mock_get_detail.return_value = {
|
||||
'workcenter': '切割',
|
||||
'summary': {'total_lots': 0, 'on_equipment_lots': 0,
|
||||
'waiting_lots': 0, 'hold_lots': 0},
|
||||
'specs': [],
|
||||
'lots': [],
|
||||
'pagination': {'page': 1, 'page_size': 500,
|
||||
'total_count': 0, 'total_pages': 1},
|
||||
'sys_date': None
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/detail/切割?page_size=1000')
|
||||
|
||||
# Should be capped to 500
|
||||
call_args = mock_get_detail.call_args
|
||||
self.assertEqual(call_args.kwargs['page_size'], 500)
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_detail')
|
||||
def test_handles_page_less_than_one(self, mock_get_detail):
|
||||
"""Page number less than 1 should be set to 1."""
|
||||
mock_get_detail.return_value = {
|
||||
'workcenter': '切割',
|
||||
'summary': {'total_lots': 0, 'on_equipment_lots': 0,
|
||||
'waiting_lots': 0, 'hold_lots': 0},
|
||||
'specs': [],
|
||||
'lots': [],
|
||||
'pagination': {'page': 1, 'page_size': 100,
|
||||
'total_count': 0, 'total_pages': 1},
|
||||
'sys_date': None
|
||||
}
|
||||
|
||||
response = self.client.get('/api/wip/detail/切割?page=0')
|
||||
|
||||
call_args = mock_get_detail.call_args
|
||||
self.assertEqual(call_args.kwargs['page'], 1)
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_wip_detail')
|
||||
def test_returns_error_on_failure(self, mock_get_detail):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_detail.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/detail/不存在的工站')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
|
||||
|
||||
class TestMetaWorkcentersRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/meta/workcenters endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_workcenters')
|
||||
def test_returns_success_with_workcenters(self, mock_get_wcs):
|
||||
"""Should return success=True with workcenters list."""
|
||||
mock_get_wcs.return_value = [
|
||||
{'name': '切割', 'lot_count': 1377},
|
||||
{'name': '焊接_DB', 'lot_count': 859}
|
||||
]
|
||||
|
||||
response = self.client.get('/api/wip/meta/workcenters')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertEqual(len(data['data']), 2)
|
||||
self.assertEqual(data['data'][0]['name'], '切割')
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_workcenters')
|
||||
def test_returns_error_on_failure(self, mock_get_wcs):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_wcs.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/meta/workcenters')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
|
||||
|
||||
class TestMetaPackagesRoute(TestWipRoutesBase):
|
||||
"""Test GET /api/wip/meta/packages endpoint."""
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_packages')
|
||||
def test_returns_success_with_packages(self, mock_get_pkgs):
|
||||
"""Should return success=True with packages list."""
|
||||
mock_get_pkgs.return_value = [
|
||||
{'name': 'SOT-23', 'lot_count': 2234},
|
||||
{'name': 'SOD-323', 'lot_count': 1392}
|
||||
]
|
||||
|
||||
response = self.client.get('/api/wip/meta/packages')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 200)
|
||||
self.assertTrue(data['success'])
|
||||
self.assertEqual(len(data['data']), 2)
|
||||
self.assertEqual(data['data'][0]['name'], 'SOT-23')
|
||||
|
||||
@patch('mes_dashboard.routes.wip_routes.get_packages')
|
||||
def test_returns_error_on_failure(self, mock_get_pkgs):
|
||||
"""Should return success=False and 500 on failure."""
|
||||
mock_get_pkgs.return_value = None
|
||||
|
||||
response = self.client.get('/api/wip/meta/packages')
|
||||
data = json.loads(response.data)
|
||||
|
||||
self.assertEqual(response.status_code, 500)
|
||||
self.assertFalse(data['success'])
|
||||
|
||||
|
||||
class TestPageRoutes(TestWipRoutesBase):
|
||||
"""Test page routes for WIP dashboards."""
|
||||
|
||||
def test_wip_overview_page_exists(self):
|
||||
"""GET /wip-overview should return 200."""
|
||||
response = self.client.get('/wip-overview')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
def test_wip_detail_page_exists(self):
|
||||
"""GET /wip-detail should return 200."""
|
||||
response = self.client.get('/wip-detail')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
def test_wip_detail_page_with_workcenter(self):
|
||||
"""GET /wip-detail?workcenter=xxx should return 200."""
|
||||
response = self.client.get('/wip-detail?workcenter=焊接_DB')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
def test_old_wip_route_removed(self):
|
||||
"""GET /wip should return 404 (route removed)."""
|
||||
response = self.client.get('/wip')
|
||||
self.assertEqual(response.status_code, 404)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
874
tests/test_wip_service.py
Normal file
874
tests/test_wip_service.py
Normal file
@@ -0,0 +1,874 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Unit tests for WIP service layer.
|
||||
|
||||
Tests the WIP query functions that use DWH.DW_PJ_LOT_V view.
|
||||
"""
|
||||
|
||||
import unittest
|
||||
from unittest.mock import patch, MagicMock
|
||||
import pandas as pd
|
||||
|
||||
from mes_dashboard.services.wip_service import (
|
||||
WIP_VIEW,
|
||||
_escape_sql,
|
||||
_build_base_conditions,
|
||||
get_wip_summary,
|
||||
get_wip_matrix,
|
||||
get_wip_hold_summary,
|
||||
get_wip_detail,
|
||||
get_workcenters,
|
||||
get_packages,
|
||||
search_workorders,
|
||||
search_lot_ids,
|
||||
)
|
||||
|
||||
|
||||
class TestWipServiceConfig(unittest.TestCase):
|
||||
"""Test WIP service configuration."""
|
||||
|
||||
def test_wip_view_has_schema_prefix(self):
|
||||
"""WIP_VIEW should include DWH schema prefix."""
|
||||
self.assertEqual(WIP_VIEW, "DWH.DW_PJ_LOT_V")
|
||||
self.assertTrue(WIP_VIEW.startswith("DWH."))
|
||||
|
||||
|
||||
class TestEscapeSql(unittest.TestCase):
|
||||
"""Test _escape_sql function for SQL injection prevention."""
|
||||
|
||||
def test_escapes_single_quotes(self):
|
||||
"""Should escape single quotes."""
|
||||
self.assertEqual(_escape_sql("O'Brien"), "O''Brien")
|
||||
|
||||
def test_escapes_multiple_quotes(self):
|
||||
"""Should escape multiple single quotes."""
|
||||
self.assertEqual(_escape_sql("It's Bob's"), "It''s Bob''s")
|
||||
|
||||
def test_handles_none(self):
|
||||
"""Should return None for None input."""
|
||||
self.assertIsNone(_escape_sql(None))
|
||||
|
||||
def test_no_change_for_safe_string(self):
|
||||
"""Should not modify strings without quotes."""
|
||||
self.assertEqual(_escape_sql("GA26012345"), "GA26012345")
|
||||
|
||||
|
||||
class TestBuildBaseConditions(unittest.TestCase):
|
||||
"""Test _build_base_conditions function."""
|
||||
|
||||
def test_default_excludes_dummy(self):
|
||||
"""Default behavior should exclude DUMMY lots."""
|
||||
conditions = _build_base_conditions()
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", conditions)
|
||||
|
||||
def test_include_dummy_true(self):
|
||||
"""include_dummy=True should not add DUMMY exclusion."""
|
||||
conditions = _build_base_conditions(include_dummy=True)
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", conditions)
|
||||
|
||||
def test_workorder_filter(self):
|
||||
"""Should add WORKORDER LIKE condition."""
|
||||
conditions = _build_base_conditions(workorder='GA26')
|
||||
self.assertTrue(any("WORKORDER LIKE '%GA26%'" in c for c in conditions))
|
||||
|
||||
def test_lotid_filter(self):
|
||||
"""Should add LOTID LIKE condition."""
|
||||
conditions = _build_base_conditions(lotid='12345')
|
||||
self.assertTrue(any("LOTID LIKE '%12345%'" in c for c in conditions))
|
||||
|
||||
def test_multiple_conditions(self):
|
||||
"""Should combine multiple conditions."""
|
||||
conditions = _build_base_conditions(
|
||||
include_dummy=False,
|
||||
workorder='GA26',
|
||||
lotid='A00'
|
||||
)
|
||||
# Should have 3 conditions: DUMMY exclusion, workorder, lotid
|
||||
self.assertEqual(len(conditions), 3)
|
||||
|
||||
def test_escapes_sql_in_workorder(self):
|
||||
"""Should escape SQL special characters in workorder."""
|
||||
conditions = _build_base_conditions(workorder="test'value")
|
||||
# Should have escaped the quote
|
||||
self.assertTrue(any("test''value" in c for c in conditions))
|
||||
|
||||
def test_escapes_sql_in_lotid(self):
|
||||
"""Should escape SQL special characters in lotid."""
|
||||
conditions = _build_base_conditions(lotid="lot'id")
|
||||
# Should have escaped the quote
|
||||
self.assertTrue(any("lot''id" in c for c in conditions))
|
||||
|
||||
|
||||
class TestGetWipSummary(unittest.TestCase):
|
||||
"""Test get_wip_summary function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_summary_dict_on_success(self, mock_read_sql):
|
||||
"""Should return dict with summary fields when query succeeds."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [9073],
|
||||
'TOTAL_QTY': [858878718],
|
||||
'HOLD_LOTS': [120],
|
||||
'HOLD_QTY': [8213395],
|
||||
'SYS_DATE': ['2026-01-26 19:18:29']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_summary()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(result['total_lots'], 9073)
|
||||
self.assertEqual(result['total_qty'], 858878718)
|
||||
self.assertEqual(result['hold_lots'], 120)
|
||||
self.assertEqual(result['hold_qty'], 8213395)
|
||||
self.assertIn('sys_date', result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_empty_result(self, mock_read_sql):
|
||||
"""Should return None when query returns empty DataFrame."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_wip_summary()
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_exception(self, mock_read_sql):
|
||||
"""Should return None when query raises exception."""
|
||||
mock_read_sql.side_effect = Exception("Database error")
|
||||
|
||||
result = get_wip_summary()
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_handles_null_values(self, mock_read_sql):
|
||||
"""Should handle NULL values gracefully."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [None],
|
||||
'TOTAL_QTY': [None],
|
||||
'HOLD_LOTS': [None],
|
||||
'HOLD_QTY': [None],
|
||||
'SYS_DATE': [None]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_summary()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(result['total_lots'], 0)
|
||||
self.assertEqual(result['total_qty'], 0)
|
||||
self.assertEqual(result['hold_lots'], 0)
|
||||
self.assertEqual(result['hold_qty'], 0)
|
||||
|
||||
|
||||
class TestGetWipMatrix(unittest.TestCase):
|
||||
"""Test get_wip_matrix function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_matrix_structure(self, mock_read_sql):
|
||||
"""Should return dict with matrix structure."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割', '焊接_DB'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1, 2],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323', 'SOT-23'],
|
||||
'QTY': [50000000, 30000000, 40000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_matrix()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertIn('workcenters', result)
|
||||
self.assertIn('packages', result)
|
||||
self.assertIn('matrix', result)
|
||||
self.assertIn('workcenter_totals', result)
|
||||
self.assertIn('package_totals', result)
|
||||
self.assertIn('grand_total', result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_workcenters_sorted_by_sequence(self, mock_read_sql):
|
||||
"""Workcenters should be sorted by WORKCENTERSEQUENCE_GROUP."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['焊接_DB', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [2, 1],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOT-23'],
|
||||
'QTY': [40000000, 50000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_matrix()
|
||||
|
||||
self.assertEqual(result['workcenters'], ['切割', '焊接_DB'])
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_packages_sorted_by_qty_desc(self, mock_read_sql):
|
||||
"""Packages should be sorted by total QTY descending."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1],
|
||||
'PRODUCTLINENAME': ['SOD-323', 'SOT-23'],
|
||||
'QTY': [30000000, 50000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_matrix()
|
||||
|
||||
self.assertEqual(result['packages'][0], 'SOT-23') # Higher QTY first
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_structure_on_empty_result(self, mock_read_sql):
|
||||
"""Should return empty structure when no data."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_wip_matrix()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(result['workcenters'], [])
|
||||
self.assertEqual(result['packages'], [])
|
||||
self.assertEqual(result['grand_total'], 0)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_calculates_totals_correctly(self, mock_read_sql):
|
||||
"""Should calculate workcenter and package totals correctly."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323'],
|
||||
'QTY': [50000000, 30000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_matrix()
|
||||
|
||||
self.assertEqual(result['workcenter_totals']['切割'], 80000000)
|
||||
self.assertEqual(result['package_totals']['SOT-23'], 50000000)
|
||||
self.assertEqual(result['grand_total'], 80000000)
|
||||
|
||||
|
||||
class TestGetWipHoldSummary(unittest.TestCase):
|
||||
"""Test get_wip_hold_summary function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_hold_items(self, mock_read_sql):
|
||||
"""Should return list of hold items."""
|
||||
mock_df = pd.DataFrame({
|
||||
'REASON': ['YieldLimit', '特殊需求管控'],
|
||||
'LOTS': [21, 44],
|
||||
'QTY': [1084443, 4235060]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_wip_hold_summary()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertIn('items', result)
|
||||
self.assertEqual(len(result['items']), 2)
|
||||
self.assertEqual(result['items'][0]['reason'], 'YieldLimit')
|
||||
self.assertEqual(result['items'][0]['lots'], 21)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_items_on_no_holds(self, mock_read_sql):
|
||||
"""Should return empty items list when no holds."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_wip_hold_summary()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(result['items'], [])
|
||||
|
||||
|
||||
class TestGetWipDetail(unittest.TestCase):
|
||||
"""Test get_wip_detail function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_detail_structure(self, mock_read_sql):
|
||||
"""Should return dict with detail structure."""
|
||||
# Mock for summary query
|
||||
summary_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [859],
|
||||
'ON_EQUIPMENT_LOTS': [312],
|
||||
'WAITING_LOTS': [547],
|
||||
'HOLD_LOTS': [15],
|
||||
'SYS_DATE': ['2026-01-26 19:18:29']
|
||||
})
|
||||
# Mock for specs query
|
||||
specs_df = pd.DataFrame({
|
||||
'SPECNAME': ['Spec1', 'Spec2'],
|
||||
'SPECSEQUENCE': [1, 2]
|
||||
})
|
||||
# Mock for lots query
|
||||
lots_df = pd.DataFrame({
|
||||
'LOTID': ['GA25102485-A00-004'],
|
||||
'EQUIPMENTNAME': ['GSMP-0054'],
|
||||
'STATUS': ['ACTIVE'],
|
||||
'HOLDREASONNAME': [None],
|
||||
'QTY': [750],
|
||||
'PRODUCTLINENAME': ['SOT-23'],
|
||||
'SPECNAME': ['Spec1']
|
||||
})
|
||||
|
||||
mock_read_sql.side_effect = [summary_df, specs_df, lots_df]
|
||||
|
||||
result = get_wip_detail('焊接_DB')
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(result['workcenter'], '焊接_DB')
|
||||
self.assertIn('summary', result)
|
||||
self.assertIn('specs', result)
|
||||
self.assertIn('lots', result)
|
||||
self.assertIn('pagination', result)
|
||||
self.assertIn('sys_date', result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_summary_contains_required_fields(self, mock_read_sql):
|
||||
"""Summary should contain total/on_equipment/waiting/hold lots."""
|
||||
summary_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [100],
|
||||
'ON_EQUIPMENT_LOTS': [60],
|
||||
'WAITING_LOTS': [40],
|
||||
'HOLD_LOTS': [5],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
specs_df = pd.DataFrame({'SPECNAME': [], 'SPECSEQUENCE': []})
|
||||
lots_df = pd.DataFrame()
|
||||
|
||||
mock_read_sql.side_effect = [summary_df, specs_df, lots_df]
|
||||
|
||||
result = get_wip_detail('切割')
|
||||
|
||||
self.assertEqual(result['summary']['total_lots'], 100)
|
||||
self.assertEqual(result['summary']['on_equipment_lots'], 60)
|
||||
self.assertEqual(result['summary']['waiting_lots'], 40)
|
||||
self.assertEqual(result['summary']['hold_lots'], 5)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_pagination_calculated_correctly(self, mock_read_sql):
|
||||
"""Pagination should be calculated correctly."""
|
||||
summary_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [250],
|
||||
'ON_EQUIPMENT_LOTS': [100],
|
||||
'WAITING_LOTS': [150],
|
||||
'HOLD_LOTS': [0],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
specs_df = pd.DataFrame({'SPECNAME': [], 'SPECSEQUENCE': []})
|
||||
lots_df = pd.DataFrame()
|
||||
|
||||
mock_read_sql.side_effect = [summary_df, specs_df, lots_df]
|
||||
|
||||
result = get_wip_detail('切割', page=2, page_size=100)
|
||||
|
||||
self.assertEqual(result['pagination']['page'], 2)
|
||||
self.assertEqual(result['pagination']['page_size'], 100)
|
||||
self.assertEqual(result['pagination']['total_count'], 250)
|
||||
self.assertEqual(result['pagination']['total_pages'], 3)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_empty_summary(self, mock_read_sql):
|
||||
"""Should return None when summary query returns empty."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_wip_detail('不存在的工站')
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
|
||||
class TestGetWorkcenters(unittest.TestCase):
|
||||
"""Test get_workcenters function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_workcenter_list(self, mock_read_sql):
|
||||
"""Should return list of workcenters with lot counts."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '焊接_DB'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 2],
|
||||
'LOT_COUNT': [1377, 859]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_workcenters()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(len(result), 2)
|
||||
self.assertEqual(result[0]['name'], '切割')
|
||||
self.assertEqual(result[0]['lot_count'], 1377)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_on_no_data(self, mock_read_sql):
|
||||
"""Should return empty list when no workcenters."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_workcenters()
|
||||
|
||||
self.assertEqual(result, [])
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_exception(self, mock_read_sql):
|
||||
"""Should return None on exception."""
|
||||
mock_read_sql.side_effect = Exception("Database error")
|
||||
|
||||
result = get_workcenters()
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
|
||||
class TestGetPackages(unittest.TestCase):
|
||||
"""Test get_packages function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_package_list(self, mock_read_sql):
|
||||
"""Should return list of packages with lot counts."""
|
||||
mock_df = pd.DataFrame({
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323'],
|
||||
'LOT_COUNT': [2234, 1392]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = get_packages()
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(len(result), 2)
|
||||
self.assertEqual(result[0]['name'], 'SOT-23')
|
||||
self.assertEqual(result[0]['lot_count'], 2234)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_on_no_data(self, mock_read_sql):
|
||||
"""Should return empty list when no packages."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = get_packages()
|
||||
|
||||
self.assertEqual(result, [])
|
||||
|
||||
|
||||
class TestSearchWorkorders(unittest.TestCase):
|
||||
"""Test search_workorders function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_matching_workorders(self, mock_read_sql):
|
||||
"""Should return list of matching WORKORDER values."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKORDER': ['GA26012001', 'GA26012002', 'GA26012003']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = search_workorders('GA26')
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(len(result), 3)
|
||||
self.assertEqual(result[0], 'GA26012001')
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_for_short_query(self, mock_read_sql):
|
||||
"""Should return empty list for query < 2 characters."""
|
||||
result = search_workorders('G')
|
||||
|
||||
self.assertEqual(result, [])
|
||||
mock_read_sql.assert_not_called()
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_for_empty_query(self, mock_read_sql):
|
||||
"""Should return empty list for empty query."""
|
||||
result = search_workorders('')
|
||||
|
||||
self.assertEqual(result, [])
|
||||
mock_read_sql.assert_not_called()
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_on_no_matches(self, mock_read_sql):
|
||||
"""Should return empty list when no matches found."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = search_workorders('NONEXISTENT')
|
||||
|
||||
self.assertEqual(result, [])
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_respects_limit_parameter(self, mock_read_sql):
|
||||
"""Should respect the limit parameter."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKORDER': ['GA26012001', 'GA26012002']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = search_workorders('GA26', limit=2)
|
||||
|
||||
self.assertEqual(len(result), 2)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_caps_limit_at_50(self, mock_read_sql):
|
||||
"""Should cap limit at 50."""
|
||||
mock_df = pd.DataFrame({'WORKORDER': ['GA26012001']})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
search_workorders('GA26', limit=100)
|
||||
|
||||
# Verify SQL contains FETCH FIRST 50
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn('FETCH FIRST 50 ROWS ONLY', call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_exception(self, mock_read_sql):
|
||||
"""Should return None on exception."""
|
||||
mock_read_sql.side_effect = Exception("Database error")
|
||||
|
||||
result = search_workorders('GA26')
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""Should exclude DUMMY lots by default."""
|
||||
mock_df = pd.DataFrame({'WORKORDER': []})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
search_workorders('GA26')
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_includes_dummy_when_specified(self, mock_read_sql):
|
||||
"""Should include DUMMY lots when include_dummy=True."""
|
||||
mock_df = pd.DataFrame({'WORKORDER': []})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
search_workorders('GA26', include_dummy=True)
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
|
||||
class TestSearchLotIds(unittest.TestCase):
|
||||
"""Test search_lot_ids function."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_matching_lotids(self, mock_read_sql):
|
||||
"""Should return list of matching LOTID values."""
|
||||
mock_df = pd.DataFrame({
|
||||
'LOTID': ['GA26012345-A00-001', 'GA26012345-A00-002']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
result = search_lot_ids('GA26012345')
|
||||
|
||||
self.assertIsNotNone(result)
|
||||
self.assertEqual(len(result), 2)
|
||||
self.assertEqual(result[0], 'GA26012345-A00-001')
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_for_short_query(self, mock_read_sql):
|
||||
"""Should return empty list for query < 2 characters."""
|
||||
result = search_lot_ids('G')
|
||||
|
||||
self.assertEqual(result, [])
|
||||
mock_read_sql.assert_not_called()
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_empty_list_on_no_matches(self, mock_read_sql):
|
||||
"""Should return empty list when no matches found."""
|
||||
mock_read_sql.return_value = pd.DataFrame()
|
||||
|
||||
result = search_lot_ids('NONEXISTENT')
|
||||
|
||||
self.assertEqual(result, [])
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_returns_none_on_exception(self, mock_read_sql):
|
||||
"""Should return None on exception."""
|
||||
mock_read_sql.side_effect = Exception("Database error")
|
||||
|
||||
result = search_lot_ids('GA26')
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""Should exclude DUMMY lots by default."""
|
||||
mock_df = pd.DataFrame({'LOTID': []})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
search_lot_ids('GA26')
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
|
||||
class TestDummyExclusionInAllFunctions(unittest.TestCase):
|
||||
"""Test DUMMY exclusion is applied in all WIP functions."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_summary_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_wip_summary should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [100], 'TOTAL_QTY': [1000],
|
||||
'HOLD_LOTS': [10], 'HOLD_QTY': [100],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_summary()
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_summary_includes_dummy_when_specified(self, mock_read_sql):
|
||||
"""get_wip_summary should include DUMMY when specified."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [100], 'TOTAL_QTY': [1000],
|
||||
'HOLD_LOTS': [10], 'HOLD_QTY': [100],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_summary(include_dummy=True)
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_matrix_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_wip_matrix should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1],
|
||||
'PRODUCTLINENAME': ['SOT-23'],
|
||||
'QTY': [1000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_matrix()
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_hold_summary_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_wip_hold_summary should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'REASON': ['YieldLimit'], 'LOTS': [10], 'QTY': [1000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_hold_summary()
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_workcenters_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_workcenters should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1],
|
||||
'LOT_COUNT': [100]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_workcenters()
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_packages_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_packages should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'PRODUCTLINENAME': ['SOT-23'], 'LOT_COUNT': [100]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_packages()
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
|
||||
class TestMultipleFilterConditions(unittest.TestCase):
|
||||
"""Test multiple filter conditions work together."""
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_summary_with_all_filters(self, mock_read_sql):
|
||||
"""get_wip_summary should combine all filter conditions."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [50], 'TOTAL_QTY': [500],
|
||||
'HOLD_LOTS': [5], 'HOLD_QTY': [50],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_summary(workorder='GA26', lotid='A00')
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("WORKORDER LIKE '%GA26%'", call_args)
|
||||
self.assertIn("LOTID LIKE '%A00%'", call_args)
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_matrix_with_all_filters(self, mock_read_sql):
|
||||
"""get_wip_matrix should combine all filter conditions."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1],
|
||||
'PRODUCTLINENAME': ['SOT-23'],
|
||||
'QTY': [500]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_matrix(workorder='GA26', lotid='A00', include_dummy=True)
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("WORKORDER LIKE '%GA26%'", call_args)
|
||||
self.assertIn("LOTID LIKE '%A00%'", call_args)
|
||||
# Should NOT contain DUMMY exclusion since include_dummy=True
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_detail_with_all_filters(self, mock_read_sql):
|
||||
"""get_wip_detail should combine all filter conditions."""
|
||||
summary_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [10], 'ON_EQUIPMENT_LOTS': [5],
|
||||
'WAITING_LOTS': [5], 'HOLD_LOTS': [1],
|
||||
'SYS_DATE': ['2026-01-26']
|
||||
})
|
||||
specs_df = pd.DataFrame({'SPECNAME': [], 'SPECSEQUENCE': []})
|
||||
lots_df = pd.DataFrame()
|
||||
|
||||
mock_read_sql.side_effect = [summary_df, specs_df, lots_df]
|
||||
|
||||
get_wip_detail(
|
||||
workcenter='切割',
|
||||
package='SOT-23',
|
||||
status='ACTIVE',
|
||||
workorder='GA26',
|
||||
lotid='A00'
|
||||
)
|
||||
|
||||
# Check the first call (summary query) contains all conditions
|
||||
call_args = mock_read_sql.call_args_list[0][0][0]
|
||||
self.assertIn("WORKCENTER_GROUP = '切割'", call_args)
|
||||
self.assertIn("PRODUCTLINENAME = 'SOT-23'", call_args)
|
||||
self.assertIn("STATUS = 'ACTIVE'", call_args)
|
||||
self.assertIn("WORKORDER LIKE '%GA26%'", call_args)
|
||||
self.assertIn("LOTID LIKE '%A00%'", call_args)
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
class TestWipServiceIntegration:
|
||||
"""Integration tests that hit the actual database.
|
||||
|
||||
These tests are skipped by default. Run with:
|
||||
python -m pytest tests/test_wip_service.py -k Integration --run-integration
|
||||
"""
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_wip_summary_integration(self):
|
||||
"""Integration test for get_wip_summary."""
|
||||
result = get_wip_summary()
|
||||
assert result is not None
|
||||
assert result['total_lots'] > 0
|
||||
assert 'sys_date' in result
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_wip_matrix_integration(self):
|
||||
"""Integration test for get_wip_matrix."""
|
||||
result = get_wip_matrix()
|
||||
assert result is not None
|
||||
assert len(result['workcenters']) > 0
|
||||
assert result['grand_total'] > 0
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_wip_hold_summary_integration(self):
|
||||
"""Integration test for get_wip_hold_summary."""
|
||||
result = get_wip_hold_summary()
|
||||
assert result is not None
|
||||
assert 'items' in result
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_wip_detail_integration(self):
|
||||
"""Integration test for get_wip_detail."""
|
||||
# First get a valid workcenter
|
||||
workcenters = get_workcenters()
|
||||
assert workcenters is not None and len(workcenters) > 0
|
||||
|
||||
wc_name = workcenters[0]['name']
|
||||
result = get_wip_detail(wc_name, page=1, page_size=10)
|
||||
|
||||
assert result is not None
|
||||
assert result['workcenter'] == wc_name
|
||||
assert 'summary' in result
|
||||
assert 'lots' in result
|
||||
assert 'pagination' in result
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_workcenters_integration(self):
|
||||
"""Integration test for get_workcenters."""
|
||||
result = get_workcenters()
|
||||
assert result is not None
|
||||
assert len(result) > 0
|
||||
assert 'name' in result[0]
|
||||
assert 'lot_count' in result[0]
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_get_packages_integration(self):
|
||||
"""Integration test for get_packages."""
|
||||
result = get_packages()
|
||||
assert result is not None
|
||||
assert len(result) > 0
|
||||
assert 'name' in result[0]
|
||||
assert 'lot_count' in result[0]
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_search_workorders_integration(self):
|
||||
"""Integration test for search_workorders."""
|
||||
# Use a common prefix that likely exists
|
||||
result = search_workorders('GA')
|
||||
assert result is not None
|
||||
# Should return a list (possibly empty if no GA* workorders)
|
||||
assert isinstance(result, list)
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_search_lot_ids_integration(self):
|
||||
"""Integration test for search_lot_ids."""
|
||||
# Use a common prefix that likely exists
|
||||
result = search_lot_ids('GA')
|
||||
assert result is not None
|
||||
assert isinstance(result, list)
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_dummy_exclusion_integration(self):
|
||||
"""Integration test to verify DUMMY exclusion works."""
|
||||
# Get summary with and without DUMMY
|
||||
result_without_dummy = get_wip_summary(include_dummy=False)
|
||||
result_with_dummy = get_wip_summary(include_dummy=True)
|
||||
|
||||
assert result_without_dummy is not None
|
||||
assert result_with_dummy is not None
|
||||
|
||||
# If there are DUMMY lots, with_dummy should have more
|
||||
# (or equal if no DUMMY lots exist)
|
||||
assert result_with_dummy['total_lots'] >= result_without_dummy['total_lots']
|
||||
|
||||
@pytest.mark.integration
|
||||
def test_workorder_filter_integration(self):
|
||||
"""Integration test for workorder filter."""
|
||||
# Get all data first
|
||||
all_result = get_wip_summary()
|
||||
assert all_result is not None
|
||||
|
||||
# Search for a workorder that exists
|
||||
workorders = search_workorders('GA', limit=1)
|
||||
if workorders and len(workorders) > 0:
|
||||
# Filter by that workorder
|
||||
filtered_result = get_wip_summary(workorder=workorders[0])
|
||||
assert filtered_result is not None
|
||||
# Filtered count should be less than or equal to total
|
||||
assert filtered_result['total_lots'] <= all_result['total_lots']
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
Reference in New Issue
Block a user