refactor: 重構 SQL 查詢管理架構,提升安全性與效能
- 新增 sql 模組:SQLLoader (LRU 快取)、QueryBuilder (參數化查詢)、CommonFilters (共用篩選器) - 將 18 個內嵌 SQL 抽取至獨立 .sql 檔案 (dashboard, resource, wip, resource_history) - 修復 SQL 注入漏洞:所有使用者輸入改用 Oracle bind variables (:param) - 優化 dashboard KPI 與 workcenter_cards 端點,從 55 秒超時降至 0.1-0.16 秒 - 標記 utils.py 舊函數為 deprecated,保持向下相容 - 新增 51 個 SQL 模組單元測試,全部通過 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,2 @@
|
||||
schema: spec-driven
|
||||
created: 2026-02-02
|
||||
199
openspec/changes/sql-query-management-refactor/design.md
Normal file
199
openspec/changes/sql-query-management-refactor/design.md
Normal file
@@ -0,0 +1,199 @@
|
||||
## Context
|
||||
|
||||
目前 MES Dashboard 專案的 SQL 查詢管理存在以下問題:
|
||||
|
||||
**現況:**
|
||||
- 約 62 個 SQL 查詢(服務層約 46 個 + core 層約 16 個)分散在 8 個 service 檔案及 core 層中
|
||||
- 最大的 `wip_service.py` 有 2,423 行,包含 20 個 SQL 查詢
|
||||
- 查詢使用 f-string 內嵌於 Python 中,難以維護與版控
|
||||
- 使用者輸入直接插入 SQL,存在注入風險
|
||||
- 相同的 filter 建構邏輯重複出現 4+ 次(分散於 `utils.py` 與各 service)
|
||||
|
||||
**技術限制:**
|
||||
- 使用 Oracle Database,需支援 Oracle 特定語法
|
||||
- 需向下相容現有的 `read_sql_df()` 和 `cursor.execute()` 呼叫
|
||||
- 不變更 API 介面,僅重構內部實作
|
||||
- 動態表名/欄位名無法用 bind variable 參數化
|
||||
|
||||
## Goals / Non-Goals
|
||||
|
||||
**Goals:**
|
||||
- 建立可維護的 SQL 檔案管理機制
|
||||
- 消除 SQL 注入風險,所有使用者輸入皆參數化
|
||||
- 整合 `utils.py` 與新的 `sql/filters.py`,減少重複程式碼
|
||||
- 提供型別安全的查詢建構 API
|
||||
|
||||
**Non-Goals:**
|
||||
- 不遷移至 ORM(維持原生 SQL 以保持查詢效能與可讀性)
|
||||
- 不變更現有 API 端點介面
|
||||
- 不重構非 SQL 相關的程式碼
|
||||
- 不新增外部依賴套件
|
||||
- **不重構 `/api/query_table` 動態查表 API**(前端限定 TABLES_CONFIG 清單,後端未強制驗證)
|
||||
- **不重構 `resource_routes.py`**(屬於 route 層,維持現狀)
|
||||
|
||||
## Decisions
|
||||
|
||||
### Decision 1: SQL 檔案組織結構
|
||||
|
||||
**選擇:** 按 service 領域分類的目錄結構
|
||||
|
||||
```
|
||||
src/mes_dashboard/sql/
|
||||
├── __init__.py
|
||||
├── loader.py # SQL 載入器
|
||||
├── builder.py # 查詢建構器
|
||||
├── filters.py # 共用篩選條件(整合自 utils.py)
|
||||
├── wip/
|
||||
│ ├── summary.sql
|
||||
│ ├── matrix.sql
|
||||
│ └── detail.sql
|
||||
├── dashboard/
|
||||
│ └── kpi.sql
|
||||
└── resource/
|
||||
├── latest_status.sql
|
||||
└── history_trend.sql
|
||||
```
|
||||
|
||||
**替代方案考慮:**
|
||||
- 單一 `queries.py` 常數檔案 → 不夠靈活,大檔案難維護
|
||||
- 使用 SQLAlchemy ORM → 學習曲線高,複雜查詢不易表達
|
||||
|
||||
**理由:** 按領域分類便於查找,`.sql` 檔案可獲得 IDE 語法支援
|
||||
|
||||
### Decision 2: 參數化策略
|
||||
|
||||
**選擇:** 使用 Oracle bind variables (`:param_name`)
|
||||
|
||||
```python
|
||||
# 參數化 IN 條件
|
||||
sql = "SELECT * FROM t WHERE status IN (:p0, :p1, :p2)"
|
||||
params = {"p0": "RUN", "p1": "QUEUE", "p2": "HOLD"}
|
||||
cursor.execute(sql, params)
|
||||
```
|
||||
|
||||
**替代方案考慮:**
|
||||
- 使用 `?` placeholder → Oracle 不支援
|
||||
- 使用 f-string + escape → 仍有注入風險
|
||||
|
||||
**理由:** Oracle 原生支援,查詢計畫可快取,完全避免注入
|
||||
|
||||
### Decision 3: 動態條件建構
|
||||
|
||||
**選擇:** Builder 模式,使用 placeholder 替換
|
||||
|
||||
```python
|
||||
# SQL 檔案使用 placeholder
|
||||
"""
|
||||
SELECT * FROM t
|
||||
{{ WHERE_CLAUSE }}
|
||||
"""
|
||||
|
||||
# Builder 建構 WHERE 條件
|
||||
builder = QueryBuilder(sql_template)
|
||||
builder.add_in_condition("status", ["RUN", "QUEUE"])
|
||||
sql, params = builder.build()
|
||||
```
|
||||
|
||||
**替代方案考慮:**
|
||||
- Jinja2 模板 → 過於複雜,不適合 SQL
|
||||
- 純字串拼接 → 難以追蹤參數
|
||||
|
||||
**理由:** 保持 SQL 檔案可讀性,同時支援動態條件
|
||||
|
||||
### Decision 4: utils.py 整合策略
|
||||
|
||||
**選擇:** 將 `core/utils.py` 中的 SQL filter 邏輯遷移至 `sql/filters.py`,原函數改為 wrapper 呼叫新實作
|
||||
|
||||
**現有 utils.py 函數:**
|
||||
- `build_filter_conditions()` → 遷移至 `CommonFilters.build_conditions()`
|
||||
- `build_equipment_filter_sql()` → 遷移至 `CommonFilters.add_equipment_filter()`
|
||||
- `build_location_filter_sql()` → 遷移至 `CommonFilters.add_location_filter()`
|
||||
- `build_asset_status_filter_sql()` → 遷移至 `CommonFilters.add_asset_status_filter()`
|
||||
- `build_exclusion_sql()` → 遷移至 `CommonFilters.add_exclusion()`
|
||||
|
||||
**整合方式:**
|
||||
```python
|
||||
# core/utils.py(保留向下相容)
|
||||
from mes_dashboard.sql.filters import CommonFilters
|
||||
|
||||
def build_location_filter_sql(locations, excluded_locations):
|
||||
"""Deprecated: use CommonFilters.add_location_filter() instead"""
|
||||
# 呼叫新實作,回傳相容格式...
|
||||
return CommonFilters.build_location_filter_legacy(locations, excluded_locations)
|
||||
```
|
||||
|
||||
**理由:** 避免破壞現有呼叫點,漸進式遷移
|
||||
|
||||
### Decision 5: 打包設定更新
|
||||
|
||||
**選擇:** 修改 `pyproject.toml` 加入 SQL 檔案
|
||||
|
||||
```toml
|
||||
[tool.setuptools.package-data]
|
||||
mes_dashboard = [
|
||||
"templates/**/*",
|
||||
"sql/**/*.sql" # 新增
|
||||
]
|
||||
```
|
||||
|
||||
**理由:** 確保部署時 SQL 檔案被包含在 package 中
|
||||
|
||||
### Decision 6: 遷移策略
|
||||
|
||||
**選擇:** 漸進式遷移,按複雜度排序
|
||||
|
||||
1. 先建立 `sql/` 基礎架構
|
||||
2. 遷移 `resource_service.py`(7 queries,複雜度中等)作為 POC
|
||||
3. 遷移 `dashboard_service.py`(5 queries)
|
||||
4. 遷移 `resource_history_service.py`(6 queries)
|
||||
5. 遷移 `wip_service.py`(20 queries,最大檔案)
|
||||
6. 遷移其餘 service(realtime_equipment_cache, resource_cache, filter_cache)
|
||||
7. 遷移 core 層(database.py, utils.py, cache_updater.py)
|
||||
8. 驗證 `excel_query_service.py`(已有良好參數化)
|
||||
|
||||
**理由:** 降低風險,可早期驗證設計,逐步累積經驗
|
||||
|
||||
## Risks / Trade-offs
|
||||
|
||||
| 風險 | 緩解措施 |
|
||||
|------|---------|
|
||||
| SQL 檔案與 Python 程式碼不同步 | 建立單元測試驗證 SQL 語法正確性 |
|
||||
| 遷移期間功能回歸 | 保留原始實作,新舊並行測試 |
|
||||
| 效能下降(額外的檔案 I/O) | 使用 LRU cache 快取載入的 SQL |
|
||||
| 團隊學習曲線 | 提供使用範例與文件 |
|
||||
| utils.py 整合造成相依性問題 | 保留 wrapper 函數維持向下相容 |
|
||||
| 部署時遺失 SQL 檔案 | 更新 pyproject.toml 並加入 CI 驗證 |
|
||||
|
||||
## Migration Plan
|
||||
|
||||
**Phase 1:基礎架構**
|
||||
- 建立 `sql/` 目錄結構
|
||||
- 實作 `SQLLoader` 類別
|
||||
- 實作 `QueryBuilder` 類別
|
||||
- 實作 `CommonFilters` 類別
|
||||
- 更新 `pyproject.toml` 包含 SQL 檔案
|
||||
- 新增單元測試
|
||||
|
||||
**Phase 2:POC 驗證**
|
||||
- 遷移 `resource_service.py`(7 queries)
|
||||
- 驗證功能正確性與效能
|
||||
|
||||
**Phase 3:Service 層遷移**
|
||||
- 遷移 `dashboard_service.py`(5 queries)
|
||||
- 遷移 `resource_history_service.py`(6 queries)
|
||||
- 遷移 `wip_service.py`(20 queries)
|
||||
- 遷移其餘 service 檔案
|
||||
|
||||
**Phase 4:Core 層遷移**
|
||||
- 整合 `core/utils.py` filter 邏輯
|
||||
- 遷移 `core/database.py` 確保所有呼叫使用 params
|
||||
- 遷移 `core/cache_updater.py`
|
||||
|
||||
**Phase 5:清理與驗證**
|
||||
- 移除舊實作
|
||||
- 更新文件
|
||||
- 執行完整測試套件
|
||||
|
||||
**Rollback 策略:**
|
||||
- 每個 service 保留原始函數(加 `_legacy` 後綴)
|
||||
- 遷移期間可快速切換回舊實作
|
||||
57
openspec/changes/sql-query-management-refactor/proposal.md
Normal file
57
openspec/changes/sql-query-management-refactor/proposal.md
Normal file
@@ -0,0 +1,57 @@
|
||||
## Why
|
||||
|
||||
目前專案中約有 62 個 SQL 查詢(服務層約 46 個 + core 層約 16 個)分散在 8 個 service 檔案及 core 層中,最大的 `wip_service.py` 達 2,423 行程式碼。SQL 查詢以 f-string 內嵌於 Python 中,導致:維護困難、存在 SQL 注入風險、重複程式碼無法共用。
|
||||
|
||||
## What Changes
|
||||
|
||||
- **新增 SQL 檔案載入器**:建立 `sql/` 目錄結構,將複雜查詢抽取到獨立 `.sql` 檔案
|
||||
- **新增安全查詢建構器**:實作 `QueryBuilder` 類別,提供參數化查詢、安全的 IN/LIKE 條件建構
|
||||
- **整合共用篩選模組**:整合現有 `core/utils.py` 的 filter 邏輯至新的 `sql/filters.py`
|
||||
- **重構 service 層**:遷移所有 8 個含 SQL 的 service 檔案使用新架構
|
||||
- **重構 core 層**:遷移 `database.py`、`utils.py`、`cache_updater.py` 的 SQL
|
||||
- **修復 SQL 注入風險**:將 f-string IN/LIKE 條件改為參數化查詢
|
||||
- **更新打包設定**:修改 `pyproject.toml` 包含 SQL 檔案
|
||||
|
||||
## Non-Goals
|
||||
|
||||
- `/api/query_table` 動態查表 API(前端限定 TABLES_CONFIG 清單,後端未強制驗證)
|
||||
- `resource_routes.py` 中的 SQL(屬於 route 層,維持現狀)
|
||||
- 動態表名/欄位名的參數化(技術上無法用 bind variable,需另案處理白名單驗證)
|
||||
|
||||
## Capabilities
|
||||
|
||||
### New Capabilities
|
||||
|
||||
- `sql-loader`: SQL 檔案載入與快取機制,支援從 `.sql` 檔案讀取查詢並提供 LRU 快取
|
||||
- `query-builder`: 安全的動態 SQL 建構器,支援參數化的 IN、LIKE、條件組合
|
||||
- `common-filters`: 共用篩選條件模組,整合現有 `utils.py` 邏輯,消除重複程式碼
|
||||
|
||||
### Modified Capabilities
|
||||
|
||||
(無現有 spec 需要修改)
|
||||
|
||||
## Impact
|
||||
|
||||
- **程式碼變更**:
|
||||
- 新增 `src/mes_dashboard/sql/` 目錄(loader.py, builder.py, filters.py)
|
||||
- 新增 `src/mes_dashboard/sql/wip/*.sql`、`sql/dashboard/*.sql`、`sql/resource/*.sql`
|
||||
- 修改 `pyproject.toml` 加入 `sql/**/*.sql` 到 package-data
|
||||
- **Service 層(8 個檔案)**:
|
||||
- `services/wip_service.py` (20 個查詢)
|
||||
- `services/resource_service.py` (7 個查詢)
|
||||
- `services/resource_history_service.py` (6 個查詢)
|
||||
- `services/dashboard_service.py` (5 個查詢)
|
||||
- `services/realtime_equipment_cache.py`
|
||||
- `services/resource_cache.py`
|
||||
- `services/filter_cache.py`
|
||||
- `services/excel_query_service.py`(已有良好的參數化,僅需驗證)
|
||||
- **Core 層**:
|
||||
- `core/database.py`:確保所有呼叫皆傳入 params
|
||||
- `core/utils.py`:整合 filter 邏輯至 sql/filters.py
|
||||
- `core/cache_updater.py`
|
||||
|
||||
- **API 影響**:無,僅內部實作變更
|
||||
|
||||
- **依賴**:無新增依賴,使用現有的 SQLAlchemy + oracledb
|
||||
|
||||
- **測試**:需新增 `tests/test_sql_builder.py`、`tests/test_sql_loader.py`、`tests/test_common_filters.py`
|
||||
@@ -0,0 +1,64 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: 廠區排除篩選
|
||||
|
||||
系統 SHALL 提供 `add_location_exclusion()` 方法,排除設定檔中定義的廠區。
|
||||
|
||||
#### Scenario: 排除設定的廠區
|
||||
- **WHEN** `EXCLUDED_LOCATIONS = ["ATEC", "F區"]`
|
||||
- **AND** 呼叫 `CommonFilters.add_location_exclusion(builder)`
|
||||
- **THEN** 產生條件 `(LOCATIONNAME IS NULL OR LOCATIONNAME NOT IN (:p0, :p1))`
|
||||
|
||||
#### Scenario: 無排除廠區時不產生條件
|
||||
- **WHEN** `EXCLUDED_LOCATIONS = []`
|
||||
- **AND** 呼叫 `CommonFilters.add_location_exclusion(builder)`
|
||||
- **THEN** 不新增任何條件
|
||||
|
||||
### Requirement: 資產狀態排除篩選
|
||||
|
||||
系統 SHALL 提供 `add_asset_status_exclusion()` 方法,排除設定檔中定義的資產狀態。
|
||||
|
||||
#### Scenario: 排除設定的資產狀態
|
||||
- **WHEN** `EXCLUDED_ASSET_STATUSES = ["報廢", "閒置"]`
|
||||
- **AND** 呼叫 `CommonFilters.add_asset_status_exclusion(builder)`
|
||||
- **THEN** 產生條件 `(PJ_ASSETSSTATUS IS NULL OR PJ_ASSETSSTATUS NOT IN (:p0, :p1))`
|
||||
|
||||
### Requirement: WIP 基礎篩選
|
||||
|
||||
系統 SHALL 提供 `add_wip_base_filters()` 方法,處理 WIP 查詢的常用篩選條件。
|
||||
|
||||
#### Scenario: 工單模糊搜尋
|
||||
- **WHEN** 呼叫 `CommonFilters.add_wip_base_filters(builder, workorder="WO123")`
|
||||
- **THEN** 產生 LIKE 條件搜尋 WORKORDER 欄位
|
||||
|
||||
#### Scenario: 批號模糊搜尋
|
||||
- **WHEN** 呼叫 `CommonFilters.add_wip_base_filters(builder, lotid="LOT001")`
|
||||
- **THEN** 產生 LIKE 條件搜尋 LOTID 欄位
|
||||
|
||||
#### Scenario: 多條件組合
|
||||
- **WHEN** 呼叫 `CommonFilters.add_wip_base_filters(builder, workorder="WO", package="PKG")`
|
||||
- **THEN** 產生兩個 LIKE 條件,以 AND 連接
|
||||
|
||||
### Requirement: 狀態篩選
|
||||
|
||||
系統 SHALL 提供 `add_status_filter()` 方法,處理 WIP 狀態篩選。
|
||||
|
||||
#### Scenario: 單一狀態篩選
|
||||
- **WHEN** 呼叫 `CommonFilters.add_status_filter(builder, status="HOLD")`
|
||||
- **THEN** 產生條件 `STATUS = :p0`
|
||||
|
||||
#### Scenario: 多狀態篩選
|
||||
- **WHEN** 呼叫 `CommonFilters.add_status_filter(builder, statuses=["RUN", "QUEUE"])`
|
||||
- **THEN** 產生條件 `STATUS IN (:p0, :p1)`
|
||||
|
||||
### Requirement: Hold 類型篩選
|
||||
|
||||
系統 SHALL 提供 `add_hold_type_filter()` 方法,區分品質與非品質 Hold。
|
||||
|
||||
#### Scenario: 品質 Hold 篩選
|
||||
- **WHEN** 呼叫 `CommonFilters.add_hold_type_filter(builder, hold_type="quality")`
|
||||
- **THEN** 產生條件排除 `NON_QUALITY_HOLD_REASONS` 中的 Hold 原因
|
||||
|
||||
#### Scenario: 非品質 Hold 篩選
|
||||
- **WHEN** 呼叫 `CommonFilters.add_hold_type_filter(builder, hold_type="non_quality")`
|
||||
- **THEN** 產生條件僅包含 `NON_QUALITY_HOLD_REASONS` 中的 Hold 原因
|
||||
@@ -0,0 +1,55 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: 參數化條件建構
|
||||
|
||||
系統 SHALL 提供 `QueryBuilder` 類別,建構參數化的 SQL 條件,避免 SQL 注入風險。
|
||||
|
||||
#### Scenario: 建構等值條件
|
||||
- **WHEN** 呼叫 `builder.add_param_condition("status", "RUN")`
|
||||
- **THEN** 產生條件 `status = :p0` 且 `params = {"p0": "RUN"}`
|
||||
|
||||
#### Scenario: 建構 IN 條件
|
||||
- **WHEN** 呼叫 `builder.add_in_condition("status", ["RUN", "QUEUE", "HOLD"])`
|
||||
- **THEN** 產生條件 `status IN (:p0, :p1, :p2)`
|
||||
- **AND** `params = {"p0": "RUN", "p1": "QUEUE", "p2": "HOLD"}`
|
||||
|
||||
#### Scenario: 空值 IN 條件不產生語句
|
||||
- **WHEN** 呼叫 `builder.add_in_condition("status", [])`
|
||||
- **THEN** 不新增任何條件
|
||||
|
||||
### Requirement: LIKE 條件安全處理
|
||||
|
||||
系統 SHALL 在建構 LIKE 條件時,自動跳脫 SQL 萬用字元(`%` 和 `_`)。
|
||||
|
||||
#### Scenario: 跳脫 LIKE 萬用字元
|
||||
- **WHEN** 呼叫 `builder.add_like_condition("name", "test%value")`
|
||||
- **THEN** 產生條件 `name LIKE :p0 ESCAPE '\'`
|
||||
- **AND** `params = {"p0": "%test\\%value%"}`
|
||||
|
||||
#### Scenario: LIKE 位置控制
|
||||
- **WHEN** 呼叫 `builder.add_like_condition("name", "prefix", position="start")`
|
||||
- **THEN** `params = {"p0": "prefix%"}`(不含前綴 %)
|
||||
|
||||
### Requirement: WHERE 子句組合
|
||||
|
||||
系統 SHALL 自動組合多個條件為完整的 WHERE 子句。
|
||||
|
||||
#### Scenario: 多條件 AND 組合
|
||||
- **WHEN** 新增多個條件後呼叫 `builder.build()`
|
||||
- **THEN** 產生 `WHERE cond1 AND cond2 AND cond3`
|
||||
|
||||
#### Scenario: 無條件時不產生 WHERE
|
||||
- **WHEN** 未新增任何條件即呼叫 `builder.build()`
|
||||
- **THEN** `{{ WHERE_CLAUSE }}` 被替換為空字串
|
||||
|
||||
### Requirement: NOT IN 條件建構
|
||||
|
||||
系統 SHALL 支援 NOT IN 條件,用於排除特定值。
|
||||
|
||||
#### Scenario: 建構 NOT IN 條件
|
||||
- **WHEN** 呼叫 `builder.add_not_in_condition("location", ["ATEC", "F區"])`
|
||||
- **THEN** 產生條件 `location NOT IN (:p0, :p1)`
|
||||
|
||||
#### Scenario: NOT IN 處理 NULL 值
|
||||
- **WHEN** 呼叫 `builder.add_not_in_condition("location", values, allow_null=True)`
|
||||
- **THEN** 產生條件 `(location IS NULL OR location NOT IN (...))`
|
||||
@@ -0,0 +1,39 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: SQL 檔案載入
|
||||
|
||||
系統 SHALL 提供 `SQLLoader` 類別,從 `.sql` 檔案載入 SQL 查詢字串。
|
||||
|
||||
#### Scenario: 載入存在的 SQL 檔案
|
||||
- **WHEN** 呼叫 `SQLLoader.load("wip/summary")`
|
||||
- **THEN** 系統回傳 `sql/wip/summary.sql` 檔案的完整內容
|
||||
|
||||
#### Scenario: 載入不存在的 SQL 檔案
|
||||
- **WHEN** 呼叫 `SQLLoader.load("nonexistent/query")`
|
||||
- **THEN** 系統拋出 `FileNotFoundError` 並包含檔案路徑
|
||||
|
||||
### Requirement: SQL 檔案快取
|
||||
|
||||
系統 SHALL 使用 LRU cache 快取已載入的 SQL 檔案內容,避免重複讀取檔案系統。
|
||||
|
||||
#### Scenario: 重複載入相同檔案使用快取
|
||||
- **WHEN** 連續呼叫 `SQLLoader.load("wip/summary")` 兩次
|
||||
- **THEN** 第二次呼叫從記憶體快取取得,不重新讀取檔案
|
||||
|
||||
#### Scenario: 快取容量限制
|
||||
- **WHEN** 快取達到 100 個條目上限
|
||||
- **THEN** 系統自動移除最少使用的條目
|
||||
|
||||
### Requirement: 結構性參數替換
|
||||
|
||||
系統 SHALL 提供 `load_with_params()` 方法,支援 Jinja2 風格的結構性參數替換(僅用於非使用者輸入的結構性參數)。
|
||||
|
||||
#### Scenario: 替換結構性參數
|
||||
- **WHEN** SQL 檔案內容為 `SELECT * FROM {{ table_name }}`
|
||||
- **AND** 呼叫 `SQLLoader.load_with_params("query", table_name="DWH.MY_TABLE")`
|
||||
- **THEN** 系統回傳 `SELECT * FROM DWH.MY_TABLE`
|
||||
|
||||
#### Scenario: 未提供的參數保持原樣
|
||||
- **WHEN** SQL 檔案內容為 `SELECT * FROM {{ table_name }} {{ WHERE_CLAUSE }}`
|
||||
- **AND** 呼叫 `SQLLoader.load_with_params("query", table_name="T")`
|
||||
- **THEN** 系統回傳 `SELECT * FROM T {{ WHERE_CLAUSE }}`
|
||||
87
openspec/changes/sql-query-management-refactor/tasks.md
Normal file
87
openspec/changes/sql-query-management-refactor/tasks.md
Normal file
@@ -0,0 +1,87 @@
|
||||
## 1. 基礎架構設置
|
||||
|
||||
- [x] 1.1 建立 `src/mes_dashboard/sql/` 目錄結構
|
||||
- [x] 1.2 建立 `sql/__init__.py` 匯出公開 API
|
||||
- [x] 1.3 實作 `sql/loader.py` - SQLLoader 類別
|
||||
- [x] 1.4 實作 `sql/builder.py` - QueryBuilder 類別
|
||||
- [x] 1.5 實作 `sql/filters.py` - CommonFilters 類別
|
||||
- [x] 1.6 更新 `pyproject.toml` 加入 `sql/**/*.sql` 到 package-data
|
||||
- [x] 1.7 新增 `tests/test_sql_loader.py` 單元測試
|
||||
- [x] 1.8 新增 `tests/test_sql_builder.py` 單元測試
|
||||
- [x] 1.9 新增 `tests/test_common_filters.py` 單元測試
|
||||
|
||||
## 2. SQL 檔案抽取
|
||||
|
||||
- [x] 2.1 建立 `sql/resource/` 目錄
|
||||
- [x] 2.2 抽取 `resource_service.py` 的 latest_status CTE 到 `resource/latest_status.sql`
|
||||
- [x] 2.3 抽取 `resource_service.py` 的 status_summary 查詢到 `resource/status_summary.sql`
|
||||
- [x] 2.4 建立 `sql/dashboard/` 目錄
|
||||
- [x] 2.5 抽取 `dashboard_service.py` 的 KPI 查詢到 `dashboard/kpi.sql`
|
||||
- [x] 2.6 抽取 `dashboard_service.py` 的 heatmap 查詢到 `dashboard/heatmap.sql`
|
||||
- [x] 2.7 建立 `sql/wip/` 目錄
|
||||
- [x] 2.8 抽取 `wip_service.py` 的 summary 查詢到 `wip/summary.sql`
|
||||
- [x] 2.9 抽取 `wip_service.py` 的 matrix 查詢到 `wip/matrix.sql`
|
||||
- [x] 2.10 抽取 `wip_service.py` 的 detail 查詢到 `wip/detail.sql`
|
||||
|
||||
## 3. Service 層遷移 - resource_service.py (POC)
|
||||
|
||||
- [x] 3.1 重構 `get_resource_latest_status_subquery()` 使用 SQLLoader
|
||||
- [x] 3.2 重構 `query_resource_status_summary()` 使用 SQLLoader(使用 resource/status_summary.sql)
|
||||
- [x] 3.3 修復 location filter 的 SQL 注入風險(使用參數化)
|
||||
- [x] 3.4 修復 asset status filter 的 SQL 注入風險
|
||||
- [x] 3.5 遷移其餘查詢使用新架構(by_status.sql, by_workcenter.sql, detail.sql, workcenter_status_matrix.sql)
|
||||
- [x] 3.6 驗證 resource API 功能正確性
|
||||
|
||||
## 4. Service 層遷移 - dashboard_service.py
|
||||
|
||||
- [x] 4.1 重構 `query_dashboard_kpi()` 使用 SQLLoader(使用 dashboard/kpi.sql)
|
||||
- [x] 4.2 重構 `query_utilization_heatmap()` 使用 SQLLoader(使用 dashboard/heatmap.sql)
|
||||
- [x] 4.3 修復 `locations` IN 條件的 SQL 注入風險(已用 QueryBuilder 參數化)
|
||||
- [x] 4.4 修復 `assetsStatuses` IN 條件的 SQL 注入風險(已用 QueryBuilder 參數化)
|
||||
- [x] 4.5 修復 workcenter pattern LIKE 條件的萬用字元問題(新增 add_or_like_conditions 方法)
|
||||
- [x] 4.6 驗證 dashboard API 功能正確性
|
||||
|
||||
## 5. Service 層遷移 - resource_history_service.py
|
||||
|
||||
- [x] 5.1 建立 `sql/resource_history/` 目錄並抽取 SQL 檔案(kpi.sql, trend.sql, heatmap.sql, detail.sql)
|
||||
- [x] 5.2 重構 `query_summary()` 使用 SQLLoader(KPI、趨勢、熱圖查詢)
|
||||
- [x] 5.3 重構 `query_detail()` 使用 SQLLoader
|
||||
- [x] 5.4 重構 `export_csv()` 使用 SQLLoader
|
||||
- [x] 5.5 驗證 resource history API 功能正確性
|
||||
|
||||
## 6. Service 層遷移 - wip_service.py
|
||||
|
||||
- [x] 6.1 重構 `get_wip_summary()` 使用 SQLLoader + QueryBuilder
|
||||
- [x] 6.2 重構 `get_wip_matrix()` 使用 SQLLoader + QueryBuilder
|
||||
- [x] 6.3 重構 `get_wip_detail()` 使用 SQLLoader + 分頁參數化
|
||||
- [x] 6.4 重構搜尋函數(search_workorders, search_lot_ids, search_packages, search_types)使用 QueryBuilder
|
||||
- [x] 6.5 新增 `_build_base_conditions_builder()` 使用 QueryBuilder 取代舊函數
|
||||
- [x] 6.6 新增 `_add_hold_type_conditions()` 使用 QueryBuilder 取代 `_build_hold_type_sql_list()`
|
||||
- [x] 6.7 遷移其餘查詢(hold_detail_summary, hold_detail_distribution, hold_detail_lots, lot_detail)使用新架構
|
||||
- [x] 6.8 驗證 WIP API 功能正確性(75 unit tests passed + 10 integration tests passed)
|
||||
|
||||
## 7. Service 層遷移 - 其餘 Service 檔案
|
||||
|
||||
- [x] 7.1 遷移 `realtime_equipment_cache.py` 使用新架構(靜態查詢,無需更動)
|
||||
- [x] 7.2 遷移 `resource_cache.py` 使用新架構(使用 QueryBuilder + 參數化)
|
||||
- [x] 7.3 遷移 `filter_cache.py` 使用新架構(靜態查詢,無需更動)
|
||||
- [x] 7.4 驗證 `excel_query_service.py` 參數化(已有良好實作:表名驗證 + 參數化 IN 條件)
|
||||
- [x] 7.5 驗證各 cache service 功能正確性(48 tests passed)
|
||||
|
||||
## 8. Core 層遷移
|
||||
|
||||
- [x] 8.1 整合 `core/utils.py` 的 filter 函數至 `sql/filters.py`(新增 `add_equipment_flag_filters`)
|
||||
- [x] 8.2 `utils.py` deprecated 函數已標記 DeprecationWarning 並無實際使用(向下相容已確保)
|
||||
- [x] 8.3 確保 `core/database.py` 的 `read_sql_df()` 所有呼叫點皆傳入 params(已驗證,靜態查詢除外)
|
||||
- [x] 8.4 遷移 `core/cache_updater.py` 使用新架構(靜態查詢,無需更動)
|
||||
- [x] 8.5 驗證 core 層功能正確性(74 tests passed)
|
||||
|
||||
## 9. 整合測試與清理
|
||||
|
||||
- [x] 9.1 執行完整測試套件,確保無回歸(172 passed, 10 skipped)
|
||||
- [x] 9.2 移除 service 中殘留的舊 SQL 字串(query_workcenter_cards, query_resource_detail_with_job, query_ou_trend 已遷移至 SQLLoader)
|
||||
- [x] 9.3 移除 `_escape_sql()` 函數及相關舊函數(`_build_base_conditions`, `_build_hold_type_sql_list`, `_get_non_quality_reasons_sql`)
|
||||
- [x] 9.4 標記 `utils.py` 舊函數為 deprecated(已加入 DeprecationWarning)
|
||||
- [x] 9.5 更新 `core/database.py` 文件說明新的查詢執行方式
|
||||
- [x] 9.6 在 `sql/__init__.py` 新增完整的 SQL 模組說明文件
|
||||
- [x] 9.7 驗證 pyproject.toml 配置 `sql/**/*.sql` 且 SQLLoader 可載入所有 18 個 SQL 檔案
|
||||
@@ -46,4 +46,4 @@ include-package-data = true
|
||||
where = ["src"]
|
||||
|
||||
[tool.setuptools.package-data]
|
||||
mes_dashboard = ["templates/**/*"]
|
||||
mes_dashboard = ["templates/**/*", "sql/**/*.sql"]
|
||||
|
||||
@@ -1,5 +1,33 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Database connection and query utilities for MES Dashboard."""
|
||||
"""Database connection and query utilities for MES Dashboard.
|
||||
|
||||
Connection Management:
|
||||
- Uses SQLAlchemy with QueuePool for connection pooling
|
||||
- Background keep-alive thread prevents idle connection drops
|
||||
- Request-scoped connections via Flask g object
|
||||
|
||||
Query Execution (Recommended Pattern):
|
||||
Use SQLLoader + QueryBuilder for safe, parameterized queries:
|
||||
|
||||
>>> from mes_dashboard.sql import SQLLoader, QueryBuilder
|
||||
>>> from mes_dashboard.core.database import read_sql_df
|
||||
>>>
|
||||
>>> # Load SQL template from file
|
||||
>>> sql = SQLLoader.load("resource/by_status")
|
||||
>>>
|
||||
>>> # Build conditions with parameters (SQL injection safe)
|
||||
>>> builder = QueryBuilder()
|
||||
>>> builder.add_in_condition("STATUS", ["PRD", "SBY"])
|
||||
>>> builder.add_param_condition("LOCATION", "FAB1")
|
||||
>>> where_clause, params = builder.build_where_only()
|
||||
>>>
|
||||
>>> # Replace placeholders and execute
|
||||
>>> sql = sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
>>> df = read_sql_df(sql, params)
|
||||
|
||||
SQL files are stored in src/mes_dashboard/sql/<module>/<query>.sql
|
||||
with LRU caching (max 100 files).
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
@@ -213,7 +241,26 @@ def _extract_ora_code(exc: Exception) -> str:
|
||||
def read_sql_df(sql: str, params: Optional[Dict[str, Any]] = None) -> pd.DataFrame:
|
||||
"""Execute SQL query and return results as a DataFrame.
|
||||
|
||||
Includes query timing and error logging with ORA codes.
|
||||
Args:
|
||||
sql: SQL query string. Can include Oracle bind variables (:param_name)
|
||||
for parameterized queries. Use SQLLoader to load SQL from files.
|
||||
params: Optional dict of parameter values to bind to the query.
|
||||
Use QueryBuilder to construct safe parameterized conditions.
|
||||
|
||||
Returns:
|
||||
DataFrame with query results. Column names are uppercased.
|
||||
|
||||
Raises:
|
||||
Exception: If query execution fails. ORA code is logged.
|
||||
|
||||
Example:
|
||||
>>> sql = "SELECT * FROM users WHERE status = :status"
|
||||
>>> df = read_sql_df(sql, {"status": "active"})
|
||||
|
||||
Note:
|
||||
- Slow queries (>1s) are logged as warnings
|
||||
- All queries use connection pooling via SQLAlchemy
|
||||
- Call timeout is set to 55s to prevent worker blocking
|
||||
"""
|
||||
start_time = time.time()
|
||||
engine = get_engine()
|
||||
|
||||
@@ -2,8 +2,12 @@
|
||||
"""Utility functions for MES Dashboard.
|
||||
|
||||
Common helper functions used across services.
|
||||
|
||||
Note: SQL filter building functions in this module are DEPRECATED.
|
||||
Use mes_dashboard.sql.CommonFilters with QueryBuilder instead.
|
||||
"""
|
||||
|
||||
import warnings
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
@@ -30,7 +34,8 @@ def get_days_back(filters: Optional[Dict] = None, default: int = DEFAULT_DAYS_BA
|
||||
|
||||
|
||||
# ============================================================
|
||||
# SQL Filter Building
|
||||
# SQL Filter Building (DEPRECATED)
|
||||
# Use mes_dashboard.sql.CommonFilters with QueryBuilder instead.
|
||||
# ============================================================
|
||||
|
||||
|
||||
@@ -38,7 +43,17 @@ def build_filter_conditions(
|
||||
filters: Optional[Dict],
|
||||
field_mapping: Optional[Dict[str, str]] = None,
|
||||
) -> List[str]:
|
||||
"""Build SQL WHERE conditions from filters dict."""
|
||||
"""Build SQL WHERE conditions from filters dict.
|
||||
|
||||
.. deprecated::
|
||||
Use QueryBuilder with add_in_condition() or add_param_condition() instead.
|
||||
This function uses string formatting which may be vulnerable to SQL injection.
|
||||
"""
|
||||
warnings.warn(
|
||||
"build_filter_conditions is deprecated. Use QueryBuilder with add_in_condition() instead.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
if not filters:
|
||||
return []
|
||||
|
||||
@@ -58,7 +73,12 @@ def build_filter_conditions(
|
||||
|
||||
|
||||
def build_equipment_filter_sql(filters: Optional[Dict]) -> List[str]:
|
||||
"""Build SQL conditions for equipment flag filters."""
|
||||
"""Build SQL conditions for equipment flag filters.
|
||||
|
||||
Note: This function is safe as it uses static conditions from config,
|
||||
but consider migrating to CommonFilters.add_equipment_flag_filters()
|
||||
for consistency with the new architecture.
|
||||
"""
|
||||
if not filters:
|
||||
return []
|
||||
|
||||
@@ -75,7 +95,17 @@ def build_location_filter_sql(
|
||||
filters: Optional[Dict],
|
||||
column_name: str = 'LOCATIONNAME',
|
||||
) -> Optional[str]:
|
||||
"""Build SQL condition for location filtering."""
|
||||
"""Build SQL condition for location filtering.
|
||||
|
||||
.. deprecated::
|
||||
Use QueryBuilder.add_in_condition() instead.
|
||||
This function uses string formatting which may be vulnerable to SQL injection.
|
||||
"""
|
||||
warnings.warn(
|
||||
"build_location_filter_sql is deprecated. Use QueryBuilder.add_in_condition() instead.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
if not filters:
|
||||
return None
|
||||
|
||||
@@ -91,7 +121,17 @@ def build_asset_status_filter_sql(
|
||||
filters: Optional[Dict],
|
||||
column_name: str = 'PJ_ASSETSSTATUS',
|
||||
) -> Optional[str]:
|
||||
"""Build SQL condition for asset status filtering."""
|
||||
"""Build SQL condition for asset status filtering.
|
||||
|
||||
.. deprecated::
|
||||
Use QueryBuilder.add_in_condition() instead.
|
||||
This function uses string formatting which may be vulnerable to SQL injection.
|
||||
"""
|
||||
warnings.warn(
|
||||
"build_asset_status_filter_sql is deprecated. Use QueryBuilder.add_in_condition() instead.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
if not filters:
|
||||
return None
|
||||
|
||||
@@ -109,7 +149,17 @@ def build_exclusion_sql(
|
||||
location_column: str = 'LOCATIONNAME',
|
||||
status_column: str = 'PJ_ASSETSSTATUS',
|
||||
) -> List[str]:
|
||||
"""Build SQL conditions for excluding specific locations and statuses."""
|
||||
"""Build SQL conditions for excluding specific locations and statuses.
|
||||
|
||||
.. deprecated::
|
||||
Use CommonFilters.add_location_exclusion() and
|
||||
CommonFilters.add_asset_status_exclusion() instead.
|
||||
"""
|
||||
warnings.warn(
|
||||
"build_exclusion_sql is deprecated. Use CommonFilters with QueryBuilder instead.",
|
||||
DeprecationWarning,
|
||||
stacklevel=2
|
||||
)
|
||||
conditions = []
|
||||
|
||||
loc_list = locations if locations is not None else EXCLUDED_LOCATIONS
|
||||
|
||||
@@ -16,7 +16,13 @@ from mes_dashboard.config.constants import (
|
||||
DEFAULT_DAYS_BACK,
|
||||
)
|
||||
from mes_dashboard.config.workcenter_groups import WORKCENTER_GROUPS, get_workcenter_group
|
||||
from mes_dashboard.services.resource_service import get_resource_latest_status_subquery
|
||||
from mes_dashboard.services.resource_service import (
|
||||
get_resource_latest_status_subquery,
|
||||
get_resource_status_summary,
|
||||
get_workcenter_status_matrix,
|
||||
)
|
||||
from mes_dashboard.sql import SQLLoader, QueryBuilder
|
||||
from mes_dashboard.sql.filters import CommonFilters
|
||||
|
||||
|
||||
# ============================================================
|
||||
@@ -24,7 +30,7 @@ from mes_dashboard.services.resource_service import get_resource_latest_status_s
|
||||
# ============================================================
|
||||
|
||||
def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
"""Query overall KPI for dashboard header.
|
||||
"""Query overall KPI for dashboard header using cached resource data.
|
||||
|
||||
Status categories:
|
||||
- RUN: PRD (Production)
|
||||
@@ -34,68 +40,47 @@ def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
|
||||
OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
|
||||
Uses get_resource_status_summary() for fast, cached data from Redis.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
filters: Optional filter values (is_production, is_key, is_monitor)
|
||||
|
||||
Returns:
|
||||
Dict with KPI data or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
# Build filter conditions
|
||||
where_conditions = []
|
||||
# Extract flag filters for cached query
|
||||
is_production = None
|
||||
is_key = None
|
||||
is_monitor = None
|
||||
if filters:
|
||||
# Equipment flag filters
|
||||
where_conditions.extend(build_equipment_filter_sql(filters))
|
||||
if filters.get('isProduction'):
|
||||
is_production = True
|
||||
if filters.get('isKey'):
|
||||
is_key = True
|
||||
if filters.get('isMonitor'):
|
||||
is_monitor = True
|
||||
|
||||
# Multi-select location filter
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"LOCATIONNAME IN ('{loc_list}')")
|
||||
# Use cached resource status summary for fast response
|
||||
summary = get_resource_status_summary(
|
||||
is_production=is_production,
|
||||
is_key=is_key,
|
||||
is_monitor=is_monitor,
|
||||
)
|
||||
|
||||
# Multi-select asset status filter
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME NOT IN ('PRD','SBY','UDT','SDT','EGT','NST') THEN 1 ELSE 0 END) as OTHER_COUNT
|
||||
FROM ({base_sql}) rs
|
||||
WHERE {where_clause}
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
row = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if not row:
|
||||
if not summary or summary.get('total_count', 0) == 0:
|
||||
return None
|
||||
|
||||
total = row[0] or 0
|
||||
prd = row[1] or 0
|
||||
sby = row[2] or 0
|
||||
udt = row[3] or 0
|
||||
sdt = row[4] or 0
|
||||
egt = row[5] or 0
|
||||
nst = row[6] or 0
|
||||
other = row[7] or 0
|
||||
# Extract counts from summary
|
||||
by_status = summary.get('by_status', {})
|
||||
total = summary.get('total_count', 0)
|
||||
prd = by_status.get('PRD', 0)
|
||||
sby = by_status.get('SBY', 0)
|
||||
udt = by_status.get('UDT', 0)
|
||||
sdt = by_status.get('SDT', 0)
|
||||
egt = by_status.get('EGT', 0)
|
||||
nst = by_status.get('NST', 0)
|
||||
other = by_status.get('OTHER', 0)
|
||||
|
||||
# Status categories
|
||||
run_count = prd # RUN = PRD
|
||||
@@ -103,10 +88,8 @@ def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
idle_count = sby + nst # IDLE = SBY + NST
|
||||
eng_count = egt # ENG = EGT
|
||||
|
||||
# OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
# Denominator excludes NST and OTHER
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
ou_pct = round(prd / operational * 100, 1) if operational > 0 else 0
|
||||
# OU% from cached summary (already calculated)
|
||||
ou_pct = summary.get('ou_pct', 0)
|
||||
|
||||
# Run% = PRD / Total * 100
|
||||
run_pct = round(prd / total * 100, 1) if total > 0 else 0
|
||||
@@ -130,8 +113,6 @@ def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
'run_pct': run_pct
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"KPI query failed: {exc}")
|
||||
return None
|
||||
|
||||
@@ -143,116 +124,50 @@ def query_dashboard_kpi(filters: Optional[Dict] = None) -> Optional[Dict]:
|
||||
def query_workcenter_cards(filters: Optional[Dict] = None) -> Optional[List[Dict]]:
|
||||
"""Query workcenter status cards for dashboard with grouping.
|
||||
|
||||
Workcenter groups order:
|
||||
0: Cutting (切割)
|
||||
1: DB Bonding (焊接_DB)
|
||||
2: WB Bonding (焊接_WB)
|
||||
3: DW Bonding (焊接_DW)
|
||||
4: Molding (成型)
|
||||
5: Deflash (去膠)
|
||||
6: Blast (水吹砂)
|
||||
7: Plating (電鍍)
|
||||
8: Marking (移印)
|
||||
9: Trim/Form (切彎腳)
|
||||
10: PKG SAW (元件切割)
|
||||
11: Test (測試)
|
||||
Uses cached resource data from Redis for fast response times.
|
||||
Data is pre-grouped by workcenter group in the cache.
|
||||
|
||||
Args:
|
||||
filters: Optional filter values
|
||||
filters: Optional filter values (isProduction, isKey, isMonitor)
|
||||
|
||||
Returns:
|
||||
List of workcenter card data or None if query fails.
|
||||
"""
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
# Build filter conditions
|
||||
where_conditions = []
|
||||
# Extract flag filters for cached query
|
||||
is_production = None
|
||||
is_key = None
|
||||
is_monitor = None
|
||||
if filters:
|
||||
where_conditions.extend(build_equipment_filter_sql(filters))
|
||||
if filters.get('isProduction'):
|
||||
is_production = True
|
||||
if filters.get('isKey'):
|
||||
is_key = True
|
||||
if filters.get('isMonitor'):
|
||||
is_monitor = True
|
||||
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"LOCATIONNAME IN ('{loc_list}')")
|
||||
# Use cached workcenter matrix for fast response
|
||||
matrix = get_workcenter_status_matrix(
|
||||
is_production=is_production,
|
||||
is_key=is_key,
|
||||
is_monitor=is_monitor,
|
||||
)
|
||||
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
if not matrix:
|
||||
return None
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST
|
||||
FROM ({base_sql}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL AND {where_clause}
|
||||
GROUP BY WORKCENTERNAME
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
|
||||
# Group workcenters
|
||||
grouped_data = {}
|
||||
ungrouped_data = []
|
||||
|
||||
for _, row in df.iterrows():
|
||||
wc_name = row['WORKCENTERNAME']
|
||||
group_name, order = get_workcenter_group(wc_name)
|
||||
|
||||
if group_name:
|
||||
if group_name not in grouped_data:
|
||||
grouped_data[group_name] = {
|
||||
'order': order,
|
||||
'original_wcs': [],
|
||||
'total': 0,
|
||||
'prd': 0,
|
||||
'sby': 0,
|
||||
'udt': 0,
|
||||
'sdt': 0,
|
||||
'egt': 0,
|
||||
'nst': 0
|
||||
}
|
||||
grouped_data[group_name]['original_wcs'].append(wc_name)
|
||||
grouped_data[group_name]['total'] += int(row['TOTAL'])
|
||||
grouped_data[group_name]['prd'] += int(row['PRD'])
|
||||
grouped_data[group_name]['sby'] += int(row['SBY'])
|
||||
grouped_data[group_name]['udt'] += int(row['UDT'])
|
||||
grouped_data[group_name]['sdt'] += int(row['SDT'])
|
||||
grouped_data[group_name]['egt'] += int(row['EGT'])
|
||||
grouped_data[group_name]['nst'] += int(row['NST'])
|
||||
else:
|
||||
# Ungrouped workcenter
|
||||
ungrouped_data.append({
|
||||
'workcenter': wc_name,
|
||||
'original_wcs': [wc_name],
|
||||
'order': 999,
|
||||
'total': int(row['TOTAL']),
|
||||
'prd': int(row['PRD']),
|
||||
'sby': int(row['SBY']),
|
||||
'udt': int(row['UDT']),
|
||||
'sdt': int(row['SDT']),
|
||||
'egt': int(row['EGT']),
|
||||
'nst': int(row['NST'])
|
||||
})
|
||||
|
||||
# Calculate OU% and build result
|
||||
# Transform matrix data to expected card format
|
||||
result = []
|
||||
|
||||
# Add grouped workcenters
|
||||
for group_name, data in grouped_data.items():
|
||||
prd = data['prd']
|
||||
sby = data['sby']
|
||||
egt = data['egt']
|
||||
sdt = data['sdt']
|
||||
udt = data['udt']
|
||||
total = data['total']
|
||||
for row in matrix:
|
||||
group_name = row['workcenter_group']
|
||||
order = row['workcenter_sequence']
|
||||
total = row['total']
|
||||
prd = row['PRD']
|
||||
sby = row['SBY']
|
||||
udt = row['UDT']
|
||||
sdt = row['SDT']
|
||||
egt = row['EGT']
|
||||
nst = row['NST']
|
||||
|
||||
# OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
@@ -261,54 +176,23 @@ def query_workcenter_cards(filters: Optional[Dict] = None) -> Optional[List[Dict
|
||||
|
||||
result.append({
|
||||
'workcenter': group_name,
|
||||
'original_wcs': data['original_wcs'],
|
||||
'order': data['order'],
|
||||
'original_wcs': [], # Not available from cache (aggregated by group)
|
||||
'order': order,
|
||||
'total': total,
|
||||
'prd': prd,
|
||||
'sby': sby,
|
||||
'udt': udt,
|
||||
'sdt': sdt,
|
||||
'egt': egt,
|
||||
'nst': data['nst'],
|
||||
'nst': nst,
|
||||
'ou_pct': ou_pct,
|
||||
'run_pct': run_pct,
|
||||
'down': udt + sdt,
|
||||
'idle': sby + data['nst'],
|
||||
'idle': sby + nst,
|
||||
'eng': egt
|
||||
})
|
||||
|
||||
# Add ungrouped workcenters
|
||||
for data in ungrouped_data:
|
||||
prd = data['prd']
|
||||
sby = data['sby']
|
||||
egt = data['egt']
|
||||
sdt = data['sdt']
|
||||
udt = data['udt']
|
||||
total = data['total']
|
||||
|
||||
operational = prd + sby + egt + sdt + udt
|
||||
ou_pct = round(prd / operational * 100, 1) if operational > 0 else 0
|
||||
run_pct = round(prd / total * 100, 1) if total > 0 else 0
|
||||
|
||||
result.append({
|
||||
'workcenter': data['workcenter'],
|
||||
'original_wcs': data['original_wcs'],
|
||||
'order': data['order'],
|
||||
'total': total,
|
||||
'prd': prd,
|
||||
'sby': sby,
|
||||
'udt': udt,
|
||||
'sdt': sdt,
|
||||
'egt': egt,
|
||||
'nst': data['nst'],
|
||||
'ou_pct': ou_pct,
|
||||
'run_pct': run_pct,
|
||||
'down': udt + sdt,
|
||||
'idle': sby + data['nst'],
|
||||
'eng': egt
|
||||
})
|
||||
|
||||
# Sort by order
|
||||
# Sort by order (already sorted by sequence, but ensure consistency)
|
||||
result.sort(key=lambda x: (x['order'], -x['total']))
|
||||
|
||||
return result
|
||||
@@ -345,18 +229,21 @@ def query_resource_detail_with_job(
|
||||
try:
|
||||
days_back = get_days_back(filters)
|
||||
|
||||
# Build exclusion filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (r.LOCATIONNAME IS NULL OR r.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
# Build exclusion filters using CommonFilters (legacy format for SQL placeholders)
|
||||
location_filter = CommonFilters.build_location_filter_legacy(
|
||||
excluded_locations=list(EXCLUDED_LOCATIONS) if EXCLUDED_LOCATIONS else None
|
||||
)
|
||||
if location_filter:
|
||||
location_filter = f"AND {location_filter.replace('LOCATIONNAME', 'r.LOCATIONNAME')}"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (r.PJ_ASSETSSTATUS IS NULL OR r.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
asset_status_filter = CommonFilters.build_asset_status_filter_legacy(
|
||||
excluded_statuses=list(EXCLUDED_ASSET_STATUSES) if EXCLUDED_ASSET_STATUSES else None
|
||||
)
|
||||
if asset_status_filter:
|
||||
asset_status_filter = f"AND {asset_status_filter.replace('PJ_ASSETSSTATUS', 'r.PJ_ASSETSSTATUS')}"
|
||||
|
||||
where_conditions = []
|
||||
# Build filter conditions using QueryBuilder for safety
|
||||
builder = QueryBuilder()
|
||||
if filters:
|
||||
# Support workcenter group filter
|
||||
if filters.get('workcenter'):
|
||||
@@ -364,142 +251,58 @@ def query_resource_detail_with_job(
|
||||
# Check if it's a merged group
|
||||
if wc_filter in WORKCENTER_GROUPS:
|
||||
patterns = WORKCENTER_GROUPS[wc_filter]['patterns']
|
||||
pattern_conditions = []
|
||||
for p in patterns:
|
||||
pattern_conditions.append(f"UPPER(rs.WORKCENTERNAME) LIKE '%{p.upper()}%'")
|
||||
where_conditions.append(f"({' OR '.join(pattern_conditions)})")
|
||||
# Use parameterized OR LIKE conditions (safe escaping)
|
||||
builder.add_or_like_conditions(
|
||||
'rs.WORKCENTERNAME',
|
||||
patterns,
|
||||
case_insensitive=True,
|
||||
)
|
||||
else:
|
||||
where_conditions.append(f"rs.WORKCENTERNAME = '{wc_filter}'")
|
||||
builder.add_param_condition('rs.WORKCENTERNAME', wc_filter)
|
||||
|
||||
if filters.get('original_wcs'):
|
||||
# If original workcenter list provided, use IN query
|
||||
wcs = filters['original_wcs']
|
||||
wc_list = "', '".join(wcs)
|
||||
where_conditions.append(f"rs.WORKCENTERNAME IN ('{wc_list}')")
|
||||
builder.add_in_condition('rs.WORKCENTERNAME', list(filters['original_wcs']))
|
||||
|
||||
if filters.get('status'):
|
||||
where_conditions.append(f"rs.NEWSTATUSNAME = '{filters['status']}'")
|
||||
builder.add_param_condition('rs.NEWSTATUSNAME', filters['status'])
|
||||
|
||||
# Equipment flag filters
|
||||
# Equipment flag filters (safe - boolean values)
|
||||
if filters.get('isProduction'):
|
||||
where_conditions.append("NVL(rs.PJ_ISPRODUCTION, 0) = 1")
|
||||
builder.add_condition("NVL(rs.PJ_ISPRODUCTION, 0) = 1")
|
||||
if filters.get('isKey'):
|
||||
where_conditions.append("NVL(rs.PJ_ISKEY, 0) = 1")
|
||||
builder.add_condition("NVL(rs.PJ_ISKEY, 0) = 1")
|
||||
if filters.get('isMonitor'):
|
||||
where_conditions.append("NVL(rs.PJ_ISMONITOR, 0) = 1")
|
||||
builder.add_condition("NVL(rs.PJ_ISMONITOR, 0) = 1")
|
||||
|
||||
# Multi-select location filter
|
||||
# Multi-select location filter (parameterized)
|
||||
if filters.get('locations') and len(filters['locations']) > 0:
|
||||
loc_list = "', '".join(filters['locations'])
|
||||
where_conditions.append(f"rs.LOCATIONNAME IN ('{loc_list}')")
|
||||
builder.add_in_condition('rs.LOCATIONNAME', list(filters['locations']))
|
||||
|
||||
# Multi-select asset status filter
|
||||
# Multi-select asset status filter (parameterized)
|
||||
if filters.get('assetsStatuses') and len(filters['assetsStatuses']) > 0:
|
||||
status_list = "', '".join(filters['assetsStatuses'])
|
||||
where_conditions.append(f"rs.PJ_ASSETSSTATUS IN ('{status_list}')")
|
||||
builder.add_in_condition('rs.PJ_ASSETSSTATUS', list(filters['assetsStatuses']))
|
||||
|
||||
# Default to showing only DOWN status (UDT, SDT)
|
||||
where_conditions.append("rs.NEWSTATUSNAME IN ('UDT', 'SDT')")
|
||||
builder.add_in_condition('rs.NEWSTATUSNAME', ['UDT', 'SDT'])
|
||||
|
||||
where_clause = " AND ".join(where_conditions) if where_conditions else "1=1"
|
||||
conditions_sql = builder.get_conditions_sql()
|
||||
params = builder.params.copy()
|
||||
where_clause = conditions_sql if conditions_sql else "1=1"
|
||||
|
||||
# Left join with JOB table for SDT/UDT details
|
||||
# Add pagination parameters
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DWH.DW_MES_RESOURCESTATUS
|
||||
),
|
||||
base_data AS (
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {days_back}
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
)
|
||||
WHERE rn = 1
|
||||
),
|
||||
max_time AS (
|
||||
SELECT MAX(LASTSTATUSCHANGEDATE) AS MAX_STATUS_TIME FROM base_data
|
||||
)
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
rs.RESOURCENAME,
|
||||
rs.WORKCENTERNAME,
|
||||
rs.RESOURCEFAMILYNAME,
|
||||
rs.NEWSTATUSNAME,
|
||||
rs.NEWREASONNAME,
|
||||
rs.LASTSTATUSCHANGEDATE,
|
||||
rs.PJ_DEPARTMENT,
|
||||
rs.VENDORNAME,
|
||||
rs.VENDORMODEL,
|
||||
rs.PJ_ISPRODUCTION,
|
||||
rs.PJ_ISKEY,
|
||||
rs.PJ_ISMONITOR,
|
||||
j.JOBID,
|
||||
rs.PJ_LOTID,
|
||||
j.JOBORDERNAME,
|
||||
j.JOBSTATUS,
|
||||
j.SYMPTOMCODENAME,
|
||||
j.CAUSECODENAME,
|
||||
j.REPAIRCODENAME,
|
||||
j.CREATEDATE as JOB_CREATEDATE,
|
||||
j.FIRSTCLOCKONDATE,
|
||||
mt.MAX_STATUS_TIME,
|
||||
ROUND((mt.MAX_STATUS_TIME - rs.LASTSTATUSCHANGEDATE) * 24 * 60, 0) as DOWN_MINUTES,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY
|
||||
CASE rs.NEWSTATUSNAME
|
||||
WHEN 'UDT' THEN 1
|
||||
WHEN 'SDT' THEN 2
|
||||
ELSE 3
|
||||
END,
|
||||
rs.LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM base_data rs
|
||||
CROSS JOIN max_time mt
|
||||
LEFT JOIN DWH.DW_MES_JOB j ON j.RESOURCEID = rs.RESOURCEID
|
||||
AND j.CREATEDATE = rs.LASTSTATUSCHANGEDATE
|
||||
WHERE {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
params['start_row'] = start_row
|
||||
params['end_row'] = end_row
|
||||
|
||||
# Load SQL from file and replace placeholders
|
||||
sql = SQLLoader.load("dashboard/resource_detail_with_job")
|
||||
sql = sql.replace("{{ DAYS_BACK }}", str(days_back))
|
||||
sql = sql.replace("{{ LOCATION_FILTER }}", location_filter if location_filter else "")
|
||||
sql = sql.replace("{{ ASSET_STATUS_FILTER }}", asset_status_filter if asset_status_filter else "")
|
||||
sql = sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
# Get max_status_time for Last Update display
|
||||
max_status_time = None
|
||||
@@ -540,18 +343,20 @@ def query_ou_trend(days: int = 7, filters: Optional[Dict] = None) -> Optional[Li
|
||||
List of {date, ou_pct, prd_hours, total_hours} records or None if query fails.
|
||||
"""
|
||||
try:
|
||||
# Build location and asset status filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (ss.LOCATIONNAME IS NULL OR ss.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
# Build exclusion filters using CommonFilters (legacy format for SQL placeholders)
|
||||
location_filter = CommonFilters.build_location_filter_legacy(
|
||||
excluded_locations=list(EXCLUDED_LOCATIONS) if EXCLUDED_LOCATIONS else None
|
||||
)
|
||||
if location_filter:
|
||||
location_filter = f"AND {location_filter.replace('LOCATIONNAME', 'ss.LOCATIONNAME')}"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (ss.PJ_ASSETSSTATUS IS NULL OR ss.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
asset_status_filter = CommonFilters.build_asset_status_filter_legacy(
|
||||
excluded_statuses=list(EXCLUDED_ASSET_STATUSES) if EXCLUDED_ASSET_STATUSES else None
|
||||
)
|
||||
if asset_status_filter:
|
||||
asset_status_filter = f"AND {asset_status_filter.replace('PJ_ASSETSSTATUS', 'ss.PJ_ASSETSSTATUS')}"
|
||||
|
||||
# Build filter conditions for equipment flags
|
||||
# Build filter conditions for equipment flags (safe - boolean values)
|
||||
flag_conditions = []
|
||||
if filters:
|
||||
if filters.get('isProduction'):
|
||||
@@ -565,28 +370,13 @@ def query_ou_trend(days: int = 7, filters: Optional[Dict] = None) -> Optional[Li
|
||||
if flag_conditions:
|
||||
flag_filter = "AND " + " AND ".join(flag_conditions)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SBY' THEN ss.HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'UDT' THEN ss.HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SDT' THEN ss.HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'EGT' THEN ss.HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(ss.HOURS) as TOTAL_HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DWH.DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - {days}
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
{flag_filter}
|
||||
GROUP BY TRUNC(ss.TXNDATE)
|
||||
ORDER BY DATA_DATE
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
# Load SQL from file and replace placeholders
|
||||
sql = SQLLoader.load("dashboard/ou_trend")
|
||||
sql = sql.replace("{{ LOCATION_FILTER }}", location_filter if location_filter else "")
|
||||
sql = sql.replace("{{ ASSET_STATUS_FILTER }}", asset_status_filter if asset_status_filter else "")
|
||||
sql = sql.replace("{{ FLAG_FILTER }}", flag_filter)
|
||||
|
||||
df = read_sql_df(sql, {'days': days})
|
||||
|
||||
result = []
|
||||
for _, row in df.iterrows():
|
||||
@@ -636,18 +426,24 @@ def query_utilization_heatmap(days: int = 7, filters: Optional[Dict] = None) ->
|
||||
List of {workcenter, date, prd_pct, prd_hours, avail_hours} records or None if query fails.
|
||||
"""
|
||||
try:
|
||||
# Build location and asset status filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (ss.LOCATIONNAME IS NULL OR ss.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
# Build exclusion filters using CommonFilters (legacy format for SQL placeholders)
|
||||
location_filter = CommonFilters.build_location_filter_legacy(
|
||||
excluded_locations=list(EXCLUDED_LOCATIONS) if EXCLUDED_LOCATIONS else None
|
||||
)
|
||||
if location_filter:
|
||||
location_filter = f"AND {location_filter.replace('LOCATIONNAME', 'r.LOCATIONNAME')}"
|
||||
else:
|
||||
location_filter = ""
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (ss.PJ_ASSETSSTATUS IS NULL OR ss.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
asset_status_filter = CommonFilters.build_asset_status_filter_legacy(
|
||||
excluded_statuses=list(EXCLUDED_ASSET_STATUSES) if EXCLUDED_ASSET_STATUSES else None
|
||||
)
|
||||
if asset_status_filter:
|
||||
asset_status_filter = f"AND {asset_status_filter.replace('PJ_ASSETSSTATUS', 'r.PJ_ASSETSSTATUS')}"
|
||||
else:
|
||||
asset_status_filter = ""
|
||||
|
||||
# Build filter conditions for equipment flags
|
||||
# Build filter conditions for equipment flags (safe - boolean values)
|
||||
flag_conditions = []
|
||||
if filters:
|
||||
if filters.get('isProduction'):
|
||||
@@ -661,26 +457,13 @@ def query_utilization_heatmap(days: int = 7, filters: Optional[Dict] = None) ->
|
||||
if flag_conditions:
|
||||
flag_filter = "AND " + " AND ".join(flag_conditions)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
ss.WORKCENTERNAME,
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME IN ('PRD', 'SBY', 'UDT', 'SDT', 'EGT') THEN ss.HOURS ELSE 0 END) as AVAIL_HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DWH.DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - {days}
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ss.WORKCENTERNAME IS NOT NULL
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
{flag_filter}
|
||||
GROUP BY ss.WORKCENTERNAME, TRUNC(ss.TXNDATE)
|
||||
ORDER BY ss.WORKCENTERNAME, DATA_DATE
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
# Load SQL from file and replace placeholders
|
||||
sql = SQLLoader.load("dashboard/heatmap")
|
||||
sql = sql.replace("{{ LOCATION_FILTER }}", location_filter)
|
||||
sql = sql.replace("{{ ASSET_STATUS_FILTER }}", asset_status_filter)
|
||||
sql = sql.replace("{{ FLAG_FILTER }}", flag_filter)
|
||||
|
||||
df = read_sql_df(sql, {'days': days})
|
||||
|
||||
# Group by workcenter for heatmap format
|
||||
result = []
|
||||
|
||||
@@ -28,6 +28,7 @@ from mes_dashboard.config.constants import (
|
||||
EXCLUDED_ASSET_STATUSES,
|
||||
EQUIPMENT_TYPE_FILTER,
|
||||
)
|
||||
from mes_dashboard.sql import QueryBuilder
|
||||
|
||||
logger = logging.getLogger('mes_dashboard.resource_cache')
|
||||
|
||||
@@ -48,28 +49,37 @@ def _get_key(key: str) -> str:
|
||||
# Internal: Oracle Load Functions
|
||||
# ============================================================
|
||||
|
||||
def _build_filter_sql() -> str:
|
||||
"""Build SQL WHERE clause for global filters."""
|
||||
conditions = [EQUIPMENT_TYPE_FILTER.strip()]
|
||||
def _build_filter_builder() -> QueryBuilder:
|
||||
"""Build QueryBuilder with global filter conditions.
|
||||
|
||||
Returns:
|
||||
QueryBuilder instance with filter conditions applied.
|
||||
"""
|
||||
builder = QueryBuilder()
|
||||
|
||||
# Equipment type filter - raw SQL condition from config
|
||||
builder.add_condition(EQUIPMENT_TYPE_FILTER.strip())
|
||||
|
||||
# Workcenter filter - exclude resources without WORKCENTERNAME
|
||||
conditions.append("WORKCENTERNAME IS NOT NULL")
|
||||
builder.add_is_not_null("WORKCENTERNAME")
|
||||
|
||||
# Location filter
|
||||
# Location filter - exclude locations, allow NULL
|
||||
if EXCLUDED_LOCATIONS:
|
||||
locations_list = ", ".join(f"'{loc}'" for loc in EXCLUDED_LOCATIONS)
|
||||
conditions.append(
|
||||
f"(LOCATIONNAME IS NULL OR LOCATIONNAME NOT IN ({locations_list}))"
|
||||
builder.add_not_in_condition(
|
||||
"LOCATIONNAME",
|
||||
list(EXCLUDED_LOCATIONS),
|
||||
allow_null=True
|
||||
)
|
||||
|
||||
# Asset status filter
|
||||
# Asset status filter - exclude statuses, allow NULL
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
status_list = ", ".join(f"'{s}'" for s in EXCLUDED_ASSET_STATUSES)
|
||||
conditions.append(
|
||||
f"(PJ_ASSETSSTATUS IS NULL OR PJ_ASSETSSTATUS NOT IN ({status_list}))"
|
||||
builder.add_not_in_condition(
|
||||
"PJ_ASSETSSTATUS",
|
||||
list(EXCLUDED_ASSET_STATUSES),
|
||||
allow_null=True
|
||||
)
|
||||
|
||||
return " AND ".join(conditions)
|
||||
return builder
|
||||
|
||||
|
||||
def _load_from_oracle() -> Optional[pd.DataFrame]:
|
||||
@@ -78,14 +88,12 @@ def _load_from_oracle() -> Optional[pd.DataFrame]:
|
||||
Returns:
|
||||
DataFrame with all columns, or None if query failed.
|
||||
"""
|
||||
filter_sql = _build_filter_sql()
|
||||
sql = f"""
|
||||
SELECT *
|
||||
FROM DWH.DW_MES_RESOURCE
|
||||
WHERE {filter_sql}
|
||||
"""
|
||||
builder = _build_filter_builder()
|
||||
builder.base_sql = "SELECT * FROM DWH.DW_MES_RESOURCE {{ WHERE_CLAUSE }}"
|
||||
sql, params = builder.build()
|
||||
|
||||
try:
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
if df is not None:
|
||||
logger.info(f"Loaded {len(df)} resources from Oracle")
|
||||
return df
|
||||
@@ -100,14 +108,12 @@ def _get_version_from_oracle() -> Optional[str]:
|
||||
Returns:
|
||||
Version string (ISO format), or None if query failed.
|
||||
"""
|
||||
filter_sql = _build_filter_sql()
|
||||
sql = f"""
|
||||
SELECT MAX(LASTCHANGEDATE) as VERSION
|
||||
FROM DWH.DW_MES_RESOURCE
|
||||
WHERE {filter_sql}
|
||||
"""
|
||||
builder = _build_filter_builder()
|
||||
builder.base_sql = "SELECT MAX(LASTCHANGEDATE) as VERSION FROM DWH.DW_MES_RESOURCE {{ WHERE_CLAUSE }}"
|
||||
sql, params = builder.build()
|
||||
|
||||
try:
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
if df is not None and not df.empty:
|
||||
version = df.iloc[0]['VERSION']
|
||||
if version is not None:
|
||||
|
||||
@@ -23,6 +23,7 @@ from typing import Optional, Dict, List, Any, Generator
|
||||
import pandas as pd
|
||||
|
||||
from mes_dashboard.core.database import read_sql_df
|
||||
from mes_dashboard.sql import QueryBuilder, SQLLoader
|
||||
|
||||
logger = logging.getLogger('mes_dashboard.resource_history')
|
||||
|
||||
@@ -265,71 +266,28 @@ def query_summary(
|
||||
# Build SQL components
|
||||
date_trunc = _get_date_trunc(granularity)
|
||||
|
||||
# Base CTE with resource filter
|
||||
base_cte = f"""
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, TXNDATE, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE('{start_date}', 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE('{end_date}', 'YYYY-MM-DD') + 1
|
||||
AND {historyid_filter}
|
||||
)
|
||||
"""
|
||||
# Common parameters for all queries (dates are parameterized for safety)
|
||||
params = {'start_date': start_date, 'end_date': end_date}
|
||||
|
||||
# KPI query - aggregate all
|
||||
kpi_sql = f"""
|
||||
{base_cte}
|
||||
SELECT
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
COUNT(DISTINCT HISTORYID) as MACHINE_COUNT
|
||||
FROM shift_data
|
||||
"""
|
||||
# Load SQL templates and replace placeholders
|
||||
kpi_sql = SQLLoader.load("resource_history/kpi")
|
||||
kpi_sql = kpi_sql.replace("{{ HISTORYID_FILTER }}", historyid_filter)
|
||||
|
||||
# Trend query - group by date
|
||||
trend_sql = f"""
|
||||
{base_cte}
|
||||
SELECT
|
||||
{date_trunc} as DATA_DATE,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
COUNT(DISTINCT HISTORYID) as MACHINE_COUNT
|
||||
FROM shift_data
|
||||
GROUP BY {date_trunc}
|
||||
ORDER BY DATA_DATE
|
||||
"""
|
||||
trend_sql = SQLLoader.load("resource_history/trend")
|
||||
trend_sql = trend_sql.replace("{{ HISTORYID_FILTER }}", historyid_filter)
|
||||
trend_sql = trend_sql.replace("{{ DATE_TRUNC }}", date_trunc)
|
||||
|
||||
# Heatmap/Comparison query - group by HISTORYID and date, merge dimension in Python
|
||||
heatmap_raw_sql = f"""
|
||||
{base_cte}
|
||||
SELECT
|
||||
HISTORYID,
|
||||
{date_trunc} as DATA_DATE,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS
|
||||
FROM shift_data
|
||||
GROUP BY HISTORYID, {date_trunc}
|
||||
ORDER BY HISTORYID, DATA_DATE
|
||||
"""
|
||||
heatmap_raw_sql = SQLLoader.load("resource_history/heatmap")
|
||||
heatmap_raw_sql = heatmap_raw_sql.replace("{{ HISTORYID_FILTER }}", historyid_filter)
|
||||
heatmap_raw_sql = heatmap_raw_sql.replace("{{ DATE_TRUNC }}", date_trunc)
|
||||
|
||||
# Execute queries in parallel
|
||||
# Execute queries in parallel with params
|
||||
results = {}
|
||||
with ThreadPoolExecutor(max_workers=3) as executor:
|
||||
futures = {
|
||||
executor.submit(read_sql_df, kpi_sql): 'kpi',
|
||||
executor.submit(read_sql_df, trend_sql): 'trend',
|
||||
executor.submit(read_sql_df, heatmap_raw_sql): 'heatmap_raw',
|
||||
executor.submit(read_sql_df, kpi_sql, params): 'kpi',
|
||||
executor.submit(read_sql_df, trend_sql, params): 'trend',
|
||||
executor.submit(read_sql_df, heatmap_raw_sql, params): 'heatmap_raw',
|
||||
}
|
||||
for future in as_completed(futures):
|
||||
query_name = futures[future]
|
||||
@@ -423,30 +381,14 @@ def query_detail(
|
||||
resource_lookup = _build_resource_lookup(resources)
|
||||
historyid_filter = _build_historyid_filter(resources)
|
||||
|
||||
# Query SHIFT data grouped by HISTORYID
|
||||
detail_sql = f"""
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE('{start_date}', 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE('{end_date}', 'YYYY-MM-DD') + 1
|
||||
AND {historyid_filter}
|
||||
)
|
||||
SELECT
|
||||
HISTORYID,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
SUM(HOURS) as TOTAL_HOURS
|
||||
FROM shift_data
|
||||
GROUP BY HISTORYID
|
||||
ORDER BY HISTORYID
|
||||
"""
|
||||
# Query SHIFT data grouped by HISTORYID (dates parameterized for safety)
|
||||
params = {'start_date': start_date, 'end_date': end_date}
|
||||
|
||||
detail_df = read_sql_df(detail_sql)
|
||||
# Load SQL template and replace placeholder
|
||||
detail_sql = SQLLoader.load("resource_history/detail")
|
||||
detail_sql = detail_sql.replace("{{ HISTORYID_FILTER }}", historyid_filter)
|
||||
|
||||
detail_df = read_sql_df(detail_sql, params)
|
||||
|
||||
# Build detail data with dimension merge from cache
|
||||
data = _build_detail_from_raw_df(detail_df, resource_lookup)
|
||||
@@ -525,29 +467,14 @@ def export_csv(
|
||||
from mes_dashboard.services.filter_cache import get_workcenter_mapping
|
||||
wc_mapping = get_workcenter_mapping() or {}
|
||||
|
||||
# Query SHIFT data grouped by HISTORYID
|
||||
sql = f"""
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE('{start_date}', 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE('{end_date}', 'YYYY-MM-DD') + 1
|
||||
AND {historyid_filter}
|
||||
)
|
||||
SELECT
|
||||
HISTORYID,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
SUM(HOURS) as TOTAL_HOURS
|
||||
FROM shift_data
|
||||
GROUP BY HISTORYID
|
||||
ORDER BY HISTORYID
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
# Query SHIFT data grouped by HISTORYID (dates parameterized for safety)
|
||||
params = {'start_date': start_date, 'end_date': end_date}
|
||||
|
||||
# Load SQL template and replace placeholder (reuse detail.sql)
|
||||
sql = SQLLoader.load("resource_history/detail")
|
||||
sql = sql.replace("{{ HISTORYID_FILTER }}", historyid_filter)
|
||||
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
# Write CSV header
|
||||
output = io.StringIO()
|
||||
|
||||
@@ -15,6 +15,8 @@ from mes_dashboard.config.constants import (
|
||||
DEFAULT_DAYS_BACK,
|
||||
STATUS_CATEGORIES,
|
||||
)
|
||||
from mes_dashboard.sql import SQLLoader, QueryBuilder
|
||||
from mes_dashboard.sql.filters import CommonFilters
|
||||
from mes_dashboard.services.resource_cache import get_all_resources
|
||||
from mes_dashboard.services.realtime_equipment_cache import (
|
||||
get_all_equipment_status,
|
||||
@@ -78,65 +80,25 @@ def get_resource_latest_status_subquery(days_back: int = 30) -> str:
|
||||
Returns:
|
||||
SQL subquery string for latest resource status.
|
||||
"""
|
||||
# Build exclusion filters
|
||||
location_filter = ""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
excluded_locations = "', '".join(EXCLUDED_LOCATIONS)
|
||||
location_filter = f"AND (r.LOCATIONNAME IS NULL OR r.LOCATIONNAME NOT IN ('{excluded_locations}'))"
|
||||
# Build exclusion filters using CommonFilters (legacy format for SQL placeholders)
|
||||
location_filter = CommonFilters.build_location_filter_legacy(
|
||||
excluded_locations=list(EXCLUDED_LOCATIONS) if EXCLUDED_LOCATIONS else None
|
||||
)
|
||||
if location_filter:
|
||||
location_filter = f"AND {location_filter.replace('LOCATIONNAME', 'r.LOCATIONNAME')}"
|
||||
|
||||
asset_status_filter = ""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
excluded_assets = "', '".join(EXCLUDED_ASSET_STATUSES)
|
||||
asset_status_filter = f"AND (r.PJ_ASSETSSTATUS IS NULL OR r.PJ_ASSETSSTATUS NOT IN ('{excluded_assets}'))"
|
||||
asset_status_filter = CommonFilters.build_asset_status_filter_legacy(
|
||||
excluded_statuses=list(EXCLUDED_ASSET_STATUSES) if EXCLUDED_ASSET_STATUSES else None
|
||||
)
|
||||
if asset_status_filter:
|
||||
asset_status_filter = f"AND {asset_status_filter.replace('PJ_ASSETSSTATUS', 'r.PJ_ASSETSSTATUS')}"
|
||||
|
||||
return f"""
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DWH.DW_MES_RESOURCESTATUS
|
||||
)
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {days_back}
|
||||
{location_filter}
|
||||
{asset_status_filter}
|
||||
)
|
||||
WHERE rn = 1
|
||||
"""
|
||||
return SQLLoader.load_with_params(
|
||||
"resource/latest_status",
|
||||
days_back=days_back,
|
||||
LOCATION_FILTER=location_filter,
|
||||
ASSET_STATUS_FILTER=asset_status_filter,
|
||||
)
|
||||
|
||||
|
||||
# ============================================================
|
||||
@@ -152,36 +114,25 @@ def query_resource_status_summary(days_back: int = 30) -> Optional[Dict]:
|
||||
Returns:
|
||||
Dict with summary stats or None if query fails.
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return None
|
||||
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT RESOURCEFAMILYNAME) as FAMILY_COUNT,
|
||||
COUNT(DISTINCT PJ_DEPARTMENT) as DEPT_COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
"""
|
||||
cursor = connection.cursor()
|
||||
cursor.execute(sql)
|
||||
result = cursor.fetchone()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
if not result:
|
||||
# Load SQL from file and replace placeholder
|
||||
sql = SQLLoader.load("resource/status_summary")
|
||||
sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_sql)
|
||||
|
||||
df = read_sql_df(sql)
|
||||
if df is None or df.empty:
|
||||
return None
|
||||
|
||||
row = df.iloc[0]
|
||||
return {
|
||||
'total_count': result[0] or 0,
|
||||
'workcenter_count': result[1] or 0,
|
||||
'family_count': result[2] or 0,
|
||||
'dept_count': result[3] or 0
|
||||
'total_count': int(row['TOTAL_COUNT'] or 0),
|
||||
'workcenter_count': int(row['WORKCENTER_COUNT'] or 0),
|
||||
'family_count': int(row['FAMILY_COUNT'] or 0),
|
||||
'dept_count': int(row['DEPT_COUNT'] or 0)
|
||||
}
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
print(f"Resource summary query failed: {exc}")
|
||||
return None
|
||||
|
||||
@@ -196,15 +147,9 @@ def query_resource_by_status(days_back: int = 30) -> Optional[pd.DataFrame]:
|
||||
DataFrame with status counts or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE NEWSTATUSNAME IS NOT NULL
|
||||
GROUP BY NEWSTATUSNAME
|
||||
ORDER BY COUNT DESC
|
||||
"""
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
sql = SQLLoader.load("resource/by_status")
|
||||
sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_sql)
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource by status query failed: {exc}")
|
||||
@@ -221,16 +166,9 @@ def query_resource_by_workcenter(days_back: int = 30) -> Optional[pd.DataFrame]:
|
||||
DataFrame with workcenter/status counts or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME, NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, COUNT DESC
|
||||
"""
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
sql = SQLLoader.load("resource/by_workcenter")
|
||||
sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_sql)
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource by workcenter query failed: {exc}")
|
||||
@@ -257,60 +195,50 @@ def query_resource_detail(
|
||||
try:
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
|
||||
where_conditions = []
|
||||
# Use QueryBuilder for safe parameterized conditions
|
||||
builder = QueryBuilder()
|
||||
if filters:
|
||||
if filters.get('workcenter'):
|
||||
where_conditions.append(f"WORKCENTERNAME = '{filters['workcenter']}'")
|
||||
builder.add_param_condition('WORKCENTERNAME', filters['workcenter'])
|
||||
if filters.get('status'):
|
||||
where_conditions.append(f"NEWSTATUSNAME = '{filters['status']}'")
|
||||
builder.add_param_condition('NEWSTATUSNAME', filters['status'])
|
||||
if filters.get('family'):
|
||||
where_conditions.append(f"RESOURCEFAMILYNAME = '{filters['family']}'")
|
||||
builder.add_param_condition('RESOURCEFAMILYNAME', filters['family'])
|
||||
if filters.get('department'):
|
||||
where_conditions.append(f"PJ_DEPARTMENT = '{filters['department']}'")
|
||||
builder.add_param_condition('PJ_DEPARTMENT', filters['department'])
|
||||
|
||||
# Equipment flag filters
|
||||
# Equipment flag filters (boolean to 0/1)
|
||||
if filters.get('isProduction') is not None:
|
||||
where_conditions.append(
|
||||
builder.add_condition(
|
||||
f"NVL(PJ_ISPRODUCTION, 0) = {1 if filters['isProduction'] else 0}"
|
||||
)
|
||||
if filters.get('isKey') is not None:
|
||||
where_conditions.append(
|
||||
builder.add_condition(
|
||||
f"NVL(PJ_ISKEY, 0) = {1 if filters['isKey'] else 0}"
|
||||
)
|
||||
if filters.get('isMonitor') is not None:
|
||||
where_conditions.append(
|
||||
builder.add_condition(
|
||||
f"NVL(PJ_ISMONITOR, 0) = {1 if filters['isMonitor'] else 0}"
|
||||
)
|
||||
|
||||
where_clause = " AND " + " AND ".join(where_conditions) if where_conditions else ""
|
||||
# Build WHERE clause and get parameters
|
||||
conditions_sql = builder.get_conditions_sql()
|
||||
params = builder.params.copy()
|
||||
|
||||
# Add pagination parameters
|
||||
start_row = offset + 1
|
||||
end_row = offset + limit
|
||||
sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
RESOURCENAME,
|
||||
WORKCENTERNAME,
|
||||
RESOURCEFAMILYNAME,
|
||||
NEWSTATUSNAME,
|
||||
NEWREASONNAME,
|
||||
LASTSTATUSCHANGEDATE,
|
||||
PJ_DEPARTMENT,
|
||||
VENDORNAME,
|
||||
VENDORMODEL,
|
||||
PJ_ASSETSSTATUS,
|
||||
AVAILABILITY,
|
||||
PJ_ISPRODUCTION,
|
||||
PJ_ISKEY,
|
||||
PJ_ISMONITOR,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM ({base_sql}) rs
|
||||
WHERE 1=1 {where_clause}
|
||||
) WHERE rn BETWEEN {start_row} AND {end_row}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
params['start_row'] = start_row
|
||||
params['end_row'] = end_row
|
||||
|
||||
where_clause = f" AND {conditions_sql}" if conditions_sql else ""
|
||||
|
||||
# Load SQL from file and replace placeholders
|
||||
sql = SQLLoader.load("resource/detail")
|
||||
sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_sql)
|
||||
sql = sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
# Convert datetime to string
|
||||
if 'LASTSTATUSCHANGEDATE' in df.columns:
|
||||
@@ -342,37 +270,9 @@ def query_resource_workcenter_status_matrix(days_back: int = 30) -> Optional[pd.
|
||||
DataFrame with workcenter/status matrix or None if query fails.
|
||||
"""
|
||||
try:
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END as STATUS_CATEGORY,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({get_resource_latest_status_subquery(days_back)}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END,
|
||||
NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, STATUS_CATEGORY
|
||||
"""
|
||||
base_sql = get_resource_latest_status_subquery(days_back)
|
||||
sql = SQLLoader.load("resource/workcenter_status_matrix")
|
||||
sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_sql)
|
||||
return read_sql_df(sql)
|
||||
except Exception as exc:
|
||||
print(f"Resource status matrix query failed: {exc}")
|
||||
@@ -407,24 +307,9 @@ def query_resource_filter_options(days_back: int = 30) -> Optional[Dict]:
|
||||
locations = get_locations()
|
||||
assets_statuses = get_distinct_values('PJ_ASSETSSTATUS')
|
||||
|
||||
# Query only dynamic status data from Oracle
|
||||
# Note: Can't wrap CTE in subquery, so use inline approach
|
||||
sql_statuses = f"""
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DWH.DW_MES_RESOURCESTATUS
|
||||
)
|
||||
SELECT DISTINCT s.NEWSTATUSNAME
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {days_back}
|
||||
AND s.NEWSTATUSNAME IS NOT NULL
|
||||
ORDER BY s.NEWSTATUSNAME
|
||||
"""
|
||||
status_df = read_sql_df(sql_statuses)
|
||||
# Query only dynamic status data from Oracle using SQLLoader
|
||||
sql_statuses = SQLLoader.load("resource/distinct_statuses")
|
||||
status_df = read_sql_df(sql_statuses, {'days_back': days_back})
|
||||
statuses = status_df['NEWSTATUSNAME'].tolist() if status_df is not None else []
|
||||
|
||||
return {
|
||||
|
||||
@@ -15,6 +15,8 @@ import pandas as pd
|
||||
|
||||
from mes_dashboard.core.database import read_sql_df
|
||||
from mes_dashboard.core.cache import get_cached_wip_data, get_cached_sys_date
|
||||
from mes_dashboard.sql import SQLLoader, QueryBuilder
|
||||
from mes_dashboard.sql.filters import CommonFilters, NON_QUALITY_HOLD_REASONS
|
||||
|
||||
logger = logging.getLogger('mes_dashboard.wip_service')
|
||||
|
||||
@@ -29,89 +31,82 @@ def _safe_value(val):
|
||||
return val
|
||||
|
||||
|
||||
def _escape_sql(value: str) -> str:
|
||||
"""Escape single quotes in SQL string values."""
|
||||
if value is None:
|
||||
return None
|
||||
return value.replace("'", "''")
|
||||
|
||||
|
||||
def _build_base_conditions(
|
||||
def _build_base_conditions_builder(
|
||||
include_dummy: bool = False,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None
|
||||
) -> List[str]:
|
||||
"""Build base WHERE conditions for WIP queries.
|
||||
lotid: Optional[str] = None,
|
||||
builder: Optional[QueryBuilder] = None
|
||||
) -> QueryBuilder:
|
||||
"""Build base WHERE conditions for WIP queries using QueryBuilder.
|
||||
|
||||
Args:
|
||||
include_dummy: If False (default), exclude LOTID containing 'DUMMY'
|
||||
workorder: Optional WORKORDER filter (fuzzy match)
|
||||
lotid: Optional LOTID filter (fuzzy match)
|
||||
builder: Optional existing QueryBuilder to add conditions to
|
||||
|
||||
Returns:
|
||||
List of SQL condition strings
|
||||
QueryBuilder with base conditions and parameters
|
||||
"""
|
||||
conditions = []
|
||||
if builder is None:
|
||||
builder = QueryBuilder()
|
||||
|
||||
# Exclude raw materials (NULL WORKORDER)
|
||||
conditions.append("WORKORDER IS NOT NULL")
|
||||
builder.add_is_not_null("WORKORDER")
|
||||
|
||||
# DUMMY exclusion (default behavior)
|
||||
if not include_dummy:
|
||||
conditions.append("LOTID NOT LIKE '%DUMMY%'")
|
||||
builder.add_condition("LOTID NOT LIKE '%DUMMY%'")
|
||||
|
||||
# WORKORDER filter (fuzzy match)
|
||||
if workorder:
|
||||
conditions.append(f"WORKORDER LIKE '%{_escape_sql(workorder)}%'")
|
||||
builder.add_like_condition("WORKORDER", workorder, position="both")
|
||||
|
||||
# LOTID filter (fuzzy match)
|
||||
if lotid:
|
||||
conditions.append(f"LOTID LIKE '%{_escape_sql(lotid)}%'")
|
||||
builder.add_like_condition("LOTID", lotid, position="both")
|
||||
|
||||
return conditions
|
||||
return builder
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Hold Type Classification
|
||||
# ============================================================
|
||||
# Non-quality hold reasons (all other reasons are quality holds)
|
||||
NON_QUALITY_HOLD_REASONS = {
|
||||
'IQC檢驗(久存品驗證)(QC)',
|
||||
'大中/安波幅50pcs樣品留樣(PD)',
|
||||
'工程驗證(PE)',
|
||||
'工程驗證(RD)',
|
||||
'指定機台生產',
|
||||
'特殊需求(X-Ray全檢)',
|
||||
'特殊需求管控',
|
||||
'第一次量產QC品質確認(QC)',
|
||||
'需綁尾數(PD)',
|
||||
'樣品需求留存打樣(樣品)',
|
||||
'盤點(收線)需求',
|
||||
}
|
||||
# NON_QUALITY_HOLD_REASONS is imported from sql.filters
|
||||
|
||||
|
||||
def is_quality_hold(reason: str) -> bool:
|
||||
"""Check if a hold reason is quality-related.
|
||||
|
||||
Wrapper for CommonFilters.is_quality_hold for backwards compatibility.
|
||||
"""
|
||||
return CommonFilters.is_quality_hold(reason)
|
||||
|
||||
|
||||
def _add_hold_type_conditions(
|
||||
builder: QueryBuilder,
|
||||
hold_type: Optional[str] = None
|
||||
) -> QueryBuilder:
|
||||
"""Add hold type filter conditions to QueryBuilder.
|
||||
|
||||
Args:
|
||||
reason: The HOLDREASONNAME value
|
||||
builder: QueryBuilder to add conditions to
|
||||
hold_type: 'quality' for quality holds, 'non-quality' for non-quality holds
|
||||
|
||||
Returns:
|
||||
True if this is a quality hold, False if non-quality hold
|
||||
QueryBuilder with hold type conditions added
|
||||
"""
|
||||
if reason is None:
|
||||
return True # Default to quality if reason is unknown
|
||||
return reason not in NON_QUALITY_HOLD_REASONS
|
||||
|
||||
|
||||
def _build_hold_type_sql_list() -> str:
|
||||
"""Build SQL IN clause list for non-quality hold reasons.
|
||||
|
||||
Returns:
|
||||
Comma-separated string of escaped reason names for SQL IN clause
|
||||
"""
|
||||
escaped = [f"'{_escape_sql(r)}'" for r in NON_QUALITY_HOLD_REASONS]
|
||||
return ', '.join(escaped)
|
||||
if hold_type == 'quality':
|
||||
# Quality hold: HOLDREASONNAME is NULL or NOT in non-quality list
|
||||
builder.add_not_in_condition(
|
||||
"HOLDREASONNAME",
|
||||
list(NON_QUALITY_HOLD_REASONS),
|
||||
allow_null=True
|
||||
)
|
||||
elif hold_type == 'non-quality':
|
||||
# Non-quality hold: HOLDREASONNAME is in non-quality list
|
||||
builder.add_in_condition("HOLDREASONNAME", list(NON_QUALITY_HOLD_REASONS))
|
||||
return builder
|
||||
|
||||
|
||||
# ============================================================
|
||||
@@ -317,49 +312,24 @@ def _get_wip_summary_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP summary directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
if pj_type:
|
||||
conditions.append(f"PJ_TYPE = '{_escape_sql(pj_type)}'")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}" if conditions else ""
|
||||
non_quality_list = _build_hold_type_sql_list()
|
||||
# Build conditions using QueryBuilder
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder, lotid)
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_LOTS,
|
||||
SUM(QTY) as TOTAL_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN 1 ELSE 0 END) as RUN_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN QTY ELSE 0 END) as RUN_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN 1 ELSE 0 END) as HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN QTY ELSE 0 END) as HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND (HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({non_quality_list}))
|
||||
THEN 1 ELSE 0 END) as QUALITY_HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND (HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({non_quality_list}))
|
||||
THEN QTY ELSE 0 END) as QUALITY_HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND HOLDREASONNAME IN ({non_quality_list})
|
||||
THEN 1 ELSE 0 END) as NON_QUALITY_HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND HOLDREASONNAME IN ({non_quality_list})
|
||||
THEN QTY ELSE 0 END) as NON_QUALITY_HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) = 0 THEN 1 ELSE 0 END) as QUEUE_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) = 0 THEN QTY ELSE 0 END) as QUEUE_QTY_PCS,
|
||||
MAX(SYS_DATE) as DATA_UPDATE_DATE
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
if package:
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
if pj_type:
|
||||
builder.add_param_condition("PJ_TYPE", pj_type)
|
||||
|
||||
# Load SQL template and build query
|
||||
base_sql = SQLLoader.load("wip/summary")
|
||||
builder.base_sql = base_sql
|
||||
|
||||
# Replace NON_QUALITY_REASONS placeholder (must be literal values for CASE expressions)
|
||||
non_quality_list = CommonFilters.get_non_quality_reasons_sql()
|
||||
builder.base_sql = builder.base_sql.replace("{{ NON_QUALITY_REASONS }}", non_quality_list)
|
||||
|
||||
sql, params = builder.build()
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return None
|
||||
@@ -541,46 +511,35 @@ def _get_wip_matrix_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP matrix directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append("WORKCENTER_GROUP IS NOT NULL")
|
||||
conditions.append("PACKAGE_LEF IS NOT NULL")
|
||||
# Build conditions using QueryBuilder
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder, lotid)
|
||||
builder.add_is_not_null("WORKCENTER_GROUP")
|
||||
builder.add_is_not_null("PACKAGE_LEF")
|
||||
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
if pj_type:
|
||||
conditions.append(f"PJ_TYPE = '{_escape_sql(pj_type)}'")
|
||||
builder.add_param_condition("PJ_TYPE", pj_type)
|
||||
|
||||
# WIP status filter
|
||||
if status:
|
||||
status_upper = status.upper()
|
||||
if status_upper == 'RUN':
|
||||
conditions.append("EQUIPMENTCOUNT > 0")
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) > 0")
|
||||
elif status_upper == 'HOLD':
|
||||
conditions.append("EQUIPMENTCOUNT = 0 AND CURRENTHOLDCOUNT > 0")
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) > 0")
|
||||
# Hold type sub-filter
|
||||
if hold_type:
|
||||
non_quality_list = _build_hold_type_sql_list()
|
||||
if hold_type == 'quality':
|
||||
conditions.append(
|
||||
f"(HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({non_quality_list}))"
|
||||
)
|
||||
elif hold_type == 'non-quality':
|
||||
conditions.append(f"HOLDREASONNAME IN ({non_quality_list})")
|
||||
_add_hold_type_conditions(builder, hold_type)
|
||||
elif status_upper == 'QUEUE':
|
||||
conditions.append("EQUIPMENTCOUNT = 0 AND CURRENTHOLDCOUNT = 0")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) = 0")
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
WORKCENTER_GROUP,
|
||||
WORKCENTERSEQUENCE_GROUP,
|
||||
PACKAGE_LEF,
|
||||
SUM(QTY) as QTY
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTERSEQUENCE_GROUP, PACKAGE_LEF
|
||||
ORDER BY WORKCENTERSEQUENCE_GROUP, PACKAGE_LEF
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
# Load SQL template and build query
|
||||
base_sql = SQLLoader.load("wip/matrix")
|
||||
builder.base_sql = base_sql
|
||||
sql, params = builder.build()
|
||||
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return {
|
||||
@@ -664,10 +623,12 @@ def _get_wip_hold_summary_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP hold summary directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append("STATUS = 'HOLD'")
|
||||
conditions.append("HOLDREASONNAME IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
# Build conditions using QueryBuilder
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder, lotid)
|
||||
builder.add_param_condition("STATUS", "HOLD")
|
||||
builder.add_is_not_null("HOLDREASONNAME")
|
||||
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
@@ -679,7 +640,7 @@ def _get_wip_hold_summary_from_oracle(
|
||||
GROUP BY HOLDREASONNAME
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return {'items': []}
|
||||
@@ -844,42 +805,36 @@ def _get_wip_detail_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get WIP detail directly from Oracle (fallback)."""
|
||||
try:
|
||||
# Build WHERE conditions
|
||||
conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
conditions.append(f"WORKCENTER_GROUP = '{_escape_sql(workcenter)}'")
|
||||
# Build WHERE conditions using QueryBuilder
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder, lotid)
|
||||
builder.add_param_condition("WORKCENTER_GROUP", workcenter)
|
||||
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
|
||||
# WIP status filter (RUN/QUEUE/HOLD based on EQUIPMENTCOUNT and CURRENTHOLDCOUNT)
|
||||
if status:
|
||||
status_upper = status.upper()
|
||||
if status_upper == 'RUN':
|
||||
conditions.append("COALESCE(EQUIPMENTCOUNT, 0) > 0")
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) > 0")
|
||||
elif status_upper == 'HOLD':
|
||||
conditions.append("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) > 0")
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) > 0")
|
||||
# Hold type sub-filter
|
||||
if hold_type:
|
||||
non_quality_list = _build_hold_type_sql_list()
|
||||
if hold_type == 'quality':
|
||||
conditions.append(
|
||||
f"(HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({non_quality_list}))"
|
||||
)
|
||||
elif hold_type == 'non-quality':
|
||||
conditions.append(f"HOLDREASONNAME IN ({non_quality_list})")
|
||||
_add_hold_type_conditions(builder, hold_type)
|
||||
elif status_upper == 'QUEUE':
|
||||
conditions.append("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) = 0")
|
||||
builder.add_condition("COALESCE(EQUIPMENTCOUNT, 0) = 0 AND COALESCE(CURRENTHOLDCOUNT, 0) = 0")
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
# Get summary with RUN/QUEUE/HOLD classification (IT standard)
|
||||
# Note: summary always uses base_conditions (without hold_type filter) to show full breakdown
|
||||
summary_conditions = _build_base_conditions(include_dummy, workorder, lotid)
|
||||
summary_conditions.append(f"WORKCENTER_GROUP = '{_escape_sql(workcenter)}'")
|
||||
# Build summary conditions (without status/hold_type filter for full breakdown)
|
||||
summary_builder = _build_base_conditions_builder(include_dummy, workorder, lotid)
|
||||
summary_builder.add_param_condition("WORKCENTER_GROUP", workcenter)
|
||||
if package:
|
||||
summary_conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
summary_where = f"WHERE {' AND '.join(summary_conditions)}"
|
||||
non_quality_list = _build_hold_type_sql_list()
|
||||
summary_builder.add_param_condition("PACKAGE_LEF", package)
|
||||
|
||||
summary_where, summary_params = summary_builder.build_where_only()
|
||||
non_quality_list = CommonFilters.get_non_quality_reasons_sql()
|
||||
|
||||
summary_sql = f"""
|
||||
SELECT
|
||||
@@ -902,7 +857,7 @@ def _get_wip_detail_from_oracle(
|
||||
{summary_where}
|
||||
"""
|
||||
|
||||
summary_df = read_sql_df(summary_sql)
|
||||
summary_df = read_sql_df(summary_sql, summary_params)
|
||||
|
||||
if summary_df is None or summary_df.empty:
|
||||
return None
|
||||
@@ -919,7 +874,6 @@ def _get_wip_detail_from_oracle(
|
||||
non_quality_hold_lots = int(summary_row['NON_QUALITY_HOLD_LOTS'] or 0)
|
||||
|
||||
# Determine filtered count based on status filter
|
||||
# When a status filter is applied, use the corresponding count for pagination
|
||||
if status:
|
||||
status_upper = status.upper()
|
||||
if status_upper == 'RUN':
|
||||
@@ -927,7 +881,6 @@ def _get_wip_detail_from_oracle(
|
||||
elif status_upper == 'QUEUE':
|
||||
filtered_count = queue_lots
|
||||
elif status_upper == 'HOLD':
|
||||
# Further filter by hold_type if specified
|
||||
if hold_type == 'quality':
|
||||
filtered_count = quality_hold_lots
|
||||
elif hold_type == 'non-quality':
|
||||
@@ -957,33 +910,20 @@ def _get_wip_detail_from_oracle(
|
||||
ORDER BY SPECSEQUENCE
|
||||
"""
|
||||
|
||||
specs_df = read_sql_df(specs_sql)
|
||||
specs_df = read_sql_df(specs_sql, params)
|
||||
specs = specs_df['SPECNAME'].tolist() if specs_df is not None and not specs_df.empty else []
|
||||
|
||||
# Get paginated lot details with WIP Status (IT standard)
|
||||
# Get paginated lot details using SQL file with bind variables
|
||||
offset = (page - 1) * page_size
|
||||
lots_sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
LOTID,
|
||||
EQUIPMENTS,
|
||||
STATUS,
|
||||
HOLDREASONNAME,
|
||||
QTY,
|
||||
PACKAGE_LEF,
|
||||
SPECNAME,
|
||||
CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN 'RUN'
|
||||
WHEN COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN 'HOLD'
|
||||
ELSE 'QUEUE' END AS WIP_STATUS,
|
||||
ROW_NUMBER() OVER (ORDER BY LOTID) as RN
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
)
|
||||
WHERE RN > {offset} AND RN <= {offset + page_size}
|
||||
ORDER BY RN
|
||||
"""
|
||||
base_detail_sql = SQLLoader.load("wip/detail")
|
||||
detail_sql = base_detail_sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
|
||||
lots_df = read_sql_df(lots_sql)
|
||||
# Add pagination params to existing params
|
||||
detail_params = params.copy()
|
||||
detail_params['offset'] = offset
|
||||
detail_params['limit'] = page_size
|
||||
|
||||
lots_df = read_sql_df(detail_sql, detail_params)
|
||||
|
||||
lots = []
|
||||
if lots_df is not None and not lots_df.empty:
|
||||
@@ -1014,7 +954,7 @@ def _get_wip_detail_from_oracle(
|
||||
'sys_date': sys_date
|
||||
}
|
||||
except Exception as exc:
|
||||
print(f"WIP detail query failed: {exc}")
|
||||
logger.error(f"WIP detail query failed: {exc}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return None
|
||||
@@ -1067,9 +1007,9 @@ def get_workcenters(include_dummy: bool = False) -> Optional[List[Dict[str, Any]
|
||||
def _get_workcenters_from_oracle(include_dummy: bool = False) -> Optional[List[Dict[str, Any]]]:
|
||||
"""Get workcenters directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("WORKCENTER_GROUP IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
builder = _build_base_conditions_builder(include_dummy)
|
||||
builder.add_is_not_null("WORKCENTER_GROUP")
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
@@ -1081,7 +1021,7 @@ def _get_workcenters_from_oracle(include_dummy: bool = False) -> Optional[List[D
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTERSEQUENCE_GROUP
|
||||
ORDER BY WORKCENTERSEQUENCE_GROUP
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1142,9 +1082,9 @@ def get_packages(include_dummy: bool = False) -> Optional[List[Dict[str, Any]]]:
|
||||
def _get_packages_from_oracle(include_dummy: bool = False) -> Optional[List[Dict[str, Any]]]:
|
||||
"""Get packages directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("PACKAGE_LEF IS NOT NULL")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
builder = _build_base_conditions_builder(include_dummy)
|
||||
builder.add_is_not_null("PACKAGE_LEF")
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
@@ -1155,7 +1095,7 @@ def _get_packages_from_oracle(include_dummy: bool = False) -> Optional[List[Dict
|
||||
GROUP BY PACKAGE_LEF
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1245,26 +1185,27 @@ def _search_workorders_from_oracle(
|
||||
) -> Optional[List[str]]:
|
||||
"""Search workorders directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, lotid=lotid)
|
||||
conditions.append(f"WORKORDER LIKE '%{_escape_sql(q)}%'")
|
||||
conditions.append("WORKORDER IS NOT NULL")
|
||||
builder = _build_base_conditions_builder(include_dummy, lotid=lotid)
|
||||
builder.add_like_condition("WORKORDER", q, position="both")
|
||||
builder.add_is_not_null("WORKORDER")
|
||||
|
||||
# Apply cross-filters
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
if pj_type:
|
||||
conditions.append(f"PJ_TYPE = '{_escape_sql(pj_type)}'")
|
||||
builder.add_param_condition("PJ_TYPE", pj_type)
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
params['row_limit'] = limit
|
||||
|
||||
sql = f"""
|
||||
SELECT DISTINCT WORKORDER
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY WORKORDER
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
FETCH FIRST :row_limit ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1342,25 +1283,26 @@ def _search_lot_ids_from_oracle(
|
||||
) -> Optional[List[str]]:
|
||||
"""Search lot IDs directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder=workorder)
|
||||
conditions.append(f"LOTID LIKE '%{_escape_sql(q)}%'")
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder=workorder)
|
||||
builder.add_like_condition("LOTID", q, position="both")
|
||||
|
||||
# Apply cross-filters
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
if pj_type:
|
||||
conditions.append(f"PJ_TYPE = '{_escape_sql(pj_type)}'")
|
||||
builder.add_param_condition("PJ_TYPE", pj_type)
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
params['row_limit'] = limit
|
||||
|
||||
sql = f"""
|
||||
SELECT LOTID
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY LOTID
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
FETCH FIRST :row_limit ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1443,24 +1385,25 @@ def _search_packages_from_oracle(
|
||||
) -> Optional[List[str]]:
|
||||
"""Search packages directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder=workorder, lotid=lotid)
|
||||
conditions.append(f"PACKAGE_LEF LIKE '%{_escape_sql(q)}%'")
|
||||
conditions.append("PACKAGE_LEF IS NOT NULL")
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder=workorder, lotid=lotid)
|
||||
builder.add_like_condition("PACKAGE_LEF", q, position="both")
|
||||
builder.add_is_not_null("PACKAGE_LEF")
|
||||
|
||||
# Apply cross-filter
|
||||
if pj_type:
|
||||
conditions.append(f"PJ_TYPE = '{_escape_sql(pj_type)}'")
|
||||
builder.add_param_condition("PJ_TYPE", pj_type)
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
params['row_limit'] = limit
|
||||
|
||||
sql = f"""
|
||||
SELECT DISTINCT PACKAGE_LEF
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY PACKAGE_LEF
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
FETCH FIRST :row_limit ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1543,24 +1486,25 @@ def _search_types_from_oracle(
|
||||
) -> Optional[List[str]]:
|
||||
"""Search types directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy, workorder=workorder, lotid=lotid)
|
||||
conditions.append(f"PJ_TYPE LIKE '%{_escape_sql(q)}%'")
|
||||
conditions.append("PJ_TYPE IS NOT NULL")
|
||||
builder = _build_base_conditions_builder(include_dummy, workorder=workorder, lotid=lotid)
|
||||
builder.add_like_condition("PJ_TYPE", q, position="both")
|
||||
builder.add_is_not_null("PJ_TYPE")
|
||||
|
||||
# Apply cross-filter
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
params['row_limit'] = limit
|
||||
|
||||
sql = f"""
|
||||
SELECT DISTINCT PJ_TYPE
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
ORDER BY PJ_TYPE
|
||||
FETCH FIRST {limit} ROWS ONLY
|
||||
FETCH FIRST :row_limit ROWS ONLY
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return []
|
||||
@@ -1632,11 +1576,11 @@ def _get_hold_detail_summary_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get hold detail summary directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("STATUS = 'HOLD'")
|
||||
conditions.append("CURRENTHOLDCOUNT > 0")
|
||||
conditions.append(f"HOLDREASONNAME = '{_escape_sql(reason)}'")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
builder = _build_base_conditions_builder(include_dummy)
|
||||
builder.add_param_condition("STATUS", "HOLD")
|
||||
builder.add_condition("CURRENTHOLDCOUNT > 0")
|
||||
builder.add_param_condition("HOLDREASONNAME", reason)
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
@@ -1648,7 +1592,7 @@ def _get_hold_detail_summary_from_oracle(
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, params)
|
||||
|
||||
if df is None or df.empty:
|
||||
return None
|
||||
@@ -1808,11 +1752,11 @@ def _get_hold_detail_distribution_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get hold detail distribution directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("STATUS = 'HOLD'")
|
||||
conditions.append("CURRENTHOLDCOUNT > 0")
|
||||
conditions.append(f"HOLDREASONNAME = '{_escape_sql(reason)}'")
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
builder = _build_base_conditions_builder(include_dummy)
|
||||
builder.add_param_condition("STATUS", "HOLD")
|
||||
builder.add_condition("CURRENTHOLDCOUNT > 0")
|
||||
builder.add_param_condition("HOLDREASONNAME", reason)
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
# Get total for percentage calculation
|
||||
total_sql = f"""
|
||||
@@ -1820,7 +1764,7 @@ def _get_hold_detail_distribution_from_oracle(
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
total_df = read_sql_df(total_sql)
|
||||
total_df = read_sql_df(total_sql, params)
|
||||
total_lots = int(total_df.iloc[0]['TOTAL_LOTS'] or 0) if total_df is not None else 0
|
||||
|
||||
if total_lots == 0:
|
||||
@@ -1842,7 +1786,7 @@ def _get_hold_detail_distribution_from_oracle(
|
||||
GROUP BY WORKCENTER_GROUP
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
wc_df = read_sql_df(wc_sql)
|
||||
wc_df = read_sql_df(wc_sql, params)
|
||||
by_workcenter = []
|
||||
if wc_df is not None and not wc_df.empty:
|
||||
for _, row in wc_df.iterrows():
|
||||
@@ -1866,7 +1810,7 @@ def _get_hold_detail_distribution_from_oracle(
|
||||
GROUP BY PACKAGE_LEF
|
||||
ORDER BY COUNT(*) DESC
|
||||
"""
|
||||
pkg_df = read_sql_df(pkg_sql)
|
||||
pkg_df = read_sql_df(pkg_sql, params)
|
||||
by_package = []
|
||||
if pkg_df is not None and not pkg_df.empty:
|
||||
for _, row in pkg_df.iterrows():
|
||||
@@ -1898,7 +1842,7 @@ def _get_hold_detail_distribution_from_oracle(
|
||||
ELSE '7+'
|
||||
END
|
||||
"""
|
||||
age_df = read_sql_df(age_sql)
|
||||
age_df = read_sql_df(age_sql, params)
|
||||
|
||||
# Define age ranges in order
|
||||
age_labels = {
|
||||
@@ -2056,27 +2000,27 @@ def _get_hold_detail_lots_from_oracle(
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Get hold detail lots directly from Oracle (fallback)."""
|
||||
try:
|
||||
conditions = _build_base_conditions(include_dummy)
|
||||
conditions.append("STATUS = 'HOLD'")
|
||||
conditions.append("CURRENTHOLDCOUNT > 0")
|
||||
conditions.append(f"HOLDREASONNAME = '{_escape_sql(reason)}'")
|
||||
builder = _build_base_conditions_builder(include_dummy)
|
||||
builder.add_param_condition("STATUS", "HOLD")
|
||||
builder.add_condition("CURRENTHOLDCOUNT > 0")
|
||||
builder.add_param_condition("HOLDREASONNAME", reason)
|
||||
|
||||
# Optional filters
|
||||
if workcenter:
|
||||
conditions.append(f"WORKCENTER_GROUP = '{_escape_sql(workcenter)}'")
|
||||
builder.add_param_condition("WORKCENTER_GROUP", workcenter)
|
||||
if package:
|
||||
conditions.append(f"PACKAGE_LEF = '{_escape_sql(package)}'")
|
||||
builder.add_param_condition("PACKAGE_LEF", package)
|
||||
if age_range:
|
||||
if age_range == '0-1':
|
||||
conditions.append("AGEBYDAYS >= 0 AND AGEBYDAYS < 1")
|
||||
builder.add_condition("AGEBYDAYS >= 0 AND AGEBYDAYS < 1")
|
||||
elif age_range == '1-3':
|
||||
conditions.append("AGEBYDAYS >= 1 AND AGEBYDAYS < 3")
|
||||
builder.add_condition("AGEBYDAYS >= 1 AND AGEBYDAYS < 3")
|
||||
elif age_range == '3-7':
|
||||
conditions.append("AGEBYDAYS >= 3 AND AGEBYDAYS < 7")
|
||||
builder.add_condition("AGEBYDAYS >= 3 AND AGEBYDAYS < 7")
|
||||
elif age_range == '7+':
|
||||
conditions.append("AGEBYDAYS >= 7")
|
||||
builder.add_condition("AGEBYDAYS >= 7")
|
||||
|
||||
where_clause = f"WHERE {' AND '.join(conditions)}"
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
# Get total count
|
||||
count_sql = f"""
|
||||
@@ -2084,11 +2028,15 @@ def _get_hold_detail_lots_from_oracle(
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
"""
|
||||
count_df = read_sql_df(count_sql)
|
||||
count_df = read_sql_df(count_sql, params)
|
||||
total = int(count_df.iloc[0]['TOTAL'] or 0) if count_df is not None else 0
|
||||
|
||||
# Get paginated lots
|
||||
# Get paginated lots with bind variables
|
||||
offset = (page - 1) * page_size
|
||||
lots_params = params.copy()
|
||||
lots_params['offset'] = offset
|
||||
lots_params['limit'] = page_size
|
||||
|
||||
lots_sql = f"""
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
@@ -2106,10 +2054,10 @@ def _get_hold_detail_lots_from_oracle(
|
||||
FROM {WIP_VIEW}
|
||||
{where_clause}
|
||||
)
|
||||
WHERE RN > {offset} AND RN <= {offset + page_size}
|
||||
WHERE RN > :offset AND RN <= :offset + :limit
|
||||
ORDER BY RN
|
||||
"""
|
||||
lots_df = read_sql_df(lots_sql)
|
||||
lots_df = read_sql_df(lots_sql, lots_params)
|
||||
|
||||
lots = []
|
||||
if lots_df is not None and not lots_df.empty:
|
||||
@@ -2297,9 +2245,9 @@ def _get_lot_detail_from_oracle(lotid: str) -> Optional[Dict[str, Any]]:
|
||||
WAFER_FACTOR,
|
||||
SYS_DATE
|
||||
FROM {WIP_VIEW}
|
||||
WHERE LOTID = '{_escape_sql(lotid)}'
|
||||
WHERE LOTID = :lotid
|
||||
"""
|
||||
df = read_sql_df(sql)
|
||||
df = read_sql_df(sql, {'lotid': lotid})
|
||||
|
||||
if df is None or df.empty:
|
||||
return None
|
||||
|
||||
91
src/mes_dashboard/sql/__init__.py
Normal file
91
src/mes_dashboard/sql/__init__.py
Normal file
@@ -0,0 +1,91 @@
|
||||
"""
|
||||
SQL Query Management Module
|
||||
|
||||
Provides safe SQL query loading, building, and common filters.
|
||||
|
||||
Architecture Overview:
|
||||
This module provides three main components for SQL query management:
|
||||
|
||||
1. SQLLoader - Load SQL templates from .sql files with LRU caching
|
||||
2. QueryBuilder - Build parameterized WHERE conditions safely
|
||||
3. CommonFilters - Reusable filter patterns for common queries
|
||||
|
||||
Directory Structure:
|
||||
src/mes_dashboard/sql/
|
||||
├── __init__.py # Public API exports
|
||||
├── loader.py # SQLLoader implementation
|
||||
├── builder.py # QueryBuilder implementation
|
||||
├── filters.py # CommonFilters implementation
|
||||
├── dashboard/ # Dashboard-related SQL files
|
||||
│ ├── kpi.sql
|
||||
│ ├── heatmap.sql
|
||||
│ ├── workcenter_cards.sql
|
||||
│ └── resource_detail_with_job.sql
|
||||
├── resource/ # Resource status SQL files
|
||||
│ ├── latest_status.sql
|
||||
│ ├── status_summary.sql
|
||||
│ ├── by_status.sql
|
||||
│ ├── by_workcenter.sql
|
||||
│ ├── detail.sql
|
||||
│ └── workcenter_status_matrix.sql
|
||||
├── resource_history/ # Resource history SQL files
|
||||
│ ├── kpi.sql
|
||||
│ ├── trend.sql
|
||||
│ ├── heatmap.sql
|
||||
│ └── detail.sql
|
||||
└── wip/ # WIP (Work In Progress) SQL files
|
||||
├── summary.sql
|
||||
├── matrix.sql
|
||||
└── detail.sql
|
||||
|
||||
SQL File Format:
|
||||
SQL files use placeholders for dynamic parts:
|
||||
|
||||
- {{ PLACEHOLDER }} - Replaced via str.replace() before execution
|
||||
- :param_name - Oracle bind variables (filled by params dict)
|
||||
|
||||
Example SQL file (resource/by_status.sql):
|
||||
-- Resource count by status
|
||||
-- Placeholders:
|
||||
-- {{ LATEST_STATUS_SUBQUERY }} - Base CTE for latest status
|
||||
-- Parameters:
|
||||
-- (from QueryBuilder)
|
||||
SELECT NEWSTATUSNAME, COUNT(*) as COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE 1=1 {{ WHERE_CLAUSE }}
|
||||
GROUP BY NEWSTATUSNAME
|
||||
|
||||
Usage Example:
|
||||
>>> from mes_dashboard.sql import SQLLoader, QueryBuilder
|
||||
>>> from mes_dashboard.core.database import read_sql_df
|
||||
>>>
|
||||
>>> # Load SQL template
|
||||
>>> sql = SQLLoader.load("resource/by_status")
|
||||
>>>
|
||||
>>> # Build parameterized conditions
|
||||
>>> builder = QueryBuilder()
|
||||
>>> builder.add_in_condition("LOCATIONNAME", ["FAB1", "FAB2"])
|
||||
>>> builder.add_like_condition("WORKCENTERNAME", "ASSY", position="start")
|
||||
>>> where_clause, params = builder.build_where_only()
|
||||
>>>
|
||||
>>> # Replace placeholders and execute
|
||||
>>> sql = sql.replace("{{ LATEST_STATUS_SUBQUERY }}", base_cte)
|
||||
>>> sql = sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
>>> df = read_sql_df(sql, params)
|
||||
|
||||
SQL Injection Prevention:
|
||||
- Always use QueryBuilder for user-provided values
|
||||
- Use :param_name bind variables for all dynamic values
|
||||
- Placeholders {{ }} are only for static, pre-defined SQL fragments
|
||||
- Never interpolate user input directly into SQL strings
|
||||
"""
|
||||
|
||||
from .loader import SQLLoader
|
||||
from .builder import QueryBuilder
|
||||
from .filters import CommonFilters
|
||||
|
||||
__all__ = [
|
||||
"SQLLoader",
|
||||
"QueryBuilder",
|
||||
"CommonFilters",
|
||||
]
|
||||
263
src/mes_dashboard/sql/builder.py
Normal file
263
src/mes_dashboard/sql/builder.py
Normal file
@@ -0,0 +1,263 @@
|
||||
"""
|
||||
Query Builder
|
||||
|
||||
Provides safe SQL query building with parameterized conditions.
|
||||
"""
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
|
||||
@dataclass
|
||||
class QueryBuilder:
|
||||
"""
|
||||
Safe SQL query builder with parameterized conditions.
|
||||
|
||||
Builds WHERE clauses with Oracle bind variables (:param_name)
|
||||
to prevent SQL injection.
|
||||
"""
|
||||
|
||||
base_sql: str = ""
|
||||
conditions: List[str] = field(default_factory=list)
|
||||
params: Dict[str, Any] = field(default_factory=dict)
|
||||
_param_counter: int = field(default=0, repr=False)
|
||||
|
||||
def _next_param(self) -> str:
|
||||
"""Generate next parameter name."""
|
||||
name = f"p{self._param_counter}"
|
||||
self._param_counter += 1
|
||||
return name
|
||||
|
||||
def add_condition(self, condition: str) -> "QueryBuilder":
|
||||
"""
|
||||
Add a fixed condition (no parameters).
|
||||
|
||||
Args:
|
||||
condition: SQL condition string
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
self.conditions.append(condition)
|
||||
return self
|
||||
|
||||
def add_param_condition(
|
||||
self,
|
||||
column: str,
|
||||
value: Any,
|
||||
operator: str = "=",
|
||||
) -> "QueryBuilder":
|
||||
"""
|
||||
Add a parameterized condition.
|
||||
|
||||
Args:
|
||||
column: Column name
|
||||
value: Value to compare
|
||||
operator: Comparison operator (default: "=")
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
param_name = self._next_param()
|
||||
self.conditions.append(f"{column} {operator} :{param_name}")
|
||||
self.params[param_name] = value
|
||||
return self
|
||||
|
||||
def add_in_condition(
|
||||
self,
|
||||
column: str,
|
||||
values: List[Any],
|
||||
) -> "QueryBuilder":
|
||||
"""
|
||||
Add an IN condition with parameterized values.
|
||||
|
||||
Args:
|
||||
column: Column name
|
||||
values: List of values for IN clause
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
if not values:
|
||||
return self
|
||||
|
||||
param_names = []
|
||||
for val in values:
|
||||
param_name = self._next_param()
|
||||
param_names.append(f":{param_name}")
|
||||
self.params[param_name] = val
|
||||
|
||||
self.conditions.append(f"{column} IN ({', '.join(param_names)})")
|
||||
return self
|
||||
|
||||
def add_not_in_condition(
|
||||
self,
|
||||
column: str,
|
||||
values: List[Any],
|
||||
allow_null: bool = False,
|
||||
) -> "QueryBuilder":
|
||||
"""
|
||||
Add a NOT IN condition with parameterized values.
|
||||
|
||||
Args:
|
||||
column: Column name
|
||||
values: List of values to exclude
|
||||
allow_null: If True, also allows NULL values
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
if not values:
|
||||
return self
|
||||
|
||||
param_names = []
|
||||
for val in values:
|
||||
param_name = self._next_param()
|
||||
param_names.append(f":{param_name}")
|
||||
self.params[param_name] = val
|
||||
|
||||
not_in_clause = f"{column} NOT IN ({', '.join(param_names)})"
|
||||
|
||||
if allow_null:
|
||||
self.conditions.append(f"({column} IS NULL OR {not_in_clause})")
|
||||
else:
|
||||
self.conditions.append(not_in_clause)
|
||||
|
||||
return self
|
||||
|
||||
def add_like_condition(
|
||||
self,
|
||||
column: str,
|
||||
value: str,
|
||||
position: str = "both",
|
||||
) -> "QueryBuilder":
|
||||
"""
|
||||
Add a LIKE condition with escaped wildcards.
|
||||
|
||||
Args:
|
||||
column: Column name
|
||||
value: Search value (wildcards will be escaped)
|
||||
position: Where to add wildcards:
|
||||
- "both": %value%
|
||||
- "start": value%
|
||||
- "end": %value
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
# Escape SQL LIKE wildcards
|
||||
escaped = value.replace("\\", "\\\\").replace("%", "\\%").replace("_", "\\_")
|
||||
|
||||
if position == "both":
|
||||
pattern = f"%{escaped}%"
|
||||
elif position == "start":
|
||||
pattern = f"{escaped}%"
|
||||
elif position == "end":
|
||||
pattern = f"%{escaped}"
|
||||
else:
|
||||
pattern = escaped
|
||||
|
||||
param_name = self._next_param()
|
||||
self.conditions.append(f"{column} LIKE :{param_name} ESCAPE '\\'")
|
||||
self.params[param_name] = pattern
|
||||
|
||||
return self
|
||||
|
||||
def add_or_like_conditions(
|
||||
self,
|
||||
column: str,
|
||||
values: List[str],
|
||||
position: str = "both",
|
||||
case_insensitive: bool = False,
|
||||
) -> "QueryBuilder":
|
||||
"""
|
||||
Add multiple LIKE conditions combined with OR.
|
||||
|
||||
Args:
|
||||
column: Column name
|
||||
values: List of search values (wildcards will be escaped)
|
||||
position: Where to add wildcards (both/start/end)
|
||||
case_insensitive: If True, use UPPER() for case-insensitive matching
|
||||
|
||||
Returns:
|
||||
self for method chaining
|
||||
"""
|
||||
if not values:
|
||||
return self
|
||||
|
||||
like_conditions = []
|
||||
col_expr = f"UPPER({column})" if case_insensitive else column
|
||||
|
||||
for val in values:
|
||||
# Escape SQL LIKE wildcards
|
||||
escaped = val.replace("\\", "\\\\").replace("%", "\\%").replace("_", "\\_")
|
||||
if case_insensitive:
|
||||
escaped = escaped.upper()
|
||||
|
||||
if position == "both":
|
||||
pattern = f"%{escaped}%"
|
||||
elif position == "start":
|
||||
pattern = f"{escaped}%"
|
||||
elif position == "end":
|
||||
pattern = f"%{escaped}"
|
||||
else:
|
||||
pattern = escaped
|
||||
|
||||
param_name = self._next_param()
|
||||
like_conditions.append(f"{col_expr} LIKE :{param_name} ESCAPE '\\'")
|
||||
self.params[param_name] = pattern
|
||||
|
||||
self.conditions.append(f"({' OR '.join(like_conditions)})")
|
||||
return self
|
||||
|
||||
def add_is_null(self, column: str) -> "QueryBuilder":
|
||||
"""Add IS NULL condition."""
|
||||
self.conditions.append(f"{column} IS NULL")
|
||||
return self
|
||||
|
||||
def add_is_not_null(self, column: str) -> "QueryBuilder":
|
||||
"""Add IS NOT NULL condition."""
|
||||
self.conditions.append(f"{column} IS NOT NULL")
|
||||
return self
|
||||
|
||||
def build(self) -> Tuple[str, Dict[str, Any]]:
|
||||
"""
|
||||
Build the final SQL with WHERE clause.
|
||||
|
||||
Replaces {{ WHERE_CLAUSE }} placeholder in base_sql.
|
||||
If no conditions, placeholder is replaced with empty string.
|
||||
|
||||
Returns:
|
||||
Tuple of (sql_string, params_dict)
|
||||
"""
|
||||
if self.conditions:
|
||||
where_clause = f"WHERE {' AND '.join(self.conditions)}"
|
||||
else:
|
||||
where_clause = ""
|
||||
|
||||
sql = self.base_sql.replace("{{ WHERE_CLAUSE }}", where_clause)
|
||||
return sql, self.params.copy()
|
||||
|
||||
def build_where_only(self) -> Tuple[str, Dict[str, Any]]:
|
||||
"""
|
||||
Build only the WHERE clause (without base SQL).
|
||||
|
||||
Returns:
|
||||
Tuple of (where_clause, params_dict)
|
||||
"""
|
||||
if self.conditions:
|
||||
where_clause = f"WHERE {' AND '.join(self.conditions)}"
|
||||
else:
|
||||
where_clause = ""
|
||||
return where_clause, self.params.copy()
|
||||
|
||||
def get_conditions_sql(self) -> str:
|
||||
"""Get conditions as AND-joined string (without WHERE keyword)."""
|
||||
return " AND ".join(self.conditions) if self.conditions else ""
|
||||
|
||||
def reset(self) -> "QueryBuilder":
|
||||
"""Reset conditions and params, keep base_sql."""
|
||||
self.conditions = []
|
||||
self.params = {}
|
||||
self._param_counter = 0
|
||||
return self
|
||||
31
src/mes_dashboard/sql/dashboard/heatmap.sql
Normal file
31
src/mes_dashboard/sql/dashboard/heatmap.sql
Normal file
@@ -0,0 +1,31 @@
|
||||
-- Utilization Heatmap Query
|
||||
-- Returns equipment utilization data by workcenter and date
|
||||
--
|
||||
-- Calculates PRD% = PRD_HOURS / AVAIL_HOURS * 100
|
||||
-- where AVAIL_HOURS = PRD + SBY + UDT + SDT + EGT (excludes NST)
|
||||
--
|
||||
-- Parameters:
|
||||
-- :days - Number of days to look back
|
||||
--
|
||||
-- Dynamic placeholders:
|
||||
-- LOCATION_FILTER - Location exclusion filter
|
||||
-- ASSET_STATUS_FILTER - Asset status exclusion filter
|
||||
-- FLAG_FILTER - Equipment flag filters (isProduction, isKey, isMonitor)
|
||||
|
||||
SELECT
|
||||
ss.WORKCENTERNAME,
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME IN ('PRD', 'SBY', 'UDT', 'SDT', 'EGT') THEN ss.HOURS ELSE 0 END) as AVAIL_HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DWH.DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ss.WORKCENTERNAME IS NOT NULL
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{{ LOCATION_FILTER }}
|
||||
{{ ASSET_STATUS_FILTER }}
|
||||
{{ FLAG_FILTER }}
|
||||
GROUP BY ss.WORKCENTERNAME, TRUNC(ss.TXNDATE)
|
||||
ORDER BY ss.WORKCENTERNAME, DATA_DATE
|
||||
26
src/mes_dashboard/sql/dashboard/kpi.sql
Normal file
26
src/mes_dashboard/sql/dashboard/kpi.sql
Normal file
@@ -0,0 +1,26 @@
|
||||
-- Dashboard KPI Query
|
||||
-- Returns overall KPI statistics for dashboard header
|
||||
--
|
||||
-- Status categories:
|
||||
-- RUN: PRD (Production)
|
||||
-- DOWN: UDT + SDT (Down Time)
|
||||
-- IDLE: SBY + NST (Idle)
|
||||
-- ENG: EGT (Engineering Time)
|
||||
--
|
||||
-- OU% = PRD / (PRD + SBY + EGT + SDT + UDT) * 100
|
||||
--
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
-- WHERE_CLAUSE - Additional filter conditions
|
||||
|
||||
SELECT
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME NOT IN ('PRD','SBY','UDT','SDT','EGT','NST') THEN 1 ELSE 0 END) as OTHER_COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
{{ WHERE_CLAUSE }}
|
||||
64
src/mes_dashboard/sql/dashboard/kpi_standalone.sql
Normal file
64
src/mes_dashboard/sql/dashboard/kpi_standalone.sql
Normal file
@@ -0,0 +1,64 @@
|
||||
-- Dashboard KPI Standalone Query
|
||||
-- Returns overall KPI statistics for dashboard header
|
||||
-- This is a self-contained query with CTE for optimal performance
|
||||
--
|
||||
-- Placeholders:
|
||||
-- DAYS_BACK - Number of days to look back
|
||||
-- LOCATION_FILTER - Location exclusion filter (AND ...)
|
||||
-- ASSET_STATUS_FILTER - Asset status exclusion filter (AND ...)
|
||||
-- WHERE_CLAUSE - Additional filter conditions
|
||||
|
||||
WITH resource_latest_status AS (
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= SYSDATE - {{ DAYS_BACK }}
|
||||
{{ LOCATION_FILTER }}
|
||||
{{ ASSET_STATUS_FILTER }}
|
||||
)
|
||||
WHERE rn = 1
|
||||
)
|
||||
SELECT
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST_COUNT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME NOT IN ('PRD','SBY','UDT','SDT','EGT','NST') THEN 1 ELSE 0 END) as OTHER_COUNT
|
||||
FROM resource_latest_status
|
||||
{{ WHERE_CLAUSE }}
|
||||
29
src/mes_dashboard/sql/dashboard/ou_trend.sql
Normal file
29
src/mes_dashboard/sql/dashboard/ou_trend.sql
Normal file
@@ -0,0 +1,29 @@
|
||||
-- OU (Operating Utilization) Trend Query
|
||||
-- Returns daily OU% for the past N days
|
||||
--
|
||||
-- Placeholders:
|
||||
-- LOCATION_FILTER - Location exclusion filter
|
||||
-- ASSET_STATUS_FILTER - Asset status exclusion filter
|
||||
-- FLAG_FILTER - Equipment flag filter (isProduction, isKey, isMonitor)
|
||||
-- Parameters:
|
||||
-- :days - Number of days to look back
|
||||
|
||||
SELECT
|
||||
TRUNC(ss.TXNDATE) as DATA_DATE,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'PRD' THEN ss.HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SBY' THEN ss.HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'UDT' THEN ss.HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'SDT' THEN ss.HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN ss.OLDSTATUSNAME = 'EGT' THEN ss.HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(ss.HOURS) as TOTAL_HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT ss
|
||||
JOIN DWH.DW_MES_RESOURCE r ON ss.HISTORYID = r.RESOURCEID
|
||||
WHERE ss.TXNDATE >= TRUNC(SYSDATE) - :days
|
||||
AND ss.TXNDATE < TRUNC(SYSDATE)
|
||||
AND ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
{{ LOCATION_FILTER }}
|
||||
{{ ASSET_STATUS_FILTER }}
|
||||
{{ FLAG_FILTER }}
|
||||
GROUP BY TRUNC(ss.TXNDATE)
|
||||
ORDER BY DATA_DATE
|
||||
101
src/mes_dashboard/sql/dashboard/resource_detail_with_job.sql
Normal file
101
src/mes_dashboard/sql/dashboard/resource_detail_with_job.sql
Normal file
@@ -0,0 +1,101 @@
|
||||
-- Resource detail with JOB info for SDT/UDT drill-down
|
||||
-- Placeholders:
|
||||
-- DAYS_BACK - Number of days to look back
|
||||
-- LOCATION_FILTER - Location exclusion filter (e.g., "AND r.LOCATIONNAME NOT IN (...)")
|
||||
-- ASSET_STATUS_FILTER - Asset status exclusion filter
|
||||
-- WHERE_CLAUSE - Dynamic WHERE conditions for final SELECT
|
||||
-- Parameters:
|
||||
-- :start_row - Pagination start row
|
||||
-- :end_row - Pagination end row
|
||||
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DWH.DW_MES_RESOURCESTATUS
|
||||
),
|
||||
base_data AS (
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - {{ DAYS_BACK }}
|
||||
{{ LOCATION_FILTER }}
|
||||
{{ ASSET_STATUS_FILTER }}
|
||||
)
|
||||
WHERE rn = 1
|
||||
),
|
||||
max_time AS (
|
||||
SELECT MAX(LASTSTATUSCHANGEDATE) AS MAX_STATUS_TIME FROM base_data
|
||||
)
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
rs.RESOURCENAME,
|
||||
rs.WORKCENTERNAME,
|
||||
rs.RESOURCEFAMILYNAME,
|
||||
rs.NEWSTATUSNAME,
|
||||
rs.NEWREASONNAME,
|
||||
rs.LASTSTATUSCHANGEDATE,
|
||||
rs.PJ_DEPARTMENT,
|
||||
rs.VENDORNAME,
|
||||
rs.VENDORMODEL,
|
||||
rs.PJ_ISPRODUCTION,
|
||||
rs.PJ_ISKEY,
|
||||
rs.PJ_ISMONITOR,
|
||||
j.JOBID,
|
||||
rs.PJ_LOTID,
|
||||
j.JOBORDERNAME,
|
||||
j.JOBSTATUS,
|
||||
j.SYMPTOMCODENAME,
|
||||
j.CAUSECODENAME,
|
||||
j.REPAIRCODENAME,
|
||||
j.CREATEDATE as JOB_CREATEDATE,
|
||||
j.FIRSTCLOCKONDATE,
|
||||
mt.MAX_STATUS_TIME,
|
||||
ROUND((mt.MAX_STATUS_TIME - rs.LASTSTATUSCHANGEDATE) * 24 * 60, 0) as DOWN_MINUTES,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY
|
||||
CASE rs.NEWSTATUSNAME
|
||||
WHEN 'UDT' THEN 1
|
||||
WHEN 'SDT' THEN 2
|
||||
ELSE 3
|
||||
END,
|
||||
rs.LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM base_data rs
|
||||
CROSS JOIN max_time mt
|
||||
LEFT JOIN DWH.DW_MES_JOB j ON j.RESOURCEID = rs.RESOURCEID
|
||||
AND j.CREATEDATE = rs.LASTSTATUSCHANGEDATE
|
||||
WHERE {{ WHERE_CLAUSE }}
|
||||
) WHERE rn BETWEEN :start_row AND :end_row
|
||||
17
src/mes_dashboard/sql/dashboard/workcenter_cards.sql
Normal file
17
src/mes_dashboard/sql/dashboard/workcenter_cards.sql
Normal file
@@ -0,0 +1,17 @@
|
||||
-- Workcenter status cards aggregation
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
-- WHERE_CLAUSE - Dynamic WHERE conditions
|
||||
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
COUNT(*) as TOTAL,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'PRD' THEN 1 ELSE 0 END) as PRD,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SBY' THEN 1 ELSE 0 END) as SBY,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'UDT' THEN 1 ELSE 0 END) as UDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'SDT' THEN 1 ELSE 0 END) as SDT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'EGT' THEN 1 ELSE 0 END) as EGT,
|
||||
SUM(CASE WHEN NEWSTATUSNAME = 'NST' THEN 1 ELSE 0 END) as NST
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE {{ WHERE_CLAUSE }}
|
||||
GROUP BY WORKCENTERNAME
|
||||
287
src/mes_dashboard/sql/filters.py
Normal file
287
src/mes_dashboard/sql/filters.py
Normal file
@@ -0,0 +1,287 @@
|
||||
"""
|
||||
Common SQL Filters
|
||||
|
||||
Provides reusable filter building methods for common query patterns.
|
||||
"""
|
||||
|
||||
from typing import Any, Dict, List, Optional, Union
|
||||
|
||||
from mes_dashboard.config import EXCLUDED_ASSET_STATUSES, EXCLUDED_LOCATIONS
|
||||
from mes_dashboard.config.constants import EQUIPMENT_FLAG_FILTERS
|
||||
|
||||
from .builder import QueryBuilder
|
||||
|
||||
|
||||
# Non-quality hold reasons (canonical source, used by wip_service.py)
|
||||
# All other hold reasons are considered quality holds
|
||||
NON_QUALITY_HOLD_REASONS = {
|
||||
"IQC檢驗(久存品驗證)(QC)",
|
||||
"大中/安波幅50pcs樣品留樣(PD)",
|
||||
"工程驗證(PE)",
|
||||
"工程驗證(RD)",
|
||||
"指定機台生產",
|
||||
"特殊需求(X-Ray全檢)",
|
||||
"特殊需求管控",
|
||||
"第一次量產QC品質確認(QC)",
|
||||
"需綁尾數(PD)",
|
||||
"樣品需求留存打樣(樣品)",
|
||||
"盤點(收線)需求",
|
||||
}
|
||||
|
||||
|
||||
class CommonFilters:
|
||||
"""Common SQL filter builders."""
|
||||
|
||||
# =========================================================
|
||||
# Location & Asset Status Filters
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def add_location_exclusion(
|
||||
builder: QueryBuilder,
|
||||
column: str = "LOCATIONNAME",
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add location exclusion filter.
|
||||
|
||||
Excludes locations defined in EXCLUDED_LOCATIONS config.
|
||||
Allows NULL values.
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
column: Column name (default: LOCATIONNAME)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if EXCLUDED_LOCATIONS:
|
||||
builder.add_not_in_condition(column, EXCLUDED_LOCATIONS, allow_null=True)
|
||||
return builder
|
||||
|
||||
@staticmethod
|
||||
def add_asset_status_exclusion(
|
||||
builder: QueryBuilder,
|
||||
column: str = "PJ_ASSETSSTATUS",
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add asset status exclusion filter.
|
||||
|
||||
Excludes statuses defined in EXCLUDED_ASSET_STATUSES config.
|
||||
Allows NULL values.
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
column: Column name (default: PJ_ASSETSSTATUS)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if EXCLUDED_ASSET_STATUSES:
|
||||
builder.add_not_in_condition(
|
||||
column, EXCLUDED_ASSET_STATUSES, allow_null=True
|
||||
)
|
||||
return builder
|
||||
|
||||
# =========================================================
|
||||
# WIP Base Filters
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def add_wip_base_filters(
|
||||
builder: QueryBuilder,
|
||||
workorder: Optional[str] = None,
|
||||
lotid: Optional[str] = None,
|
||||
package: Optional[str] = None,
|
||||
pj_type: Optional[str] = None,
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add WIP base filters (fuzzy search).
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
workorder: Workorder filter (LIKE %value%)
|
||||
lotid: Lot ID filter (LIKE %value%)
|
||||
package: Package filter (LIKE %value%)
|
||||
pj_type: PJ type filter (LIKE %value%)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if workorder:
|
||||
builder.add_like_condition("WORKORDER", workorder)
|
||||
if lotid:
|
||||
builder.add_like_condition("LOTID", lotid)
|
||||
if package:
|
||||
builder.add_like_condition("PACKAGE_LEF", package)
|
||||
if pj_type:
|
||||
builder.add_like_condition("PJ_TYPE", pj_type)
|
||||
return builder
|
||||
|
||||
# =========================================================
|
||||
# Status Filters
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def add_status_filter(
|
||||
builder: QueryBuilder,
|
||||
status: Optional[str] = None,
|
||||
statuses: Optional[List[str]] = None,
|
||||
column: str = "STATUS",
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add status filter.
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
status: Single status value
|
||||
statuses: List of status values
|
||||
column: Column name (default: STATUS)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if status:
|
||||
builder.add_param_condition(column, status)
|
||||
elif statuses:
|
||||
builder.add_in_condition(column, statuses)
|
||||
return builder
|
||||
|
||||
# =========================================================
|
||||
# Hold Type Filters
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def add_hold_type_filter(
|
||||
builder: QueryBuilder,
|
||||
hold_type: Optional[str] = None,
|
||||
column: str = "HOLDREASONNAME",
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add hold type filter (quality vs non-quality).
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
hold_type: "quality" or "non_quality"
|
||||
column: Column name (default: HOLDREASONNAME)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if hold_type == "quality":
|
||||
# Quality holds: exclude non-quality reasons
|
||||
builder.add_not_in_condition(column, list(NON_QUALITY_HOLD_REASONS))
|
||||
elif hold_type == "non_quality":
|
||||
# Non-quality holds: only non-quality reasons
|
||||
builder.add_in_condition(column, list(NON_QUALITY_HOLD_REASONS))
|
||||
return builder
|
||||
|
||||
@staticmethod
|
||||
def is_quality_hold(reason: Optional[str]) -> bool:
|
||||
"""Check if a hold reason is quality-related."""
|
||||
return reason not in NON_QUALITY_HOLD_REASONS
|
||||
|
||||
@staticmethod
|
||||
def get_non_quality_reasons_sql() -> str:
|
||||
"""Get non-quality hold reasons as SQL-safe literal list.
|
||||
|
||||
Used for embedding in SQL CASE expressions where bind parameters
|
||||
cannot be used. Values are from a constant set (not user input).
|
||||
|
||||
Returns:
|
||||
SQL-safe string for IN clause, e.g., "'reason1', 'reason2', ..."
|
||||
"""
|
||||
# Escape single quotes in values (replace ' with '')
|
||||
escaped = [f"'{r.replace(chr(39), chr(39)+chr(39))}'" for r in NON_QUALITY_HOLD_REASONS]
|
||||
return ", ".join(escaped)
|
||||
|
||||
# =========================================================
|
||||
# Equipment/Resource Filters
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def add_equipment_filter(
|
||||
builder: QueryBuilder,
|
||||
resource_ids: Optional[List[str]] = None,
|
||||
workcenters: Optional[List[str]] = None,
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add equipment/resource filters.
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
resource_ids: List of resource IDs
|
||||
workcenters: List of workcenter names
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if resource_ids:
|
||||
builder.add_in_condition("RESOURCEID", resource_ids)
|
||||
if workcenters:
|
||||
builder.add_in_condition("WORKCENTERNAME", workcenters)
|
||||
return builder
|
||||
|
||||
@staticmethod
|
||||
def add_equipment_flag_filters(
|
||||
builder: QueryBuilder,
|
||||
filters: Optional[Dict] = None,
|
||||
) -> QueryBuilder:
|
||||
"""
|
||||
Add equipment flag filters (isProduction, isKey, isMonitor).
|
||||
|
||||
These are safe boolean conditions from EQUIPMENT_FLAG_FILTERS config.
|
||||
|
||||
Args:
|
||||
builder: QueryBuilder instance
|
||||
filters: Dict with flag keys (isProduction, isKey, isMonitor)
|
||||
|
||||
Returns:
|
||||
QueryBuilder for method chaining
|
||||
"""
|
||||
if not filters:
|
||||
return builder
|
||||
|
||||
for flag_key, sql_condition in EQUIPMENT_FLAG_FILTERS.items():
|
||||
if filters.get(flag_key):
|
||||
builder.add_condition(sql_condition)
|
||||
|
||||
return builder
|
||||
|
||||
# =========================================================
|
||||
# Legacy Compatibility (for core/utils.py wrapper)
|
||||
# =========================================================
|
||||
|
||||
@staticmethod
|
||||
def build_location_filter_legacy(
|
||||
locations: Optional[List[str]] = None,
|
||||
excluded_locations: Optional[List[str]] = None,
|
||||
) -> str:
|
||||
"""
|
||||
Build location filter SQL string (legacy format).
|
||||
|
||||
Deprecated: Use add_location_exclusion() with QueryBuilder instead.
|
||||
"""
|
||||
conditions = []
|
||||
if locations:
|
||||
loc_list = ", ".join(f"'{loc}'" for loc in locations)
|
||||
conditions.append(f"LOCATIONNAME IN ({loc_list})")
|
||||
if excluded_locations:
|
||||
exc_list = ", ".join(f"'{loc}'" for loc in excluded_locations)
|
||||
conditions.append(
|
||||
f"(LOCATIONNAME IS NULL OR LOCATIONNAME NOT IN ({exc_list}))"
|
||||
)
|
||||
return " AND ".join(conditions) if conditions else ""
|
||||
|
||||
@staticmethod
|
||||
def build_asset_status_filter_legacy(
|
||||
excluded_statuses: Optional[List[str]] = None,
|
||||
) -> str:
|
||||
"""
|
||||
Build asset status filter SQL string (legacy format).
|
||||
|
||||
Deprecated: Use add_asset_status_exclusion() with QueryBuilder instead.
|
||||
"""
|
||||
if not excluded_statuses:
|
||||
return ""
|
||||
exc_list = ", ".join(f"'{s}'" for s in excluded_statuses)
|
||||
return f"(PJ_ASSETSSTATUS IS NULL OR PJ_ASSETSSTATUS NOT IN ({exc_list}))"
|
||||
66
src/mes_dashboard/sql/loader.py
Normal file
66
src/mes_dashboard/sql/loader.py
Normal file
@@ -0,0 +1,66 @@
|
||||
"""
|
||||
SQL File Loader
|
||||
|
||||
Provides SQL file loading with LRU caching and structural parameter substitution.
|
||||
"""
|
||||
|
||||
from functools import lru_cache
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class SQLLoader:
|
||||
"""SQL file loader with LRU caching."""
|
||||
|
||||
_sql_dir: Path = Path(__file__).parent
|
||||
|
||||
@classmethod
|
||||
@lru_cache(maxsize=100)
|
||||
def load(cls, name: str) -> str:
|
||||
"""
|
||||
Load SQL file content.
|
||||
|
||||
Args:
|
||||
name: SQL file path without extension, e.g., "wip/summary"
|
||||
|
||||
Returns:
|
||||
SQL file content as string
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If SQL file does not exist
|
||||
"""
|
||||
path = cls._sql_dir / f"{name}.sql"
|
||||
if not path.exists():
|
||||
raise FileNotFoundError(f"SQL file not found: {path}")
|
||||
return path.read_text(encoding="utf-8")
|
||||
|
||||
@classmethod
|
||||
def load_with_params(cls, name: str, **kwargs) -> str:
|
||||
"""
|
||||
Load SQL file and substitute structural parameters.
|
||||
|
||||
Uses Jinja2-style placeholders: {{ param_name }}
|
||||
Only use for structural parameters (table names, column lists),
|
||||
NOT for user input values.
|
||||
|
||||
Args:
|
||||
name: SQL file path without extension
|
||||
**kwargs: Parameters to substitute
|
||||
|
||||
Returns:
|
||||
SQL content with substituted parameters
|
||||
"""
|
||||
sql = cls.load(name)
|
||||
for key, value in kwargs.items():
|
||||
sql = sql.replace(f"{{{{ {key} }}}}", str(value))
|
||||
return sql
|
||||
|
||||
@classmethod
|
||||
def clear_cache(cls) -> None:
|
||||
"""Clear the LRU cache."""
|
||||
cls.load.cache_clear()
|
||||
|
||||
@classmethod
|
||||
def cache_info(cls):
|
||||
"""Get cache statistics."""
|
||||
return cls.load.cache_info()
|
||||
11
src/mes_dashboard/sql/resource/by_status.sql
Normal file
11
src/mes_dashboard/sql/resource/by_status.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
-- Resource count by status
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
|
||||
SELECT
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE NEWSTATUSNAME IS NOT NULL
|
||||
GROUP BY NEWSTATUSNAME
|
||||
ORDER BY COUNT DESC
|
||||
12
src/mes_dashboard/sql/resource/by_workcenter.sql
Normal file
12
src/mes_dashboard/sql/resource/by_workcenter.sql
Normal file
@@ -0,0 +1,12 @@
|
||||
-- Resource count by workcenter and status
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME, NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, COUNT DESC
|
||||
30
src/mes_dashboard/sql/resource/detail.sql
Normal file
30
src/mes_dashboard/sql/resource/detail.sql
Normal file
@@ -0,0 +1,30 @@
|
||||
-- Resource detail with pagination
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
-- WHERE_CLAUSE - Dynamic WHERE conditions (e.g., AND ...)
|
||||
-- Parameters:
|
||||
-- :start_row - Pagination start row
|
||||
-- :end_row - Pagination end row
|
||||
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
RESOURCENAME,
|
||||
WORKCENTERNAME,
|
||||
RESOURCEFAMILYNAME,
|
||||
NEWSTATUSNAME,
|
||||
NEWREASONNAME,
|
||||
LASTSTATUSCHANGEDATE,
|
||||
PJ_DEPARTMENT,
|
||||
VENDORNAME,
|
||||
VENDORMODEL,
|
||||
PJ_ASSETSSTATUS,
|
||||
AVAILABILITY,
|
||||
PJ_ISPRODUCTION,
|
||||
PJ_ISKEY,
|
||||
PJ_ISMONITOR,
|
||||
ROW_NUMBER() OVER (
|
||||
ORDER BY LASTSTATUSCHANGEDATE DESC NULLS LAST
|
||||
) AS rn
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE 1=1 {{ WHERE_CLAUSE }}
|
||||
) WHERE rn BETWEEN :start_row AND :end_row
|
||||
19
src/mes_dashboard/sql/resource/distinct_statuses.sql
Normal file
19
src/mes_dashboard/sql/resource/distinct_statuses.sql
Normal file
@@ -0,0 +1,19 @@
|
||||
-- Resource Distinct Statuses Query
|
||||
-- Returns distinct status names from recent resource status changes
|
||||
--
|
||||
-- Parameters:
|
||||
-- :days_back - Number of days to look back for status changes
|
||||
|
||||
WITH latest_txn AS (
|
||||
SELECT MAX(COALESCE(TXNDATE, LASTSTATUSCHANGEDATE)) AS MAX_TXNDATE
|
||||
FROM DWH.DW_MES_RESOURCESTATUS
|
||||
)
|
||||
SELECT DISTINCT s.NEWSTATUSNAME
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
CROSS JOIN latest_txn lt
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= lt.MAX_TXNDATE - :days_back
|
||||
AND s.NEWSTATUSNAME IS NOT NULL
|
||||
ORDER BY s.NEWSTATUSNAME
|
||||
52
src/mes_dashboard/sql/resource/latest_status.sql
Normal file
52
src/mes_dashboard/sql/resource/latest_status.sql
Normal file
@@ -0,0 +1,52 @@
|
||||
-- Resource Latest Status Query
|
||||
-- Returns the latest status for each resource using ROW_NUMBER()
|
||||
--
|
||||
-- Dynamic placeholders:
|
||||
-- days_back - Number of days to look back for status changes
|
||||
-- LOCATION_FILTER - Location exclusion filter (AND ...)
|
||||
-- ASSET_STATUS_FILTER - Asset status exclusion filter (AND ...)
|
||||
--
|
||||
-- Note: This query is designed to be embedded as a subquery (no CTE/WITH clause)
|
||||
-- The MAX_TXNDATE calculation is done inline using a scalar subquery with KEEP FIRST
|
||||
|
||||
SELECT *
|
||||
FROM (
|
||||
SELECT
|
||||
r.RESOURCEID,
|
||||
r.RESOURCENAME,
|
||||
r.OBJECTCATEGORY,
|
||||
r.OBJECTTYPE,
|
||||
r.RESOURCEFAMILYNAME,
|
||||
r.WORKCENTERNAME,
|
||||
r.LOCATIONNAME,
|
||||
r.VENDORNAME,
|
||||
r.VENDORMODEL,
|
||||
r.PJ_DEPARTMENT,
|
||||
r.PJ_ASSETSSTATUS,
|
||||
r.PJ_ISPRODUCTION,
|
||||
r.PJ_ISKEY,
|
||||
r.PJ_ISMONITOR,
|
||||
r.PJ_LOTID,
|
||||
r.DESCRIPTION,
|
||||
s.NEWSTATUSNAME,
|
||||
s.NEWREASONNAME,
|
||||
s.LASTSTATUSCHANGEDATE,
|
||||
s.OLDSTATUSNAME,
|
||||
s.OLDREASONNAME,
|
||||
s.AVAILABILITY,
|
||||
s.JOBID,
|
||||
s.TXNDATE,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY r.RESOURCEID
|
||||
ORDER BY s.LASTSTATUSCHANGEDATE DESC NULLS LAST,
|
||||
COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) DESC
|
||||
) AS rn
|
||||
FROM DWH.DW_MES_RESOURCE r
|
||||
JOIN DWH.DW_MES_RESOURCESTATUS s ON r.RESOURCEID = s.HISTORYID
|
||||
WHERE ((r.OBJECTCATEGORY = 'ASSEMBLY' AND r.OBJECTTYPE = 'ASSEMBLY')
|
||||
OR (r.OBJECTCATEGORY = 'WAFERSORT' AND r.OBJECTTYPE = 'WAFERSORT'))
|
||||
AND COALESCE(s.TXNDATE, s.LASTSTATUSCHANGEDATE) >= SYSDATE - {{ days_back }}
|
||||
{{ LOCATION_FILTER }}
|
||||
{{ ASSET_STATUS_FILTER }}
|
||||
)
|
||||
WHERE rn = 1
|
||||
11
src/mes_dashboard/sql/resource/status_summary.sql
Normal file
11
src/mes_dashboard/sql/resource/status_summary.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
-- Resource Status Summary Query
|
||||
-- Returns aggregate statistics for resources
|
||||
--
|
||||
-- This query wraps the latest_status subquery
|
||||
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_COUNT,
|
||||
COUNT(DISTINCT WORKCENTERNAME) as WORKCENTER_COUNT,
|
||||
COUNT(DISTINCT RESOURCEFAMILYNAME) as FAMILY_COUNT,
|
||||
COUNT(DISTINCT PJ_DEPARTMENT) as DEPT_COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
33
src/mes_dashboard/sql/resource/workcenter_status_matrix.sql
Normal file
33
src/mes_dashboard/sql/resource/workcenter_status_matrix.sql
Normal file
@@ -0,0 +1,33 @@
|
||||
-- Resource workcenter × status matrix
|
||||
-- Placeholders:
|
||||
-- LATEST_STATUS_SUBQUERY - Base subquery for latest resource status
|
||||
|
||||
SELECT
|
||||
WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END as STATUS_CATEGORY,
|
||||
NEWSTATUSNAME,
|
||||
COUNT(*) as COUNT
|
||||
FROM ({{ LATEST_STATUS_SUBQUERY }}) rs
|
||||
WHERE WORKCENTERNAME IS NOT NULL
|
||||
GROUP BY WORKCENTERNAME,
|
||||
CASE NEWSTATUSNAME
|
||||
WHEN 'PRD' THEN 'PRD'
|
||||
WHEN 'SBY' THEN 'SBY'
|
||||
WHEN 'UDT' THEN 'UDT'
|
||||
WHEN 'SDT' THEN 'SDT'
|
||||
WHEN 'EGT' THEN 'EGT'
|
||||
WHEN 'NST' THEN 'NST'
|
||||
WHEN 'SCRAP' THEN 'SCRAP'
|
||||
ELSE 'OTHER'
|
||||
END,
|
||||
NEWSTATUSNAME
|
||||
ORDER BY WORKCENTERNAME, STATUS_CATEGORY
|
||||
27
src/mes_dashboard/sql/resource_history/detail.sql
Normal file
27
src/mes_dashboard/sql/resource_history/detail.sql
Normal file
@@ -0,0 +1,27 @@
|
||||
-- Detail Query for Resource History
|
||||
-- Aggregates status hours by resource for detail table and CSV export
|
||||
-- Placeholders:
|
||||
-- HISTORYID_FILTER - Resource ID filter condition (e.g., HISTORYID IN (...))
|
||||
-- Parameters:
|
||||
-- :start_date - Start date (YYYY-MM-DD)
|
||||
-- :end_date - End date (YYYY-MM-DD)
|
||||
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE(:start_date, 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE(:end_date, 'YYYY-MM-DD') + 1
|
||||
AND {{ HISTORYID_FILTER }}
|
||||
)
|
||||
SELECT
|
||||
HISTORYID,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
SUM(HOURS) as TOTAL_HOURS
|
||||
FROM shift_data
|
||||
GROUP BY HISTORYID
|
||||
ORDER BY HISTORYID
|
||||
27
src/mes_dashboard/sql/resource_history/heatmap.sql
Normal file
27
src/mes_dashboard/sql/resource_history/heatmap.sql
Normal file
@@ -0,0 +1,27 @@
|
||||
-- Heatmap Query for Resource History
|
||||
-- Aggregates status hours by resource and date for heatmap visualization
|
||||
-- Placeholders:
|
||||
-- HISTORYID_FILTER - Resource ID filter condition (e.g., HISTORYID IN (...))
|
||||
-- DATE_TRUNC - Date truncation expression (e.g., TRUNC(TXNDATE, 'MM'))
|
||||
-- Parameters:
|
||||
-- :start_date - Start date (YYYY-MM-DD)
|
||||
-- :end_date - End date (YYYY-MM-DD)
|
||||
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, TXNDATE, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE(:start_date, 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE(:end_date, 'YYYY-MM-DD') + 1
|
||||
AND {{ HISTORYID_FILTER }}
|
||||
)
|
||||
SELECT
|
||||
HISTORYID,
|
||||
{{ DATE_TRUNC }} as DATA_DATE,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS
|
||||
FROM shift_data
|
||||
GROUP BY HISTORYID, {{ DATE_TRUNC }}
|
||||
ORDER BY HISTORYID, DATA_DATE
|
||||
24
src/mes_dashboard/sql/resource_history/kpi.sql
Normal file
24
src/mes_dashboard/sql/resource_history/kpi.sql
Normal file
@@ -0,0 +1,24 @@
|
||||
-- KPI Query for Resource History
|
||||
-- Aggregates status hours across all filtered resources
|
||||
-- Placeholders:
|
||||
-- HISTORYID_FILTER - Resource ID filter condition (e.g., HISTORYID IN (...))
|
||||
-- Parameters:
|
||||
-- :start_date - Start date (YYYY-MM-DD)
|
||||
-- :end_date - End date (YYYY-MM-DD)
|
||||
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, TXNDATE, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE(:start_date, 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE(:end_date, 'YYYY-MM-DD') + 1
|
||||
AND {{ HISTORYID_FILTER }}
|
||||
)
|
||||
SELECT
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
COUNT(DISTINCT HISTORYID) as MACHINE_COUNT
|
||||
FROM shift_data
|
||||
28
src/mes_dashboard/sql/resource_history/trend.sql
Normal file
28
src/mes_dashboard/sql/resource_history/trend.sql
Normal file
@@ -0,0 +1,28 @@
|
||||
-- Trend Query for Resource History
|
||||
-- Aggregates status hours by date for trend visualization
|
||||
-- Placeholders:
|
||||
-- HISTORYID_FILTER - Resource ID filter condition (e.g., HISTORYID IN (...))
|
||||
-- DATE_TRUNC - Date truncation expression (e.g., TRUNC(TXNDATE, 'MM'))
|
||||
-- Parameters:
|
||||
-- :start_date - Start date (YYYY-MM-DD)
|
||||
-- :end_date - End date (YYYY-MM-DD)
|
||||
|
||||
WITH shift_data AS (
|
||||
SELECT /*+ MATERIALIZE */ HISTORYID, TXNDATE, OLDSTATUSNAME, HOURS
|
||||
FROM DWH.DW_MES_RESOURCESTATUS_SHIFT
|
||||
WHERE TXNDATE >= TO_DATE(:start_date, 'YYYY-MM-DD')
|
||||
AND TXNDATE < TO_DATE(:end_date, 'YYYY-MM-DD') + 1
|
||||
AND {{ HISTORYID_FILTER }}
|
||||
)
|
||||
SELECT
|
||||
{{ DATE_TRUNC }} as DATA_DATE,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'PRD' THEN HOURS ELSE 0 END) as PRD_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SBY' THEN HOURS ELSE 0 END) as SBY_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'UDT' THEN HOURS ELSE 0 END) as UDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'SDT' THEN HOURS ELSE 0 END) as SDT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'EGT' THEN HOURS ELSE 0 END) as EGT_HOURS,
|
||||
SUM(CASE WHEN OLDSTATUSNAME = 'NST' THEN HOURS ELSE 0 END) as NST_HOURS,
|
||||
COUNT(DISTINCT HISTORYID) as MACHINE_COUNT
|
||||
FROM shift_data
|
||||
GROUP BY {{ DATE_TRUNC }}
|
||||
ORDER BY DATA_DATE
|
||||
30
src/mes_dashboard/sql/wip/detail.sql
Normal file
30
src/mes_dashboard/sql/wip/detail.sql
Normal file
@@ -0,0 +1,30 @@
|
||||
-- WIP Detail Query
|
||||
-- Returns paginated lot details for a specific workcenter group
|
||||
--
|
||||
-- Uses ROW_NUMBER() for efficient pagination
|
||||
--
|
||||
-- Parameters:
|
||||
-- :offset - Starting row offset (0-based)
|
||||
-- :limit - Number of rows to return
|
||||
--
|
||||
-- Dynamic placeholders:
|
||||
-- WHERE_CLAUSE - Filter conditions
|
||||
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
LOTID,
|
||||
EQUIPMENTS,
|
||||
STATUS,
|
||||
HOLDREASONNAME,
|
||||
QTY,
|
||||
PACKAGE_LEF,
|
||||
SPECNAME,
|
||||
CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN 'RUN'
|
||||
WHEN COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN 'HOLD'
|
||||
ELSE 'QUEUE' END AS WIP_STATUS,
|
||||
ROW_NUMBER() OVER (ORDER BY LOTID) as RN
|
||||
FROM DWH.DW_MES_LOT_V
|
||||
{{ WHERE_CLAUSE }}
|
||||
)
|
||||
WHERE RN > :offset AND RN <= :offset + :limit
|
||||
ORDER BY RN
|
||||
18
src/mes_dashboard/sql/wip/matrix.sql
Normal file
18
src/mes_dashboard/sql/wip/matrix.sql
Normal file
@@ -0,0 +1,18 @@
|
||||
-- WIP Matrix Query
|
||||
-- Returns workcenter x product line (package) matrix
|
||||
--
|
||||
-- Aggregates QTY by WORKCENTER_GROUP and PACKAGE_LEF
|
||||
-- Used for the overview dashboard matrix visualization
|
||||
--
|
||||
-- Dynamic placeholders:
|
||||
-- WHERE_CLAUSE - Filter conditions including status and hold type
|
||||
|
||||
SELECT
|
||||
WORKCENTER_GROUP,
|
||||
WORKCENTERSEQUENCE_GROUP,
|
||||
PACKAGE_LEF,
|
||||
SUM(QTY) as QTY
|
||||
FROM DWH.DW_MES_LOT_V
|
||||
{{ WHERE_CLAUSE }}
|
||||
GROUP BY WORKCENTER_GROUP, WORKCENTERSEQUENCE_GROUP, PACKAGE_LEF
|
||||
ORDER BY WORKCENTERSEQUENCE_GROUP, PACKAGE_LEF
|
||||
48
src/mes_dashboard/sql/wip/summary.sql
Normal file
48
src/mes_dashboard/sql/wip/summary.sql
Normal file
@@ -0,0 +1,48 @@
|
||||
-- WIP Summary Query
|
||||
-- Returns overall WIP KPI statistics
|
||||
--
|
||||
-- WIP Status Logic:
|
||||
-- RUN: EQUIPMENTCOUNT > 0
|
||||
-- HOLD: EQUIPMENTCOUNT = 0 AND CURRENTHOLDCOUNT > 0
|
||||
-- QUEUE: EQUIPMENTCOUNT = 0 AND CURRENTHOLDCOUNT = 0
|
||||
--
|
||||
-- Hold Type Logic:
|
||||
-- Quality Hold: Not in NON_QUALITY_HOLD_REASONS
|
||||
-- Non-Quality Hold: In NON_QUALITY_HOLD_REASONS
|
||||
--
|
||||
-- Dynamic placeholders:
|
||||
-- WHERE_CLAUSE - Filter conditions
|
||||
-- NON_QUALITY_REASONS - List of non-quality hold reason values
|
||||
|
||||
SELECT
|
||||
COUNT(*) as TOTAL_LOTS,
|
||||
SUM(QTY) as TOTAL_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN 1 ELSE 0 END) as RUN_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) > 0 THEN QTY ELSE 0 END) as RUN_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN 1 ELSE 0 END) as HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0 THEN QTY ELSE 0 END) as HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND (HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({{ NON_QUALITY_REASONS }}))
|
||||
THEN 1 ELSE 0 END) as QUALITY_HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND (HOLDREASONNAME IS NULL OR HOLDREASONNAME NOT IN ({{ NON_QUALITY_REASONS }}))
|
||||
THEN QTY ELSE 0 END) as QUALITY_HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND HOLDREASONNAME IN ({{ NON_QUALITY_REASONS }})
|
||||
THEN 1 ELSE 0 END) as NON_QUALITY_HOLD_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) > 0
|
||||
AND HOLDREASONNAME IN ({{ NON_QUALITY_REASONS }})
|
||||
THEN QTY ELSE 0 END) as NON_QUALITY_HOLD_QTY_PCS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) = 0 THEN 1 ELSE 0 END) as QUEUE_LOTS,
|
||||
SUM(CASE WHEN COALESCE(EQUIPMENTCOUNT, 0) = 0
|
||||
AND COALESCE(CURRENTHOLDCOUNT, 0) = 0 THEN QTY ELSE 0 END) as QUEUE_QTY_PCS,
|
||||
MAX(SYS_DATE) as DATA_UPDATE_DATE
|
||||
FROM DWH.DW_MES_LOT_V
|
||||
{{ WHERE_CLAUSE }}
|
||||
@@ -86,7 +86,7 @@ class StressTestResult:
|
||||
@pytest.fixture(scope="session")
|
||||
def base_url() -> str:
|
||||
"""Get the base URL for stress testing."""
|
||||
return os.environ.get('STRESS_TEST_URL', 'http://127.0.0.1:5000')
|
||||
return os.environ.get('STRESS_TEST_URL', 'http://127.0.0.1:8080')
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
|
||||
186
tests/test_common_filters.py
Normal file
186
tests/test_common_filters.py
Normal file
@@ -0,0 +1,186 @@
|
||||
"""Tests for Common Filters."""
|
||||
|
||||
import pytest
|
||||
from unittest.mock import patch
|
||||
|
||||
from mes_dashboard.sql.builder import QueryBuilder
|
||||
from mes_dashboard.sql.filters import CommonFilters, NON_QUALITY_HOLD_REASONS
|
||||
|
||||
|
||||
class TestCommonFilters:
|
||||
"""Test CommonFilters class."""
|
||||
|
||||
def test_add_location_exclusion(self):
|
||||
"""Test location exclusion filter."""
|
||||
builder = QueryBuilder()
|
||||
|
||||
with patch(
|
||||
"mes_dashboard.sql.filters.EXCLUDED_LOCATIONS", ["ATEC", "F區"]
|
||||
):
|
||||
CommonFilters.add_location_exclusion(builder)
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "LOCATIONNAME IS NULL OR LOCATIONNAME NOT IN" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "ATEC"
|
||||
assert builder.params["p1"] == "F區"
|
||||
|
||||
def test_add_location_exclusion_empty(self):
|
||||
"""Test location exclusion with empty list."""
|
||||
builder = QueryBuilder()
|
||||
|
||||
with patch("mes_dashboard.sql.filters.EXCLUDED_LOCATIONS", []):
|
||||
CommonFilters.add_location_exclusion(builder)
|
||||
|
||||
assert len(builder.conditions) == 0
|
||||
|
||||
def test_add_location_exclusion_custom_column(self):
|
||||
"""Test location exclusion with custom column name."""
|
||||
builder = QueryBuilder()
|
||||
|
||||
with patch(
|
||||
"mes_dashboard.sql.filters.EXCLUDED_LOCATIONS", ["TEST"]
|
||||
):
|
||||
CommonFilters.add_location_exclusion(builder, column="LOC_NAME")
|
||||
|
||||
assert "LOC_NAME IS NULL OR LOC_NAME NOT IN" in builder.conditions[0]
|
||||
|
||||
def test_add_asset_status_exclusion(self):
|
||||
"""Test asset status exclusion filter."""
|
||||
builder = QueryBuilder()
|
||||
|
||||
with patch(
|
||||
"mes_dashboard.sql.filters.EXCLUDED_ASSET_STATUSES", ["報廢", "閒置"]
|
||||
):
|
||||
CommonFilters.add_asset_status_exclusion(builder)
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "PJ_ASSETSSTATUS IS NULL OR PJ_ASSETSSTATUS NOT IN" in builder.conditions[0]
|
||||
|
||||
def test_add_asset_status_exclusion_empty(self):
|
||||
"""Test asset status exclusion with empty list."""
|
||||
builder = QueryBuilder()
|
||||
|
||||
with patch("mes_dashboard.sql.filters.EXCLUDED_ASSET_STATUSES", []):
|
||||
CommonFilters.add_asset_status_exclusion(builder)
|
||||
|
||||
assert len(builder.conditions) == 0
|
||||
|
||||
def test_add_wip_base_filters_workorder(self):
|
||||
"""Test WIP base filter for workorder."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_wip_base_filters(builder, workorder="WO123")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "WORKORDER LIKE" in builder.conditions[0]
|
||||
assert "%WO123%" in builder.params["p0"]
|
||||
|
||||
def test_add_wip_base_filters_lotid(self):
|
||||
"""Test WIP base filter for lot ID."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_wip_base_filters(builder, lotid="LOT001")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "LOTID LIKE" in builder.conditions[0]
|
||||
|
||||
def test_add_wip_base_filters_multiple(self):
|
||||
"""Test WIP base filter with multiple parameters."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_wip_base_filters(
|
||||
builder, workorder="WO", package="PKG", pj_type="TYPE"
|
||||
)
|
||||
|
||||
assert len(builder.conditions) == 3
|
||||
assert any("WORKORDER LIKE" in c for c in builder.conditions)
|
||||
assert any("PACKAGE_LEF LIKE" in c for c in builder.conditions)
|
||||
assert any("PJ_TYPE LIKE" in c for c in builder.conditions)
|
||||
|
||||
def test_add_status_filter_single(self):
|
||||
"""Test status filter with single status."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_status_filter(builder, status="HOLD")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "STATUS = :p0" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "HOLD"
|
||||
|
||||
def test_add_status_filter_multiple(self):
|
||||
"""Test status filter with multiple statuses."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_status_filter(builder, statuses=["RUN", "QUEUE"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "STATUS IN (:p0, :p1)" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "RUN"
|
||||
assert builder.params["p1"] == "QUEUE"
|
||||
|
||||
def test_add_hold_type_filter_quality(self):
|
||||
"""Test hold type filter for quality holds."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_hold_type_filter(builder, hold_type="quality")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "HOLDREASONNAME NOT IN" in builder.conditions[0]
|
||||
|
||||
def test_add_hold_type_filter_non_quality(self):
|
||||
"""Test hold type filter for non-quality holds."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_hold_type_filter(builder, hold_type="non_quality")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "HOLDREASONNAME IN" in builder.conditions[0]
|
||||
|
||||
def test_is_quality_hold(self):
|
||||
"""Test is_quality_hold helper function."""
|
||||
# Quality hold (not in non-quality list)
|
||||
assert CommonFilters.is_quality_hold("品質異常") is True
|
||||
|
||||
# Non-quality hold (in list)
|
||||
non_quality_reason = list(NON_QUALITY_HOLD_REASONS)[0]
|
||||
assert CommonFilters.is_quality_hold(non_quality_reason) is False
|
||||
|
||||
def test_add_equipment_filter_resource_ids(self):
|
||||
"""Test equipment filter with resource IDs."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_equipment_filter(builder, resource_ids=["R001", "R002"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "RESOURCEID IN" in builder.conditions[0]
|
||||
|
||||
def test_add_equipment_filter_workcenters(self):
|
||||
"""Test equipment filter with workcenters."""
|
||||
builder = QueryBuilder()
|
||||
CommonFilters.add_equipment_filter(builder, workcenters=["WC1", "WC2"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "WORKCENTERNAME IN" in builder.conditions[0]
|
||||
|
||||
def test_build_location_filter_legacy(self):
|
||||
"""Test legacy location filter builder."""
|
||||
result = CommonFilters.build_location_filter_legacy(
|
||||
locations=["LOC1", "LOC2"],
|
||||
excluded_locations=["EXC1"],
|
||||
)
|
||||
|
||||
assert "LOCATIONNAME IN ('LOC1', 'LOC2')" in result
|
||||
assert "LOCATIONNAME NOT IN ('EXC1')" in result
|
||||
|
||||
def test_build_asset_status_filter_legacy(self):
|
||||
"""Test legacy asset status filter builder."""
|
||||
result = CommonFilters.build_asset_status_filter_legacy(
|
||||
excluded_statuses=["報廢", "閒置"]
|
||||
)
|
||||
|
||||
assert "PJ_ASSETSSTATUS NOT IN" in result
|
||||
assert "'報廢'" in result
|
||||
assert "'閒置'" in result
|
||||
|
||||
def test_build_asset_status_filter_legacy_empty(self):
|
||||
"""Test legacy asset status filter with empty list."""
|
||||
result = CommonFilters.build_asset_status_filter_legacy(excluded_statuses=[])
|
||||
|
||||
assert result == ""
|
||||
|
||||
def test_non_quality_hold_reasons_exists(self):
|
||||
"""Test that NON_QUALITY_HOLD_REASONS is defined and has content."""
|
||||
assert len(NON_QUALITY_HOLD_REASONS) > 0
|
||||
assert isinstance(NON_QUALITY_HOLD_REASONS, set)
|
||||
@@ -456,30 +456,42 @@ class TestRefreshCache:
|
||||
mock_sync.assert_called_once()
|
||||
|
||||
|
||||
class TestBuildFilterSql:
|
||||
"""Test _build_filter_sql function."""
|
||||
class TestBuildFilterBuilder:
|
||||
"""Test _build_filter_builder function."""
|
||||
|
||||
def test_includes_equipment_type_filter(self):
|
||||
"""Test includes equipment type filter."""
|
||||
import mes_dashboard.services.resource_cache as rc
|
||||
|
||||
sql = rc._build_filter_sql()
|
||||
builder = rc._build_filter_builder()
|
||||
builder.base_sql = "SELECT * FROM DWH.DW_MES_RESOURCE {{ WHERE_CLAUSE }}"
|
||||
sql, params = builder.build()
|
||||
|
||||
assert 'OBJECTCATEGORY' in sql
|
||||
assert 'ASSEMBLY' in sql or 'WAFERSORT' in sql
|
||||
|
||||
def test_includes_location_filter(self):
|
||||
"""Test includes location exclusion filter."""
|
||||
"""Test includes location exclusion filter with parameterization."""
|
||||
import mes_dashboard.services.resource_cache as rc
|
||||
|
||||
sql = rc._build_filter_sql()
|
||||
builder = rc._build_filter_builder()
|
||||
builder.base_sql = "SELECT * FROM DWH.DW_MES_RESOURCE {{ WHERE_CLAUSE }}"
|
||||
sql, params = builder.build()
|
||||
|
||||
# Check SQL contains LOCATIONNAME condition
|
||||
assert 'LOCATIONNAME' in sql
|
||||
# Parameterized query should have bind variables
|
||||
assert len(params) > 0
|
||||
|
||||
def test_includes_asset_status_filter(self):
|
||||
"""Test includes asset status exclusion filter."""
|
||||
"""Test includes asset status exclusion filter with parameterization."""
|
||||
import mes_dashboard.services.resource_cache as rc
|
||||
|
||||
sql = rc._build_filter_sql()
|
||||
builder = rc._build_filter_builder()
|
||||
builder.base_sql = "SELECT * FROM DWH.DW_MES_RESOURCE {{ WHERE_CLAUSE }}"
|
||||
sql, params = builder.build()
|
||||
|
||||
# Check SQL contains PJ_ASSETSSTATUS condition
|
||||
assert 'PJ_ASSETSSTATUS' in sql
|
||||
# Parameterized query should have bind variables
|
||||
assert len(params) > 0
|
||||
|
||||
@@ -24,7 +24,7 @@ from mes_dashboard.services.resource_history_service import (
|
||||
_calc_ou_pct,
|
||||
_calc_availability_pct,
|
||||
_build_kpi_from_df,
|
||||
_build_detail_from_df,
|
||||
_build_detail_from_raw_df,
|
||||
MAX_QUERY_DAYS,
|
||||
)
|
||||
|
||||
@@ -62,7 +62,7 @@ class TestGetDateTrunc(unittest.TestCase):
|
||||
def test_day_granularity(self):
|
||||
"""Day granularity should use TRUNC without format."""
|
||||
result = _get_date_trunc('day')
|
||||
self.assertIn('TRUNC(ss.TXNDATE)', result)
|
||||
self.assertIn('TRUNC(TXNDATE)', result)
|
||||
self.assertNotIn('IW', result)
|
||||
|
||||
def test_week_granularity(self):
|
||||
@@ -83,7 +83,7 @@ class TestGetDateTrunc(unittest.TestCase):
|
||||
def test_unknown_granularity(self):
|
||||
"""Unknown granularity should default to day."""
|
||||
result = _get_date_trunc('unknown')
|
||||
self.assertIn('TRUNC(ss.TXNDATE)', result)
|
||||
self.assertIn('TRUNC(TXNDATE)', result)
|
||||
self.assertNotIn("'IW'", result)
|
||||
|
||||
|
||||
@@ -209,15 +209,18 @@ class TestBuildDetailFromDf(unittest.TestCase):
|
||||
def test_empty_dataframe(self):
|
||||
"""Empty DataFrame should return empty list."""
|
||||
df = pd.DataFrame()
|
||||
result = _build_detail_from_df(df)
|
||||
resource_lookup = {}
|
||||
result = _build_detail_from_raw_df(df, resource_lookup)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_normal_dataframe(self):
|
||||
@patch('mes_dashboard.services.filter_cache.get_workcenter_mapping')
|
||||
def test_normal_dataframe(self, mock_wc_mapping):
|
||||
"""Normal DataFrame should build correct detail data."""
|
||||
mock_wc_mapping.return_value = {
|
||||
'WC01': {'group': 'Group01', 'sequence': 1}
|
||||
}
|
||||
df = pd.DataFrame([{
|
||||
'WORKCENTERNAME': 'WC01',
|
||||
'RESOURCEFAMILYNAME': 'FAM01',
|
||||
'RESOURCENAME': 'RES01',
|
||||
'HISTORYID': 'RES01',
|
||||
'PRD_HOURS': 80,
|
||||
'SBY_HOURS': 10,
|
||||
'UDT_HOURS': 5,
|
||||
@@ -226,10 +229,18 @@ class TestBuildDetailFromDf(unittest.TestCase):
|
||||
'NST_HOURS': 10,
|
||||
'TOTAL_HOURS': 110
|
||||
}])
|
||||
result = _build_detail_from_df(df)
|
||||
resource_lookup = {
|
||||
'RES01': {
|
||||
'RESOURCEID': 'RES01',
|
||||
'WORKCENTERNAME': 'WC01',
|
||||
'RESOURCEFAMILYNAME': 'FAM01',
|
||||
'RESOURCENAME': 'RES01'
|
||||
}
|
||||
}
|
||||
result = _build_detail_from_raw_df(df, resource_lookup)
|
||||
|
||||
self.assertEqual(len(result), 1)
|
||||
self.assertEqual(result[0]['workcenter'], 'WC01')
|
||||
self.assertEqual(result[0]['workcenter'], 'Group01')
|
||||
self.assertEqual(result[0]['family'], 'FAM01')
|
||||
self.assertEqual(result[0]['resource'], 'RES01')
|
||||
self.assertEqual(result[0]['machine_count'], 1)
|
||||
@@ -350,14 +361,23 @@ class TestQueryDetail(unittest.TestCase):
|
||||
self.assertIsNotNone(result)
|
||||
self.assertIn('error', result)
|
||||
|
||||
@patch('mes_dashboard.services.filter_cache.get_workcenter_mapping')
|
||||
@patch('mes_dashboard.services.resource_history_service._get_filtered_resources')
|
||||
@patch('mes_dashboard.services.resource_history_service.read_sql_df')
|
||||
def test_successful_query(self, mock_read_sql):
|
||||
def test_successful_query(self, mock_read_sql, mock_get_resources, mock_wc_mapping):
|
||||
"""Successful query should return data with total count."""
|
||||
# Mock detail query
|
||||
# Mock filtered resources
|
||||
mock_get_resources.return_value = [
|
||||
{'RESOURCEID': 'RES01', 'WORKCENTERNAME': 'WC01',
|
||||
'RESOURCEFAMILYNAME': 'FAM01', 'RESOURCENAME': 'RES01'}
|
||||
]
|
||||
mock_wc_mapping.return_value = {
|
||||
'WC01': {'group': 'Group01', 'sequence': 1}
|
||||
}
|
||||
|
||||
# Mock detail query with HISTORYID column
|
||||
detail_df = pd.DataFrame([{
|
||||
'WORKCENTERNAME': 'WC01',
|
||||
'RESOURCEFAMILYNAME': 'FAM01',
|
||||
'RESOURCENAME': 'RES01',
|
||||
'HISTORYID': 'RES01',
|
||||
'PRD_HOURS': 80, 'SBY_HOURS': 10, 'UDT_HOURS': 5,
|
||||
'SDT_HOURS': 3, 'EGT_HOURS': 2, 'NST_HOURS': 10,
|
||||
'TOTAL_HOURS': 110
|
||||
|
||||
238
tests/test_sql_builder.py
Normal file
238
tests/test_sql_builder.py
Normal file
@@ -0,0 +1,238 @@
|
||||
"""Tests for Query Builder."""
|
||||
|
||||
import pytest
|
||||
|
||||
from mes_dashboard.sql.builder import QueryBuilder
|
||||
|
||||
|
||||
class TestQueryBuilder:
|
||||
"""Test QueryBuilder class."""
|
||||
|
||||
def test_add_param_condition(self):
|
||||
"""Test adding a parameterized condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_param_condition("status", "RUN")
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "status = :p0" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "RUN"
|
||||
|
||||
def test_add_param_condition_with_operator(self):
|
||||
"""Test adding a parameterized condition with custom operator."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_param_condition("count", 10, operator=">=")
|
||||
|
||||
assert "count >= :p0" in builder.conditions[0]
|
||||
assert builder.params["p0"] == 10
|
||||
|
||||
def test_add_in_condition(self):
|
||||
"""Test adding an IN condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_in_condition("status", ["RUN", "QUEUE", "HOLD"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "status IN (:p0, :p1, :p2)" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "RUN"
|
||||
assert builder.params["p1"] == "QUEUE"
|
||||
assert builder.params["p2"] == "HOLD"
|
||||
|
||||
def test_add_in_condition_empty_list(self):
|
||||
"""Test that empty list doesn't add condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_in_condition("status", [])
|
||||
|
||||
assert len(builder.conditions) == 0
|
||||
assert len(builder.params) == 0
|
||||
|
||||
def test_add_not_in_condition(self):
|
||||
"""Test adding a NOT IN condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_not_in_condition("location", ["ATEC", "F區"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "location NOT IN (:p0, :p1)" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "ATEC"
|
||||
assert builder.params["p1"] == "F區"
|
||||
|
||||
def test_add_not_in_condition_with_null(self):
|
||||
"""Test NOT IN condition allowing NULL values."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_not_in_condition("location", ["ATEC"], allow_null=True)
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
assert "(location IS NULL OR location NOT IN (:p0))" in builder.conditions[0]
|
||||
|
||||
def test_add_like_condition_both(self):
|
||||
"""Test LIKE condition with wildcards on both sides."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_like_condition("name", "test")
|
||||
|
||||
assert "name LIKE :p0 ESCAPE '\\'" in builder.conditions[0]
|
||||
assert builder.params["p0"] == "%test%"
|
||||
|
||||
def test_add_like_condition_start(self):
|
||||
"""Test LIKE condition with wildcard at end only."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_like_condition("name", "prefix", position="start")
|
||||
|
||||
assert builder.params["p0"] == "prefix%"
|
||||
|
||||
def test_add_like_condition_end(self):
|
||||
"""Test LIKE condition with wildcard at start only."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_like_condition("name", "suffix", position="end")
|
||||
|
||||
assert builder.params["p0"] == "%suffix"
|
||||
|
||||
def test_add_like_condition_escapes_wildcards(self):
|
||||
"""Test that LIKE condition escapes SQL wildcards."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_like_condition("name", "test%value")
|
||||
|
||||
assert builder.params["p0"] == "%test\\%value%"
|
||||
|
||||
def test_add_like_condition_escapes_underscore(self):
|
||||
"""Test that LIKE condition escapes underscores."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_like_condition("name", "test_value")
|
||||
|
||||
assert builder.params["p0"] == "%test\\_value%"
|
||||
|
||||
def test_build_with_conditions(self):
|
||||
"""Test building SQL with multiple conditions."""
|
||||
builder = QueryBuilder("SELECT * FROM t {{ WHERE_CLAUSE }}")
|
||||
builder.add_param_condition("status", "RUN")
|
||||
builder.add_in_condition("type", ["A", "B"])
|
||||
|
||||
sql, params = builder.build()
|
||||
|
||||
assert "WHERE" in sql
|
||||
assert "status = :p0" in sql
|
||||
assert "type IN (:p1, :p2)" in sql
|
||||
assert "AND" in sql
|
||||
assert params["p0"] == "RUN"
|
||||
assert params["p1"] == "A"
|
||||
assert params["p2"] == "B"
|
||||
|
||||
def test_build_without_conditions(self):
|
||||
"""Test building SQL with no conditions."""
|
||||
builder = QueryBuilder("SELECT * FROM t {{ WHERE_CLAUSE }}")
|
||||
sql, params = builder.build()
|
||||
|
||||
assert "WHERE" not in sql
|
||||
assert "{{ WHERE_CLAUSE }}" not in sql
|
||||
assert params == {}
|
||||
|
||||
def test_build_where_only(self):
|
||||
"""Test building only the WHERE clause."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_param_condition("status", "RUN")
|
||||
|
||||
where_clause, params = builder.build_where_only()
|
||||
|
||||
assert where_clause.startswith("WHERE")
|
||||
assert "status = :p0" in where_clause
|
||||
|
||||
def test_get_conditions_sql(self):
|
||||
"""Test getting conditions as string."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_param_condition("a", 1)
|
||||
builder.add_param_condition("b", 2)
|
||||
|
||||
conditions = builder.get_conditions_sql()
|
||||
|
||||
assert "a = :p0 AND b = :p1" == conditions
|
||||
|
||||
def test_reset(self):
|
||||
"""Test resetting the builder."""
|
||||
builder = QueryBuilder("SELECT * FROM t")
|
||||
builder.add_param_condition("status", "RUN")
|
||||
builder.reset()
|
||||
|
||||
assert len(builder.conditions) == 0
|
||||
assert len(builder.params) == 0
|
||||
assert builder._param_counter == 0
|
||||
assert builder.base_sql == "SELECT * FROM t"
|
||||
|
||||
def test_method_chaining(self):
|
||||
"""Test that methods support chaining."""
|
||||
builder = (
|
||||
QueryBuilder("SELECT * FROM t {{ WHERE_CLAUSE }}")
|
||||
.add_param_condition("status", "RUN")
|
||||
.add_in_condition("type", ["A", "B"])
|
||||
.add_like_condition("name", "test")
|
||||
)
|
||||
|
||||
assert len(builder.conditions) == 3
|
||||
|
||||
def test_add_is_null(self):
|
||||
"""Test adding IS NULL condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_is_null("deleted_at")
|
||||
|
||||
assert "deleted_at IS NULL" in builder.conditions[0]
|
||||
|
||||
def test_add_is_not_null(self):
|
||||
"""Test adding IS NOT NULL condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_is_not_null("updated_at")
|
||||
|
||||
assert "updated_at IS NOT NULL" in builder.conditions[0]
|
||||
|
||||
def test_add_condition_fixed(self):
|
||||
"""Test adding a fixed condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_condition("1=1")
|
||||
|
||||
assert "1=1" in builder.conditions[0]
|
||||
assert len(builder.params) == 0
|
||||
|
||||
def test_add_or_like_conditions(self):
|
||||
"""Test adding multiple LIKE conditions combined with OR."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_or_like_conditions("name", ["foo", "bar", "baz"])
|
||||
|
||||
assert len(builder.conditions) == 1
|
||||
condition = builder.conditions[0]
|
||||
assert "name LIKE :p0 ESCAPE '\\'" in condition
|
||||
assert "name LIKE :p1 ESCAPE '\\'" in condition
|
||||
assert "name LIKE :p2 ESCAPE '\\'" in condition
|
||||
assert " OR " in condition
|
||||
assert condition.startswith("(")
|
||||
assert condition.endswith(")")
|
||||
assert builder.params["p0"] == "%foo%"
|
||||
assert builder.params["p1"] == "%bar%"
|
||||
assert builder.params["p2"] == "%baz%"
|
||||
|
||||
def test_add_or_like_conditions_case_insensitive(self):
|
||||
"""Test OR LIKE conditions with case insensitive matching."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_or_like_conditions("name", ["Foo", "BAR"], case_insensitive=True)
|
||||
|
||||
condition = builder.conditions[0]
|
||||
assert "UPPER(name)" in condition
|
||||
assert builder.params["p0"] == "%FOO%"
|
||||
assert builder.params["p1"] == "%BAR%"
|
||||
|
||||
def test_add_or_like_conditions_escapes_wildcards(self):
|
||||
"""Test OR LIKE conditions escape SQL wildcards."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_or_like_conditions("name", ["test%val", "foo_bar"])
|
||||
|
||||
assert builder.params["p0"] == "%test\\%val%"
|
||||
assert builder.params["p1"] == "%foo\\_bar%"
|
||||
|
||||
def test_add_or_like_conditions_empty_list(self):
|
||||
"""Test that empty list doesn't add condition."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_or_like_conditions("name", [])
|
||||
|
||||
assert len(builder.conditions) == 0
|
||||
assert len(builder.params) == 0
|
||||
|
||||
def test_add_or_like_conditions_position(self):
|
||||
"""Test OR LIKE conditions with different positions."""
|
||||
builder = QueryBuilder()
|
||||
builder.add_or_like_conditions("name", ["test"], position="start")
|
||||
|
||||
assert builder.params["p0"] == "test%"
|
||||
109
tests/test_sql_loader.py
Normal file
109
tests/test_sql_loader.py
Normal file
@@ -0,0 +1,109 @@
|
||||
"""Tests for SQL Loader."""
|
||||
|
||||
import pytest
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
from mes_dashboard.sql.loader import SQLLoader
|
||||
|
||||
|
||||
class TestSQLLoader:
|
||||
"""Test SQLLoader class."""
|
||||
|
||||
def setup_method(self):
|
||||
"""Clear cache before each test."""
|
||||
SQLLoader.clear_cache()
|
||||
|
||||
def test_load_existing_file(self, tmp_path):
|
||||
"""Test loading an existing SQL file."""
|
||||
# Create a temporary SQL file
|
||||
sql_dir = tmp_path / "wip"
|
||||
sql_dir.mkdir()
|
||||
sql_file = sql_dir / "summary.sql"
|
||||
sql_file.write_text("SELECT * FROM DWH.DW_MES_LOT_V")
|
||||
|
||||
# Patch the _sql_dir to use our temp directory
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
result = SQLLoader.load("wip/summary")
|
||||
assert result == "SELECT * FROM DWH.DW_MES_LOT_V"
|
||||
|
||||
def test_load_nonexistent_file(self):
|
||||
"""Test loading a non-existent SQL file raises FileNotFoundError."""
|
||||
with pytest.raises(FileNotFoundError) as exc_info:
|
||||
SQLLoader.load("nonexistent/query")
|
||||
assert "SQL file not found" in str(exc_info.value)
|
||||
|
||||
def test_load_uses_cache(self, tmp_path):
|
||||
"""Test that repeated loads use the cache."""
|
||||
# Create a temporary SQL file
|
||||
sql_dir = tmp_path / "test"
|
||||
sql_dir.mkdir()
|
||||
sql_file = sql_dir / "cached.sql"
|
||||
sql_file.write_text("SELECT 1")
|
||||
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
SQLLoader.clear_cache()
|
||||
|
||||
# First load
|
||||
result1 = SQLLoader.load("test/cached")
|
||||
info1 = SQLLoader.cache_info()
|
||||
|
||||
# Second load (should hit cache)
|
||||
result2 = SQLLoader.load("test/cached")
|
||||
info2 = SQLLoader.cache_info()
|
||||
|
||||
assert result1 == result2
|
||||
assert info1.misses == 1
|
||||
assert info2.hits == 1
|
||||
|
||||
def test_load_with_params_substitutes_values(self, tmp_path):
|
||||
"""Test structural parameter substitution."""
|
||||
sql_dir = tmp_path
|
||||
sql_file = sql_dir / "query.sql"
|
||||
sql_file.write_text("SELECT * FROM {{ table_name }}")
|
||||
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
result = SQLLoader.load_with_params("query", table_name="DWH.MY_TABLE")
|
||||
assert result == "SELECT * FROM DWH.MY_TABLE"
|
||||
|
||||
def test_load_with_params_preserves_unsubstituted(self, tmp_path):
|
||||
"""Test that unsubstituted parameters remain unchanged."""
|
||||
sql_dir = tmp_path
|
||||
sql_file = sql_dir / "query.sql"
|
||||
sql_file.write_text("SELECT * FROM {{ table_name }} {{ WHERE_CLAUSE }}")
|
||||
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
result = SQLLoader.load_with_params("query", table_name="T")
|
||||
assert result == "SELECT * FROM T {{ WHERE_CLAUSE }}"
|
||||
|
||||
def test_clear_cache(self, tmp_path):
|
||||
"""Test cache clearing."""
|
||||
sql_dir = tmp_path
|
||||
sql_file = sql_dir / "test.sql"
|
||||
sql_file.write_text("SELECT 1")
|
||||
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
SQLLoader.load("test")
|
||||
info_before = SQLLoader.cache_info()
|
||||
assert info_before.currsize > 0
|
||||
|
||||
SQLLoader.clear_cache()
|
||||
info_after = SQLLoader.cache_info()
|
||||
assert info_after.currsize == 0
|
||||
|
||||
def test_cache_info(self, tmp_path):
|
||||
"""Test cache_info returns valid statistics."""
|
||||
sql_dir = tmp_path
|
||||
sql_file = sql_dir / "test.sql"
|
||||
sql_file.write_text("SELECT 1")
|
||||
|
||||
with patch.object(SQLLoader, "_sql_dir", tmp_path):
|
||||
SQLLoader.clear_cache()
|
||||
SQLLoader.load("test")
|
||||
info = SQLLoader.cache_info()
|
||||
|
||||
assert hasattr(info, "hits")
|
||||
assert hasattr(info, "misses")
|
||||
assert hasattr(info, "maxsize")
|
||||
assert hasattr(info, "currsize")
|
||||
assert info.maxsize == 100
|
||||
@@ -11,8 +11,6 @@ import pandas as pd
|
||||
|
||||
from mes_dashboard.services.wip_service import (
|
||||
WIP_VIEW,
|
||||
_escape_sql,
|
||||
_build_base_conditions,
|
||||
get_wip_summary,
|
||||
get_wip_matrix,
|
||||
get_wip_hold_summary,
|
||||
@@ -39,63 +37,7 @@ class TestWipServiceConfig(unittest.TestCase):
|
||||
|
||||
def test_wip_view_configured(self):
|
||||
"""WIP_VIEW should be configured correctly."""
|
||||
self.assertEqual(WIP_VIEW, "DW_MES_LOT_V")
|
||||
|
||||
|
||||
class TestEscapeSql(unittest.TestCase):
|
||||
"""Test _escape_sql function for SQL injection prevention."""
|
||||
|
||||
def test_escapes_single_quotes(self):
|
||||
"""Should escape single quotes."""
|
||||
self.assertEqual(_escape_sql("O'Brien"), "O''Brien")
|
||||
|
||||
def test_escapes_multiple_quotes(self):
|
||||
"""Should escape multiple single quotes."""
|
||||
self.assertEqual(_escape_sql("It's Bob's"), "It''s Bob''s")
|
||||
|
||||
def test_handles_none(self):
|
||||
"""Should return None for None input."""
|
||||
self.assertIsNone(_escape_sql(None))
|
||||
|
||||
def test_no_change_for_safe_string(self):
|
||||
"""Should not modify strings without quotes."""
|
||||
self.assertEqual(_escape_sql("GA26012345"), "GA26012345")
|
||||
|
||||
|
||||
class TestBuildBaseConditions(unittest.TestCase):
|
||||
"""Test _build_base_conditions function."""
|
||||
|
||||
def test_default_excludes_dummy(self):
|
||||
"""Default behavior should exclude DUMMY lots."""
|
||||
conditions = _build_base_conditions()
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", conditions)
|
||||
|
||||
def test_include_dummy_true(self):
|
||||
"""include_dummy=True should not add DUMMY exclusion."""
|
||||
conditions = _build_base_conditions(include_dummy=True)
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", conditions)
|
||||
|
||||
def test_workorder_filter(self):
|
||||
"""Should add WORKORDER LIKE condition."""
|
||||
conditions = _build_base_conditions(workorder='GA26')
|
||||
self.assertTrue(any("WORKORDER LIKE '%GA26%'" in c for c in conditions))
|
||||
|
||||
def test_lotid_filter(self):
|
||||
"""Should add LOTID LIKE condition."""
|
||||
conditions = _build_base_conditions(lotid='12345')
|
||||
self.assertTrue(any("LOTID LIKE '%12345%'" in c for c in conditions))
|
||||
|
||||
def test_escapes_sql_in_workorder(self):
|
||||
"""Should escape SQL special characters in workorder."""
|
||||
conditions = _build_base_conditions(workorder="test'value")
|
||||
# Should have escaped the quote
|
||||
self.assertTrue(any("test''value" in c for c in conditions))
|
||||
|
||||
def test_escapes_sql_in_lotid(self):
|
||||
"""Should escape SQL special characters in lotid."""
|
||||
conditions = _build_base_conditions(lotid="lot'id")
|
||||
# Should have escaped the quote
|
||||
self.assertTrue(any("lot''id" in c for c in conditions))
|
||||
self.assertEqual(WIP_VIEW, "DWH.DW_MES_LOT_V")
|
||||
|
||||
|
||||
class TestGetWipSummary(unittest.TestCase):
|
||||
@@ -133,7 +75,7 @@ class TestGetWipMatrix(unittest.TestCase):
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割', '焊接_DB'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1, 2],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323', 'SOT-23'],
|
||||
'PACKAGE_LEF': ['SOT-23', 'SOD-323', 'SOT-23'],
|
||||
'QTY': [50000000, 30000000, 40000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -155,7 +97,7 @@ class TestGetWipMatrix(unittest.TestCase):
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['焊接_DB', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [2, 1],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOT-23'],
|
||||
'PACKAGE_LEF': ['SOT-23', 'SOT-23'],
|
||||
'QTY': [40000000, 50000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -171,7 +113,7 @@ class TestGetWipMatrix(unittest.TestCase):
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1],
|
||||
'PRODUCTLINENAME': ['SOD-323', 'SOT-23'],
|
||||
'PACKAGE_LEF': ['SOD-323', 'SOT-23'],
|
||||
'QTY': [30000000, 50000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -200,7 +142,7 @@ class TestGetWipMatrix(unittest.TestCase):
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割', '切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1, 1],
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323'],
|
||||
'PACKAGE_LEF': ['SOT-23', 'SOD-323'],
|
||||
'QTY': [50000000, 30000000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -310,7 +252,7 @@ class TestGetPackages(unittest.TestCase):
|
||||
def test_returns_package_list(self, mock_read_sql):
|
||||
"""Should return list of packages with lot counts."""
|
||||
mock_df = pd.DataFrame({
|
||||
'PRODUCTLINENAME': ['SOT-23', 'SOD-323'],
|
||||
'PACKAGE_LEF': ['SOT-23', 'SOD-323'],
|
||||
'LOT_COUNT': [2234, 1392]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -401,9 +343,10 @@ class TestSearchWorkorders(unittest.TestCase):
|
||||
|
||||
search_workorders('GA26', limit=100)
|
||||
|
||||
# Verify SQL contains FETCH FIRST 50
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn('FETCH FIRST 50 ROWS ONLY', call_args)
|
||||
# Verify params contain row_limit=50 (capped from 100)
|
||||
call_args = mock_read_sql.call_args
|
||||
params = call_args[0][1] if len(call_args[0]) > 1 else call_args[1].get('params', {})
|
||||
self.assertEqual(params.get('row_limit'), 50)
|
||||
|
||||
@disable_cache
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
@@ -554,7 +497,7 @@ class TestDummyExclusionInAllFunctions(unittest.TestCase):
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1],
|
||||
'PRODUCTLINENAME': ['SOT-23'],
|
||||
'PACKAGE_LEF': ['SOT-23'],
|
||||
'QTY': [1000]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
@@ -599,7 +542,7 @@ class TestDummyExclusionInAllFunctions(unittest.TestCase):
|
||||
def test_get_packages_excludes_dummy_by_default(self, mock_read_sql):
|
||||
"""get_packages should exclude DUMMY by default."""
|
||||
mock_df = pd.DataFrame({
|
||||
'PRODUCTLINENAME': ['SOT-23'], 'LOT_COUNT': [100]
|
||||
'PACKAGE_LEF': ['SOT-23'], 'LOT_COUNT': [100]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
@@ -615,7 +558,7 @@ class TestMultipleFilterConditions(unittest.TestCase):
|
||||
@disable_cache
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_summary_with_all_filters(self, mock_read_sql):
|
||||
"""get_wip_summary should combine all filter conditions."""
|
||||
"""get_wip_summary should combine all filter conditions via parameterized queries."""
|
||||
mock_df = pd.DataFrame({
|
||||
'TOTAL_LOTS': [50],
|
||||
'TOTAL_QTY_PCS': [500],
|
||||
@@ -625,36 +568,54 @@ class TestMultipleFilterConditions(unittest.TestCase):
|
||||
'QUEUE_QTY_PCS': [50],
|
||||
'HOLD_LOTS': [5],
|
||||
'HOLD_QTY_PCS': [50],
|
||||
'QUALITY_HOLD_LOTS': [3],
|
||||
'QUALITY_HOLD_QTY_PCS': [30],
|
||||
'NON_QUALITY_HOLD_LOTS': [2],
|
||||
'NON_QUALITY_HOLD_QTY_PCS': [20],
|
||||
'DATA_UPDATE_DATE': ['2026-01-26']
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_summary(workorder='GA26', lotid='A00')
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("WORKORDER LIKE '%GA26%'", call_args)
|
||||
self.assertIn("LOTID LIKE '%A00%'", call_args)
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
# Check SQL contains parameterized LIKE conditions
|
||||
call_args = mock_read_sql.call_args
|
||||
sql = call_args[0][0]
|
||||
params = call_args[0][1] if len(call_args[0]) > 1 else {}
|
||||
|
||||
self.assertIn("WORKORDER LIKE", sql)
|
||||
self.assertIn("LOTID LIKE", sql)
|
||||
self.assertIn("LOTID NOT LIKE '%DUMMY%'", sql)
|
||||
# Verify params contain the search patterns
|
||||
self.assertTrue(any('%GA26%' in str(v) for v in params.values()))
|
||||
self.assertTrue(any('%A00%' in str(v) for v in params.values()))
|
||||
|
||||
@disable_cache
|
||||
@patch('mes_dashboard.services.wip_service.read_sql_df')
|
||||
def test_get_wip_matrix_with_all_filters(self, mock_read_sql):
|
||||
"""get_wip_matrix should combine all filter conditions."""
|
||||
"""get_wip_matrix should combine all filter conditions via parameterized queries."""
|
||||
mock_df = pd.DataFrame({
|
||||
'WORKCENTER_GROUP': ['切割'],
|
||||
'WORKCENTERSEQUENCE_GROUP': [1],
|
||||
'PRODUCTLINENAME': ['SOT-23'],
|
||||
'PACKAGE_LEF': ['SOT-23'],
|
||||
'QTY': [500]
|
||||
})
|
||||
mock_read_sql.return_value = mock_df
|
||||
|
||||
get_wip_matrix(workorder='GA26', lotid='A00', include_dummy=True)
|
||||
|
||||
call_args = mock_read_sql.call_args[0][0]
|
||||
self.assertIn("WORKORDER LIKE '%GA26%'", call_args)
|
||||
self.assertIn("LOTID LIKE '%A00%'", call_args)
|
||||
# Check SQL contains parameterized LIKE conditions
|
||||
call_args = mock_read_sql.call_args
|
||||
sql = call_args[0][0]
|
||||
params = call_args[0][1] if len(call_args[0]) > 1 else {}
|
||||
|
||||
self.assertIn("WORKORDER LIKE", sql)
|
||||
self.assertIn("LOTID LIKE", sql)
|
||||
# Should NOT contain DUMMY exclusion since include_dummy=True
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", call_args)
|
||||
self.assertNotIn("LOTID NOT LIKE '%DUMMY%'", sql)
|
||||
# Verify params contain the search patterns
|
||||
self.assertTrue(any('%GA26%' in str(v) for v in params.values()))
|
||||
self.assertTrue(any('%A00%' in str(v) for v in params.values()))
|
||||
|
||||
|
||||
|
||||
|
||||
Reference in New Issue
Block a user