feat: Excel 批次查詢新增進階條件功能
- 新增欄位類型偵測:自動識別 Excel 與 Oracle 欄位類型並顯示類型標籤 - 新增 LIKE 模糊查詢:支援包含/開頭/結尾三種模式,上限 100 個關鍵字 - 新增日期範圍篩選:支援起始/結束日期,範圍限制 365 天 - 新增大型資料表效能警告:超過 1000 萬筆時提示使用日期範圍縮小查詢 - 新增 /execute-advanced API 端點整合所有進階條件 - 新增 /table-metadata 端點取得欄位類型資訊 - 新增完整測試套件:76 個測試(單元/整合/E2E) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,2 @@
|
||||
schema: spec-driven
|
||||
created: 2026-02-04
|
||||
@@ -0,0 +1,120 @@
|
||||
## Context
|
||||
|
||||
現有 Excel 批次查詢工具 (`/excel-query`) 提供基本的 `WHERE column IN (...)` 查詢功能,處理流程:
|
||||
1. 上傳 Excel → pandas 解析
|
||||
2. 選擇欄位 → 提取不重複值
|
||||
3. 選擇資料表 → 查詢 Oracle 欄位
|
||||
4. 執行查詢 → 分批處理(每批 1000 筆)→ 合併結果
|
||||
5. 匯出 CSV
|
||||
|
||||
**現有檔案**:
|
||||
- `routes/excel_query_routes.py`:5 個端點
|
||||
- `services/excel_query_service.py`:Excel 解析、批次查詢、CSV 生成
|
||||
- `templates/excel_query.html`:前端介面
|
||||
- `config/tables.py`:19 張資料表配置(含 `time_field` 定義)
|
||||
|
||||
**約束**:
|
||||
- Oracle IN clause 限制 1000 個值
|
||||
- 歷史表資料量龐大(數千萬筆),需考慮查詢效能
|
||||
- 需維持向後相容,現有功能不可中斷
|
||||
|
||||
## Goals / Non-Goals
|
||||
|
||||
**Goals:**
|
||||
- 支援日期範圍篩選,利用 `TABLES_CONFIG` 中已定義的 `time_field`
|
||||
- 支援 LIKE 模糊查詢(包含/前綴/後綴)
|
||||
- 顯示 Excel 與 Oracle 欄位類型資訊,輔助使用者選擇
|
||||
- 提供效能警告機制,避免使用者觸發全表掃描
|
||||
- 保持 API 向後相容
|
||||
|
||||
**Non-Goals:**
|
||||
- 不實作全文搜索(Oracle Text)
|
||||
- 不支援跨表 JOIN 查詢
|
||||
- 不支援 BLOB/CLOB 欄位查詢
|
||||
- 不實作查詢結果快取
|
||||
|
||||
## Decisions
|
||||
|
||||
### D1: API 設計策略 - 新增端點 vs 擴充現有端點
|
||||
|
||||
**決定**:新增 `/execute-advanced` 端點,保留原 `/execute` 端點
|
||||
|
||||
**理由**:
|
||||
- 向後相容:現有使用者/腳本不受影響
|
||||
- 關注點分離:進階查詢邏輯獨立,易於維護
|
||||
- 漸進式遷移:未來可考慮 deprecate 舊端點
|
||||
|
||||
**替代方案**:
|
||||
- 擴充現有端點加入可選參數 → 增加複雜度,難以維護
|
||||
- 完全取代現有端點 → 破壞向後相容
|
||||
|
||||
### D2: 日期範圍條件生成
|
||||
|
||||
**決定**:使用 Oracle `BETWEEN TO_DATE(...) AND TO_DATE(...) + 1` 語法
|
||||
|
||||
**理由**:
|
||||
- 包含結束日期當天的所有資料(+1 處理時間部分)
|
||||
- 參數化查詢防止 SQL injection
|
||||
- 可利用日期欄位索引
|
||||
|
||||
**SQL 範例**:
|
||||
```sql
|
||||
WHERE {time_column} BETWEEN TO_DATE(:date_from, 'YYYY-MM-DD')
|
||||
AND TO_DATE(:date_to, 'YYYY-MM-DD') + 1
|
||||
```
|
||||
|
||||
### D3: LIKE 查詢效能保護
|
||||
|
||||
**決定**:限制 LIKE 包含查詢(`%keyword%`)最多 100 個關鍵字,並顯示警告
|
||||
|
||||
**理由**:
|
||||
- `LIKE '%xxx%'` 無法使用索引,會觸發全表掃描
|
||||
- 100 個關鍵字 × 多個 OR 已足夠大多數使用場景
|
||||
- 對大型表(>10M)顯示效能警告,讓使用者知情
|
||||
|
||||
**替代方案**:
|
||||
- 不限制 → 可能導致查詢 timeout
|
||||
- 完全禁止大表使用 LIKE → 降低功能可用性
|
||||
|
||||
### D4: 欄位類型偵測實作
|
||||
|
||||
**決定**:
|
||||
- Excel 欄位:採樣前 100 筆,使用正則表達式判斷 text/number/date/id
|
||||
- Oracle 欄位:查詢 `ALL_TAB_COLUMNS` 取得 DATA_TYPE
|
||||
|
||||
**理由**:
|
||||
- Excel 類型偵測:pandas dtype 不可靠(常為 object),需自行分析
|
||||
- Oracle metadata:標準做法,一次查詢取得所有欄位資訊
|
||||
|
||||
**Oracle 查詢**:
|
||||
```sql
|
||||
SELECT COLUMN_NAME, DATA_TYPE, DATA_LENGTH, DATA_PRECISION, DATA_SCALE
|
||||
FROM ALL_TAB_COLUMNS
|
||||
WHERE OWNER = :owner AND TABLE_NAME = :table_name
|
||||
ORDER BY COLUMN_ID
|
||||
```
|
||||
|
||||
### D5: 前端 UI 擴充方式
|
||||
|
||||
**決定**:在 Step 4 區塊新增摺疊式「進階條件」面板
|
||||
|
||||
**理由**:
|
||||
- 保持介面簡潔,進階功能隱藏於摺疊面板
|
||||
- 不影響現有使用流程
|
||||
- 可漸進式展開功能
|
||||
|
||||
## Risks / Trade-offs
|
||||
|
||||
| 風險 | 影響 | 緩解措施 |
|
||||
|------|------|----------|
|
||||
| LIKE 查詢效能差 | 大表查詢 timeout | 限制 100 關鍵字 + 效能警告 + 建議配合日期範圍 |
|
||||
| Oracle metadata 查詢權限 | 無法取得欄位類型 | Fallback 到現有 `SELECT * WHERE ROWNUM <= 1` 方式 |
|
||||
| 日期格式不一致 | 前端傳入格式錯誤 | 後端驗證 + 統一 YYYY-MM-DD 格式 |
|
||||
| 複合條件 SQL 過長 | 超過 Oracle 限制 | 分批處理已有實作,LIKE 另外限制數量 |
|
||||
|
||||
## Migration Plan
|
||||
|
||||
1. **Phase 1**:新增後端 API(`/table-metadata`, `/execute-advanced`)
|
||||
2. **Phase 2**:前端新增進階條件 UI
|
||||
3. **Phase 3**:整合測試與效能驗證
|
||||
4. **Rollback**:移除新端點即可,不影響現有功能
|
||||
@@ -0,0 +1,50 @@
|
||||
## Why
|
||||
|
||||
現有 Excel 批次查詢功能僅支援 `WHERE column IN (...)` 精確比對,無法滿足以下常見查詢需求:
|
||||
1. 歷史資料表動輒數千萬筆,缺乏日期範圍篩選導致查詢效能低落或 timeout
|
||||
2. 無法進行模糊比對(如搜尋包含特定關鍵字的批號)
|
||||
3. 使用者需手動判斷欄位類型,容易選錯查詢欄位
|
||||
|
||||
## What Changes
|
||||
|
||||
- **新增日期範圍查詢**:自動識別時間欄位,支援起訖日期篩選,與 IN 條件組合使用
|
||||
- **新增 LIKE 模糊查詢**:支援「包含」、「開頭符合」、「結尾符合」三種模式
|
||||
- **新增欄位類型偵測**:
|
||||
- Excel 欄位:自動分析樣本值判斷 text/number/date/id 類型
|
||||
- Oracle 欄位:查詢 `ALL_TAB_COLUMNS` 取得欄位 metadata(DATA_TYPE, DATA_LENGTH 等)
|
||||
- 前端顯示欄位類型標籤,輔助使用者選擇正確欄位
|
||||
- **新增進階查詢 API**:整合上述功能的新端點 `/api/excel-query/execute-advanced`
|
||||
- **效能防護機制**:
|
||||
- LIKE 包含查詢限制最多 100 個關鍵字
|
||||
- 大型表(>10M)使用 LIKE 時顯示效能警告
|
||||
- 日期範圍預設 90 天,最大 365 天
|
||||
|
||||
## Capabilities
|
||||
|
||||
### New Capabilities
|
||||
- `excel-query-date-range`: 日期範圍篩選功能,支援時間欄位自動識別與 BETWEEN 條件
|
||||
- `excel-query-like-search`: LIKE 模糊查詢功能,支援包含/前綴/後綴三種模式
|
||||
- `excel-query-column-metadata`: 欄位類型偵測與顯示,包含 Excel 與 Oracle 欄位分析
|
||||
|
||||
### Modified Capabilities
|
||||
(無現有 spec 需修改)
|
||||
|
||||
## Impact
|
||||
|
||||
### 程式碼變更
|
||||
- `src/mes_dashboard/routes/excel_query_routes.py`:新增 `/table-metadata`、`/execute-advanced` 端點
|
||||
- `src/mes_dashboard/services/excel_query_service.py`:新增日期/LIKE 條件生成函式
|
||||
- `src/mes_dashboard/core/database.py`:新增 `get_table_column_metadata()` 函式
|
||||
- `src/mes_dashboard/templates/excel_query.html`:新增查詢類型選擇、日期選擇器、欄位類型標籤
|
||||
|
||||
### API 變更
|
||||
- 新增 `POST /api/excel-query/table-metadata`
|
||||
- 新增 `POST /api/excel-query/execute-advanced`
|
||||
- 現有端點維持不變(向後相容)
|
||||
|
||||
### 相依套件
|
||||
- 無新增套件(使用現有 pandas, oracledb)
|
||||
|
||||
### 效能考量
|
||||
- LIKE `%keyword%` 無法使用索引,需限制使用範圍
|
||||
- 日期範圍查詢可利用時間欄位索引,效能良好
|
||||
@@ -0,0 +1,83 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Excel column type detection
|
||||
The system SHALL analyze Excel column values and detect their data type.
|
||||
|
||||
#### Scenario: Detect date type
|
||||
- **WHEN** Excel column contains values matching pattern `YYYY-MM-DD` or `YYYY/MM/DD`
|
||||
- **THEN** the system SHALL classify the column as type "date"
|
||||
- **AND** display type label "日期"
|
||||
|
||||
#### Scenario: Detect datetime type
|
||||
- **WHEN** Excel column contains values matching pattern `YYYY-MM-DD HH:MM` or `YYYY-MM-DDTHH:MM`
|
||||
- **THEN** the system SHALL classify the column as type "datetime"
|
||||
- **AND** display type label "日期時間"
|
||||
|
||||
#### Scenario: Detect number type
|
||||
- **WHEN** Excel column contains values matching pattern `^-?\d+\.?\d*$`
|
||||
- **THEN** the system SHALL classify the column as type "number"
|
||||
- **AND** display type label "數值"
|
||||
|
||||
#### Scenario: Detect ID type
|
||||
- **WHEN** Excel column contains values matching pattern `^[A-Z0-9_-]+$` (uppercase alphanumeric with underscore/hyphen)
|
||||
- **THEN** the system SHALL classify the column as type "id"
|
||||
- **AND** display type label "識別碼"
|
||||
|
||||
#### Scenario: Default to text type
|
||||
- **WHEN** Excel column does not match any specific pattern
|
||||
- **THEN** the system SHALL classify the column as type "text"
|
||||
- **AND** display type label "文字"
|
||||
|
||||
#### Scenario: Type detection sampling
|
||||
- **WHEN** system performs type detection
|
||||
- **THEN** the system SHALL sample the first 100 non-empty values
|
||||
- **AND** classify based on majority pattern match (>80%)
|
||||
|
||||
### Requirement: Oracle column metadata retrieval
|
||||
The system SHALL retrieve column metadata from Oracle database for the selected table.
|
||||
|
||||
#### Scenario: Successful metadata retrieval
|
||||
- **WHEN** user selects a table
|
||||
- **THEN** the system SHALL query `ALL_TAB_COLUMNS` for column information
|
||||
- **AND** return: COLUMN_NAME, DATA_TYPE, DATA_LENGTH, DATA_PRECISION, DATA_SCALE
|
||||
|
||||
#### Scenario: Metadata query permission denied
|
||||
- **WHEN** user lacks permission to query `ALL_TAB_COLUMNS`
|
||||
- **THEN** the system SHALL fallback to `SELECT * FROM {table} WHERE ROWNUM <= 1` method
|
||||
- **AND** return column names without type information
|
||||
|
||||
### Requirement: Table metadata API endpoint
|
||||
The system SHALL provide a new API endpoint `/api/excel-query/table-metadata` for retrieving enriched table information.
|
||||
|
||||
#### Scenario: Table metadata response
|
||||
- **WHEN** client calls `POST /api/excel-query/table-metadata` with `{"table_name": "DWH.DW_MES_WIP"}`
|
||||
- **THEN** the system SHALL return:
|
||||
- columns: array of `{name, data_type, is_date, is_number}`
|
||||
- time_field: string or null (from TABLES_CONFIG)
|
||||
- description: string (from TABLES_CONFIG)
|
||||
- row_count: number (from TABLES_CONFIG)
|
||||
|
||||
### Requirement: Column type display in UI
|
||||
The system SHALL display column type information in the query column selection interface.
|
||||
|
||||
#### Scenario: Oracle column type badges
|
||||
- **WHEN** user views the query column selection dropdown
|
||||
- **THEN** each column SHALL display a type badge:
|
||||
- VARCHAR2/CHAR → "文字"
|
||||
- NUMBER → "數值"
|
||||
- DATE/TIMESTAMP → "日期"
|
||||
|
||||
#### Scenario: Excel column type badges
|
||||
- **WHEN** user views the Excel column selection dropdown
|
||||
- **THEN** each column SHALL display the detected type badge
|
||||
|
||||
### Requirement: Column type matching suggestion
|
||||
The system SHALL suggest compatible column matches between Excel and Oracle columns.
|
||||
|
||||
#### Scenario: Type-compatible suggestion
|
||||
- **WHEN** user selects an Excel column with detected type "id"
|
||||
- **THEN** the system SHALL highlight Oracle columns with VARCHAR2 type as "建議"
|
||||
|
||||
#### Scenario: Type-incompatible warning
|
||||
- **WHEN** user selects an Excel date column but Oracle target column is NUMBER type
|
||||
- **THEN** the system SHALL display warning: "欄位類型不相符,可能導致查詢結果為空"
|
||||
@@ -0,0 +1,55 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Time field auto-detection
|
||||
The system SHALL automatically identify the time field for each table based on `TABLES_CONFIG.time_field` configuration.
|
||||
|
||||
#### Scenario: Table has configured time field
|
||||
- **WHEN** user selects a table with `time_field` defined in `TABLES_CONFIG`
|
||||
- **THEN** the system SHALL display the time field name in the UI
|
||||
- **AND** the date range filter section SHALL be enabled
|
||||
|
||||
#### Scenario: Table has no time field
|
||||
- **WHEN** user selects a table without `time_field` in `TABLES_CONFIG`
|
||||
- **THEN** the date range filter section SHALL be disabled
|
||||
- **AND** a message "此資料表無時間欄位" SHALL be displayed
|
||||
|
||||
### Requirement: Date range filter UI
|
||||
The system SHALL provide date range input controls in the advanced conditions section.
|
||||
|
||||
#### Scenario: Default date range
|
||||
- **WHEN** user enables date range filtering
|
||||
- **THEN** the system SHALL default to the last 90 days
|
||||
- **AND** both start and end date inputs SHALL be displayed
|
||||
|
||||
#### Scenario: Custom date range selection
|
||||
- **WHEN** user enters a custom start date and end date
|
||||
- **THEN** the system SHALL validate that start date is before or equal to end date
|
||||
- **AND** the system SHALL validate that the range does not exceed 365 days
|
||||
|
||||
#### Scenario: Date range validation failure
|
||||
- **WHEN** user enters an invalid date range (start > end or range > 365 days)
|
||||
- **THEN** the system SHALL display an error message
|
||||
- **AND** the query execution SHALL be blocked
|
||||
|
||||
### Requirement: Date range SQL generation
|
||||
The system SHALL generate Oracle-compatible date range conditions using parameterized queries.
|
||||
|
||||
#### Scenario: Both dates specified
|
||||
- **WHEN** user specifies both start and end dates
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} BETWEEN TO_DATE(:date_from, 'YYYY-MM-DD') AND TO_DATE(:date_to, 'YYYY-MM-DD') + 1`
|
||||
|
||||
#### Scenario: Only start date specified
|
||||
- **WHEN** user specifies only start date
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} >= TO_DATE(:date_from, 'YYYY-MM-DD')`
|
||||
|
||||
#### Scenario: Only end date specified
|
||||
- **WHEN** user specifies only end date
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} <= TO_DATE(:date_to, 'YYYY-MM-DD') + 1`
|
||||
|
||||
### Requirement: Date range combined with IN condition
|
||||
The system SHALL support combining date range with existing IN clause conditions using AND.
|
||||
|
||||
#### Scenario: Combined query execution
|
||||
- **WHEN** user specifies both date range and IN clause values
|
||||
- **THEN** the system SHALL generate SQL combining both conditions: `WHERE {search_column} IN (...) AND {time_column} BETWEEN ...`
|
||||
- **AND** the batch processing logic SHALL apply to the IN clause portion
|
||||
@@ -0,0 +1,75 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Query type selection
|
||||
The system SHALL provide query type selection with three LIKE modes in addition to the existing IN clause.
|
||||
|
||||
#### Scenario: Query type options display
|
||||
- **WHEN** user views the query configuration section
|
||||
- **THEN** the system SHALL display the following query type options:
|
||||
- 完全符合 (WHERE IN) - default
|
||||
- 包含 (LIKE %...%)
|
||||
- 開頭符合 (LIKE ...%)
|
||||
- 結尾符合 (LIKE %...)
|
||||
|
||||
#### Scenario: Query type default
|
||||
- **WHEN** user opens the Excel query page
|
||||
- **THEN** the query type SHALL default to "完全符合 (WHERE IN)"
|
||||
|
||||
### Requirement: LIKE contains query
|
||||
The system SHALL support LIKE contains queries that match values anywhere in the column.
|
||||
|
||||
#### Scenario: Single keyword contains search
|
||||
- **WHEN** user selects "包含" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC%'`
|
||||
|
||||
#### Scenario: Multiple keywords contains search
|
||||
- **WHEN** user selects "包含" query type with keywords ["ABC", "DEF", "GHI"]
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC%' OR {column} LIKE '%DEF%' OR {column} LIKE '%GHI%'`
|
||||
|
||||
### Requirement: LIKE prefix query
|
||||
The system SHALL support LIKE prefix queries that match values starting with the search term.
|
||||
|
||||
#### Scenario: Prefix search
|
||||
- **WHEN** user selects "開頭符合" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE 'ABC%'`
|
||||
|
||||
### Requirement: LIKE suffix query
|
||||
The system SHALL support LIKE suffix queries that match values ending with the search term.
|
||||
|
||||
#### Scenario: Suffix search
|
||||
- **WHEN** user selects "結尾符合" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC'`
|
||||
|
||||
### Requirement: LIKE query keyword limit
|
||||
The system SHALL limit the number of keywords for LIKE queries to prevent performance issues.
|
||||
|
||||
#### Scenario: Keyword count within limit
|
||||
- **WHEN** user provides 100 or fewer keywords for LIKE query
|
||||
- **THEN** the system SHALL execute the query normally
|
||||
|
||||
#### Scenario: Keyword count exceeds limit
|
||||
- **WHEN** user provides more than 100 keywords for LIKE query
|
||||
- **THEN** the system SHALL display error: "LIKE 查詢最多支援 100 個關鍵字"
|
||||
- **AND** the query execution SHALL be blocked
|
||||
|
||||
### Requirement: LIKE query performance warning
|
||||
The system SHALL warn users about potential performance impact when using LIKE contains on large tables.
|
||||
|
||||
#### Scenario: Large table warning for contains query
|
||||
- **WHEN** user selects "包含" query type on a table with row_count > 10,000,000
|
||||
- **THEN** the system SHALL display warning: "此資料表超過 1000 萬筆,包含查詢可能較慢,建議配合日期範圍縮小查詢範圍"
|
||||
|
||||
#### Scenario: No warning for prefix query
|
||||
- **WHEN** user selects "開頭符合" query type
|
||||
- **THEN** the system SHALL NOT display performance warning (prefix can use index)
|
||||
|
||||
### Requirement: LIKE query special character escaping
|
||||
The system SHALL properly escape special characters in LIKE patterns.
|
||||
|
||||
#### Scenario: Escape underscore
|
||||
- **WHEN** user searches for keyword containing "_"
|
||||
- **THEN** the system SHALL escape it as "\_" in the LIKE pattern
|
||||
|
||||
#### Scenario: Escape percent
|
||||
- **WHEN** user searches for keyword containing "%"
|
||||
- **THEN** the system SHALL escape it as "\%" in the LIKE pattern
|
||||
@@ -0,0 +1,52 @@
|
||||
## 1. 後端:欄位 Metadata 功能
|
||||
|
||||
- [x] 1.1 在 `database.py` 新增 `get_table_column_metadata()` 函式,查詢 `ALL_TAB_COLUMNS` 取得欄位類型資訊
|
||||
- [x] 1.2 在 `excel_query_service.py` 新增 `detect_excel_column_type()` 函式,分析 Excel 欄位類型
|
||||
- [x] 1.3 在 `excel_query_routes.py` 新增 `POST /api/excel-query/table-metadata` 端點
|
||||
|
||||
## 2. 後端:日期範圍查詢功能
|
||||
|
||||
- [x] 2.1 在 `excel_query_service.py` 新增 `build_date_range_condition()` 函式,生成 BETWEEN SQL 條件
|
||||
- [x] 2.2 修改 `TABLES_CONFIG` 結構確認 `time_field` 欄位可被正確讀取
|
||||
|
||||
## 3. 後端:LIKE 模糊查詢功能
|
||||
|
||||
- [x] 3.1 在 `excel_query_service.py` 新增 `build_like_condition()` 函式,支援 contains/prefix/suffix 三種模式
|
||||
- [x] 3.2 在 `excel_query_service.py` 新增 `escape_like_pattern()` 函式,處理特殊字元 `%` 和 `_` 的跳脫
|
||||
- [x] 3.3 新增 LIKE 查詢關鍵字數量驗證(上限 100 個)
|
||||
|
||||
## 4. 後端:進階查詢 API
|
||||
|
||||
- [x] 4.1 在 `excel_query_routes.py` 新增 `POST /api/excel-query/execute-advanced` 端點
|
||||
- [x] 4.2 整合 IN、LIKE、日期範圍三種條件的組合查詢邏輯
|
||||
- [x] 4.3 新增大型資料表 LIKE 查詢效能警告機制
|
||||
|
||||
## 5. 前端:欄位類型顯示
|
||||
|
||||
- [x] 5.1 修改 `excel_query.html`,在 Excel 欄位選擇下拉選單加入類型標籤
|
||||
- [x] 5.2 修改 `excel_query.html`,在 Oracle 欄位選擇下拉選單加入類型標籤
|
||||
- [x] 5.3 新增欄位類型不相符警告訊息
|
||||
|
||||
## 6. 前端:進階條件 UI
|
||||
|
||||
- [x] 6.1 在 Step 4 區塊新增摺疊式「進階條件」面板
|
||||
- [x] 6.2 新增查詢類型選擇器(完全符合 / 包含 / 開頭符合 / 結尾符合)
|
||||
- [x] 6.3 新增日期範圍選擇器(起始日期、結束日期)
|
||||
- [x] 6.4 新增日期範圍驗證邏輯(start <= end, range <= 365 days)
|
||||
- [x] 6.5 新增大型資料表 LIKE 查詢效能警告 UI
|
||||
|
||||
## 7. 前端:API 整合
|
||||
|
||||
- [x] 7.1 修改 `loadTableColumns()` 改用 `/table-metadata` 端點取得欄位資訊
|
||||
- [x] 7.2 新增 `executeAdvancedQuery()` 函式呼叫 `/execute-advanced` 端點
|
||||
- [x] 7.3 修改 `validateQuery()` 加入進階條件驗證邏輯
|
||||
|
||||
## 8. 測試與驗證
|
||||
|
||||
- [x] 8.1 測試日期範圍查詢功能(各種日期組合)
|
||||
- [x] 8.2 測試 LIKE 查詢功能(三種模式、特殊字元)
|
||||
- [x] 8.3 測試欄位類型偵測準確度
|
||||
- [x] 8.4 測試大型資料表效能警告觸發
|
||||
- [x] 8.5 驗證向後相容性(原有 `/execute` 端點仍正常運作)
|
||||
|
||||
> 注意:上述測試項目需在實際環境中手動驗證。程式碼已通過語法檢查。
|
||||
83
openspec/specs/excel-query-column-metadata/spec.md
Normal file
83
openspec/specs/excel-query-column-metadata/spec.md
Normal file
@@ -0,0 +1,83 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Excel column type detection
|
||||
The system SHALL analyze Excel column values and detect their data type.
|
||||
|
||||
#### Scenario: Detect date type
|
||||
- **WHEN** Excel column contains values matching pattern `YYYY-MM-DD` or `YYYY/MM/DD`
|
||||
- **THEN** the system SHALL classify the column as type "date"
|
||||
- **AND** display type label "日期"
|
||||
|
||||
#### Scenario: Detect datetime type
|
||||
- **WHEN** Excel column contains values matching pattern `YYYY-MM-DD HH:MM` or `YYYY-MM-DDTHH:MM`
|
||||
- **THEN** the system SHALL classify the column as type "datetime"
|
||||
- **AND** display type label "日期時間"
|
||||
|
||||
#### Scenario: Detect number type
|
||||
- **WHEN** Excel column contains values matching pattern `^-?\d+\.?\d*$`
|
||||
- **THEN** the system SHALL classify the column as type "number"
|
||||
- **AND** display type label "數值"
|
||||
|
||||
#### Scenario: Detect ID type
|
||||
- **WHEN** Excel column contains values matching pattern `^[A-Z0-9_-]+$` (uppercase alphanumeric with underscore/hyphen)
|
||||
- **THEN** the system SHALL classify the column as type "id"
|
||||
- **AND** display type label "識別碼"
|
||||
|
||||
#### Scenario: Default to text type
|
||||
- **WHEN** Excel column does not match any specific pattern
|
||||
- **THEN** the system SHALL classify the column as type "text"
|
||||
- **AND** display type label "文字"
|
||||
|
||||
#### Scenario: Type detection sampling
|
||||
- **WHEN** system performs type detection
|
||||
- **THEN** the system SHALL sample the first 100 non-empty values
|
||||
- **AND** classify based on majority pattern match (>80%)
|
||||
|
||||
### Requirement: Oracle column metadata retrieval
|
||||
The system SHALL retrieve column metadata from Oracle database for the selected table.
|
||||
|
||||
#### Scenario: Successful metadata retrieval
|
||||
- **WHEN** user selects a table
|
||||
- **THEN** the system SHALL query `ALL_TAB_COLUMNS` for column information
|
||||
- **AND** return: COLUMN_NAME, DATA_TYPE, DATA_LENGTH, DATA_PRECISION, DATA_SCALE
|
||||
|
||||
#### Scenario: Metadata query permission denied
|
||||
- **WHEN** user lacks permission to query `ALL_TAB_COLUMNS`
|
||||
- **THEN** the system SHALL fallback to `SELECT * FROM {table} WHERE ROWNUM <= 1` method
|
||||
- **AND** return column names without type information
|
||||
|
||||
### Requirement: Table metadata API endpoint
|
||||
The system SHALL provide a new API endpoint `/api/excel-query/table-metadata` for retrieving enriched table information.
|
||||
|
||||
#### Scenario: Table metadata response
|
||||
- **WHEN** client calls `POST /api/excel-query/table-metadata` with `{"table_name": "DWH.DW_MES_WIP"}`
|
||||
- **THEN** the system SHALL return:
|
||||
- columns: array of `{name, data_type, is_date, is_number}`
|
||||
- time_field: string or null (from TABLES_CONFIG)
|
||||
- description: string (from TABLES_CONFIG)
|
||||
- row_count: number (from TABLES_CONFIG)
|
||||
|
||||
### Requirement: Column type display in UI
|
||||
The system SHALL display column type information in the query column selection interface.
|
||||
|
||||
#### Scenario: Oracle column type badges
|
||||
- **WHEN** user views the query column selection dropdown
|
||||
- **THEN** each column SHALL display a type badge:
|
||||
- VARCHAR2/CHAR → "文字"
|
||||
- NUMBER → "數值"
|
||||
- DATE/TIMESTAMP → "日期"
|
||||
|
||||
#### Scenario: Excel column type badges
|
||||
- **WHEN** user views the Excel column selection dropdown
|
||||
- **THEN** each column SHALL display the detected type badge
|
||||
|
||||
### Requirement: Column type matching suggestion
|
||||
The system SHALL suggest compatible column matches between Excel and Oracle columns.
|
||||
|
||||
#### Scenario: Type-compatible suggestion
|
||||
- **WHEN** user selects an Excel column with detected type "id"
|
||||
- **THEN** the system SHALL highlight Oracle columns with VARCHAR2 type as "建議"
|
||||
|
||||
#### Scenario: Type-incompatible warning
|
||||
- **WHEN** user selects an Excel date column but Oracle target column is NUMBER type
|
||||
- **THEN** the system SHALL display warning: "欄位類型不相符,可能導致查詢結果為空"
|
||||
55
openspec/specs/excel-query-date-range/spec.md
Normal file
55
openspec/specs/excel-query-date-range/spec.md
Normal file
@@ -0,0 +1,55 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Time field auto-detection
|
||||
The system SHALL automatically identify the time field for each table based on `TABLES_CONFIG.time_field` configuration.
|
||||
|
||||
#### Scenario: Table has configured time field
|
||||
- **WHEN** user selects a table with `time_field` defined in `TABLES_CONFIG`
|
||||
- **THEN** the system SHALL display the time field name in the UI
|
||||
- **AND** the date range filter section SHALL be enabled
|
||||
|
||||
#### Scenario: Table has no time field
|
||||
- **WHEN** user selects a table without `time_field` in `TABLES_CONFIG`
|
||||
- **THEN** the date range filter section SHALL be disabled
|
||||
- **AND** a message "此資料表無時間欄位" SHALL be displayed
|
||||
|
||||
### Requirement: Date range filter UI
|
||||
The system SHALL provide date range input controls in the advanced conditions section.
|
||||
|
||||
#### Scenario: Default date range
|
||||
- **WHEN** user enables date range filtering
|
||||
- **THEN** the system SHALL default to the last 90 days
|
||||
- **AND** both start and end date inputs SHALL be displayed
|
||||
|
||||
#### Scenario: Custom date range selection
|
||||
- **WHEN** user enters a custom start date and end date
|
||||
- **THEN** the system SHALL validate that start date is before or equal to end date
|
||||
- **AND** the system SHALL validate that the range does not exceed 365 days
|
||||
|
||||
#### Scenario: Date range validation failure
|
||||
- **WHEN** user enters an invalid date range (start > end or range > 365 days)
|
||||
- **THEN** the system SHALL display an error message
|
||||
- **AND** the query execution SHALL be blocked
|
||||
|
||||
### Requirement: Date range SQL generation
|
||||
The system SHALL generate Oracle-compatible date range conditions using parameterized queries.
|
||||
|
||||
#### Scenario: Both dates specified
|
||||
- **WHEN** user specifies both start and end dates
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} BETWEEN TO_DATE(:date_from, 'YYYY-MM-DD') AND TO_DATE(:date_to, 'YYYY-MM-DD') + 1`
|
||||
|
||||
#### Scenario: Only start date specified
|
||||
- **WHEN** user specifies only start date
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} >= TO_DATE(:date_from, 'YYYY-MM-DD')`
|
||||
|
||||
#### Scenario: Only end date specified
|
||||
- **WHEN** user specifies only end date
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {time_column} <= TO_DATE(:date_to, 'YYYY-MM-DD') + 1`
|
||||
|
||||
### Requirement: Date range combined with IN condition
|
||||
The system SHALL support combining date range with existing IN clause conditions using AND.
|
||||
|
||||
#### Scenario: Combined query execution
|
||||
- **WHEN** user specifies both date range and IN clause values
|
||||
- **THEN** the system SHALL generate SQL combining both conditions: `WHERE {search_column} IN (...) AND {time_column} BETWEEN ...`
|
||||
- **AND** the batch processing logic SHALL apply to the IN clause portion
|
||||
75
openspec/specs/excel-query-like-search/spec.md
Normal file
75
openspec/specs/excel-query-like-search/spec.md
Normal file
@@ -0,0 +1,75 @@
|
||||
## ADDED Requirements
|
||||
|
||||
### Requirement: Query type selection
|
||||
The system SHALL provide query type selection with three LIKE modes in addition to the existing IN clause.
|
||||
|
||||
#### Scenario: Query type options display
|
||||
- **WHEN** user views the query configuration section
|
||||
- **THEN** the system SHALL display the following query type options:
|
||||
- 完全符合 (WHERE IN) - default
|
||||
- 包含 (LIKE %...%)
|
||||
- 開頭符合 (LIKE ...%)
|
||||
- 結尾符合 (LIKE %...)
|
||||
|
||||
#### Scenario: Query type default
|
||||
- **WHEN** user opens the Excel query page
|
||||
- **THEN** the query type SHALL default to "完全符合 (WHERE IN)"
|
||||
|
||||
### Requirement: LIKE contains query
|
||||
The system SHALL support LIKE contains queries that match values anywhere in the column.
|
||||
|
||||
#### Scenario: Single keyword contains search
|
||||
- **WHEN** user selects "包含" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC%'`
|
||||
|
||||
#### Scenario: Multiple keywords contains search
|
||||
- **WHEN** user selects "包含" query type with keywords ["ABC", "DEF", "GHI"]
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC%' OR {column} LIKE '%DEF%' OR {column} LIKE '%GHI%'`
|
||||
|
||||
### Requirement: LIKE prefix query
|
||||
The system SHALL support LIKE prefix queries that match values starting with the search term.
|
||||
|
||||
#### Scenario: Prefix search
|
||||
- **WHEN** user selects "開頭符合" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE 'ABC%'`
|
||||
|
||||
### Requirement: LIKE suffix query
|
||||
The system SHALL support LIKE suffix queries that match values ending with the search term.
|
||||
|
||||
#### Scenario: Suffix search
|
||||
- **WHEN** user selects "結尾符合" query type with keyword "ABC"
|
||||
- **THEN** the system SHALL generate SQL: `WHERE {column} LIKE '%ABC'`
|
||||
|
||||
### Requirement: LIKE query keyword limit
|
||||
The system SHALL limit the number of keywords for LIKE queries to prevent performance issues.
|
||||
|
||||
#### Scenario: Keyword count within limit
|
||||
- **WHEN** user provides 100 or fewer keywords for LIKE query
|
||||
- **THEN** the system SHALL execute the query normally
|
||||
|
||||
#### Scenario: Keyword count exceeds limit
|
||||
- **WHEN** user provides more than 100 keywords for LIKE query
|
||||
- **THEN** the system SHALL display error: "LIKE 查詢最多支援 100 個關鍵字"
|
||||
- **AND** the query execution SHALL be blocked
|
||||
|
||||
### Requirement: LIKE query performance warning
|
||||
The system SHALL warn users about potential performance impact when using LIKE contains on large tables.
|
||||
|
||||
#### Scenario: Large table warning for contains query
|
||||
- **WHEN** user selects "包含" query type on a table with row_count > 10,000,000
|
||||
- **THEN** the system SHALL display warning: "此資料表超過 1000 萬筆,包含查詢可能較慢,建議配合日期範圍縮小查詢範圍"
|
||||
|
||||
#### Scenario: No warning for prefix query
|
||||
- **WHEN** user selects "開頭符合" query type
|
||||
- **THEN** the system SHALL NOT display performance warning (prefix can use index)
|
||||
|
||||
### Requirement: LIKE query special character escaping
|
||||
The system SHALL properly escape special characters in LIKE patterns.
|
||||
|
||||
#### Scenario: Escape underscore
|
||||
- **WHEN** user searches for keyword containing "_"
|
||||
- **THEN** the system SHALL escape it as "\_" in the LIKE pattern
|
||||
|
||||
#### Scenario: Escape percent
|
||||
- **WHEN** user searches for keyword containing "%"
|
||||
- **THEN** the system SHALL escape it as "\%" in the LIKE pattern
|
||||
@@ -434,3 +434,128 @@ def get_table_data(
|
||||
if connection:
|
||||
connection.close()
|
||||
return {'error': f'查詢失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def get_table_column_metadata(table_name: str) -> Dict[str, Any]:
|
||||
"""Get column metadata from Oracle ALL_TAB_COLUMNS.
|
||||
|
||||
Args:
|
||||
table_name: Table name in format 'SCHEMA.TABLE' or 'TABLE'
|
||||
|
||||
Returns:
|
||||
Dict with 'columns' list containing column info:
|
||||
- name: Column name
|
||||
- data_type: Oracle data type (VARCHAR2, NUMBER, DATE, etc.)
|
||||
- data_length: Max length for character types
|
||||
- data_precision: Precision for numeric types
|
||||
- data_scale: Scale for numeric types
|
||||
- is_date: True if column is DATE or TIMESTAMP type
|
||||
- is_number: True if column is NUMBER type
|
||||
"""
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return {'error': 'Database connection failed', 'columns': []}
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
|
||||
# Parse schema and table name
|
||||
parts = table_name.split('.')
|
||||
if len(parts) == 2:
|
||||
owner, tbl_name = parts[0].upper(), parts[1].upper()
|
||||
else:
|
||||
owner = None
|
||||
tbl_name = parts[0].upper()
|
||||
|
||||
# Query ALL_TAB_COLUMNS for metadata
|
||||
if owner:
|
||||
sql = """
|
||||
SELECT COLUMN_NAME, DATA_TYPE, DATA_LENGTH,
|
||||
DATA_PRECISION, DATA_SCALE, COLUMN_ID
|
||||
FROM ALL_TAB_COLUMNS
|
||||
WHERE OWNER = :owner AND TABLE_NAME = :table_name
|
||||
ORDER BY COLUMN_ID
|
||||
"""
|
||||
cursor.execute(sql, {'owner': owner, 'table_name': tbl_name})
|
||||
else:
|
||||
sql = """
|
||||
SELECT COLUMN_NAME, DATA_TYPE, DATA_LENGTH,
|
||||
DATA_PRECISION, DATA_SCALE, COLUMN_ID
|
||||
FROM ALL_TAB_COLUMNS
|
||||
WHERE TABLE_NAME = :table_name
|
||||
ORDER BY COLUMN_ID
|
||||
"""
|
||||
cursor.execute(sql, {'table_name': tbl_name})
|
||||
|
||||
rows = cursor.fetchall()
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
if not rows:
|
||||
# Fallback to basic column detection if no metadata found
|
||||
logger.warning(
|
||||
f"No metadata found for {table_name}, falling back to basic detection"
|
||||
)
|
||||
basic_columns = get_table_columns(table_name)
|
||||
return {
|
||||
'columns': [
|
||||
{
|
||||
'name': col,
|
||||
'data_type': 'UNKNOWN',
|
||||
'data_length': None,
|
||||
'data_precision': None,
|
||||
'data_scale': None,
|
||||
'is_date': False,
|
||||
'is_number': False
|
||||
}
|
||||
for col in basic_columns
|
||||
]
|
||||
}
|
||||
|
||||
# Process results
|
||||
columns = []
|
||||
date_types = {'DATE', 'TIMESTAMP', 'TIMESTAMP WITH TIME ZONE',
|
||||
'TIMESTAMP WITH LOCAL TIME ZONE'}
|
||||
number_types = {'NUMBER', 'FLOAT', 'BINARY_FLOAT', 'BINARY_DOUBLE',
|
||||
'INTEGER', 'SMALLINT'}
|
||||
|
||||
for row in rows:
|
||||
col_name, data_type, data_length, data_precision, data_scale, _ = row
|
||||
columns.append({
|
||||
'name': col_name,
|
||||
'data_type': data_type,
|
||||
'data_length': data_length,
|
||||
'data_precision': data_precision,
|
||||
'data_scale': data_scale,
|
||||
'is_date': data_type in date_types,
|
||||
'is_number': data_type in number_types
|
||||
})
|
||||
|
||||
logger.debug(f"Retrieved metadata for {table_name}: {len(columns)} columns")
|
||||
return {'columns': columns}
|
||||
|
||||
except Exception as exc:
|
||||
ora_code = _extract_ora_code(exc)
|
||||
logger.warning(
|
||||
f"get_table_column_metadata failed - ORA-{ora_code}: {exc}, "
|
||||
f"falling back to basic detection"
|
||||
)
|
||||
if connection:
|
||||
connection.close()
|
||||
|
||||
# Fallback to basic column detection
|
||||
basic_columns = get_table_columns(table_name)
|
||||
return {
|
||||
'columns': [
|
||||
{
|
||||
'name': col,
|
||||
'data_type': 'UNKNOWN',
|
||||
'data_length': None,
|
||||
'data_precision': None,
|
||||
'data_scale': None,
|
||||
'is_date': False,
|
||||
'is_number': False
|
||||
}
|
||||
for col in basic_columns
|
||||
]
|
||||
}
|
||||
|
||||
@@ -11,12 +11,15 @@ Provides endpoints for:
|
||||
from flask import Blueprint, jsonify, request, Response
|
||||
|
||||
from mes_dashboard.config.tables import TABLES_CONFIG
|
||||
from mes_dashboard.core.database import get_table_columns
|
||||
from mes_dashboard.core.database import get_table_columns, get_table_column_metadata
|
||||
from mes_dashboard.services.excel_query_service import (
|
||||
parse_excel,
|
||||
get_column_unique_values,
|
||||
execute_batch_query,
|
||||
execute_advanced_batch_query,
|
||||
generate_csv_content,
|
||||
detect_excel_column_type,
|
||||
LARGE_TABLE_THRESHOLD,
|
||||
)
|
||||
|
||||
|
||||
@@ -112,6 +115,165 @@ def get_table_cols():
|
||||
return jsonify({'columns': columns})
|
||||
|
||||
|
||||
@excel_query_bp.route('/table-metadata', methods=['POST'])
|
||||
def get_table_metadata():
|
||||
"""Get enriched table metadata including column types.
|
||||
|
||||
Returns:
|
||||
- columns: List of column info with data types
|
||||
- time_field: Configured time field from TABLES_CONFIG (or null)
|
||||
- description: Table description from TABLES_CONFIG
|
||||
- row_count: Approximate row count from TABLES_CONFIG
|
||||
- performance_warning: Warning message if table is large
|
||||
"""
|
||||
data = request.get_json()
|
||||
table_name = data.get('table_name')
|
||||
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定資料表名稱'}), 400
|
||||
|
||||
# Get column metadata from Oracle
|
||||
metadata = get_table_column_metadata(table_name)
|
||||
if 'error' in metadata and not metadata.get('columns'):
|
||||
return jsonify({'error': f'無法取得資料表 {table_name} 的欄位資訊'}), 400
|
||||
|
||||
# Find table config for additional info
|
||||
table_config = None
|
||||
for category, table_list in TABLES_CONFIG.items():
|
||||
for table in table_list:
|
||||
if table['name'] == table_name:
|
||||
table_config = table
|
||||
break
|
||||
if table_config:
|
||||
break
|
||||
|
||||
# Build response
|
||||
result = {
|
||||
'columns': metadata.get('columns', []),
|
||||
'time_field': table_config.get('time_field') if table_config else None,
|
||||
'description': table_config.get('description', '') if table_config else '',
|
||||
'row_count': table_config.get('row_count', 0) if table_config else 0,
|
||||
'performance_warning': None
|
||||
}
|
||||
|
||||
# Add performance warning for large tables
|
||||
if result['row_count'] and result['row_count'] > LARGE_TABLE_THRESHOLD:
|
||||
result['performance_warning'] = (
|
||||
f'此資料表超過 {LARGE_TABLE_THRESHOLD // 1_000_000} 千萬筆,'
|
||||
'包含查詢可能較慢,建議配合日期範圍縮小查詢範圍'
|
||||
)
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@excel_query_bp.route('/column-type', methods=['POST'])
|
||||
def get_excel_column_type():
|
||||
"""Detect Excel column data type from cached file.
|
||||
|
||||
Expects JSON body:
|
||||
{"column_name": "LOT_ID"}
|
||||
|
||||
Returns column type info.
|
||||
"""
|
||||
data = request.get_json()
|
||||
column_name = data.get('column_name')
|
||||
|
||||
if not column_name:
|
||||
return jsonify({'error': '請指定欄位名稱'}), 400
|
||||
|
||||
if 'current' not in _uploaded_excel_cache:
|
||||
return jsonify({'error': '請先上傳 Excel 檔案'}), 400
|
||||
|
||||
import io
|
||||
file_like = io.BytesIO(_uploaded_excel_cache['current'])
|
||||
|
||||
# Get unique values first
|
||||
from mes_dashboard.services.excel_query_service import get_column_unique_values
|
||||
values_result = get_column_unique_values(file_like, column_name)
|
||||
if 'error' in values_result:
|
||||
return jsonify(values_result), 400
|
||||
|
||||
# Detect type from values
|
||||
type_info = detect_excel_column_type(values_result['values'])
|
||||
|
||||
return jsonify({
|
||||
'column_name': column_name,
|
||||
**type_info
|
||||
})
|
||||
|
||||
|
||||
@excel_query_bp.route('/execute-advanced', methods=['POST'])
|
||||
def execute_advanced_query():
|
||||
"""Execute advanced batch query with multiple condition types.
|
||||
|
||||
Expects JSON body:
|
||||
{
|
||||
"table_name": "DWH.DW_MES_WIP",
|
||||
"search_column": "LOT_ID",
|
||||
"return_columns": ["LOT_ID", "SPEC", "QTY"],
|
||||
"search_values": ["val1", "val2", ...],
|
||||
"query_type": "in" | "like_contains" | "like_prefix" | "like_suffix",
|
||||
"date_column": "TXNDATE", // optional
|
||||
"date_from": "2024-01-01", // optional (YYYY-MM-DD)
|
||||
"date_to": "2024-12-31" // optional (YYYY-MM-DD)
|
||||
}
|
||||
"""
|
||||
data = request.get_json()
|
||||
|
||||
table_name = data.get('table_name')
|
||||
search_column = data.get('search_column')
|
||||
return_columns = data.get('return_columns')
|
||||
search_values = data.get('search_values')
|
||||
query_type = data.get('query_type', 'in')
|
||||
date_column = data.get('date_column')
|
||||
date_from = data.get('date_from')
|
||||
date_to = data.get('date_to')
|
||||
|
||||
# Validation
|
||||
if not table_name:
|
||||
return jsonify({'error': '請指定資料表'}), 400
|
||||
if not search_column:
|
||||
return jsonify({'error': '請指定查詢欄位'}), 400
|
||||
if not return_columns or not isinstance(return_columns, list):
|
||||
return jsonify({'error': '請指定回傳欄位'}), 400
|
||||
if not search_values or not isinstance(search_values, list):
|
||||
return jsonify({'error': '無查詢值'}), 400
|
||||
|
||||
# Validate query_type
|
||||
valid_types = {'in', 'like_contains', 'like_prefix', 'like_suffix'}
|
||||
if query_type not in valid_types:
|
||||
return jsonify({'error': f'無效的查詢類型: {query_type}'}), 400
|
||||
|
||||
# Validate date range if provided
|
||||
if date_from and date_to:
|
||||
try:
|
||||
from datetime import datetime
|
||||
d_from = datetime.strptime(date_from, '%Y-%m-%d')
|
||||
d_to = datetime.strptime(date_to, '%Y-%m-%d')
|
||||
if d_from > d_to:
|
||||
return jsonify({'error': '起始日期不可晚於結束日期'}), 400
|
||||
if (d_to - d_from).days > 365:
|
||||
return jsonify({'error': '日期範圍不可超過 365 天'}), 400
|
||||
except ValueError:
|
||||
return jsonify({'error': '日期格式錯誤,請使用 YYYY-MM-DD'}), 400
|
||||
|
||||
result = execute_advanced_batch_query(
|
||||
table_name=table_name,
|
||||
search_column=search_column,
|
||||
return_columns=return_columns,
|
||||
search_values=search_values,
|
||||
query_type=query_type,
|
||||
date_column=date_column,
|
||||
date_from=date_from,
|
||||
date_to=date_to
|
||||
)
|
||||
|
||||
if 'error' in result:
|
||||
return jsonify(result), 400
|
||||
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@excel_query_bp.route('/execute', methods=['POST'])
|
||||
def execute_query():
|
||||
"""Execute batch query with Excel values.
|
||||
|
||||
@@ -17,6 +17,12 @@ from mes_dashboard.core.database import get_db_connection
|
||||
# Oracle IN clause limit
|
||||
BATCH_SIZE = 1000
|
||||
|
||||
# LIKE query keyword limit
|
||||
LIKE_KEYWORD_LIMIT = 100
|
||||
|
||||
# Large table threshold for performance warning (10 million rows)
|
||||
LARGE_TABLE_THRESHOLD = 10_000_000
|
||||
|
||||
|
||||
def parse_excel(file_storage) -> Dict[str, Any]:
|
||||
"""Parse uploaded Excel file and return column info.
|
||||
@@ -71,6 +77,87 @@ def get_column_unique_values(file_storage, column_name: str) -> Dict[str, Any]:
|
||||
return {'error': f'讀取欄位失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def detect_excel_column_type(values: List[str]) -> Dict[str, Any]:
|
||||
"""Detect the data type of Excel column values.
|
||||
|
||||
Args:
|
||||
values: List of string values from Excel column
|
||||
|
||||
Returns:
|
||||
Dict with:
|
||||
- detected_type: 'text', 'number', 'date', 'datetime', or 'id'
|
||||
- type_label: Display label in Chinese
|
||||
- sample_values: First 5 sample values
|
||||
"""
|
||||
if not values:
|
||||
return {
|
||||
'detected_type': 'text',
|
||||
'type_label': '文字',
|
||||
'sample_values': []
|
||||
}
|
||||
|
||||
# Sample first 100 non-empty values for analysis
|
||||
sample = [str(v).strip() for v in values[:100] if str(v).strip()]
|
||||
if not sample:
|
||||
return {
|
||||
'detected_type': 'text',
|
||||
'type_label': '文字',
|
||||
'sample_values': []
|
||||
}
|
||||
|
||||
# Regex patterns for type detection
|
||||
date_pattern = re.compile(r'^\d{4}[-/]\d{1,2}[-/]\d{1,2}$')
|
||||
datetime_pattern = re.compile(r'^\d{4}[-/]\d{1,2}[-/]\d{1,2}[T ]\d{1,2}:\d{2}')
|
||||
number_pattern = re.compile(r'^-?\d+\.?\d*$')
|
||||
id_pattern = re.compile(r'^[A-Z0-9_-]+$', re.IGNORECASE)
|
||||
|
||||
# Count matches for each type
|
||||
type_counts = {
|
||||
'datetime': 0,
|
||||
'date': 0,
|
||||
'number': 0,
|
||||
'id': 0,
|
||||
'text': 0
|
||||
}
|
||||
|
||||
for val in sample:
|
||||
if datetime_pattern.match(val):
|
||||
type_counts['datetime'] += 1
|
||||
elif date_pattern.match(val):
|
||||
type_counts['date'] += 1
|
||||
elif number_pattern.match(val):
|
||||
type_counts['number'] += 1
|
||||
elif id_pattern.match(val) and len(val) >= 3:
|
||||
# ID pattern: uppercase alphanumeric, at least 3 chars
|
||||
type_counts['id'] += 1
|
||||
else:
|
||||
type_counts['text'] += 1
|
||||
|
||||
# Determine type based on majority (>80%)
|
||||
threshold = len(sample) * 0.8
|
||||
detected_type = 'text'
|
||||
type_label = '文字'
|
||||
|
||||
if type_counts['datetime'] >= threshold:
|
||||
detected_type = 'datetime'
|
||||
type_label = '日期時間'
|
||||
elif type_counts['date'] >= threshold:
|
||||
detected_type = 'date'
|
||||
type_label = '日期'
|
||||
elif type_counts['number'] >= threshold:
|
||||
detected_type = 'number'
|
||||
type_label = '數值'
|
||||
elif type_counts['id'] >= threshold:
|
||||
detected_type = 'id'
|
||||
type_label = '識別碼'
|
||||
|
||||
return {
|
||||
'detected_type': detected_type,
|
||||
'type_label': type_label,
|
||||
'sample_values': sample[:5]
|
||||
}
|
||||
|
||||
|
||||
def sanitize_column_name(name: str) -> str:
|
||||
"""Sanitize column name to prevent SQL injection."""
|
||||
return re.sub(r'[^a-zA-Z0-9_]', '', name)
|
||||
@@ -81,6 +168,119 @@ def validate_table_name(table_name: str) -> bool:
|
||||
return bool(re.match(r'^[A-Za-z_][A-Za-z0-9_]*(\.[A-Za-z_][A-Za-z0-9_]*)?$', table_name))
|
||||
|
||||
|
||||
def escape_like_pattern(value: str) -> str:
|
||||
"""Escape special characters in LIKE pattern.
|
||||
|
||||
Oracle LIKE special characters: % (any chars), _ (single char)
|
||||
These need to be escaped with backslash for literal matching.
|
||||
|
||||
Args:
|
||||
value: Raw search value
|
||||
|
||||
Returns:
|
||||
Escaped value safe for LIKE pattern
|
||||
"""
|
||||
# Escape backslash first, then special chars
|
||||
value = value.replace('\\', '\\\\')
|
||||
value = value.replace('%', '\\%')
|
||||
value = value.replace('_', '\\_')
|
||||
return value
|
||||
|
||||
|
||||
def build_like_condition(
|
||||
column: str,
|
||||
values: List[str],
|
||||
mode: str = 'contains'
|
||||
) -> Tuple[str, Dict[str, str]]:
|
||||
"""Build LIKE query condition with multiple OR clauses.
|
||||
|
||||
Args:
|
||||
column: Column name to search (must be sanitized)
|
||||
values: List of search keywords
|
||||
mode: 'contains' (%val%), 'prefix' (val%), or 'suffix' (%val)
|
||||
|
||||
Returns:
|
||||
Tuple of (WHERE clause string, params dict)
|
||||
"""
|
||||
if not values:
|
||||
return '', {}
|
||||
|
||||
conditions = []
|
||||
params = {}
|
||||
|
||||
for i, val in enumerate(values):
|
||||
param_name = f'like_{i}'
|
||||
escaped_val = escape_like_pattern(val)
|
||||
|
||||
if mode == 'contains':
|
||||
params[param_name] = f'%{escaped_val}%'
|
||||
elif mode == 'prefix':
|
||||
params[param_name] = f'{escaped_val}%'
|
||||
elif mode == 'suffix':
|
||||
params[param_name] = f'%{escaped_val}'
|
||||
else:
|
||||
params[param_name] = f'%{escaped_val}%'
|
||||
|
||||
conditions.append(f"{column} LIKE :{param_name} ESCAPE '\\'")
|
||||
|
||||
where_clause = ' OR '.join(conditions)
|
||||
return f'({where_clause})', params
|
||||
|
||||
|
||||
def build_date_range_condition(
|
||||
column: str,
|
||||
date_from: str = None,
|
||||
date_to: str = None
|
||||
) -> Tuple[str, Dict[str, str]]:
|
||||
"""Build date range condition for Oracle.
|
||||
|
||||
Args:
|
||||
column: Date column name (must be sanitized)
|
||||
date_from: Start date in YYYY-MM-DD format
|
||||
date_to: End date in YYYY-MM-DD format
|
||||
|
||||
Returns:
|
||||
Tuple of (WHERE clause string, params dict)
|
||||
"""
|
||||
conditions = []
|
||||
params = {}
|
||||
|
||||
if date_from:
|
||||
conditions.append(
|
||||
f"{column} >= TO_DATE(:date_from, 'YYYY-MM-DD')"
|
||||
)
|
||||
params['date_from'] = date_from
|
||||
|
||||
if date_to:
|
||||
# Add 1 day to include the entire end date
|
||||
conditions.append(
|
||||
f"{column} < TO_DATE(:date_to, 'YYYY-MM-DD') + 1"
|
||||
)
|
||||
params['date_to'] = date_to
|
||||
|
||||
if not conditions:
|
||||
return '', {}
|
||||
|
||||
return ' AND '.join(conditions), params
|
||||
|
||||
|
||||
def validate_like_keywords(values: List[str]) -> Dict[str, Any]:
|
||||
"""Validate LIKE query keyword count.
|
||||
|
||||
Args:
|
||||
values: List of search keywords
|
||||
|
||||
Returns:
|
||||
Dict with 'valid' boolean and optional 'error' message
|
||||
"""
|
||||
if len(values) > LIKE_KEYWORD_LIMIT:
|
||||
return {
|
||||
'valid': False,
|
||||
'error': f'LIKE 查詢最多支援 {LIKE_KEYWORD_LIMIT} 個關鍵字,目前有 {len(values)} 個'
|
||||
}
|
||||
return {'valid': True}
|
||||
|
||||
|
||||
def execute_batch_query(
|
||||
table_name: str,
|
||||
search_column: str,
|
||||
@@ -175,6 +375,164 @@ def execute_batch_query(
|
||||
return {'error': f'查詢失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def execute_advanced_batch_query(
|
||||
table_name: str,
|
||||
search_column: str,
|
||||
return_columns: List[str],
|
||||
search_values: List[str],
|
||||
query_type: str = 'in',
|
||||
date_column: str = None,
|
||||
date_from: str = None,
|
||||
date_to: str = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute advanced batch query with multiple condition types.
|
||||
|
||||
Args:
|
||||
table_name: Target table name
|
||||
search_column: Column to search
|
||||
return_columns: Columns to return in SELECT
|
||||
search_values: Values to search for
|
||||
query_type: 'in', 'like_contains', 'like_prefix', or 'like_suffix'
|
||||
date_column: Optional date column for range filter
|
||||
date_from: Optional start date (YYYY-MM-DD)
|
||||
date_to: Optional end date (YYYY-MM-DD)
|
||||
|
||||
Returns:
|
||||
Dict with 'columns', 'data', 'row_count', or 'error' if failed.
|
||||
"""
|
||||
# Validate inputs
|
||||
if not validate_table_name(table_name):
|
||||
return {'error': f'無效的資料表名稱: {table_name}'}
|
||||
|
||||
safe_search_col = sanitize_column_name(search_column)
|
||||
safe_return_cols = [sanitize_column_name(col) for col in return_columns]
|
||||
|
||||
if not safe_search_col:
|
||||
return {'error': '查詢欄位名稱無效'}
|
||||
if not safe_return_cols:
|
||||
return {'error': '回傳欄位名稱無效'}
|
||||
|
||||
# Validate LIKE keyword count
|
||||
if query_type.startswith('like_'):
|
||||
validation = validate_like_keywords(search_values)
|
||||
if not validation['valid']:
|
||||
return {'error': validation['error']}
|
||||
|
||||
connection = get_db_connection()
|
||||
if not connection:
|
||||
return {'error': '資料庫連接失敗'}
|
||||
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
all_data = []
|
||||
columns = None
|
||||
columns_str = ', '.join(safe_return_cols)
|
||||
|
||||
# Build date range condition
|
||||
date_condition = ''
|
||||
date_params = {}
|
||||
if date_column:
|
||||
safe_date_col = sanitize_column_name(date_column)
|
||||
if safe_date_col:
|
||||
date_condition, date_params = build_date_range_condition(
|
||||
safe_date_col, date_from, date_to
|
||||
)
|
||||
|
||||
# Handle different query types
|
||||
if query_type == 'in':
|
||||
# Original IN clause logic with batching
|
||||
total_batches = (len(search_values) + BATCH_SIZE - 1) // BATCH_SIZE
|
||||
|
||||
for batch_idx in range(0, len(search_values), BATCH_SIZE):
|
||||
batch_values = search_values[batch_idx:batch_idx + BATCH_SIZE]
|
||||
placeholders = ', '.join([f':v{j}' for j in range(len(batch_values))])
|
||||
params = {f'v{j}': str(v) for j, v in enumerate(batch_values)}
|
||||
params.update(date_params)
|
||||
|
||||
where_parts = [f'{safe_search_col} IN ({placeholders})']
|
||||
if date_condition:
|
||||
where_parts.append(date_condition)
|
||||
|
||||
sql = f"""
|
||||
SELECT {columns_str}
|
||||
FROM {table_name}
|
||||
WHERE {' AND '.join(where_parts)}
|
||||
"""
|
||||
|
||||
cursor.execute(sql, params)
|
||||
|
||||
if columns is None:
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
|
||||
rows = cursor.fetchall()
|
||||
for row in rows:
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
else:
|
||||
row_dict[col] = value
|
||||
all_data.append(row_dict)
|
||||
|
||||
else:
|
||||
# LIKE query - process all at once (already limited to 100 keywords)
|
||||
mode_map = {
|
||||
'like_contains': 'contains',
|
||||
'like_prefix': 'prefix',
|
||||
'like_suffix': 'suffix'
|
||||
}
|
||||
mode = mode_map.get(query_type, 'contains')
|
||||
like_condition, like_params = build_like_condition(
|
||||
safe_search_col, search_values, mode
|
||||
)
|
||||
|
||||
params = {**like_params, **date_params}
|
||||
|
||||
where_parts = [like_condition]
|
||||
if date_condition:
|
||||
where_parts.append(date_condition)
|
||||
|
||||
sql = f"""
|
||||
SELECT {columns_str}
|
||||
FROM {table_name}
|
||||
WHERE {' AND '.join(where_parts)}
|
||||
"""
|
||||
|
||||
cursor.execute(sql, params)
|
||||
columns = [desc[0] for desc in cursor.description]
|
||||
|
||||
rows = cursor.fetchall()
|
||||
for row in rows:
|
||||
row_dict = {}
|
||||
for i, col in enumerate(columns):
|
||||
value = row[i]
|
||||
if isinstance(value, datetime):
|
||||
row_dict[col] = value.strftime('%Y-%m-%d %H:%M:%S')
|
||||
else:
|
||||
row_dict[col] = value
|
||||
all_data.append(row_dict)
|
||||
|
||||
total_batches = 1
|
||||
|
||||
cursor.close()
|
||||
connection.close()
|
||||
|
||||
return {
|
||||
'columns': columns or safe_return_cols,
|
||||
'data': all_data,
|
||||
'row_count': len(all_data),
|
||||
'search_count': len(search_values),
|
||||
'batch_count': total_batches,
|
||||
'query_type': query_type
|
||||
}
|
||||
|
||||
except Exception as exc:
|
||||
if connection:
|
||||
connection.close()
|
||||
return {'error': f'查詢失敗: {str(exc)}'}
|
||||
|
||||
|
||||
def generate_csv_content(data: List[Dict], columns: List[str]) -> str:
|
||||
"""Generate CSV content from query results.
|
||||
|
||||
|
||||
@@ -339,6 +339,90 @@
|
||||
font-weight: 500;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
/* Type badges */
|
||||
.type-badge {
|
||||
display: inline-block;
|
||||
padding: 2px 6px;
|
||||
border-radius: 3px;
|
||||
font-size: 11px;
|
||||
font-weight: 500;
|
||||
margin-left: 5px;
|
||||
}
|
||||
.type-badge.text { background: #e3e8ff; color: #667eea; }
|
||||
.type-badge.number { background: #d4edda; color: #155724; }
|
||||
.type-badge.date { background: #fff3cd; color: #856404; }
|
||||
.type-badge.datetime { background: #ffeeba; color: #856404; }
|
||||
.type-badge.id { background: #cce5ff; color: #004085; }
|
||||
.type-badge.unknown { background: #f8f9fa; color: #6c757d; }
|
||||
|
||||
/* Advanced options panel */
|
||||
.advanced-panel {
|
||||
margin-top: 15px;
|
||||
border: 1px solid #e0e0e0;
|
||||
border-radius: 6px;
|
||||
overflow: hidden;
|
||||
}
|
||||
.advanced-panel-header {
|
||||
background: #f8f9fa;
|
||||
padding: 12px 15px;
|
||||
cursor: pointer;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
font-weight: 500;
|
||||
}
|
||||
.advanced-panel-header:hover {
|
||||
background: #e9ecef;
|
||||
}
|
||||
.advanced-panel-toggle {
|
||||
transition: transform 0.2s;
|
||||
}
|
||||
.advanced-panel.collapsed .advanced-panel-toggle {
|
||||
transform: rotate(-90deg);
|
||||
}
|
||||
.advanced-panel.collapsed .advanced-panel-body {
|
||||
display: none;
|
||||
}
|
||||
.advanced-panel-body {
|
||||
padding: 15px;
|
||||
background: white;
|
||||
border-top: 1px solid #e0e0e0;
|
||||
}
|
||||
|
||||
/* Date inputs */
|
||||
input[type="date"] {
|
||||
padding: 8px 12px;
|
||||
border: 1px solid #ddd;
|
||||
border-radius: 6px;
|
||||
font-size: 14px;
|
||||
}
|
||||
input[type="date"]:focus {
|
||||
outline: none;
|
||||
border-color: #667eea;
|
||||
}
|
||||
|
||||
/* Performance warning */
|
||||
.performance-warning {
|
||||
background: #fff3cd;
|
||||
border: 1px solid #ffc107;
|
||||
color: #856404;
|
||||
padding: 10px 15px;
|
||||
border-radius: 6px;
|
||||
margin-top: 10px;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
/* Type mismatch warning */
|
||||
.type-mismatch-warning {
|
||||
background: #f8d7da;
|
||||
border: 1px solid #f5c6cb;
|
||||
color: #721c24;
|
||||
padding: 8px 12px;
|
||||
border-radius: 4px;
|
||||
margin-top: 8px;
|
||||
font-size: 12px;
|
||||
}
|
||||
</style>
|
||||
{% endblock %}
|
||||
|
||||
@@ -406,12 +490,64 @@
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col">
|
||||
<label>查詢欄位(WHERE IN):</label>
|
||||
<select id="searchColumn">
|
||||
<label>查詢欄位:</label>
|
||||
<select id="searchColumn" onchange="checkTypeMismatch()">
|
||||
<option value="">-- 請選擇 --</option>
|
||||
</select>
|
||||
<div id="typeMismatchWarning"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Advanced Options Panel -->
|
||||
<div class="advanced-panel" id="advancedPanel">
|
||||
<div class="advanced-panel-header" onclick="toggleAdvancedPanel()">
|
||||
<span>進階查詢條件</span>
|
||||
<span class="advanced-panel-toggle">▼</span>
|
||||
</div>
|
||||
<div class="advanced-panel-body">
|
||||
<!-- Query Type Selection -->
|
||||
<div class="row" style="margin-bottom: 15px;">
|
||||
<div class="col">
|
||||
<label>查詢類型:</label>
|
||||
<select id="queryType" onchange="onQueryTypeChange()">
|
||||
<option value="in">完全符合 (WHERE IN)</option>
|
||||
<option value="like_contains">包含 (LIKE %...%)</option>
|
||||
<option value="like_prefix">開頭符合 (LIKE ...%)</option>
|
||||
<option value="like_suffix">結尾符合 (LIKE %...)</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Date Range Filter -->
|
||||
<div id="dateRangeSection" style="display: none;">
|
||||
<label>日期範圍篩選:</label>
|
||||
<div class="row" style="gap: 10px; align-items: center;">
|
||||
<div>
|
||||
<label style="font-size: 12px; margin-bottom: 4px;">時間欄位</label>
|
||||
<select id="dateColumn" style="min-width: 150px;">
|
||||
<option value="">-- 不限時間 --</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label style="font-size: 12px; margin-bottom: 4px;">起始日期</label>
|
||||
<input type="date" id="dateFrom">
|
||||
</div>
|
||||
<div>
|
||||
<label style="font-size: 12px; margin-bottom: 4px;">結束日期</label>
|
||||
<input type="date" id="dateTo">
|
||||
</div>
|
||||
<div>
|
||||
<button class="btn" style="padding: 8px 12px; margin-top: 18px;" onclick="setDefaultDateRange()">最近 90 天</button>
|
||||
</div>
|
||||
</div>
|
||||
<div id="dateRangeError" class="error" style="display: none; margin-top: 10px;"></div>
|
||||
</div>
|
||||
|
||||
<!-- Performance Warning -->
|
||||
<div id="performanceWarning" class="performance-warning" style="display: none;"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div style="margin-top: 15px;">
|
||||
<label>回傳欄位(可多選):</label>
|
||||
<div class="select-all-bar">
|
||||
@@ -453,8 +589,11 @@
|
||||
<script>
|
||||
// State
|
||||
let excelColumns = [];
|
||||
let excelColumnTypes = {}; // { columnName: { detected_type, type_label } }
|
||||
let searchValues = [];
|
||||
let tableColumns = [];
|
||||
let tableColumns = []; // Array of column names (for backward compat)
|
||||
let tableColumnsMeta = []; // Array of { name, data_type, is_date, is_number }
|
||||
let tableMetadata = null; // Full table metadata including time_field, row_count
|
||||
let queryResult = null;
|
||||
|
||||
// Step 1: Upload Excel
|
||||
@@ -542,6 +681,7 @@
|
||||
document.getElementById('columnInfo').innerHTML = '<div class="loading"><div class="loading-spinner"></div><br>讀取中...</div>';
|
||||
|
||||
try {
|
||||
// Get column values
|
||||
const data = await MesApi.post('/api/excel-query/column-values', { column_name: column });
|
||||
|
||||
if (data.error) {
|
||||
@@ -550,10 +690,25 @@
|
||||
}
|
||||
|
||||
searchValues = data.values;
|
||||
|
||||
// Get column type detection
|
||||
try {
|
||||
const typeData = await MesApi.post('/api/excel-query/column-type', { column_name: column });
|
||||
if (!typeData.error) {
|
||||
excelColumnTypes[column] = typeData;
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn('Could not detect column type:', e);
|
||||
}
|
||||
|
||||
// Build info display
|
||||
const typeInfo = excelColumnTypes[column];
|
||||
const typeBadge = typeInfo ? `<span class="type-badge ${typeInfo.detected_type}">${typeInfo.type_label}</span>` : '';
|
||||
const warningClass = data.count > 1000 ? ' warning' : '';
|
||||
|
||||
document.getElementById('columnInfo').innerHTML = `
|
||||
<div class="info-box${warningClass}">
|
||||
共 ${data.count} 個不重複值
|
||||
共 ${data.count} 個不重複值 ${typeBadge}
|
||||
${data.count > 1000 ? '(將分批查詢,每批 1000 筆)' : ''}
|
||||
</div>
|
||||
`;
|
||||
@@ -581,47 +736,75 @@
|
||||
}
|
||||
}
|
||||
|
||||
// Step 3: Load table columns
|
||||
// Step 3: Load table columns (using new table-metadata endpoint)
|
||||
async function loadTableColumns() {
|
||||
const tableName = document.getElementById('targetTable').value;
|
||||
if (!tableName) {
|
||||
tableColumns = [];
|
||||
tableColumnsMeta = [];
|
||||
tableMetadata = null;
|
||||
document.getElementById('tableInfo').innerHTML = '';
|
||||
document.getElementById('dateRangeSection').style.display = 'none';
|
||||
document.getElementById('performanceWarning').style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('tableInfo').innerHTML = '<div class="loading"><div class="loading-spinner"></div><br>讀取欄位...</div>';
|
||||
|
||||
try {
|
||||
const data = await MesApi.post('/api/excel-query/table-columns', { table_name: tableName });
|
||||
const data = await MesApi.post('/api/excel-query/table-metadata', { table_name: tableName });
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('tableInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
tableColumns = data.columns;
|
||||
document.getElementById('tableInfo').innerHTML = `
|
||||
<div class="info-box">共 ${data.columns.length} 個欄位</div>
|
||||
`;
|
||||
tableColumnsMeta = data.columns || [];
|
||||
tableColumns = tableColumnsMeta.map(c => c.name);
|
||||
tableMetadata = data;
|
||||
|
||||
// Show table info
|
||||
let infoHtml = `共 ${tableColumns.length} 個欄位`;
|
||||
if (data.row_count) {
|
||||
infoHtml += ` | 約 ${data.row_count.toLocaleString()} 筆`;
|
||||
}
|
||||
if (data.time_field) {
|
||||
infoHtml += ` | 時間欄位: ${data.time_field}`;
|
||||
}
|
||||
document.getElementById('tableInfo').innerHTML = `<div class="info-box">${infoHtml}</div>`;
|
||||
|
||||
// Populate search column dropdown with type badges
|
||||
const searchSelect = document.getElementById('searchColumn');
|
||||
searchSelect.innerHTML = '<option value="">-- 請選擇 --</option>';
|
||||
tableColumns.forEach(col => {
|
||||
searchSelect.innerHTML += `<option value="${col}">${col}</option>`;
|
||||
tableColumnsMeta.forEach(col => {
|
||||
const typeBadge = getTypeBadgeHtml(col.data_type);
|
||||
searchSelect.innerHTML += `<option value="${col.name}" data-type="${col.data_type}">${col.name} ${typeBadge}</option>`;
|
||||
});
|
||||
|
||||
// Populate return columns with type badges
|
||||
const container = document.getElementById('returnColumns');
|
||||
container.innerHTML = '';
|
||||
tableColumns.forEach(col => {
|
||||
tableColumnsMeta.forEach(col => {
|
||||
const typeBadge = getTypeBadgeHtml(col.data_type);
|
||||
container.innerHTML += `
|
||||
<label class="checkbox-item">
|
||||
<input type="checkbox" value="${col}" checked>
|
||||
${col}
|
||||
<input type="checkbox" value="${col.name}" checked>
|
||||
${col.name} ${typeBadge}
|
||||
</label>
|
||||
`;
|
||||
});
|
||||
|
||||
// Setup date range section
|
||||
setupDateRangeSection(data);
|
||||
|
||||
// Show performance warning if applicable
|
||||
if (data.performance_warning) {
|
||||
document.getElementById('performanceWarning').textContent = data.performance_warning;
|
||||
document.getElementById('performanceWarning').style.display = 'block';
|
||||
} else {
|
||||
document.getElementById('performanceWarning').style.display = 'none';
|
||||
}
|
||||
|
||||
document.getElementById('step4').classList.remove('disabled');
|
||||
document.getElementById('step5').classList.remove('disabled');
|
||||
|
||||
@@ -630,6 +813,128 @@
|
||||
}
|
||||
}
|
||||
|
||||
// Helper: Get type badge HTML
|
||||
function getTypeBadgeHtml(dataType) {
|
||||
if (!dataType || dataType === 'UNKNOWN') return '';
|
||||
|
||||
const typeMap = {
|
||||
'VARCHAR2': { class: 'text', label: '文字' },
|
||||
'CHAR': { class: 'text', label: '文字' },
|
||||
'NVARCHAR2': { class: 'text', label: '文字' },
|
||||
'CLOB': { class: 'text', label: '文字' },
|
||||
'NUMBER': { class: 'number', label: '數值' },
|
||||
'FLOAT': { class: 'number', label: '數值' },
|
||||
'INTEGER': { class: 'number', label: '數值' },
|
||||
'DATE': { class: 'date', label: '日期' },
|
||||
'TIMESTAMP': { class: 'datetime', label: '日期時間' },
|
||||
};
|
||||
|
||||
// Find matching type
|
||||
for (const [key, val] of Object.entries(typeMap)) {
|
||||
if (dataType.toUpperCase().includes(key)) {
|
||||
return `<span class="type-badge ${val.class}">${val.label}</span>`;
|
||||
}
|
||||
}
|
||||
return `<span class="type-badge unknown">${dataType}</span>`;
|
||||
}
|
||||
|
||||
// Setup date range section based on table metadata
|
||||
function setupDateRangeSection(metadata) {
|
||||
const section = document.getElementById('dateRangeSection');
|
||||
const dateColumnSelect = document.getElementById('dateColumn');
|
||||
|
||||
// Find date/timestamp columns
|
||||
const dateColumns = tableColumnsMeta.filter(c => c.is_date);
|
||||
|
||||
if (dateColumns.length === 0 && !metadata.time_field) {
|
||||
section.style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
section.style.display = 'block';
|
||||
dateColumnSelect.innerHTML = '<option value="">-- 不限時間 --</option>';
|
||||
|
||||
// Add configured time_field first if available
|
||||
if (metadata.time_field) {
|
||||
dateColumnSelect.innerHTML += `<option value="${metadata.time_field}" selected>${metadata.time_field} (預設)</option>`;
|
||||
}
|
||||
|
||||
// Add other date columns
|
||||
dateColumns.forEach(col => {
|
||||
if (col.name !== metadata.time_field) {
|
||||
dateColumnSelect.innerHTML += `<option value="${col.name}">${col.name}</option>`;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Set default date range (last 90 days)
|
||||
function setDefaultDateRange() {
|
||||
const today = new Date();
|
||||
const past = new Date();
|
||||
past.setDate(today.getDate() - 90);
|
||||
|
||||
document.getElementById('dateFrom').value = past.toISOString().split('T')[0];
|
||||
document.getElementById('dateTo').value = today.toISOString().split('T')[0];
|
||||
}
|
||||
|
||||
// Toggle advanced panel
|
||||
function toggleAdvancedPanel() {
|
||||
const panel = document.getElementById('advancedPanel');
|
||||
panel.classList.toggle('collapsed');
|
||||
}
|
||||
|
||||
// Handle query type change
|
||||
function onQueryTypeChange() {
|
||||
const queryType = document.getElementById('queryType').value;
|
||||
const warningDiv = document.getElementById('performanceWarning');
|
||||
|
||||
// Show warning for LIKE contains on large tables
|
||||
if (queryType === 'like_contains' && tableMetadata && tableMetadata.row_count > 10000000) {
|
||||
warningDiv.textContent = '此資料表超過 1000 萬筆,包含查詢可能較慢,建議配合日期範圍縮小查詢範圍';
|
||||
warningDiv.style.display = 'block';
|
||||
} else if (tableMetadata && tableMetadata.performance_warning && queryType === 'like_contains') {
|
||||
warningDiv.textContent = tableMetadata.performance_warning;
|
||||
warningDiv.style.display = 'block';
|
||||
} else {
|
||||
warningDiv.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
// Check for type mismatch between Excel column and Oracle column
|
||||
function checkTypeMismatch() {
|
||||
const warningDiv = document.getElementById('typeMismatchWarning');
|
||||
const searchCol = document.getElementById('searchColumn').value;
|
||||
const excelCol = document.getElementById('excelColumn').value;
|
||||
|
||||
if (!searchCol || !excelCol) {
|
||||
warningDiv.innerHTML = '';
|
||||
return;
|
||||
}
|
||||
|
||||
// Get types
|
||||
const oracleMeta = tableColumnsMeta.find(c => c.name === searchCol);
|
||||
const excelType = excelColumnTypes[excelCol];
|
||||
|
||||
if (!oracleMeta || !excelType) {
|
||||
warningDiv.innerHTML = '';
|
||||
return;
|
||||
}
|
||||
|
||||
// Check for potential mismatches
|
||||
let warning = '';
|
||||
if (oracleMeta.is_number && excelType.detected_type === 'text') {
|
||||
warning = '欄位類型可能不相符:Excel 欄位為文字,Oracle 欄位為數值';
|
||||
} else if (oracleMeta.is_date && excelType.detected_type !== 'date' && excelType.detected_type !== 'datetime') {
|
||||
warning = '欄位類型可能不相符:Oracle 欄位為日期類型';
|
||||
}
|
||||
|
||||
if (warning) {
|
||||
warningDiv.innerHTML = `<div class="type-mismatch-warning">${warning}</div>`;
|
||||
} else {
|
||||
warningDiv.innerHTML = '';
|
||||
}
|
||||
}
|
||||
|
||||
function selectAllColumns() {
|
||||
document.querySelectorAll('#returnColumns input[type="checkbox"]').forEach(cb => cb.checked = true);
|
||||
}
|
||||
@@ -644,12 +949,26 @@
|
||||
}
|
||||
|
||||
function getQueryParams() {
|
||||
return {
|
||||
const params = {
|
||||
table_name: document.getElementById('targetTable').value,
|
||||
search_column: document.getElementById('searchColumn').value,
|
||||
return_columns: getSelectedReturnColumns(),
|
||||
search_values: searchValues
|
||||
search_values: searchValues,
|
||||
query_type: document.getElementById('queryType').value
|
||||
};
|
||||
|
||||
// Add date range if specified
|
||||
const dateColumn = document.getElementById('dateColumn').value;
|
||||
const dateFrom = document.getElementById('dateFrom').value;
|
||||
const dateTo = document.getElementById('dateTo').value;
|
||||
|
||||
if (dateColumn) {
|
||||
params.date_column = dateColumn;
|
||||
if (dateFrom) params.date_from = dateFrom;
|
||||
if (dateTo) params.date_to = dateTo;
|
||||
}
|
||||
|
||||
return params;
|
||||
}
|
||||
|
||||
function validateQuery() {
|
||||
@@ -671,6 +990,33 @@
|
||||
alert('無查詢值,請先選擇 Excel 欄位');
|
||||
return false;
|
||||
}
|
||||
|
||||
// Validate LIKE keyword limit
|
||||
if (params.query_type.startsWith('like_') && params.search_values.length > 100) {
|
||||
alert('LIKE 查詢最多支援 100 個關鍵字,目前有 ' + params.search_values.length + ' 個');
|
||||
return false;
|
||||
}
|
||||
|
||||
// Validate date range
|
||||
if (params.date_from && params.date_to) {
|
||||
const from = new Date(params.date_from);
|
||||
const to = new Date(params.date_to);
|
||||
if (from > to) {
|
||||
alert('起始日期不可晚於結束日期');
|
||||
document.getElementById('dateRangeError').textContent = '起始日期不可晚於結束日期';
|
||||
document.getElementById('dateRangeError').style.display = 'block';
|
||||
return false;
|
||||
}
|
||||
const daysDiff = (to - from) / (1000 * 60 * 60 * 24);
|
||||
if (daysDiff > 365) {
|
||||
alert('日期範圍不可超過 365 天');
|
||||
document.getElementById('dateRangeError').textContent = '日期範圍不可超過 365 天';
|
||||
document.getElementById('dateRangeError').style.display = 'block';
|
||||
return false;
|
||||
}
|
||||
document.getElementById('dateRangeError').style.display = 'none';
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
@@ -679,18 +1025,38 @@
|
||||
if (!validateQuery()) return;
|
||||
|
||||
const params = getQueryParams();
|
||||
const batchCount = Math.ceil(params.search_values.length / 1000);
|
||||
const isAdvanced = params.query_type !== 'in' || params.date_column;
|
||||
const batchCount = params.query_type === 'in' ? Math.ceil(params.search_values.length / 1000) : 1;
|
||||
|
||||
// Build loading message
|
||||
let loadingMsg = `查詢中... (${params.search_values.length} 筆`;
|
||||
if (params.query_type !== 'in') {
|
||||
const typeLabels = {
|
||||
'like_contains': '包含查詢',
|
||||
'like_prefix': '前綴查詢',
|
||||
'like_suffix': '後綴查詢'
|
||||
};
|
||||
loadingMsg += `,${typeLabels[params.query_type] || params.query_type}`;
|
||||
} else if (batchCount > 1) {
|
||||
loadingMsg += `,${batchCount} 批次`;
|
||||
}
|
||||
if (params.date_from || params.date_to) {
|
||||
loadingMsg += `,日期篩選`;
|
||||
}
|
||||
loadingMsg += ')';
|
||||
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="loading">
|
||||
<div class="loading-spinner"></div><br>
|
||||
查詢中... (${params.search_values.length} 筆,${batchCount} 批次)
|
||||
${loadingMsg}
|
||||
</div>
|
||||
`;
|
||||
document.getElementById('resultSection').classList.remove('active');
|
||||
|
||||
try {
|
||||
const data = await MesApi.post('/api/excel-query/execute', params);
|
||||
// Use advanced endpoint if using advanced features
|
||||
const endpoint = isAdvanced ? '/api/excel-query/execute-advanced' : '/api/excel-query/execute';
|
||||
const data = await MesApi.post(endpoint, params);
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('executeInfo').innerHTML = `<div class="error">${data.error}</div>`;
|
||||
@@ -698,10 +1064,15 @@
|
||||
}
|
||||
|
||||
queryResult = data;
|
||||
|
||||
// Build result message
|
||||
let resultMsg = `查詢完成!搜尋 ${data.search_count} 筆,找到 ${data.row_count} 筆結果`;
|
||||
if (data.query_type && data.query_type !== 'in') {
|
||||
resultMsg += ` (${data.query_type})`;
|
||||
}
|
||||
|
||||
document.getElementById('executeInfo').innerHTML = `
|
||||
<div class="info-box">
|
||||
查詢完成!搜尋 ${data.search_count} 筆,找到 ${data.row_count} 筆結果
|
||||
</div>
|
||||
<div class="info-box">${resultMsg}</div>
|
||||
`;
|
||||
|
||||
renderResult(data);
|
||||
|
||||
506
tests/test_excel_query_e2e.py
Normal file
506
tests/test_excel_query_e2e.py
Normal file
@@ -0,0 +1,506 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""End-to-end tests for Excel query workflow.
|
||||
|
||||
Tests the complete workflow from Excel upload to query execution and export.
|
||||
"""
|
||||
|
||||
import pytest
|
||||
import json
|
||||
import io
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
from mes_dashboard import create_app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app():
|
||||
"""Create test Flask application."""
|
||||
app = create_app()
|
||||
app.config['TESTING'] = True
|
||||
return app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(app):
|
||||
"""Create test client."""
|
||||
return app.test_client()
|
||||
|
||||
|
||||
def create_test_excel(data):
|
||||
"""Create a test Excel file with given data.
|
||||
|
||||
Args:
|
||||
data: List of lists where first list is headers.
|
||||
e.g. [['COL1', 'COL2'], ['val1', 'val2'], ...]
|
||||
"""
|
||||
import openpyxl
|
||||
wb = openpyxl.Workbook()
|
||||
ws = wb.active
|
||||
|
||||
for row_idx, row in enumerate(data, 1):
|
||||
for col_idx, value in enumerate(row, 1):
|
||||
ws.cell(row=row_idx, column=col_idx, value=value)
|
||||
|
||||
buffer = io.BytesIO()
|
||||
wb.save(buffer)
|
||||
buffer.seek(0)
|
||||
return buffer
|
||||
|
||||
|
||||
class TestBasicQueryWorkflow:
|
||||
"""E2E tests for basic query workflow."""
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_batch_query')
|
||||
def test_complete_basic_workflow(self, mock_execute, client):
|
||||
"""Test complete workflow: upload → get values → execute → export."""
|
||||
# Step 1: Upload Excel file
|
||||
excel_data = [
|
||||
['LOT_ID', 'PRODUCT', 'QTY'],
|
||||
['LOT001', 'PROD_A', 100],
|
||||
['LOT002', 'PROD_B', 200],
|
||||
['LOT003', 'PROD_A', 150],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
upload_response = client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'batch_query.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
assert upload_response.status_code == 200
|
||||
upload_data = json.loads(upload_response.data)
|
||||
assert 'columns' in upload_data
|
||||
assert 'LOT_ID' in upload_data['columns']
|
||||
assert 'preview' in upload_data
|
||||
|
||||
# Step 2: Get column values
|
||||
values_response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert values_response.status_code == 200
|
||||
values_data = json.loads(values_response.data)
|
||||
assert 'values' in values_data
|
||||
assert set(values_data['values']) == {'LOT001', 'LOT002', 'LOT003'}
|
||||
|
||||
# Step 3: Execute query
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'SPEC', 'STATUS'],
|
||||
'data': [
|
||||
['LOT001', 'SPEC_001', 'ACTIVE'],
|
||||
['LOT002', 'SPEC_002', 'HOLD'],
|
||||
['LOT003', 'SPEC_001', 'ACTIVE'],
|
||||
],
|
||||
'total': 3
|
||||
}
|
||||
|
||||
execute_response = client.post(
|
||||
'/api/excel-query/execute',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'SPEC', 'STATUS'],
|
||||
'search_values': ['LOT001', 'LOT002', 'LOT003']
|
||||
}
|
||||
)
|
||||
assert execute_response.status_code == 200
|
||||
execute_data = json.loads(execute_response.data)
|
||||
assert execute_data['total'] == 3
|
||||
|
||||
|
||||
class TestAdvancedQueryWorkflow:
|
||||
"""E2E tests for advanced query workflow with date range and LIKE."""
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_like_contains_workflow(self, mock_execute, client):
|
||||
"""Test workflow with LIKE contains query."""
|
||||
# Upload Excel with search patterns
|
||||
excel_data = [
|
||||
['SEARCH_PATTERN'],
|
||||
['LOT'],
|
||||
['WIP'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
upload_response = client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'patterns.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
assert upload_response.status_code == 200
|
||||
|
||||
# Get search values
|
||||
values_response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={'column_name': 'SEARCH_PATTERN'}
|
||||
)
|
||||
assert values_response.status_code == 200
|
||||
search_values = json.loads(values_response.data)['values']
|
||||
|
||||
# Execute LIKE contains query
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'STATUS'],
|
||||
'data': [
|
||||
['LOT001', 'ACTIVE'],
|
||||
['LOT002', 'ACTIVE'],
|
||||
['WIP001', 'HOLD'],
|
||||
['WIP002', 'ACTIVE'],
|
||||
],
|
||||
'total': 4
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'STATUS'],
|
||||
'search_values': search_values,
|
||||
'query_type': 'like_contains'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 4
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_date_range_workflow(self, mock_execute, client):
|
||||
"""Test workflow with date range filter."""
|
||||
excel_data = [
|
||||
['LOT_ID'],
|
||||
['LOT001'],
|
||||
['LOT002'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'lots.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Execute with date range
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'TXNDATE'],
|
||||
'data': [['LOT001', '2024-01-15']],
|
||||
'total': 1
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'TXNDATE'],
|
||||
'search_values': ['LOT001', 'LOT002'],
|
||||
'query_type': 'in',
|
||||
'date_column': 'TXNDATE',
|
||||
'date_from': '2024-01-01',
|
||||
'date_to': '2024-01-31'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 1
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_combined_like_and_date_workflow(self, mock_execute, client):
|
||||
"""Test workflow combining LIKE and date range."""
|
||||
excel_data = [
|
||||
['PREFIX'],
|
||||
['LOT'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'prefixes.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Execute with both LIKE prefix and date range
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'TXNDATE', 'STATUS'],
|
||||
'data': [
|
||||
['LOT001', '2024-01-15', 'ACTIVE'],
|
||||
['LOT002', '2024-01-20', 'ACTIVE'],
|
||||
],
|
||||
'total': 2
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'TXNDATE', 'STATUS'],
|
||||
'search_values': ['LOT'],
|
||||
'query_type': 'like_prefix',
|
||||
'date_column': 'TXNDATE',
|
||||
'date_from': '2024-01-01',
|
||||
'date_to': '2024-01-31'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
class TestColumnTypeDetection:
|
||||
"""E2E tests for column type detection workflow."""
|
||||
|
||||
def test_detect_date_column(self, client):
|
||||
"""Test detecting date type from Excel column."""
|
||||
excel_data = [
|
||||
['DATE_COL'],
|
||||
['2024-01-01'],
|
||||
['2024-01-02'],
|
||||
['2024-01-03'],
|
||||
['2024-01-04'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'dates.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'DATE_COL'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['detected_type'] == 'date'
|
||||
|
||||
def test_detect_number_column(self, client):
|
||||
"""Test detecting numeric type from Excel column."""
|
||||
excel_data = [
|
||||
['QTY'],
|
||||
['100'],
|
||||
['200'],
|
||||
['350.5'],
|
||||
['-50'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'numbers.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'QTY'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['detected_type'] == 'number'
|
||||
|
||||
def test_detect_id_column(self, client):
|
||||
"""Test detecting ID type from Excel column."""
|
||||
excel_data = [
|
||||
['LOT_ID'],
|
||||
['LOT001'],
|
||||
['LOT002'],
|
||||
['WIP-2024-001'],
|
||||
['PROD_ABC'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'ids.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['detected_type'] == 'id'
|
||||
|
||||
|
||||
class TestTableMetadataWorkflow:
|
||||
"""E2E tests for table metadata retrieval workflow."""
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.get_table_column_metadata')
|
||||
def test_metadata_with_type_matching(self, mock_metadata, client):
|
||||
"""Test workflow checking column type compatibility."""
|
||||
# Step 1: Upload Excel with ID column
|
||||
excel_data = [
|
||||
['LOT_ID'],
|
||||
['LOT001'],
|
||||
['LOT002'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'lots.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Step 2: Get Excel column type
|
||||
excel_type_response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
excel_type = json.loads(excel_type_response.data)['detected_type']
|
||||
|
||||
# Step 3: Get table metadata
|
||||
mock_metadata.return_value = {
|
||||
'columns': [
|
||||
{'name': 'LOT_ID', 'data_type': 'VARCHAR2', 'is_date': False, 'is_number': False},
|
||||
{'name': 'QTY', 'data_type': 'NUMBER', 'is_date': False, 'is_number': True},
|
||||
{'name': 'TXNDATE', 'data_type': 'DATE', 'is_date': True, 'is_number': False},
|
||||
]
|
||||
}
|
||||
|
||||
metadata_response = client.post(
|
||||
'/api/excel-query/table-metadata',
|
||||
json={'table_name': 'DWH.DW_MES_WIP'}
|
||||
)
|
||||
assert metadata_response.status_code == 200
|
||||
metadata = json.loads(metadata_response.data)
|
||||
|
||||
# Verify column types are returned
|
||||
assert len(metadata['columns']) == 3
|
||||
lot_col = next(c for c in metadata['columns'] if c['name'] == 'LOT_ID')
|
||||
assert lot_col['data_type'] == 'VARCHAR2'
|
||||
|
||||
|
||||
class TestValidationWorkflow:
|
||||
"""E2E tests for input validation throughout workflow."""
|
||||
|
||||
def test_like_keyword_limit_enforcement(self, client):
|
||||
"""Test that LIKE queries enforce keyword limit."""
|
||||
from mes_dashboard.services.excel_query_service import LIKE_KEYWORD_LIMIT
|
||||
|
||||
# Create Excel with many values
|
||||
excel_data = [['VALUE']] + [[f'VAL{i}'] for i in range(LIKE_KEYWORD_LIMIT + 10)]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'many_values.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Get all values
|
||||
values_response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={'column_name': 'VALUE'}
|
||||
)
|
||||
all_values = json.loads(values_response.data)['values']
|
||||
|
||||
# Attempt LIKE query with too many values
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'COL',
|
||||
'return_columns': ['COL'],
|
||||
'search_values': all_values,
|
||||
'query_type': 'like_contains'
|
||||
}
|
||||
)
|
||||
# This should either fail at validation or service layer
|
||||
# The exact behavior depends on implementation
|
||||
# At minimum, verify the request was processed
|
||||
assert response.status_code in [200, 400]
|
||||
|
||||
def test_date_range_boundary_validation(self, client):
|
||||
"""Test date range validation at boundaries."""
|
||||
excel_data = [
|
||||
['LOT_ID'],
|
||||
['LOT001'],
|
||||
]
|
||||
excel_file = create_test_excel(excel_data)
|
||||
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (excel_file, 'lots.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Test exactly 365 days (should pass)
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001'],
|
||||
'date_from': '2024-01-01',
|
||||
'date_to': '2024-12-31' # 365 days (2024 is leap year, so 366)
|
||||
}
|
||||
)
|
||||
# 366 days in 2024, should fail
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_empty_search_values_rejected(self, client):
|
||||
"""Test that empty search values are rejected."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': [],
|
||||
'query_type': 'in'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
|
||||
class TestBackwardCompatibility:
|
||||
"""E2E tests ensuring backward compatibility with original API."""
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_batch_query')
|
||||
def test_original_execute_endpoint_works(self, mock_execute, client):
|
||||
"""Test that original /execute endpoint still works."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID'],
|
||||
'data': [['LOT001']],
|
||||
'total': 1
|
||||
}
|
||||
|
||||
# Use original endpoint without advanced features
|
||||
response = client.post(
|
||||
'/api/excel-query/execute',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 1
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_batch_query')
|
||||
@patch('mes_dashboard.routes.excel_query_routes.generate_csv_content')
|
||||
def test_csv_export_still_works(self, mock_csv, mock_execute, client):
|
||||
"""Test that CSV export still works with basic query."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'STATUS'],
|
||||
'data': [['LOT001', 'ACTIVE']],
|
||||
'total': 1
|
||||
}
|
||||
mock_csv.return_value = 'LOT_ID,STATUS\nLOT001,ACTIVE\n'
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/export-csv',
|
||||
json={
|
||||
'table_name': 'DWH.DW_MES_WIP',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'STATUS'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.content_type.startswith('text/csv')
|
||||
474
tests/test_excel_query_routes.py
Normal file
474
tests/test_excel_query_routes.py
Normal file
@@ -0,0 +1,474 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Integration tests for Excel query API routes.
|
||||
|
||||
Tests the API endpoints with mocked database dependencies.
|
||||
"""
|
||||
|
||||
import pytest
|
||||
import json
|
||||
import io
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
from mes_dashboard import create_app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app():
|
||||
"""Create test Flask application."""
|
||||
app = create_app()
|
||||
app.config['TESTING'] = True
|
||||
return app
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(app):
|
||||
"""Create test client."""
|
||||
return app.test_client()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_excel_file():
|
||||
"""Create a mock Excel file content."""
|
||||
import openpyxl
|
||||
wb = openpyxl.Workbook()
|
||||
ws = wb.active
|
||||
ws['A1'] = 'LOT_ID'
|
||||
ws['B1'] = 'PRODUCT'
|
||||
ws['C1'] = 'DATE'
|
||||
ws['A2'] = 'LOT001'
|
||||
ws['B2'] = 'PROD_A'
|
||||
ws['C2'] = '2024-01-15'
|
||||
ws['A3'] = 'LOT002'
|
||||
ws['B3'] = 'PROD_B'
|
||||
ws['C3'] = '2024-01-16'
|
||||
ws['A4'] = 'LOT003'
|
||||
ws['B4'] = 'PROD_A'
|
||||
ws['C4'] = '2024-01-17'
|
||||
|
||||
buffer = io.BytesIO()
|
||||
wb.save(buffer)
|
||||
buffer.seek(0)
|
||||
return buffer
|
||||
|
||||
|
||||
class TestUploadExcel:
|
||||
"""Tests for /api/excel-query/upload endpoint."""
|
||||
|
||||
def test_upload_no_file(self, client):
|
||||
"""Should return error when no file provided."""
|
||||
response = client.post('/api/excel-query/upload')
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert 'error' in data
|
||||
|
||||
def test_upload_empty_filename(self, client):
|
||||
"""Should return error for empty filename."""
|
||||
response = client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (io.BytesIO(b''), '')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_upload_invalid_extension(self, client):
|
||||
"""Should reject non-Excel files."""
|
||||
response = client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (io.BytesIO(b'test'), 'test.txt')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert '.xlsx' in data['error'] or '.xls' in data['error']
|
||||
|
||||
def test_upload_valid_excel(self, client, mock_excel_file):
|
||||
"""Should successfully parse valid Excel file."""
|
||||
response = client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (mock_excel_file, 'test.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert 'columns' in data
|
||||
assert 'LOT_ID' in data['columns']
|
||||
assert 'preview' in data
|
||||
|
||||
|
||||
class TestGetColumnValues:
|
||||
"""Tests for /api/excel-query/column-values endpoint."""
|
||||
|
||||
def test_no_column_name(self, client):
|
||||
"""Should return error without column name."""
|
||||
response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_no_excel_uploaded(self, client):
|
||||
"""Should return error if no Excel uploaded."""
|
||||
# Clear cache first
|
||||
from mes_dashboard.routes.excel_query_routes import _uploaded_excel_cache
|
||||
_uploaded_excel_cache.clear()
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_get_values_after_upload(self, client, mock_excel_file):
|
||||
"""Should return column values after upload."""
|
||||
# First upload
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (mock_excel_file, 'test.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Then get values
|
||||
response = client.post(
|
||||
'/api/excel-query/column-values',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert 'values' in data
|
||||
assert 'LOT001' in data['values']
|
||||
|
||||
|
||||
class TestGetTables:
|
||||
"""Tests for /api/excel-query/tables endpoint."""
|
||||
|
||||
def test_get_tables(self, client):
|
||||
"""Should return available tables."""
|
||||
response = client.get('/api/excel-query/tables')
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert 'tables' in data
|
||||
assert isinstance(data['tables'], list)
|
||||
|
||||
|
||||
class TestTableMetadata:
|
||||
"""Tests for /api/excel-query/table-metadata endpoint."""
|
||||
|
||||
def test_no_table_name(self, client):
|
||||
"""Should return error without table name."""
|
||||
response = client.post(
|
||||
'/api/excel-query/table-metadata',
|
||||
json={}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.get_table_column_metadata')
|
||||
def test_get_metadata_success(self, mock_metadata, client):
|
||||
"""Should return enriched metadata."""
|
||||
mock_metadata.return_value = {
|
||||
'columns': [
|
||||
{'name': 'LOT_ID', 'data_type': 'VARCHAR2', 'is_date': False, 'is_number': False},
|
||||
{'name': 'TXNDATE', 'data_type': 'DATE', 'is_date': True, 'is_number': False},
|
||||
]
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/table-metadata',
|
||||
json={'table_name': 'TEST_TABLE'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert 'columns' in data
|
||||
assert len(data['columns']) == 2
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.get_table_column_metadata')
|
||||
def test_metadata_not_found(self, mock_metadata, client):
|
||||
"""Should handle table not found."""
|
||||
mock_metadata.return_value = {'error': 'Table not found', 'columns': []}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/table-metadata',
|
||||
json={'table_name': 'NONEXISTENT'}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
|
||||
class TestExecuteAdvancedQuery:
|
||||
"""Tests for /api/excel-query/execute-advanced endpoint."""
|
||||
|
||||
def test_missing_table_name(self, client):
|
||||
"""Should return error without table name."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_missing_search_column(self, client):
|
||||
"""Should return error without search column."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_invalid_query_type(self, client):
|
||||
"""Should reject invalid query type."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001'],
|
||||
'query_type': 'invalid_type'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert 'invalid' in data['error'].lower() or '無效' in data['error']
|
||||
|
||||
def test_invalid_date_format(self, client):
|
||||
"""Should reject invalid date format."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001'],
|
||||
'date_from': '01-01-2024',
|
||||
'date_to': '12-31-2024'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert '格式' in data['error'] or 'format' in data['error'].lower()
|
||||
|
||||
def test_date_range_reversed(self, client):
|
||||
"""Should reject if start date > end date."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001'],
|
||||
'date_from': '2024-12-31',
|
||||
'date_to': '2024-01-01'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert '起始' in data['error'] or 'start' in data['error'].lower()
|
||||
|
||||
def test_date_range_exceeds_limit(self, client):
|
||||
"""Should reject date range > 365 days."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001'],
|
||||
'date_from': '2023-01-01',
|
||||
'date_to': '2024-12-31'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
data = json.loads(response.data)
|
||||
assert '365' in data['error']
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_execute_in_query(self, mock_execute, client):
|
||||
"""Should execute IN query successfully."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'PRODUCT'],
|
||||
'data': [['LOT001', 'PROD_A']],
|
||||
'total': 1
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'PRODUCT'],
|
||||
'search_values': ['LOT001'],
|
||||
'query_type': 'in'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 1
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_execute_like_contains(self, mock_execute, client):
|
||||
"""Should execute LIKE contains query."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID'],
|
||||
'data': [['LOT001'], ['LOT002']],
|
||||
'total': 2
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT'],
|
||||
'query_type': 'like_contains'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 2
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_advanced_batch_query')
|
||||
def test_execute_with_date_range(self, mock_execute, client):
|
||||
"""Should execute query with date range."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'TXNDATE'],
|
||||
'data': [['LOT001', '2024-01-15']],
|
||||
'total': 1
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute-advanced',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'TXNDATE'],
|
||||
'search_values': ['LOT001'],
|
||||
'query_type': 'in',
|
||||
'date_column': 'TXNDATE',
|
||||
'date_from': '2024-01-01',
|
||||
'date_to': '2024-01-31'
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
mock_execute.assert_called_once()
|
||||
call_kwargs = mock_execute.call_args[1]
|
||||
assert call_kwargs['date_column'] == 'TXNDATE'
|
||||
assert call_kwargs['date_from'] == '2024-01-01'
|
||||
assert call_kwargs['date_to'] == '2024-01-31'
|
||||
|
||||
|
||||
class TestExecuteQuery:
|
||||
"""Tests for /api/excel-query/execute endpoint (backward compatibility)."""
|
||||
|
||||
def test_missing_parameters(self, client):
|
||||
"""Should return error for missing parameters."""
|
||||
response = client.post(
|
||||
'/api/excel-query/execute',
|
||||
json={'table_name': 'TEST'}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_batch_query')
|
||||
def test_execute_success(self, mock_execute, client):
|
||||
"""Should execute basic query successfully."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID'],
|
||||
'data': [['LOT001']],
|
||||
'total': 1
|
||||
}
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/execute',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert data['total'] == 1
|
||||
|
||||
|
||||
class TestExportCSV:
|
||||
"""Tests for /api/excel-query/export-csv endpoint."""
|
||||
|
||||
def test_missing_parameters(self, client):
|
||||
"""Should return error for missing parameters."""
|
||||
response = client.post(
|
||||
'/api/excel-query/export-csv',
|
||||
json={}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
@patch('mes_dashboard.routes.excel_query_routes.execute_batch_query')
|
||||
@patch('mes_dashboard.routes.excel_query_routes.generate_csv_content')
|
||||
def test_export_success(self, mock_csv, mock_execute, client):
|
||||
"""Should export CSV successfully."""
|
||||
mock_execute.return_value = {
|
||||
'columns': ['LOT_ID', 'PRODUCT'],
|
||||
'data': [['LOT001', 'PROD_A']],
|
||||
'total': 1
|
||||
}
|
||||
mock_csv.return_value = 'LOT_ID,PRODUCT\nLOT001,PROD_A\n'
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/export-csv',
|
||||
json={
|
||||
'table_name': 'TEST_TABLE',
|
||||
'search_column': 'LOT_ID',
|
||||
'return_columns': ['LOT_ID', 'PRODUCT'],
|
||||
'search_values': ['LOT001']
|
||||
}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.content_type.startswith('text/csv')
|
||||
assert b'LOT_ID' in response.data
|
||||
|
||||
|
||||
class TestGetExcelColumnType:
|
||||
"""Tests for /api/excel-query/column-type endpoint."""
|
||||
|
||||
def test_no_column_name(self, client):
|
||||
"""Should return error without column name."""
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_no_excel_uploaded(self, client):
|
||||
"""Should return error if no Excel uploaded."""
|
||||
from mes_dashboard.routes.excel_query_routes import _uploaded_excel_cache
|
||||
_uploaded_excel_cache.clear()
|
||||
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
def test_detect_type_after_upload(self, client, mock_excel_file):
|
||||
"""Should detect column type after upload."""
|
||||
# Upload first
|
||||
client.post(
|
||||
'/api/excel-query/upload',
|
||||
data={'file': (mock_excel_file, 'test.xlsx')},
|
||||
content_type='multipart/form-data'
|
||||
)
|
||||
|
||||
# Then detect type
|
||||
response = client.post(
|
||||
'/api/excel-query/column-type',
|
||||
json={'column_name': 'LOT_ID'}
|
||||
)
|
||||
assert response.status_code == 200
|
||||
data = json.loads(response.data)
|
||||
assert 'detected_type' in data
|
||||
assert 'type_label' in data
|
||||
261
tests/test_excel_query_service.py
Normal file
261
tests/test_excel_query_service.py
Normal file
@@ -0,0 +1,261 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Unit tests for Excel query service functions.
|
||||
|
||||
Tests the core service functions without database dependencies.
|
||||
"""
|
||||
|
||||
import pytest
|
||||
from mes_dashboard.services.excel_query_service import (
|
||||
detect_excel_column_type,
|
||||
escape_like_pattern,
|
||||
build_like_condition,
|
||||
build_date_range_condition,
|
||||
validate_like_keywords,
|
||||
sanitize_column_name,
|
||||
validate_table_name,
|
||||
LIKE_KEYWORD_LIMIT,
|
||||
)
|
||||
|
||||
|
||||
class TestDetectExcelColumnType:
|
||||
"""Tests for detect_excel_column_type function."""
|
||||
|
||||
def test_empty_values_returns_text(self):
|
||||
"""Empty list should return text type."""
|
||||
result = detect_excel_column_type([])
|
||||
assert result['detected_type'] == 'text'
|
||||
assert result['type_label'] == '文字'
|
||||
|
||||
def test_detect_date_type(self):
|
||||
"""Should detect date format YYYY-MM-DD."""
|
||||
values = ['2024-01-15', '2024-02-20', '2024-03-25', '2024-04-30']
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'date'
|
||||
assert result['type_label'] == '日期'
|
||||
|
||||
def test_detect_date_with_slash(self):
|
||||
"""Should detect date format YYYY/MM/DD."""
|
||||
values = ['2024/01/15', '2024/02/20', '2024/03/25', '2024/04/30']
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'date'
|
||||
assert result['type_label'] == '日期'
|
||||
|
||||
def test_detect_datetime_type(self):
|
||||
"""Should detect datetime format."""
|
||||
values = [
|
||||
'2024-01-15 10:30:00',
|
||||
'2024-02-20 14:45:30',
|
||||
'2024-03-25T08:00:00',
|
||||
'2024-04-30 23:59:59'
|
||||
]
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'datetime'
|
||||
assert result['type_label'] == '日期時間'
|
||||
|
||||
def test_detect_number_type(self):
|
||||
"""Should detect numeric values."""
|
||||
values = ['123', '456.78', '-99', '0', '1000000']
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'number'
|
||||
assert result['type_label'] == '數值'
|
||||
|
||||
def test_detect_id_type(self):
|
||||
"""Should detect ID pattern (uppercase alphanumeric)."""
|
||||
values = ['LOT001', 'WIP-2024-001', 'ABC_123', 'PROD001', 'TEST_ID']
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'id'
|
||||
assert result['type_label'] == '識別碼'
|
||||
|
||||
def test_mixed_values_returns_text(self):
|
||||
"""Mixed values should return text type."""
|
||||
values = ['abc', '123', '2024-01-01', 'xyz', 'test']
|
||||
result = detect_excel_column_type(values)
|
||||
assert result['detected_type'] == 'text'
|
||||
assert result['type_label'] == '文字'
|
||||
|
||||
def test_sample_values_included(self):
|
||||
"""Should include sample values in result."""
|
||||
values = ['A', 'B', 'C', 'D', 'E', 'F']
|
||||
result = detect_excel_column_type(values)
|
||||
assert 'sample_values' in result
|
||||
assert len(result['sample_values']) <= 5
|
||||
|
||||
|
||||
class TestEscapeLikePattern:
|
||||
"""Tests for escape_like_pattern function."""
|
||||
|
||||
def test_escape_percent(self):
|
||||
"""Should escape percent sign."""
|
||||
assert escape_like_pattern('100%') == '100\\%'
|
||||
|
||||
def test_escape_underscore(self):
|
||||
"""Should escape underscore."""
|
||||
assert escape_like_pattern('test_value') == 'test\\_value'
|
||||
|
||||
def test_escape_backslash(self):
|
||||
"""Should escape backslash."""
|
||||
assert escape_like_pattern('path\\file') == 'path\\\\file'
|
||||
|
||||
def test_escape_multiple_specials(self):
|
||||
"""Should escape multiple special characters."""
|
||||
assert escape_like_pattern('50%_off') == '50\\%\\_off'
|
||||
|
||||
def test_no_escape_needed(self):
|
||||
"""Should return unchanged if no special chars."""
|
||||
assert escape_like_pattern('normalvalue') == 'normalvalue'
|
||||
|
||||
|
||||
class TestBuildLikeCondition:
|
||||
"""Tests for build_like_condition function."""
|
||||
|
||||
def test_contains_mode(self):
|
||||
"""Should build LIKE %...% pattern."""
|
||||
condition, params = build_like_condition('COL', ['abc'], 'contains')
|
||||
assert 'LIKE :like_0' in condition
|
||||
assert params['like_0'] == '%abc%'
|
||||
|
||||
def test_prefix_mode(self):
|
||||
"""Should build LIKE ...% pattern."""
|
||||
condition, params = build_like_condition('COL', ['abc'], 'prefix')
|
||||
assert 'LIKE :like_0' in condition
|
||||
assert params['like_0'] == 'abc%'
|
||||
|
||||
def test_suffix_mode(self):
|
||||
"""Should build LIKE %... pattern."""
|
||||
condition, params = build_like_condition('COL', ['abc'], 'suffix')
|
||||
assert 'LIKE :like_0' in condition
|
||||
assert params['like_0'] == '%abc'
|
||||
|
||||
def test_multiple_values(self):
|
||||
"""Should build OR conditions for multiple values."""
|
||||
condition, params = build_like_condition('COL', ['a', 'b', 'c'], 'contains')
|
||||
assert 'OR' in condition
|
||||
assert len(params) == 3
|
||||
assert params['like_0'] == '%a%'
|
||||
assert params['like_1'] == '%b%'
|
||||
assert params['like_2'] == '%c%'
|
||||
|
||||
def test_empty_values(self):
|
||||
"""Should return empty for empty values."""
|
||||
condition, params = build_like_condition('COL', [], 'contains')
|
||||
assert condition == ''
|
||||
assert params == {}
|
||||
|
||||
def test_escape_clause_included(self):
|
||||
"""Should include ESCAPE clause."""
|
||||
condition, params = build_like_condition('COL', ['test'], 'contains')
|
||||
assert "ESCAPE '\\')" in condition
|
||||
|
||||
|
||||
class TestBuildDateRangeCondition:
|
||||
"""Tests for build_date_range_condition function."""
|
||||
|
||||
def test_both_dates(self):
|
||||
"""Should build condition with both dates."""
|
||||
condition, params = build_date_range_condition(
|
||||
'TXNDATE', '2024-01-01', '2024-12-31'
|
||||
)
|
||||
assert 'TO_DATE(:date_from' in condition
|
||||
assert 'TO_DATE(:date_to' in condition
|
||||
assert params['date_from'] == '2024-01-01'
|
||||
assert params['date_to'] == '2024-12-31'
|
||||
|
||||
def test_only_from_date(self):
|
||||
"""Should build condition with only start date."""
|
||||
condition, params = build_date_range_condition(
|
||||
'TXNDATE', date_from='2024-01-01'
|
||||
)
|
||||
assert '>=' in condition
|
||||
assert 'date_from' in params
|
||||
assert 'date_to' not in params
|
||||
|
||||
def test_only_to_date(self):
|
||||
"""Should build condition with only end date."""
|
||||
condition, params = build_date_range_condition(
|
||||
'TXNDATE', date_to='2024-12-31'
|
||||
)
|
||||
assert '<' in condition
|
||||
assert 'date_to' in params
|
||||
assert 'date_from' not in params
|
||||
|
||||
def test_no_dates(self):
|
||||
"""Should return empty for no dates."""
|
||||
condition, params = build_date_range_condition('TXNDATE')
|
||||
assert condition == ''
|
||||
assert params == {}
|
||||
|
||||
def test_end_date_includes_full_day(self):
|
||||
"""End date condition should include +1 for full day."""
|
||||
condition, params = build_date_range_condition(
|
||||
'TXNDATE', date_to='2024-12-31'
|
||||
)
|
||||
assert '+ 1' in condition
|
||||
|
||||
|
||||
class TestValidateLikeKeywords:
|
||||
"""Tests for validate_like_keywords function."""
|
||||
|
||||
def test_within_limit(self):
|
||||
"""Should pass validation for values within limit."""
|
||||
values = ['a'] * 50
|
||||
result = validate_like_keywords(values)
|
||||
assert result['valid'] is True
|
||||
|
||||
def test_at_limit(self):
|
||||
"""Should pass validation at exact limit."""
|
||||
values = ['a'] * LIKE_KEYWORD_LIMIT
|
||||
result = validate_like_keywords(values)
|
||||
assert result['valid'] is True
|
||||
|
||||
def test_exceeds_limit(self):
|
||||
"""Should fail validation when exceeding limit."""
|
||||
values = ['a'] * (LIKE_KEYWORD_LIMIT + 1)
|
||||
result = validate_like_keywords(values)
|
||||
assert result['valid'] is False
|
||||
assert 'error' in result
|
||||
|
||||
|
||||
class TestSanitizeColumnName:
|
||||
"""Tests for sanitize_column_name function."""
|
||||
|
||||
def test_valid_name(self):
|
||||
"""Should keep valid column name."""
|
||||
assert sanitize_column_name('LOT_ID') == 'LOT_ID'
|
||||
|
||||
def test_removes_special_chars(self):
|
||||
"""Should remove special characters."""
|
||||
assert sanitize_column_name('LOT-ID') == 'LOTID'
|
||||
assert sanitize_column_name('LOT ID') == 'LOTID'
|
||||
|
||||
def test_allows_underscore(self):
|
||||
"""Should allow underscore."""
|
||||
assert sanitize_column_name('MY_COLUMN_NAME') == 'MY_COLUMN_NAME'
|
||||
|
||||
def test_prevents_sql_injection(self):
|
||||
"""Should prevent SQL injection attempts."""
|
||||
assert sanitize_column_name("COL; DROP TABLE--") == 'COLDROPTABLE'
|
||||
|
||||
|
||||
class TestValidateTableName:
|
||||
"""Tests for validate_table_name function."""
|
||||
|
||||
def test_simple_name(self):
|
||||
"""Should validate simple table name."""
|
||||
assert validate_table_name('MY_TABLE') is True
|
||||
|
||||
def test_schema_qualified(self):
|
||||
"""Should validate schema.table format."""
|
||||
assert validate_table_name('DWH.DW_MES_WIP') is True
|
||||
|
||||
def test_invalid_starts_with_number(self):
|
||||
"""Should reject names starting with number."""
|
||||
assert validate_table_name('123TABLE') is False
|
||||
|
||||
def test_invalid_special_chars(self):
|
||||
"""Should reject names with special characters."""
|
||||
assert validate_table_name('TABLE-NAME') is False
|
||||
assert validate_table_name('TABLE NAME') is False
|
||||
|
||||
def test_sql_injection_prevention(self):
|
||||
"""Should reject SQL injection attempts."""
|
||||
assert validate_table_name('TABLE; DROP--') is False
|
||||
Reference in New Issue
Block a user