Files
hr-position-system/test_ollama_final.py
DonaldFang 方士碩 12ceccc3d3 refactor: 新增 ui.js 和 main.js 模組,啟用 ES6 Modules
新增檔案:
- js/ui.js - UI 操作、模組切換、預覽更新、表單資料收集
- js/main.js - 主程式初始化、事件監聽器設置、快捷鍵

更新檔案:
- index.html - 引用 ES6 模組 (type="module")

功能:
 模組切換功能
 標籤頁切換
 表單欄位監聽
 JSON 預覽更新
 快捷鍵支援 (Ctrl+S, Ctrl+N)
 用戶信息載入
 登出功能

注意:
- 大部分 JavaScript 代碼仍在 HTML 中(約 2400 行)
- 已建立核心模組架構,便於後續逐步遷移
- 使用 ES6 Modules,需要通過 HTTP Server 運行

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-05 17:18:28 +08:00

111 lines
3.0 KiB
Python

"""
Final Ollama API Integration Test
Tests the integration with the Flask app
"""
import requests
import json
import sys
# Set UTF-8 encoding for output
if sys.platform == 'win32':
import codecs
sys.stdout = codecs.getwriter('utf-8')(sys.stdout.buffer, 'strict')
print("=" * 60)
print("Ollama API Integration Test (via Flask App)")
print("=" * 60)
print()
# Test 1: Test Ollama connection status
print("Test 1: Checking Ollama API configuration...")
try:
response = requests.get("http://localhost:5000/api/llm/config", timeout=10)
if response.status_code == 200:
config = response.json()
ollama_config = config.get('ollama', {})
print(f" Name: {ollama_config.get('name', 'N/A')}")
print(f" Enabled: {ollama_config.get('enabled', False)}")
print(f" Endpoint: {ollama_config.get('endpoint', 'N/A')}")
print(" Status: ✓ Configuration OK")
else:
print(f" Status: ✗ Error {response.status_code}")
except Exception as e:
print(f" Status: ✗ Error: {str(e)}")
print()
# Test 2: Generate text using Ollama
print("Test 2: Testing text generation with Ollama...")
try:
payload = {
"api": "ollama",
"prompt": "請用中文回答:你好嗎?",
"max_tokens": 100
}
response = requests.post(
"http://localhost:5000/api/llm/generate",
json=payload,
headers={'Content-Type': 'application/json'},
timeout=60
)
print(f" Status Code: {response.status_code}")
result = response.json()
if result.get('success'):
text = result.get('text', '')
print(f" Status: ✓ Generation successful")
print(f" Response length: {len(text)} characters")
print(f" Response preview: {text[:100]}...")
# Save full response to file
with open('ollama_response.txt', 'w', encoding='utf-8') as f:
f.write(text)
print(f" Full response saved to: ollama_response.txt")
else:
error = result.get('error', 'Unknown error')
print(f" Status: ✗ Generation failed")
print(f" Error: {error}")
except Exception as e:
print(f" Status: ✗ Error: {str(e)}")
print()
# Test 3: Test with English prompt
print("Test 3: Testing with English prompt...")
try:
payload = {
"api": "ollama",
"prompt": "Write a haiku about coding.",
"max_tokens": 100
}
response = requests.post(
"http://localhost:5000/api/llm/generate",
json=payload,
headers={'Content-Type': 'application/json'},
timeout=60
)
result = response.json()
if result.get('success'):
text = result.get('text', '')
print(f" Status: ✓ Generation successful")
print(f" Response:\n{text}")
else:
error = result.get('error', 'Unknown error')
print(f" Status: ✗ Generation failed")
print(f" Error: {error}")
except Exception as e:
print(f" Status: ✗ Error: {str(e)}")
print()
print("=" * 60)
print("Integration test completed!")
print("=" * 60)