feat: Migrate to MySQL and add unified environment configuration

## Database Migration (SQLite → MySQL)
- Add Alembic migration framework
- Add 'tr_' prefix to all tables to avoid conflicts in shared database
- Remove SQLite support, use MySQL exclusively
- Add pymysql driver dependency
- Change ad_token column to Text type for long JWT tokens

## Unified Environment Configuration
- Centralize all hardcoded settings to environment variables
- Backend: Extend Settings class in app/core/config.py
- Frontend: Use Vite environment variables (import.meta.env)
- Docker: Move credentials to environment variables
- Update .env.example files with comprehensive documentation

## Test Organization
- Move root-level test files to tests/ directory:
  - test_chat_room.py → tests/test_chat_room.py
  - test_websocket.py → tests/test_websocket.py
  - test_realtime_implementation.py → tests/test_realtime_implementation.py
- Fix path references in test_realtime_implementation.py

Breaking Changes:
- CORS now requires explicit origins (no more wildcard)
- All database tables renamed with 'tr_' prefix
- SQLite no longer supported

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
egg
2025-12-07 14:15:11 +08:00
parent 1d5d4d447d
commit 92834dbe0e
39 changed files with 1558 additions and 136 deletions

23
.env.docker.example Normal file
View File

@@ -0,0 +1,23 @@
# =============================================================================
# Task Reporter - Docker Environment Configuration
# =============================================================================
# Copy this file to .env.docker and customize for your deployment.
# Use with: docker-compose -f docker-compose.minio.yml --env-file .env.docker up -d
# =============================================================================
# -----------------------------------------------------------------------------
# MinIO Configuration
# -----------------------------------------------------------------------------
# MinIO admin username
# IMPORTANT: Change this in production!
MINIO_ROOT_USER=minioadmin
# MinIO admin password
# IMPORTANT: Use a strong password in production!
MINIO_ROOT_PASSWORD=minioadmin
# MinIO S3 API port (default: 9000)
MINIO_API_PORT=9000
# MinIO Web Console port (default: 9001)
MINIO_CONSOLE_PORT=9001

View File

@@ -1,33 +1,135 @@
# =============================================================================
# Task Reporter - Backend Environment Configuration
# =============================================================================
# Copy this file to .env and fill in the required values.
# Required fields are marked with (Required), optional fields have defaults.
# =============================================================================
# -----------------------------------------------------------------------------
# Database Configuration
DATABASE_URL=postgresql://dev:dev123@localhost:5432/task_reporter
# For development with SQLite (comment out DATABASE_URL above and use this):
# DATABASE_URL=sqlite:///./task_reporter.db
# -----------------------------------------------------------------------------
# (Required) MySQL database connection string
# Format: mysql+pymysql://user:password@host:port/database?charset=utf8mb4
# Note: All tables use 'tr_' prefix to avoid conflicts in shared database
DATABASE_URL=mysql+pymysql://user:password@localhost:3306/task_reporter?charset=utf8mb4
# Security
FERNET_KEY= # Generate with: python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
# -----------------------------------------------------------------------------
# Security Configuration
# -----------------------------------------------------------------------------
# (Required) Fernet encryption key for session token encryption
# Generate with: python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
FERNET_KEY=
# AD API
# -----------------------------------------------------------------------------
# Server Configuration
# -----------------------------------------------------------------------------
# Server bind address (default: 0.0.0.0)
HOST=0.0.0.0
# Server port (default: 8000)
PORT=8000
# Debug mode - set to False in production (default: False)
DEBUG=True
# Log level: DEBUG, INFO, WARNING, ERROR, CRITICAL (default: INFO)
LOG_LEVEL=INFO
# -----------------------------------------------------------------------------
# CORS Configuration
# -----------------------------------------------------------------------------
# (Required for production) Comma-separated list of allowed CORS origins
# Example: http://localhost:3000,https://your-domain.com
# WARNING: Never use "*" in production - always specify allowed origins
CORS_ORIGINS=http://localhost:3000
# -----------------------------------------------------------------------------
# System Administration
# -----------------------------------------------------------------------------
# System administrator email with special permissions (bypass room membership checks)
# Leave empty if no system admin is needed
SYSTEM_ADMIN_EMAIL=
# -----------------------------------------------------------------------------
# AD Authentication API
# -----------------------------------------------------------------------------
# (Required) Active Directory authentication API URL
AD_API_URL=https://pj-auth-api.vercel.app/api/auth/login
# AD API request timeout in seconds (default: 10)
AD_API_TIMEOUT_SECONDS=10
# -----------------------------------------------------------------------------
# Session Settings
# -----------------------------------------------------------------------------
# Session inactivity timeout in days (default: 3)
SESSION_INACTIVITY_DAYS=3
# Token refresh threshold in minutes (default: 5)
TOKEN_REFRESH_THRESHOLD_MINUTES=5
# Maximum token refresh attempts (default: 3)
MAX_REFRESH_ATTEMPTS=3
# -----------------------------------------------------------------------------
# Realtime Messaging Settings
# -----------------------------------------------------------------------------
# Message edit time limit in minutes - users can edit messages within this window (default: 15)
MESSAGE_EDIT_TIME_LIMIT_MINUTES=15
# Typing indicator timeout in seconds (default: 3)
TYPING_TIMEOUT_SECONDS=3
# -----------------------------------------------------------------------------
# File Upload Limits
# -----------------------------------------------------------------------------
# Maximum image file size in MB (default: 10)
IMAGE_MAX_SIZE_MB=10
# Maximum document file size in MB (default: 20)
DOCUMENT_MAX_SIZE_MB=20
# Maximum log file size in MB (default: 5)
LOG_MAX_SIZE_MB=5
# -----------------------------------------------------------------------------
# MinIO Object Storage Configuration
# For local development, use docker-compose.minio.yml to start MinIO
# -----------------------------------------------------------------------------
# MinIO server endpoint (default: localhost:9000)
MINIO_ENDPOINT=localhost:9000
# MinIO access key (default: minioadmin)
# IMPORTANT: Change this in production!
MINIO_ACCESS_KEY=minioadmin
# MinIO secret key (default: minioadmin)
# IMPORTANT: Change this in production!
MINIO_SECRET_KEY=minioadmin
# MinIO bucket name (default: task-reporter-files)
MINIO_BUCKET=task-reporter-files
MINIO_SECURE=false # Set to true for HTTPS in production
# Use HTTPS for MinIO connection (default: false)
# Set to true in production with proper TLS configuration
MINIO_SECURE=false
# -----------------------------------------------------------------------------
# DIFY AI Service Configuration
# Used for AI-powered incident report generation
# -----------------------------------------------------------------------------
# DIFY API base URL for AI-powered report generation
DIFY_BASE_URL=https://dify.theaken.com/v1
DIFY_API_KEY= # Required: Get from DIFY console
DIFY_TIMEOUT_SECONDS=120 # Timeout for AI generation requests
# (Required for AI reports) DIFY API key - get from DIFY console
DIFY_API_KEY=
# DIFY API request timeout in seconds - AI generation can be slow (default: 120)
DIFY_TIMEOUT_SECONDS=120
# -----------------------------------------------------------------------------
# Report Generation Settings
REPORT_MAX_MESSAGES=200 # Summarize older messages if room exceeds this count
REPORT_STORAGE_PATH=reports # MinIO path prefix for generated reports
# -----------------------------------------------------------------------------
# Maximum messages to include in report before summarization (default: 200)
REPORT_MAX_MESSAGES=200
# MinIO path prefix for generated reports (default: reports)
REPORT_STORAGE_PATH=reports

116
alembic.ini Normal file
View File

@@ -0,0 +1,116 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

1
alembic/README Normal file
View File

@@ -0,0 +1 @@
Generic single-database configuration.

122
alembic/env.py Normal file
View File

@@ -0,0 +1,122 @@
"""Alembic migrations environment configuration
This configures Alembic to use the application's database settings
and SQLAlchemy models for migration autogeneration.
All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from logging.config import fileConfig
import os
import sys
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# Add parent directory to path so we can import app modules
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
# Import settings and models
from app.core.config import get_settings
from app.core.database import Base
# Import all models to register them with Base.metadata
from app.modules.auth.models import UserSession, User
from app.modules.chat_room.models import IncidentRoom, RoomMember, RoomTemplate
from app.modules.realtime.models import Message, MessageReaction, MessageEditHistory
from app.modules.file_storage.models import RoomFile
from app.modules.report_generation.models import GeneratedReport
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Load database URL from settings
settings = get_settings()
config.set_main_option("sqlalchemy.url", settings.DATABASE_URL)
# Custom version table name with tr_ prefix to avoid conflicts
VERSION_TABLE = "tr_alembic_version"
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
target_metadata = Base.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def include_object(object, name, type_, reflected, compare_to):
"""Filter to only include tables with 'tr_' prefix
This ensures migrations only affect Task Reporter tables
in the shared database.
"""
if type_ == "table":
return name.startswith("tr_")
return True
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
include_object=include_object,
version_table=VERSION_TABLE,
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
include_object=include_object,
version_table=VERSION_TABLE,
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

26
alembic/script.py.mako Normal file
View File

@@ -0,0 +1,26 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,200 @@
"""Initial migration - create tr_ prefixed tables
Revision ID: d80670b4abcb
Revises:
Create Date: 2025-12-07 13:51:52.658701
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = 'd80670b4abcb'
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('tr_incident_rooms',
sa.Column('room_id', sa.String(length=36), nullable=False),
sa.Column('title', sa.String(length=255), nullable=False),
sa.Column('incident_type', sa.Enum('EQUIPMENT_FAILURE', 'MATERIAL_SHORTAGE', 'QUALITY_ISSUE', 'OTHER', name='incidenttype'), nullable=False),
sa.Column('severity', sa.Enum('LOW', 'MEDIUM', 'HIGH', 'CRITICAL', name='severitylevel'), nullable=False),
sa.Column('status', sa.Enum('ACTIVE', 'RESOLVED', 'ARCHIVED', name='roomstatus'), nullable=False),
sa.Column('location', sa.String(length=255), nullable=True),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('resolution_notes', sa.Text(), nullable=True),
sa.Column('created_by', sa.String(length=255), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('resolved_at', sa.DateTime(), nullable=True),
sa.Column('archived_at', sa.DateTime(), nullable=True),
sa.Column('last_activity_at', sa.DateTime(), nullable=False),
sa.Column('last_updated_at', sa.DateTime(), nullable=False),
sa.Column('ownership_transferred_at', sa.DateTime(), nullable=True),
sa.Column('ownership_transferred_by', sa.String(length=255), nullable=True),
sa.Column('member_count', sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint('room_id')
)
op.create_index('ix_tr_incident_rooms_created_by', 'tr_incident_rooms', ['created_by'], unique=False)
op.create_index('ix_tr_incident_rooms_status_created', 'tr_incident_rooms', ['status', 'created_at'], unique=False)
op.create_table('tr_room_templates',
sa.Column('template_id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('name', sa.String(length=100), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('incident_type', sa.Enum('EQUIPMENT_FAILURE', 'MATERIAL_SHORTAGE', 'QUALITY_ISSUE', 'OTHER', name='incidenttype'), nullable=False),
sa.Column('default_severity', sa.Enum('LOW', 'MEDIUM', 'HIGH', 'CRITICAL', name='severitylevel'), nullable=False),
sa.Column('default_members', sa.Text(), nullable=True),
sa.Column('metadata_fields', sa.Text(), nullable=True),
sa.PrimaryKeyConstraint('template_id'),
sa.UniqueConstraint('name')
)
op.create_table('tr_user_sessions',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('username', sa.String(length=255), nullable=False, comment='User email from AD'),
sa.Column('display_name', sa.String(length=255), nullable=False, comment='Display name for chat'),
sa.Column('internal_token', sa.String(length=255), nullable=False, comment='Internal session token (UUID)'),
sa.Column('ad_token', sa.String(length=500), nullable=False, comment='AD API token'),
sa.Column('encrypted_password', sa.String(length=500), nullable=False, comment='AES-256 encrypted password'),
sa.Column('ad_token_expires_at', sa.DateTime(), nullable=False, comment='AD token expiry time'),
sa.Column('refresh_attempt_count', sa.Integer(), nullable=False, comment='Failed refresh attempts counter'),
sa.Column('last_activity', sa.DateTime(), nullable=False, comment='Last API request time'),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_tr_user_sessions_id'), 'tr_user_sessions', ['id'], unique=False)
op.create_index(op.f('ix_tr_user_sessions_internal_token'), 'tr_user_sessions', ['internal_token'], unique=True)
op.create_table('tr_users',
sa.Column('user_id', sa.String(length=255), nullable=False, comment='User email address (e.g., ymirliu@panjit.com.tw)'),
sa.Column('display_name', sa.String(length=255), nullable=False, comment="Display name from AD (e.g., 'ymirliu 劉念蓉')"),
sa.Column('office_location', sa.String(length=100), nullable=True, comment="Office location from AD (e.g., '高雄')"),
sa.Column('job_title', sa.String(length=100), nullable=True, comment='Job title from AD'),
sa.Column('last_login_at', sa.DateTime(), nullable=True, comment='Last login timestamp'),
sa.Column('created_at', sa.DateTime(), nullable=False, comment='First login timestamp'),
sa.PrimaryKeyConstraint('user_id')
)
op.create_index('ix_tr_users_display_name', 'tr_users', ['display_name'], unique=False)
op.create_table('tr_generated_reports',
sa.Column('report_id', sa.String(length=36), nullable=False, comment='Unique report identifier (UUID)'),
sa.Column('room_id', sa.String(length=36), nullable=False, comment='Reference to incident room'),
sa.Column('generated_by', sa.String(length=255), nullable=False, comment='User email who triggered report generation'),
sa.Column('generated_at', sa.DateTime(), nullable=False, comment='Report generation timestamp'),
sa.Column('status', sa.String(length=30), nullable=False, comment='Current generation status'),
sa.Column('error_message', sa.Text(), nullable=True, comment='User-friendly error message if generation failed'),
sa.Column('dify_message_id', sa.String(length=100), nullable=True, comment='DIFY API message ID for tracking'),
sa.Column('dify_conversation_id', sa.String(length=100), nullable=True, comment='DIFY conversation ID'),
sa.Column('prompt_tokens', sa.Integer(), nullable=True, comment='Number of prompt tokens used'),
sa.Column('completion_tokens', sa.Integer(), nullable=True, comment='Number of completion tokens used'),
sa.Column('report_title', sa.String(length=255), nullable=True, comment='Generated report title'),
sa.Column('report_json', sa.JSON(), nullable=True, comment='Parsed AI output as JSON'),
sa.Column('docx_storage_path', sa.String(length=500), nullable=True, comment='Path to generated .docx file in MinIO or local storage'),
sa.ForeignKeyConstraint(['room_id'], ['tr_incident_rooms.room_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('report_id')
)
op.create_index('ix_tr_generated_reports_room_date', 'tr_generated_reports', ['room_id', 'generated_at'], unique=False)
op.create_index('ix_tr_generated_reports_status', 'tr_generated_reports', ['status'], unique=False)
op.create_table('tr_messages',
sa.Column('message_id', sa.String(length=36), nullable=False),
sa.Column('room_id', sa.String(length=36), nullable=False),
sa.Column('sender_id', sa.String(length=255), nullable=False),
sa.Column('content', sa.Text(), nullable=False),
sa.Column('message_type', sa.Enum('TEXT', 'IMAGE_REF', 'FILE_REF', 'SYSTEM', 'INCIDENT_DATA', name='messagetype'), nullable=False),
sa.Column('message_metadata', sa.JSON(), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('edited_at', sa.DateTime(), nullable=True),
sa.Column('deleted_at', sa.DateTime(), nullable=True),
sa.Column('sequence_number', sa.BigInteger(), nullable=False),
sa.ForeignKeyConstraint(['room_id'], ['tr_incident_rooms.room_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('message_id')
)
op.create_index('ix_tr_messages_room_created', 'tr_messages', ['room_id', 'created_at'], unique=False)
op.create_index('ix_tr_messages_room_sequence', 'tr_messages', ['room_id', 'sequence_number'], unique=False)
op.create_index('ix_tr_messages_sender', 'tr_messages', ['sender_id'], unique=False)
op.create_table('tr_room_files',
sa.Column('file_id', sa.String(length=36), nullable=False),
sa.Column('room_id', sa.String(length=36), nullable=False),
sa.Column('uploader_id', sa.String(length=255), nullable=False),
sa.Column('filename', sa.String(length=255), nullable=False),
sa.Column('file_type', sa.String(length=20), nullable=False),
sa.Column('mime_type', sa.String(length=100), nullable=False),
sa.Column('file_size', sa.BigInteger(), nullable=False),
sa.Column('minio_bucket', sa.String(length=100), nullable=False),
sa.Column('minio_object_path', sa.String(length=500), nullable=False),
sa.Column('uploaded_at', sa.DateTime(), nullable=False),
sa.Column('deleted_at', sa.DateTime(), nullable=True),
sa.ForeignKeyConstraint(['room_id'], ['tr_incident_rooms.room_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('file_id')
)
op.create_index('ix_tr_room_files_room_uploaded', 'tr_room_files', ['room_id', 'uploaded_at'], unique=False)
op.create_index('ix_tr_room_files_uploader', 'tr_room_files', ['uploader_id'], unique=False)
op.create_table('tr_room_members',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('room_id', sa.String(length=36), nullable=False),
sa.Column('user_id', sa.String(length=255), nullable=False),
sa.Column('role', sa.Enum('OWNER', 'EDITOR', 'VIEWER', name='memberrole'), nullable=False),
sa.Column('added_by', sa.String(length=255), nullable=False),
sa.Column('added_at', sa.DateTime(), nullable=False),
sa.Column('removed_at', sa.DateTime(), nullable=True),
sa.ForeignKeyConstraint(['room_id'], ['tr_incident_rooms.room_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('room_id', 'user_id', 'removed_at', name='uq_tr_room_member_active')
)
op.create_index('ix_tr_room_members_room_user', 'tr_room_members', ['room_id', 'user_id'], unique=False)
op.create_index('ix_tr_room_members_user', 'tr_room_members', ['user_id'], unique=False)
op.create_table('tr_message_edit_history',
sa.Column('edit_id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('message_id', sa.String(length=36), nullable=False),
sa.Column('original_content', sa.Text(), nullable=False),
sa.Column('edited_by', sa.String(length=255), nullable=False),
sa.Column('edited_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['message_id'], ['tr_messages.message_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('edit_id')
)
op.create_index('ix_tr_message_edit_history_message', 'tr_message_edit_history', ['message_id', 'edited_at'], unique=False)
op.create_table('tr_message_reactions',
sa.Column('reaction_id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('message_id', sa.String(length=36), nullable=False),
sa.Column('user_id', sa.String(length=255), nullable=False),
sa.Column('emoji', sa.String(length=10), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['message_id'], ['tr_messages.message_id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('reaction_id'),
sa.UniqueConstraint('message_id', 'user_id', 'emoji', name='uq_tr_message_reaction')
)
op.create_index('ix_tr_message_reactions_message', 'tr_message_reactions', ['message_id'], unique=False)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('ix_tr_message_reactions_message', table_name='tr_message_reactions')
op.drop_table('tr_message_reactions')
op.drop_index('ix_tr_message_edit_history_message', table_name='tr_message_edit_history')
op.drop_table('tr_message_edit_history')
op.drop_index('ix_tr_room_members_user', table_name='tr_room_members')
op.drop_index('ix_tr_room_members_room_user', table_name='tr_room_members')
op.drop_table('tr_room_members')
op.drop_index('ix_tr_room_files_uploader', table_name='tr_room_files')
op.drop_index('ix_tr_room_files_room_uploaded', table_name='tr_room_files')
op.drop_table('tr_room_files')
op.drop_index('ix_tr_messages_sender', table_name='tr_messages')
op.drop_index('ix_tr_messages_room_sequence', table_name='tr_messages')
op.drop_index('ix_tr_messages_room_created', table_name='tr_messages')
op.drop_table('tr_messages')
op.drop_index('ix_tr_generated_reports_status', table_name='tr_generated_reports')
op.drop_index('ix_tr_generated_reports_room_date', table_name='tr_generated_reports')
op.drop_table('tr_generated_reports')
op.drop_index('ix_tr_users_display_name', table_name='tr_users')
op.drop_table('tr_users')
op.drop_index(op.f('ix_tr_user_sessions_internal_token'), table_name='tr_user_sessions')
op.drop_index(op.f('ix_tr_user_sessions_id'), table_name='tr_user_sessions')
op.drop_table('tr_user_sessions')
op.drop_table('tr_room_templates')
op.drop_index('ix_tr_incident_rooms_status_created', table_name='tr_incident_rooms')
op.drop_index('ix_tr_incident_rooms_created_by', table_name='tr_incident_rooms')
op.drop_table('tr_incident_rooms')
# ### end Alembic commands ###

View File

@@ -0,0 +1,40 @@
"""change ad_token to text type
Revision ID: ea3798f776f4
Revises: d80670b4abcb
Create Date: 2025-12-07 14:13:47.469856
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision: str = 'ea3798f776f4'
down_revision: Union[str, None] = 'd80670b4abcb'
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('tr_user_sessions', 'ad_token',
existing_type=mysql.VARCHAR(length=500),
type_=sa.Text(),
comment='AD API token (JWT)',
existing_comment='AD API token',
existing_nullable=False)
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('tr_user_sessions', 'ad_token',
existing_type=sa.Text(),
type_=mysql.VARCHAR(length=500),
comment='AD API token',
existing_comment='AD API token (JWT)',
existing_nullable=False)
# ### end Alembic commands ###

View File

@@ -1,6 +1,9 @@
"""Application configuration loaded from environment variables"""
from pydantic_settings import BaseSettings
from pydantic import field_validator
from functools import lru_cache
from typing import List
import logging
class Settings(BaseSettings):
@@ -14,6 +17,7 @@ class Settings(BaseSettings):
# AD API
AD_API_URL: str
AD_API_TIMEOUT_SECONDS: int = 10 # AD API request timeout
# Session Settings
SESSION_INACTIVITY_DAYS: int = 3
@@ -23,7 +27,23 @@ class Settings(BaseSettings):
# Server
HOST: str = "0.0.0.0"
PORT: int = 8000
DEBUG: bool = True
DEBUG: bool = False # Default to False for security
LOG_LEVEL: str = "INFO" # DEBUG, INFO, WARNING, ERROR
# CORS Configuration
CORS_ORIGINS: str = "http://localhost:3000" # Comma-separated list of allowed origins
# System Admin
SYSTEM_ADMIN_EMAIL: str = "" # System administrator email with special permissions
# Realtime Messaging Settings
MESSAGE_EDIT_TIME_LIMIT_MINUTES: int = 15 # Time limit for editing messages
TYPING_TIMEOUT_SECONDS: int = 3 # Typing indicator timeout
# File Upload Size Limits (in MB)
IMAGE_MAX_SIZE_MB: int = 10
DOCUMENT_MAX_SIZE_MB: int = 20
LOG_MAX_SIZE_MB: int = 5
# MinIO Object Storage
MINIO_ENDPOINT: str = "localhost:9000"
@@ -41,6 +61,41 @@ class Settings(BaseSettings):
REPORT_MAX_MESSAGES: int = 200 # Summarize if exceeded
REPORT_STORAGE_PATH: str = "reports" # MinIO path prefix for reports
@field_validator("LOG_LEVEL")
@classmethod
def validate_log_level(cls, v: str) -> str:
"""Validate log level"""
valid_levels = ["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]
v_upper = v.upper()
if v_upper not in valid_levels:
raise ValueError(f"LOG_LEVEL must be one of {valid_levels}")
return v_upper
def get_cors_origins(self) -> List[str]:
"""Parse CORS_ORIGINS into a list"""
if not self.CORS_ORIGINS:
return []
return [origin.strip() for origin in self.CORS_ORIGINS.split(",") if origin.strip()]
def get_image_max_size_bytes(self) -> int:
"""Get image max size in bytes"""
return self.IMAGE_MAX_SIZE_MB * 1024 * 1024
def get_document_max_size_bytes(self) -> int:
"""Get document max size in bytes"""
return self.DOCUMENT_MAX_SIZE_MB * 1024 * 1024
def get_log_max_size_bytes(self) -> int:
"""Get log file max size in bytes"""
return self.LOG_MAX_SIZE_MB * 1024 * 1024
def configure_logging(self) -> None:
"""Configure application logging based on LOG_LEVEL"""
logging.basicConfig(
level=getattr(logging, self.LOG_LEVEL),
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
class Config:
env_file = ".env"
case_sensitive = True

View File

@@ -1,4 +1,8 @@
"""Database connection and session management"""
"""Database connection and session management
Supports MySQL database with connection pooling.
All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
@@ -6,10 +10,13 @@ from app.core.config import get_settings
settings = get_settings()
# Create engine
# Create engine with MySQL connection pooling
engine = create_engine(
settings.DATABASE_URL,
connect_args={"check_same_thread": False} if "sqlite" in settings.DATABASE_URL else {},
pool_size=5,
max_overflow=10,
pool_pre_ping=True, # Verify connection before using
pool_recycle=3600, # Recycle connections after 1 hour
echo=settings.DEBUG,
)

View File

@@ -9,7 +9,6 @@ from fastapi.middleware.cors import CORSMiddleware
from fastapi.staticfiles import StaticFiles
from fastapi.responses import FileResponse
from app.core.config import get_settings
from app.core.database import engine, Base
from app.modules.auth import router as auth_router
from app.modules.auth.users_router import router as users_router
from app.modules.auth.middleware import auth_middleware
@@ -24,8 +23,8 @@ FRONTEND_DIR = Path(__file__).parent.parent / "frontend" / "dist"
settings = get_settings()
# Create database tables
Base.metadata.create_all(bind=engine)
# Database tables are managed by Alembic migrations
# Run: alembic upgrade head
# Initialize FastAPI app
app = FastAPI(
@@ -35,10 +34,10 @@ app = FastAPI(
debug=settings.DEBUG,
)
# CORS middleware (adjust for production)
# CORS middleware - origins configured via CORS_ORIGINS environment variable
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # TODO: Restrict in production
allow_origins=settings.get_cors_origins(),
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],

View File

@@ -1,10 +1,12 @@
"""SQLAlchemy models for authentication
資料表結構:
- user_sessions: 儲存使用者 session 資料,包含加密密碼用於自動刷新
- users: 永久儲存使用者資訊 (用於報告生成時的姓名解析)
- tr_user_sessions: 儲存使用者 session 資料,包含加密密碼用於自動刷新
- tr_users: 永久儲存使用者資訊 (用於報告生成時的姓名解析)
Note: All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from sqlalchemy import Column, Integer, String, DateTime, Index
from sqlalchemy import Column, Integer, String, DateTime, Index, Text
from datetime import datetime
from app.core.database import Base
@@ -12,7 +14,7 @@ from app.core.database import Base
class UserSession(Base):
"""User session model with encrypted password for auto-refresh"""
__tablename__ = "user_sessions"
__tablename__ = "tr_user_sessions"
id = Column(Integer, primary_key=True, index=True)
username = Column(String(255), nullable=False, comment="User email from AD")
@@ -20,7 +22,7 @@ class UserSession(Base):
internal_token = Column(
String(255), unique=True, nullable=False, index=True, comment="Internal session token (UUID)"
)
ad_token = Column(String(500), nullable=False, comment="AD API token")
ad_token = Column(Text, nullable=False, comment="AD API token (JWT)")
encrypted_password = Column(String(500), nullable=False, comment="AES-256 encrypted password")
ad_token_expires_at = Column(DateTime, nullable=False, comment="AD token expiry time")
refresh_attempt_count = Column(
@@ -41,7 +43,7 @@ class User(Base):
- Tracking user metadata (office location, job title)
"""
__tablename__ = "users"
__tablename__ = "tr_users"
user_id = Column(
String(255), primary_key=True, comment="User email address (e.g., ymirliu@panjit.com.tw)"
@@ -64,5 +66,5 @@ class User(Base):
# Indexes
__table_args__ = (
Index("ix_users_display_name", "display_name"),
Index("ix_tr_users_display_name", "display_name"),
)

View File

@@ -18,7 +18,7 @@ class ADAuthService:
def __init__(self):
self.ad_api_url = settings.AD_API_URL
self._client = httpx.AsyncClient(timeout=10.0)
self._client = httpx.AsyncClient(timeout=float(settings.AD_API_TIMEOUT_SECONDS))
async def authenticate(self, username: str, password: str) -> Dict[str, any]:
"""Authenticate user with AD API

View File

@@ -1,9 +1,11 @@
"""SQLAlchemy models for chat room management
Tables:
- incident_rooms: Stores room metadata and configuration
- room_members: User-room associations with roles
- room_templates: Predefined templates for common incident types
- tr_incident_rooms: Stores room metadata and configuration
- tr_room_members: User-room associations with roles
- tr_room_templates: Predefined templates for common incident types
Note: All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from sqlalchemy import Column, Integer, String, Text, DateTime, Enum, ForeignKey, UniqueConstraint, Index
from sqlalchemy.orm import relationship
@@ -46,7 +48,7 @@ class MemberRole(str, enum.Enum):
class IncidentRoom(Base):
"""Incident room model for production incidents"""
__tablename__ = "incident_rooms"
__tablename__ = "tr_incident_rooms"
room_id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
title = Column(String(255), nullable=False)
@@ -80,18 +82,18 @@ class IncidentRoom(Base):
# Indexes for common queries
__table_args__ = (
Index("ix_incident_rooms_status_created", "status", "created_at"),
Index("ix_incident_rooms_created_by", "created_by"),
Index("ix_tr_incident_rooms_status_created", "status", "created_at"),
Index("ix_tr_incident_rooms_created_by", "created_by"),
)
class RoomMember(Base):
"""Room membership model"""
__tablename__ = "room_members"
__tablename__ = "tr_room_members"
id = Column(Integer, primary_key=True, autoincrement=True)
room_id = Column(String(36), ForeignKey("incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
room_id = Column(String(36), ForeignKey("tr_incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
user_id = Column(String(255), nullable=False) # User email/ID
role = Column(Enum(MemberRole), nullable=False)
@@ -106,16 +108,16 @@ class RoomMember(Base):
# Constraints and indexes
__table_args__ = (
# Ensure unique active membership (where removed_at IS NULL)
UniqueConstraint("room_id", "user_id", "removed_at", name="uq_room_member_active"),
Index("ix_room_members_room_user", "room_id", "user_id"),
Index("ix_room_members_user", "user_id"),
UniqueConstraint("room_id", "user_id", "removed_at", name="uq_tr_room_member_active"),
Index("ix_tr_room_members_room_user", "room_id", "user_id"),
Index("ix_tr_room_members_user", "user_id"),
)
class RoomTemplate(Base):
"""Predefined templates for common incident types"""
__tablename__ = "room_templates"
__tablename__ = "tr_room_templates"
template_id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String(100), unique=True, nullable=False)

View File

@@ -8,13 +8,13 @@ from app.core.database import Base
class RoomFile(Base):
"""File uploaded to an incident room"""
__tablename__ = "room_files"
__tablename__ = "tr_room_files"
# Primary key
file_id = Column(String(36), primary_key=True)
# Foreign key to incident room (CASCADE delete when room is permanently deleted)
room_id = Column(String(36), ForeignKey("incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
room_id = Column(String(36), ForeignKey("tr_incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
# File metadata
uploader_id = Column(String(255), nullable=False)
@@ -36,8 +36,8 @@ class RoomFile(Base):
# Indexes
__table_args__ = (
Index("ix_room_files", "room_id", "uploaded_at"),
Index("ix_file_uploader", "uploader_id"),
Index("ix_tr_room_files_room_uploaded", "room_id", "uploaded_at"),
Index("ix_tr_room_files_uploader", "uploader_id"),
)
def __repr__(self):

View File

@@ -4,7 +4,10 @@ from fastapi import UploadFile, HTTPException
from typing import Set
import logging
from app.core.config import get_settings
logger = logging.getLogger(__name__)
settings = get_settings()
# MIME type whitelists
IMAGE_TYPES: Set[str] = {
@@ -22,11 +25,6 @@ LOG_TYPES: Set[str] = {
"text/csv"
}
# File size limits (bytes)
IMAGE_MAX_SIZE = 10 * 1024 * 1024 # 10MB
DOCUMENT_MAX_SIZE = 20 * 1024 * 1024 # 20MB
LOG_MAX_SIZE = 5 * 1024 * 1024 # 5MB
def detect_mime_type(file_data: bytes) -> str:
"""
@@ -118,11 +116,11 @@ def get_file_type_and_limits(mime_type: str) -> tuple[str, int]:
HTTPException if MIME type not recognized
"""
if mime_type in IMAGE_TYPES:
return ("image", IMAGE_MAX_SIZE)
return ("image", settings.get_image_max_size_bytes())
elif mime_type in DOCUMENT_TYPES:
return ("document", DOCUMENT_MAX_SIZE)
return ("document", settings.get_document_max_size_bytes())
elif mime_type in LOG_TYPES:
return ("log", LOG_MAX_SIZE)
return ("log", settings.get_log_max_size_bytes())
else:
raise HTTPException(
status_code=400,

View File

@@ -1,9 +1,11 @@
"""SQLAlchemy models for realtime messaging
Tables:
- messages: Stores all messages sent in incident rooms
- message_reactions: User reactions to messages (emoji)
- message_edit_history: Audit trail for message edits
- tr_messages: Stores all messages sent in incident rooms
- tr_message_reactions: User reactions to messages (emoji)
- tr_message_edit_history: Audit trail for message edits
Note: All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from sqlalchemy import Column, Integer, String, Text, DateTime, Enum, ForeignKey, UniqueConstraint, Index, BigInteger, JSON
from sqlalchemy.orm import relationship
@@ -25,10 +27,10 @@ class MessageType(str, enum.Enum):
class Message(Base):
"""Message model for incident room communications"""
__tablename__ = "messages"
__tablename__ = "tr_messages"
message_id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
room_id = Column(String(36), ForeignKey("incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
room_id = Column(String(36), ForeignKey("tr_incident_rooms.room_id", ondelete="CASCADE"), nullable=False)
sender_id = Column(String(255), nullable=False) # User email/ID
content = Column(Text, nullable=False)
message_type = Column(Enum(MessageType), default=MessageType.TEXT, nullable=False)
@@ -42,7 +44,6 @@ class Message(Base):
deleted_at = Column(DateTime) # Soft delete timestamp
# Sequence number for FIFO ordering within a room
# Note: Autoincrement doesn't work for non-PK in SQLite, will be set in service layer
sequence_number = Column(BigInteger, nullable=False)
# Relationships
@@ -51,22 +52,19 @@ class Message(Base):
# Indexes for common queries
__table_args__ = (
Index("ix_messages_room_created", "room_id", "created_at"),
Index("ix_messages_room_sequence", "room_id", "sequence_number"),
Index("ix_messages_sender", "sender_id"),
# PostgreSQL full-text search index on content (commented for SQLite compatibility)
# Note: Uncomment when using PostgreSQL with pg_trgm extension enabled
# Index("ix_messages_content_search", "content", postgresql_using='gin', postgresql_ops={'content': 'gin_trgm_ops'}),
Index("ix_tr_messages_room_created", "room_id", "created_at"),
Index("ix_tr_messages_room_sequence", "room_id", "sequence_number"),
Index("ix_tr_messages_sender", "sender_id"),
)
class MessageReaction(Base):
"""Message reaction model for emoji reactions"""
__tablename__ = "message_reactions"
__tablename__ = "tr_message_reactions"
reaction_id = Column(Integer, primary_key=True, autoincrement=True)
message_id = Column(String(36), ForeignKey("messages.message_id", ondelete="CASCADE"), nullable=False)
message_id = Column(String(36), ForeignKey("tr_messages.message_id", ondelete="CASCADE"), nullable=False)
user_id = Column(String(255), nullable=False) # User email/ID who reacted
emoji = Column(String(10), nullable=False) # Emoji character or code
@@ -79,18 +77,18 @@ class MessageReaction(Base):
# Constraints and indexes
__table_args__ = (
# Ensure unique reaction per user per message
UniqueConstraint("message_id", "user_id", "emoji", name="uq_message_reaction"),
Index("ix_message_reactions_message", "message_id"),
UniqueConstraint("message_id", "user_id", "emoji", name="uq_tr_message_reaction"),
Index("ix_tr_message_reactions_message", "message_id"),
)
class MessageEditHistory(Base):
"""Message edit history model for audit trail"""
__tablename__ = "message_edit_history"
__tablename__ = "tr_message_edit_history"
edit_id = Column(Integer, primary_key=True, autoincrement=True)
message_id = Column(String(36), ForeignKey("messages.message_id", ondelete="CASCADE"), nullable=False)
message_id = Column(String(36), ForeignKey("tr_messages.message_id", ondelete="CASCADE"), nullable=False)
original_content = Column(Text, nullable=False) # Content before edit
edited_by = Column(String(255), nullable=False) # User who made the edit
@@ -102,5 +100,5 @@ class MessageEditHistory(Base):
# Indexes
__table_args__ = (
Index("ix_message_edit_history_message", "message_id", "edited_at"),
Index("ix_tr_message_edit_history_message", "message_id", "edited_at"),
)

View File

@@ -7,6 +7,7 @@ from datetime import datetime
import json
from app.core.database import get_db
from app.core.config import get_settings
from app.modules.auth.dependencies import get_current_user
from app.modules.auth.services.session_service import session_service
from app.modules.chat_room.models import RoomMember, MemberRole
@@ -32,7 +33,7 @@ from sqlalchemy import and_
router = APIRouter(prefix="/api", tags=["realtime"])
SYSTEM_ADMIN_EMAIL = "ymirliu@panjit.com.tw"
settings = get_settings()
async def ws_send_json(websocket: WebSocket, data: dict):
@@ -51,9 +52,14 @@ def get_user_room_membership(db: Session, room_id: str, user_id: str) -> Optiona
).first()
def is_system_admin(user_id: str) -> bool:
"""Check if user is the system administrator"""
return bool(settings.SYSTEM_ADMIN_EMAIL and user_id == settings.SYSTEM_ADMIN_EMAIL)
def can_write_message(membership: Optional[RoomMember], user_id: str) -> bool:
"""Check if user has write permission (OWNER or EDITOR)"""
if user_id == SYSTEM_ADMIN_EMAIL:
if is_system_admin(user_id):
return True
if not membership:
@@ -99,7 +105,7 @@ async def websocket_endpoint(
# Check room membership
membership = get_user_room_membership(db, room_id, user_id)
if not membership and user_id != SYSTEM_ADMIN_EMAIL:
if not membership and not is_system_admin(user_id):
await websocket.close(code=4001, reason="Not a member of this room")
return
@@ -225,12 +231,11 @@ async def websocket_endpoint(
continue
# Delete message
is_admin = user_id == SYSTEM_ADMIN_EMAIL
deleted_message = MessageService.delete_message(
db=db,
message_id=ws_message.message_id,
user_id=user_id,
is_admin=is_admin
is_admin=is_system_admin(user_id)
)
if not deleted_message:
@@ -345,7 +350,7 @@ async def get_messages(
# Check room membership
membership = get_user_room_membership(db, room_id, user_id)
if not membership and user_id != SYSTEM_ADMIN_EMAIL:
if not membership and not is_system_admin(user_id):
raise HTTPException(status_code=403, detail="Not a member of this room")
return MessageService.get_messages(
@@ -414,7 +419,7 @@ async def search_messages(
# Check room membership
membership = get_user_room_membership(db, room_id, user_id)
if not membership and user_id != SYSTEM_ADMIN_EMAIL:
if not membership and not is_system_admin(user_id):
raise HTTPException(status_code=403, detail="Not a member of this room")
return MessageService.search_messages(
@@ -437,7 +442,7 @@ async def get_online_users(
# Check room membership
membership = get_user_room_membership(db, room_id, user_id)
if not membership and user_id != SYSTEM_ADMIN_EMAIL:
if not membership and not is_system_admin(user_id):
raise HTTPException(status_code=403, detail="Not a member of this room")
online_users = manager.get_online_users(room_id)
@@ -455,7 +460,7 @@ async def get_typing_users(
# Check room membership
membership = get_user_room_membership(db, room_id, user_id)
if not membership and user_id != SYSTEM_ADMIN_EMAIL:
if not membership and not is_system_admin(user_id):
raise HTTPException(status_code=403, detail="Not a member of this room")
typing_users = manager.get_typing_users(room_id)

View File

@@ -5,6 +5,7 @@ from typing import List, Optional, Dict, Any
from datetime import datetime, timedelta
import uuid
from app.core.config import get_settings
from app.modules.realtime.models import Message, MessageType, MessageReaction, MessageEditHistory
from app.modules.realtime.schemas import (
MessageCreate,
@@ -13,6 +14,8 @@ from app.modules.realtime.schemas import (
ReactionSummary
)
settings = get_settings()
class MessageService:
"""Service for message operations"""
@@ -161,9 +164,9 @@ class MessageService:
if message.sender_id != user_id:
return None
# Check time limit (15 minutes)
# Check time limit (configurable via MESSAGE_EDIT_TIME_LIMIT_MINUTES)
time_diff = datetime.utcnow() - message.created_at
if time_diff > timedelta(minutes=15):
if time_diff > timedelta(minutes=settings.MESSAGE_EDIT_TIME_LIMIT_MINUTES):
return None
# Store original content in edit history

View File

@@ -6,6 +6,10 @@ import asyncio
import json
from collections import defaultdict
from app.core.config import get_settings
settings = get_settings()
def json_serializer(obj: Any) -> str:
"""Custom JSON serializer for objects not serializable by default json code"""
@@ -193,9 +197,11 @@ class WebSocketManager:
if user_id in self._typing_tasks:
self._typing_tasks[user_id].cancel()
# Set new timeout (3 seconds)
# Set new timeout (configurable via TYPING_TIMEOUT_SECONDS)
typing_timeout = settings.TYPING_TIMEOUT_SECONDS
async def clear_typing():
await asyncio.sleep(3)
await asyncio.sleep(typing_timeout)
self._typing_users[room_id].discard(user_id)
if user_id in self._typing_tasks:
del self._typing_tasks[user_id]

View File

@@ -1,7 +1,9 @@
"""SQLAlchemy models for report generation
Tables:
- generated_reports: Stores report metadata and generation status
- tr_generated_reports: Stores report metadata and generation status
Note: All tables use 'tr_' prefix to avoid conflicts in shared database.
"""
from sqlalchemy import Column, String, Text, DateTime, Integer, ForeignKey, Index, JSON
from sqlalchemy.orm import relationship
@@ -24,14 +26,14 @@ class ReportStatus(str, enum.Enum):
class GeneratedReport(Base):
"""Generated report model for incident reports"""
__tablename__ = "generated_reports"
__tablename__ = "tr_generated_reports"
report_id = Column(
String(36), primary_key=True, default=lambda: str(uuid.uuid4()),
comment="Unique report identifier (UUID)"
)
room_id = Column(
String(36), ForeignKey("incident_rooms.room_id", ondelete="CASCADE"),
String(36), ForeignKey("tr_incident_rooms.room_id", ondelete="CASCADE"),
nullable=False, comment="Reference to incident room"
)
@@ -92,8 +94,8 @@ class GeneratedReport(Base):
# Indexes
__table_args__ = (
Index("ix_generated_reports_room_date", "room_id", "generated_at"),
Index("ix_generated_reports_status", "status"),
Index("ix_tr_generated_reports_room_date", "room_id", "generated_at"),
Index("ix_tr_generated_reports_status", "status"),
)
def __repr__(self):

View File

@@ -1,9 +1,10 @@
# MinIO Object Storage for Task Reporter
# Usage: docker-compose -f docker-compose.minio.yml up -d
# docker-compose -f docker-compose.minio.yml --env-file .env.docker up -d
#
# This configuration starts MinIO for local development.
# Access MinIO Console at: http://localhost:9001
# S3 API endpoint at: http://localhost:9000
# Access MinIO Console at: http://localhost:${MINIO_CONSOLE_PORT:-9001}
# S3 API endpoint at: http://localhost:${MINIO_API_PORT:-9000}
version: '3.8'
@@ -12,11 +13,11 @@ services:
image: minio/minio:latest
container_name: task-reporter-minio
ports:
- "9000:9000" # S3 API
- "9001:9001" # MinIO Console
- "${MINIO_API_PORT:-9000}:9000" # S3 API
- "${MINIO_CONSOLE_PORT:-9001}:9001" # MinIO Console
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
MINIO_ROOT_USER: ${MINIO_ROOT_USER:-minioadmin}
MINIO_ROOT_PASSWORD: ${MINIO_ROOT_PASSWORD:-minioadmin}
command: server /data --console-address ":9001"
volumes:
- minio_data:/data
@@ -35,28 +36,40 @@ volumes:
# Quick Start Guide
# ============================================================================
#
# 1. Start MinIO:
# 1. Start MinIO (with default settings):
# docker-compose -f docker-compose.minio.yml up -d
#
# 2. Access MinIO Console:
# Open http://localhost:9001 in your browser
# Login: minioadmin / minioadmin
# 2. Start MinIO (with custom settings from .env.docker):
# docker-compose -f docker-compose.minio.yml --env-file .env.docker up -d
#
# 3. The application will automatically create the bucket on startup
# 3. Access MinIO Console:
# Open http://localhost:9001 in your browser
# Login with MINIO_ROOT_USER / MINIO_ROOT_PASSWORD
#
# 4. The application will automatically create the bucket on startup
# (configured as 'task-reporter-files' in .env)
#
# 4. Stop MinIO:
# 5. Stop MinIO:
# docker-compose -f docker-compose.minio.yml down
#
# 5. Remove all data:
# 6. Remove all data:
# docker-compose -f docker-compose.minio.yml down -v
#
# ============================================================================
# Environment Variables
# ============================================================================
#
# MINIO_ROOT_USER - MinIO admin username (default: minioadmin)
# MINIO_ROOT_PASSWORD - MinIO admin password (default: minioadmin)
# MINIO_API_PORT - S3 API port (default: 9000)
# MINIO_CONSOLE_PORT - Web console port (default: 9001)
#
# ============================================================================
# Production Notes
# ============================================================================
#
# For production deployment:
# - Change MINIO_ROOT_USER and MINIO_ROOT_PASSWORD to secure values
# - Set secure MINIO_ROOT_USER and MINIO_ROOT_PASSWORD in .env.docker
# - Use external volume or persistent storage
# - Configure TLS/HTTPS
# - Set up proper backup policies

View File

@@ -1,11 +1,56 @@
# Frontend Environment Variables
# =============================================================================
# Task Reporter - Frontend Environment Configuration
# =============================================================================
# Copy this file to .env and customize as needed.
# All variables are optional and have sensible defaults for development.
# =============================================================================
# -----------------------------------------------------------------------------
# API Configuration
# -----------------------------------------------------------------------------
# API Base URL (optional)
# - For local development: leave empty or don't set (will use /api)
# - For local development: leave empty or don't set (will use /api with proxy)
# - For production with separate backend: set to full URL
# Example: https://api.yourdomain.com/api
VITE_API_BASE_URL=
# Note: When set, this URL is also used for WebSocket connections
# http:// will be converted to ws://
# https:// will be converted to wss://
# API request timeout in milliseconds (default: 30000 = 30 seconds)
VITE_API_TIMEOUT_MS=30000
# -----------------------------------------------------------------------------
# Development Server Configuration
# -----------------------------------------------------------------------------
# Frontend development server port (default: 3000)
VITE_PORT=3000
# Backend API URL for development proxy (default: http://localhost:8000)
# This is used by Vite's proxy configuration during development
VITE_BACKEND_URL=http://localhost:8000
# -----------------------------------------------------------------------------
# WebSocket Configuration
# -----------------------------------------------------------------------------
# Maximum WebSocket reconnection delay in milliseconds (default: 30000 = 30 seconds)
# The reconnection uses exponential backoff, capped at this value
VITE_MAX_RECONNECT_DELAY_MS=30000
# -----------------------------------------------------------------------------
# Query/Cache Configuration
# -----------------------------------------------------------------------------
# Messages refetch interval in milliseconds (default: 30000 = 30 seconds)
# Used for polling online users status
VITE_MESSAGES_REFETCH_INTERVAL_MS=30000
# Reports stale time in milliseconds (default: 30000 = 30 seconds)
# Time before cached report data is considered stale
VITE_REPORTS_STALE_TIME_MS=30000
# =============================================================================
# Notes
# =============================================================================
# - All VITE_ prefixed variables are exposed to the browser
# - Never put sensitive data (API keys, secrets) in frontend environment variables
# - When VITE_API_BASE_URL is set:
# - http:// URLs will be converted to ws:// for WebSocket connections
# - https:// URLs will be converted to wss:// for WebSocket connections
# =============================================================================

View File

@@ -61,12 +61,18 @@ export function useCreateMessage(roomId: string) {
})
}
// Configurable refetch interval for online users (default 30 seconds)
const MESSAGES_REFETCH_INTERVAL = parseInt(
import.meta.env.VITE_MESSAGES_REFETCH_INTERVAL_MS || '30000',
10
)
export function useOnlineUsers(roomId: string) {
return useQuery({
queryKey: messageKeys.online(roomId),
queryFn: () => messagesService.getOnlineUsers(roomId),
enabled: !!roomId,
refetchInterval: 30000, // Refresh every 30 seconds
refetchInterval: MESSAGES_REFETCH_INTERVAL,
})
}

View File

@@ -14,6 +14,12 @@ const reportKeys = {
[...reportKeys.all, 'detail', roomId, reportId] as const,
}
// Configurable stale time for reports (default 30 seconds)
const REPORTS_STALE_TIME = parseInt(
import.meta.env.VITE_REPORTS_STALE_TIME_MS || '30000',
10
)
/**
* Hook to list reports for a room
*/
@@ -22,7 +28,7 @@ export function useReports(roomId: string) {
queryKey: reportKeys.list(roomId),
queryFn: () => reportsService.listReports(roomId),
enabled: !!roomId,
staleTime: 30000, // 30 seconds
staleTime: REPORTS_STALE_TIME,
})
}

View File

@@ -12,7 +12,10 @@ import type {
} from '../types'
const RECONNECT_DELAY = 1000
const MAX_RECONNECT_DELAY = 30000
const MAX_RECONNECT_DELAY = parseInt(
import.meta.env.VITE_MAX_RECONNECT_DELAY_MS || '30000',
10
)
const RECONNECT_MULTIPLIER = 2
interface UseWebSocketOptions {

View File

@@ -3,10 +3,13 @@ import axios, { type AxiosError, type InternalAxiosRequestConfig } from 'axios'
// API Base URL: use environment variable or default to relative path
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || '/api'
// API Timeout: configurable via environment variable (default 30 seconds)
const API_TIMEOUT = parseInt(import.meta.env.VITE_API_TIMEOUT_MS || '30000', 10)
// Create axios instance
const api = axios.create({
baseURL: API_BASE_URL,
timeout: 30000,
timeout: API_TIMEOUT,
headers: {
'Content-Type': 'application/json',
},

View File

@@ -1,28 +1,36 @@
/// <reference types="vitest/config" />
import { defineConfig } from 'vite'
import { defineConfig, loadEnv } from 'vite'
import react from '@vitejs/plugin-react'
import tailwindcss from '@tailwindcss/vite'
// https://vite.dev/config/
export default defineConfig({
plugins: [
tailwindcss(),
react(),
],
server: {
port: 3000,
proxy: {
'/api': {
target: 'http://localhost:8000',
changeOrigin: true,
ws: true, // Enable WebSocket proxying
export default defineConfig(({ mode }) => {
const env = loadEnv(mode, process.cwd(), '')
// Configuration from environment variables with defaults
const PORT = parseInt(env.VITE_PORT || '3000', 10)
const BACKEND_URL = env.VITE_BACKEND_URL || 'http://localhost:8000'
return {
plugins: [
tailwindcss(),
react(),
],
server: {
port: PORT,
proxy: {
'/api': {
target: BACKEND_URL,
changeOrigin: true,
ws: true, // Enable WebSocket proxying
},
},
},
},
test: {
globals: true,
environment: 'jsdom',
setupFiles: ['./src/test/setup.ts'],
include: ['src/**/*.{test,spec}.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'],
},
test: {
globals: true,
environment: 'jsdom',
setupFiles: ['./src/test/setup.ts'],
include: ['src/**/*.{test,spec}.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'],
},
}
})

View File

@@ -0,0 +1,63 @@
# Change: Unified Environment Configuration Management
## Why
目前專案中存在多處硬編碼的設定值端口、URL、超時時間、檔案大小限制等分散在後端 Python 程式碼、前端 TypeScript 程式碼、Docker 配置檔案及開發腳本中。這造成:
1. **部署困難**:每次部署到不同環境需要修改多個檔案
2. **安全風險**CORS 使用萬用字元 `["*"]`、MinIO 使用預設密碼
3. **維護成本**:設定值分散在 20+ 個檔案中,難以追蹤和更新
4. **環境一致性問題**:開發、測試、生產環境難以保持設定同步
## What Changes
### Backend Configuration (Python)
- 將所有硬編碼的設定值移至 `app/core/config.py` 的 Settings 類別
- 擴展 `.env` 文件以包含所有可配置項目
- 新增以下環境變數:
- `HOST`, `PORT`, `DEBUG` - 伺服器設定
- `CORS_ORIGINS` - CORS 來源白名單(**BREAKING**: 移除萬用字元)
- `SYSTEM_ADMIN_EMAIL` - 系統管理員信箱
- `AD_API_TIMEOUT_SECONDS` - AD API 超時設定
- `MESSAGE_EDIT_TIME_LIMIT_MINUTES` - 訊息編輯時間限制
- `TYPING_TIMEOUT_SECONDS` - 打字指示器超時
- `IMAGE_MAX_SIZE_MB`, `DOCUMENT_MAX_SIZE_MB`, `LOG_MAX_SIZE_MB` - 檔案大小限制
- `LOG_LEVEL` - 日誌等級
### Frontend Configuration (TypeScript/Vite)
- 使用 Vite 環境變數機制 (`import.meta.env`)
- 新增以下環境變數:
- `VITE_API_TIMEOUT_MS` - API 請求超時
- `VITE_MESSAGES_REFETCH_INTERVAL_MS` - 訊息重新取得間隔
- `VITE_MAX_RECONNECT_DELAY_MS` - WebSocket 重連延遲
- `VITE_REPORTS_STALE_TIME_MS` - 報告快取過期時間
- `VITE_PORT` - 開發伺服器端口
- `VITE_BACKEND_URL` - 後端 API URL
### Docker Configuration
-`docker-compose.minio.yml` 中的硬編碼認證資訊改為環境變數
- 新增 `.env.docker` 範例檔案
### Documentation
- 更新 `.env.example` 包含所有環境變數及說明
- 更新 `frontend/.env.example` 包含所有前端環境變數
## Impact
- **Affected specs**: 新增 `env-config` spec
- **Affected code**:
- `app/core/config.py` - 擴展 Settings 類別
- `app/main.py` - CORS 設定改為從環境變數讀取
- `app/modules/realtime/router.py` - SYSTEM_ADMIN_EMAIL
- `app/modules/auth/services/ad_client.py` - AD API 超時
- `app/modules/realtime/services/message_service.py` - 訊息編輯限制
- `app/modules/realtime/websocket_manager.py` - 打字超時
- `app/modules/file_storage/validators.py` - 檔案大小限制
- `frontend/vite.config.ts` - 端口和後端 URL
- `frontend/src/services/api.ts` - API 超時
- `frontend/src/hooks/*.ts` - 各種超時和間隔設定
- `docker-compose.minio.yml` - MinIO 認證
- `.env.example`, `frontend/.env.example` - 文件更新
- **Breaking changes**:
- CORS 設定從 `["*"]` 改為必須明確指定來源
- 新的必要環境變數可能導致現有部署需要更新 `.env` 檔案

View File

@@ -0,0 +1,158 @@
## ADDED Requirements
### Requirement: Centralized Backend Configuration
系統 SHALL 透過 `app/core/config.py` 中的 Settings 類別集中管理所有後端配置,並從環境變數讀取設定值。
**必要環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `DATABASE_URL` | str | (必填) | 資料庫連線字串 |
| `FERNET_KEY` | str | (必填) | 加密金鑰 |
| `AD_API_URL` | str | (必填) | AD 認證 API URL |
| `HOST` | str | `0.0.0.0` | 伺服器綁定位址 |
| `PORT` | int | `8000` | 伺服器端口 |
| `DEBUG` | bool | `False` | 除錯模式 |
| `LOG_LEVEL` | str | `INFO` | 日誌等級 (DEBUG/INFO/WARNING/ERROR) |
| `CORS_ORIGINS` | str | (必填) | 允許的 CORS 來源,逗號分隔 |
| `SYSTEM_ADMIN_EMAIL` | str | (必填) | 系統管理員信箱 |
| `AD_API_TIMEOUT_SECONDS` | int | `10` | AD API 請求超時秒數 |
| `SESSION_INACTIVITY_DAYS` | int | `3` | 會話閒置過期天數 |
| `TOKEN_REFRESH_THRESHOLD_MINUTES` | int | `5` | Token 更新閾值分鐘數 |
| `MAX_REFRESH_ATTEMPTS` | int | `3` | 最大 Token 更新嘗試次數 |
| `MESSAGE_EDIT_TIME_LIMIT_MINUTES` | int | `15` | 訊息編輯時間限制分鐘數 |
| `TYPING_TIMEOUT_SECONDS` | int | `3` | 打字指示器超時秒數 |
#### Scenario: 必要環境變數缺失時啟動失敗
- **WHEN** 啟動應用程式時缺少必要環境變數(如 `DATABASE_URL`
- **THEN** 應用程式 SHALL 顯示明確錯誤訊息並拒絕啟動
#### Scenario: 使用預設值啟動
- **WHEN** 提供所有必要環境變數但未設定選填變數
- **THEN** 系統 SHALL 使用預設值正常啟動
---
### Requirement: File Storage Configuration
系統 SHALL 支援透過環境變數配置檔案儲存相關設定。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `MINIO_ENDPOINT` | str | `localhost:9000` | MinIO 伺服器位址 |
| `MINIO_ACCESS_KEY` | str | `minioadmin` | MinIO 存取金鑰 |
| `MINIO_SECRET_KEY` | str | `minioadmin` | MinIO 密鑰 |
| `MINIO_BUCKET` | str | `task-reporter-files` | 預設儲存桶名稱 |
| `MINIO_SECURE` | bool | `False` | 是否使用 HTTPS |
| `IMAGE_MAX_SIZE_MB` | int | `10` | 圖片最大上傳大小 (MB) |
| `DOCUMENT_MAX_SIZE_MB` | int | `20` | 文件最大上傳大小 (MB) |
| `LOG_MAX_SIZE_MB` | int | `5` | 日誌檔最大上傳大小 (MB) |
#### Scenario: 自訂檔案大小限制
- **WHEN** 設定 `IMAGE_MAX_SIZE_MB=20`
- **THEN** 系統 SHALL 接受最大 20MB 的圖片上傳
#### Scenario: MinIO 連線配置
- **WHEN** 設定自訂 MinIO 端點和認證資訊
- **THEN** 系統 SHALL 使用該配置連線 MinIO 服務
---
### Requirement: AI Service Configuration
系統 SHALL 支援透過環境變數配置 AI 報告生成服務。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `DIFY_BASE_URL` | str | `https://dify.theaken.com/v1` | DIFY API 基礎 URL |
| `DIFY_API_KEY` | str | `""` | DIFY API 金鑰 |
| `DIFY_TIMEOUT_SECONDS` | int | `120` | DIFY API 請求超時秒數 |
| `REPORT_MAX_MESSAGES` | int | `200` | 報告包含的最大訊息數 |
| `REPORT_STORAGE_PATH` | str | `reports` | 報告儲存路徑 |
#### Scenario: DIFY API 配置
- **WHEN** 設定有效的 `DIFY_API_KEY`
- **THEN** 系統 SHALL 能夠呼叫 DIFY API 生成報告
#### Scenario: DIFY API 金鑰缺失
- **WHEN** 未設定 `DIFY_API_KEY` 或設為空字串
- **THEN** 報告生成功能 SHALL 返回適當錯誤訊息
---
### Requirement: Frontend Environment Configuration
前端應用程式 SHALL 透過 Vite 環境變數機制配置運行時設定。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `VITE_API_BASE_URL` | str | `""` | API 基礎 URL空字串表示相對路徑 |
| `VITE_PORT` | int | `3000` | 開發伺服器端口 |
| `VITE_BACKEND_URL` | str | `http://localhost:8000` | 後端 API URL開發代理用 |
| `VITE_API_TIMEOUT_MS` | int | `30000` | API 請求超時毫秒數 |
| `VITE_MESSAGES_REFETCH_INTERVAL_MS` | int | `30000` | 訊息重新取得間隔毫秒數 |
| `VITE_MAX_RECONNECT_DELAY_MS` | int | `30000` | WebSocket 最大重連延遲毫秒數 |
| `VITE_REPORTS_STALE_TIME_MS` | int | `30000` | 報告快取過期時間毫秒數 |
#### Scenario: 開發環境配置
- **WHEN** 在開發環境執行 `npm run dev`
- **THEN** 前端 SHALL 使用 `VITE_BACKEND_URL` 設定代理轉發 API 請求
#### Scenario: 生產環境配置
- **WHEN** 建置生產版本並設定 `VITE_API_BASE_URL`
- **THEN** 前端 SHALL 使用該 URL 作為 API 請求基礎路徑
---
### Requirement: Docker Environment Configuration
Docker 部署 SHALL 支援透過環境變數文件配置所有服務。
**Docker Compose 支援的環境變數:**
| 變數名稱 | 類型 | 說明 |
|---------|------|------|
| `MINIO_ROOT_USER` | str | MinIO 管理員帳號 |
| `MINIO_ROOT_PASSWORD` | str | MinIO 管理員密碼 |
| `MINIO_API_PORT` | int | MinIO S3 API 端口 (預設 9000) |
| `MINIO_CONSOLE_PORT` | int | MinIO Console 端口 (預設 9001) |
#### Scenario: Docker 環境變數載入
- **WHEN** 執行 `docker-compose --env-file .env.docker up`
- **THEN** Docker 服務 SHALL 使用環境變數文件中的配置
#### Scenario: 安全認證配置
- **WHEN** 設定非預設的 MinIO 認證資訊
- **THEN** MinIO 服務 SHALL 使用自訂認證
---
### Requirement: CORS Security Configuration
系統 SHALL 要求明確配置 CORS 允許來源,禁止使用萬用字元。
#### Scenario: CORS 來源配置
- **WHEN** 設定 `CORS_ORIGINS=http://localhost:3000,https://example.com`
- **THEN** 系統 SHALL 只接受來自這些來源的跨域請求
#### Scenario: CORS 未配置
- **WHEN** 未設定 `CORS_ORIGINS` 環境變數
- **THEN** 系統 SHALL 拒絕啟動並顯示錯誤訊息
#### Scenario: 開發環境 CORS
- **WHEN** `DEBUG=True``CORS_ORIGINS` 包含 `http://localhost:3000`
- **THEN** 開發環境前端 SHALL 能夠正常存取 API
---
### Requirement: Environment Example Files
專案 SHALL 提供完整的環境變數範例檔案,包含所有可配置項目及說明。
**必要範例檔案:**
- `.env.example` - 後端環境變數範例
- `frontend/.env.example` - 前端環境變數範例
- `.env.docker.example` - Docker 部署環境變數範例
#### Scenario: 新開發者設定環境
- **WHEN** 新開發者複製 `.env.example``.env`
- **THEN** 只需填入必要的 API 金鑰即可啟動開發環境
#### Scenario: 生產部署設定
- **WHEN** 運維人員參考 `.env.example` 設定生產環境
- **THEN** 所有環境變數 SHALL 有明確的說明和範例值

View File

@@ -0,0 +1,36 @@
# Tasks: Unified Environment Configuration
## 1. Backend Configuration Enhancement
- [x] 1.1 擴展 `app/core/config.py` Settings 類別,新增所有環境變數
- [x] 1.2 更新 `app/main.py` 使 CORS 來源從 `CORS_ORIGINS` 環境變數讀取
- [x] 1.3 更新 `app/modules/realtime/router.py` 將硬編碼的 `SYSTEM_ADMIN_EMAIL` 改為環境變數
- [x] 1.4 更新 `app/modules/auth/services/ad_client.py` 將 timeout 改為可配置
- [x] 1.5 更新 `app/modules/realtime/services/message_service.py` 將訊息編輯時間限制改為可配置
- [x] 1.6 更新 `app/modules/realtime/websocket_manager.py` 將打字超時改為可配置
- [x] 1.7 更新 `app/modules/file_storage/validators.py` 將檔案大小限制改為可配置
## 2. Frontend Configuration Enhancement
- [x] 2.1 更新 `frontend/vite.config.ts` 使用環境變數設定端口和後端 URL
- [x] 2.2 更新 `frontend/src/services/api.ts` 使用環境變數設定 API 超時
- [x] 2.3 更新 `frontend/src/hooks/useMessages.ts` 使用環境變數設定重新取得間隔
- [x] 2.4 更新 `frontend/src/hooks/useWebSocket.ts` 使用環境變數設定重連延遲
- [x] 2.5 更新 `frontend/src/hooks/useReports.ts` 使用環境變數設定快取過期時間
## 3. Docker Configuration
- [x] 3.1 更新 `docker-compose.minio.yml` 使用環境變數取代硬編碼認證
- [x] 3.2 創建 `.env.docker.example` 範例檔案
## 4. Documentation Updates
- [x] 4.1 更新根目錄 `.env.example` 包含所有後端環境變數及中英文說明
- [x] 4.2 更新 `frontend/.env.example` 包含所有前端環境變數及說明
- [x] 4.3 更新現有 `.env` 檔案包含所有新環境變數
## 5. Testing & Validation
- [x] 5.1 驗證所有環境變數有合理的預設值(開發環境可直接運行)
- [x] 5.2 確保前端編譯成功
- [x] 5.3 驗證後端配置正確載入

View File

@@ -0,0 +1,66 @@
# Change: Migrate SQLite to MySQL with Table Prefix
## Why
目前專案使用 SQLite 作為開發資料庫,需要遷移到雲端 MySQL 資料庫以支援生產環境部署。由於 MySQL 資料庫 `db_A060` 會與其他專案共用,需要為所有資料表加上 `tr_` 前綴以避免命名衝突。
**遷移目標:**
- 完全移除 SQLite 支援,統一使用 MySQL
- 所有資料表加上 `tr_` 前綴(例如 `users``tr_users`
- 使用 Alembic 進行資料庫版本控制和遷移管理
- 確保遷移腳本只影響 `tr_` 前綴的資料表
## What Changes
### 1. Database Configuration
- 更新 `DATABASE_URL` 環境變數格式支援 MySQL
- 移除 `app/core/database.py` 中的 SQLite 特殊處理
- 新增 MySQL 驅動相依套件 (`pymysql``mysqlclient`)
### 2. Model Table Prefix (**BREAKING**)
所有 10 個資料表將重新命名:
| 原名稱 | 新名稱 |
|--------|--------|
| `users` | `tr_users` |
| `user_sessions` | `tr_user_sessions` |
| `incident_rooms` | `tr_incident_rooms` |
| `room_members` | `tr_room_members` |
| `room_templates` | `tr_room_templates` |
| `messages` | `tr_messages` |
| `message_reactions` | `tr_message_reactions` |
| `message_edit_history` | `tr_message_edit_history` |
| `generated_reports` | `tr_generated_reports` |
| `room_files` | `tr_room_files` |
### 3. Alembic Integration
- 初始化 Alembic 遷移框架
- 建立初始遷移腳本(建立所有 `tr_` 前綴資料表)
- 移除 `app/main.py` 中的 `Base.metadata.create_all()` 自動建表
### 4. Index and Constraint Naming
- 更新所有索引名稱加上 `tr_` 前綴以避免衝突
- 更新唯一約束名稱
### 5. MySQL Compatibility
- 確保 JSON 欄位在 MySQL 中正確運作
- 確保 Enum 類型在 MySQL 中正確運作
- 處理 MySQL 的字串長度限制VARCHAR vs TEXT
## Impact
- **Affected specs**: 新增 `database` spec
- **Affected code**:
- `app/core/database.py` - 移除 SQLite 支援
- `app/core/config.py` - 可能新增資料表前綴設定
- `app/modules/*/models.py` - 所有 5 個 models.py 檔案更新 `__tablename__`
- `app/main.py` - 移除自動建表,改用 Alembic
- `requirements.txt` - 新增 `alembic`, `pymysql`
- `.env`, `.env.example` - 更新 DATABASE_URL 格式
- **Breaking changes**:
- 所有資料表重新命名(需要重新建立資料庫或執行遷移)
- SQLite 不再支援
- 現有 SQLite 資料不會自動遷移(需手動處理)
- **New files**:
- `alembic.ini` - Alembic 設定檔
- `alembic/` - 遷移腳本目錄

View File

@@ -0,0 +1,97 @@
## ADDED Requirements
### Requirement: MySQL Database Support
系統 SHALL 使用 MySQL 作為唯一的資料庫後端,不再支援 SQLite。
**MySQL 連線配置:**
| 環境變數 | 格式 | 說明 |
|---------|------|------|
| `DATABASE_URL` | `mysql+pymysql://user:pass@host:port/database` | MySQL 連線字串 |
#### Scenario: MySQL 連線成功
- **WHEN** 提供有效的 MySQL 連線字串
- **THEN** 系統 SHALL 成功連線到 MySQL 資料庫
#### Scenario: MySQL 連線失敗
- **WHEN** MySQL 伺服器無法連線
- **THEN** 系統 SHALL 顯示明確的連線錯誤訊息並拒絕啟動
---
### Requirement: Table Prefix Convention
所有資料表 SHALL 使用 `tr_` 前綴以避免與同資料庫中的其他專案發生命名衝突。
**資料表命名規則:**
- 所有資料表名稱以 `tr_` 開頭
- 所有索引名稱以 `ix_tr_` 開頭
- 所有唯一約束名稱以 `uq_tr_` 開頭
**完整資料表清單:**
| 模組 | 資料表名稱 |
|------|-----------|
| Auth | `tr_users`, `tr_user_sessions` |
| Chat Room | `tr_incident_rooms`, `tr_room_members`, `tr_room_templates` |
| Realtime | `tr_messages`, `tr_message_reactions`, `tr_message_edit_history` |
| Report | `tr_generated_reports` |
| File Storage | `tr_room_files` |
#### Scenario: 資料表前綴驗證
- **WHEN** 查詢資料庫中由本系統建立的資料表
- **THEN** 所有資料表名稱 SHALL 以 `tr_` 開頭
#### Scenario: 索引前綴驗證
- **WHEN** 查詢資料庫中由本系統建立的索引
- **THEN** 所有索引名稱 SHALL 以 `ix_tr_` 開頭
---
### Requirement: Alembic Database Migration
系統 SHALL 使用 Alembic 進行資料庫結構版本控制和遷移管理。
**Alembic 配置要求:**
- 從環境變數 `DATABASE_URL` 讀取資料庫連線
- 遷移腳本存放於 `alembic/versions/` 目錄
- 支援 `alembic upgrade head``alembic downgrade` 指令
#### Scenario: 執行資料庫遷移
- **WHEN** 執行 `alembic upgrade head`
- **THEN** 系統 SHALL 建立所有 `tr_` 前綴的資料表
#### Scenario: 自動產生遷移腳本
- **WHEN** 執行 `alembic revision --autogenerate`
- **THEN** Alembic SHALL 比對 models 與資料庫結構並產生遷移腳本
#### Scenario: 遷移腳本隔離
- **WHEN** 執行任何 Alembic 遷移操作
- **THEN** 只有 `tr_` 前綴的資料表會受到影響,其他專案的資料表不受影響
---
### Requirement: MySQL Connection Pooling
系統 SHALL 使用連線池管理 MySQL 連線以提升效能和穩定性。
**連線池配置:**
| 參數 | 預設值 | 說明 |
|------|--------|------|
| `pool_size` | 5 | 連線池大小 |
| `max_overflow` | 10 | 最大額外連線數 |
| `pool_recycle` | 3600 | 連線回收時間(秒) |
#### Scenario: 連線池運作
- **WHEN** 多個 API 請求同時存取資料庫
- **THEN** 系統 SHALL 從連線池取得連線而非每次建立新連線
#### Scenario: 連線回收
- **WHEN** 連線閒置超過 `pool_recycle` 時間
- **THEN** 系統 SHALL 自動回收並建立新連線以避免 MySQL 的 wait_timeout 問題
---
## REMOVED Requirements
### Requirement: SQLite Support
**Reason**: 專案已完全遷移至 MySQL不再需要 SQLite 支援
**Migration**:
- 移除 `app/core/database.py` 中的 SQLite 特殊處理(`check_same_thread`
- 更新 `.env.example` 移除 SQLite 連線範例
- 現有 SQLite 資料需手動遷移或重新建立

View File

@@ -0,0 +1,49 @@
# Tasks: Migrate SQLite to MySQL
## 1. Dependencies Setup
- [x] 1.1 新增 `pymysql` 到 requirements.txt
- [x] 1.2 新增 `alembic` 到 requirements.txt
- [x] 1.3 安裝新相依套件
## 2. Model Updates (Add tr_ Prefix)
- [x] 2.1 更新 `app/modules/auth/models.py` - `tr_users`, `tr_user_sessions`
- [x] 2.2 更新 `app/modules/chat_room/models.py` - `tr_incident_rooms`, `tr_room_members`, `tr_room_templates`
- [x] 2.3 更新 `app/modules/realtime/models.py` - `tr_messages`, `tr_message_reactions`, `tr_message_edit_history`
- [x] 2.4 更新 `app/modules/report_generation/models.py` - `tr_generated_reports`
- [x] 2.5 更新 `app/modules/file_storage/models.py` - `tr_room_files`
- [x] 2.6 更新所有索引和約束名稱加上 `tr_` 前綴
## 3. Database Core Updates
- [x] 3.1 更新 `app/core/database.py` 移除 SQLite 特殊處理,加入 MySQL 連線池設定
- [x] 3.2 更新 `app/main.py` 移除 `Base.metadata.create_all()` 自動建表
## 4. Alembic Setup
- [x] 4.1 執行 `alembic init alembic` 初始化 Alembic
- [x] 4.2 設定 `alembic/env.py` 使用環境變數讀取 DATABASE_URL
- [x] 4.3 更新 `alembic/env.py` 設定 target_metadata 和 `tr_alembic_version` 版本表
- [x] 4.4 建立初始遷移腳本 `alembic revision --autogenerate -m "Initial migration - create tr_ prefixed tables"`
## 5. Environment Configuration
- [x] 5.1 更新 `.env` 使用 MySQL 連線字串
- [x] 5.2 更新 `.env.example` 提供 MySQL 連線範例
- [x] 5.3 移除 SQLite 相關註解和範例
## 6. Database Migration
- [x] 6.1 執行 `alembic upgrade head` 建立資料表
- [x] 6.2 驗證所有資料表正確建立於 MySQL (11 個 tr_ 前綴表格)
## 7. Cleanup
- [x] 7.1 刪除本地 SQLite 資料庫檔案 `task_reporter.db`
- [x] 7.2 確認 `.gitignore` 包含 `*.db` 規則
## 8. Testing
- [x] 8.1 驗證後端應用程式可正常啟動並連接 MySQL
- [x] 8.2 驗證資料庫 CRUD 操作正常 (tr_room_templates 查詢成功)

View File

@@ -0,0 +1,162 @@
# env-config Specification
## Purpose
TBD - created by archiving change add-unified-env-config. Update Purpose after archive.
## Requirements
### Requirement: Centralized Backend Configuration
系統 SHALL 透過 `app/core/config.py` 中的 Settings 類別集中管理所有後端配置,並從環境變數讀取設定值。
**必要環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `DATABASE_URL` | str | (必填) | 資料庫連線字串 |
| `FERNET_KEY` | str | (必填) | 加密金鑰 |
| `AD_API_URL` | str | (必填) | AD 認證 API URL |
| `HOST` | str | `0.0.0.0` | 伺服器綁定位址 |
| `PORT` | int | `8000` | 伺服器端口 |
| `DEBUG` | bool | `False` | 除錯模式 |
| `LOG_LEVEL` | str | `INFO` | 日誌等級 (DEBUG/INFO/WARNING/ERROR) |
| `CORS_ORIGINS` | str | (必填) | 允許的 CORS 來源,逗號分隔 |
| `SYSTEM_ADMIN_EMAIL` | str | (必填) | 系統管理員信箱 |
| `AD_API_TIMEOUT_SECONDS` | int | `10` | AD API 請求超時秒數 |
| `SESSION_INACTIVITY_DAYS` | int | `3` | 會話閒置過期天數 |
| `TOKEN_REFRESH_THRESHOLD_MINUTES` | int | `5` | Token 更新閾值分鐘數 |
| `MAX_REFRESH_ATTEMPTS` | int | `3` | 最大 Token 更新嘗試次數 |
| `MESSAGE_EDIT_TIME_LIMIT_MINUTES` | int | `15` | 訊息編輯時間限制分鐘數 |
| `TYPING_TIMEOUT_SECONDS` | int | `3` | 打字指示器超時秒數 |
#### Scenario: 必要環境變數缺失時啟動失敗
- **WHEN** 啟動應用程式時缺少必要環境變數(如 `DATABASE_URL`
- **THEN** 應用程式 SHALL 顯示明確錯誤訊息並拒絕啟動
#### Scenario: 使用預設值啟動
- **WHEN** 提供所有必要環境變數但未設定選填變數
- **THEN** 系統 SHALL 使用預設值正常啟動
---
### Requirement: File Storage Configuration
系統 SHALL 支援透過環境變數配置檔案儲存相關設定。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `MINIO_ENDPOINT` | str | `localhost:9000` | MinIO 伺服器位址 |
| `MINIO_ACCESS_KEY` | str | `minioadmin` | MinIO 存取金鑰 |
| `MINIO_SECRET_KEY` | str | `minioadmin` | MinIO 密鑰 |
| `MINIO_BUCKET` | str | `task-reporter-files` | 預設儲存桶名稱 |
| `MINIO_SECURE` | bool | `False` | 是否使用 HTTPS |
| `IMAGE_MAX_SIZE_MB` | int | `10` | 圖片最大上傳大小 (MB) |
| `DOCUMENT_MAX_SIZE_MB` | int | `20` | 文件最大上傳大小 (MB) |
| `LOG_MAX_SIZE_MB` | int | `5` | 日誌檔最大上傳大小 (MB) |
#### Scenario: 自訂檔案大小限制
- **WHEN** 設定 `IMAGE_MAX_SIZE_MB=20`
- **THEN** 系統 SHALL 接受最大 20MB 的圖片上傳
#### Scenario: MinIO 連線配置
- **WHEN** 設定自訂 MinIO 端點和認證資訊
- **THEN** 系統 SHALL 使用該配置連線 MinIO 服務
---
### Requirement: AI Service Configuration
系統 SHALL 支援透過環境變數配置 AI 報告生成服務。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `DIFY_BASE_URL` | str | `https://dify.theaken.com/v1` | DIFY API 基礎 URL |
| `DIFY_API_KEY` | str | `""` | DIFY API 金鑰 |
| `DIFY_TIMEOUT_SECONDS` | int | `120` | DIFY API 請求超時秒數 |
| `REPORT_MAX_MESSAGES` | int | `200` | 報告包含的最大訊息數 |
| `REPORT_STORAGE_PATH` | str | `reports` | 報告儲存路徑 |
#### Scenario: DIFY API 配置
- **WHEN** 設定有效的 `DIFY_API_KEY`
- **THEN** 系統 SHALL 能夠呼叫 DIFY API 生成報告
#### Scenario: DIFY API 金鑰缺失
- **WHEN** 未設定 `DIFY_API_KEY` 或設為空字串
- **THEN** 報告生成功能 SHALL 返回適當錯誤訊息
---
### Requirement: Frontend Environment Configuration
前端應用程式 SHALL 透過 Vite 環境變數機制配置運行時設定。
**環境變數:**
| 變數名稱 | 類型 | 預設值 | 說明 |
|---------|------|--------|------|
| `VITE_API_BASE_URL` | str | `""` | API 基礎 URL空字串表示相對路徑 |
| `VITE_PORT` | int | `3000` | 開發伺服器端口 |
| `VITE_BACKEND_URL` | str | `http://localhost:8000` | 後端 API URL開發代理用 |
| `VITE_API_TIMEOUT_MS` | int | `30000` | API 請求超時毫秒數 |
| `VITE_MESSAGES_REFETCH_INTERVAL_MS` | int | `30000` | 訊息重新取得間隔毫秒數 |
| `VITE_MAX_RECONNECT_DELAY_MS` | int | `30000` | WebSocket 最大重連延遲毫秒數 |
| `VITE_REPORTS_STALE_TIME_MS` | int | `30000` | 報告快取過期時間毫秒數 |
#### Scenario: 開發環境配置
- **WHEN** 在開發環境執行 `npm run dev`
- **THEN** 前端 SHALL 使用 `VITE_BACKEND_URL` 設定代理轉發 API 請求
#### Scenario: 生產環境配置
- **WHEN** 建置生產版本並設定 `VITE_API_BASE_URL`
- **THEN** 前端 SHALL 使用該 URL 作為 API 請求基礎路徑
---
### Requirement: Docker Environment Configuration
Docker 部署 SHALL 支援透過環境變數文件配置所有服務。
**Docker Compose 支援的環境變數:**
| 變數名稱 | 類型 | 說明 |
|---------|------|------|
| `MINIO_ROOT_USER` | str | MinIO 管理員帳號 |
| `MINIO_ROOT_PASSWORD` | str | MinIO 管理員密碼 |
| `MINIO_API_PORT` | int | MinIO S3 API 端口 (預設 9000) |
| `MINIO_CONSOLE_PORT` | int | MinIO Console 端口 (預設 9001) |
#### Scenario: Docker 環境變數載入
- **WHEN** 執行 `docker-compose --env-file .env.docker up`
- **THEN** Docker 服務 SHALL 使用環境變數文件中的配置
#### Scenario: 安全認證配置
- **WHEN** 設定非預設的 MinIO 認證資訊
- **THEN** MinIO 服務 SHALL 使用自訂認證
---
### Requirement: CORS Security Configuration
系統 SHALL 要求明確配置 CORS 允許來源,禁止使用萬用字元。
#### Scenario: CORS 來源配置
- **WHEN** 設定 `CORS_ORIGINS=http://localhost:3000,https://example.com`
- **THEN** 系統 SHALL 只接受來自這些來源的跨域請求
#### Scenario: CORS 未配置
- **WHEN** 未設定 `CORS_ORIGINS` 環境變數
- **THEN** 系統 SHALL 拒絕啟動並顯示錯誤訊息
#### Scenario: 開發環境 CORS
- **WHEN** `DEBUG=True``CORS_ORIGINS` 包含 `http://localhost:3000`
- **THEN** 開發環境前端 SHALL 能夠正常存取 API
---
### Requirement: Environment Example Files
專案 SHALL 提供完整的環境變數範例檔案,包含所有可配置項目及說明。
**必要範例檔案:**
- `.env.example` - 後端環境變數範例
- `frontend/.env.example` - 前端環境變數範例
- `.env.docker.example` - Docker 部署環境變數範例
#### Scenario: 新開發者設定環境
- **WHEN** 新開發者複製 `.env.example``.env`
- **THEN** 只需填入必要的 API 金鑰即可啟動開發環境
#### Scenario: 生產部署設定
- **WHEN** 運維人員參考 `.env.example` 設定生產環境
- **THEN** 所有環境變數 SHALL 有明確的說明和範例值

View File

@@ -5,7 +5,7 @@ python-multipart==0.0.6
# Database
sqlalchemy==2.0.25
psycopg2-binary==2.9.9
pymysql==1.1.0
alembic==1.13.1
# Object Storage

View File

@@ -7,7 +7,7 @@ import sys
import os
# Add project root to path
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from sqlalchemy import inspect
from app.core.database import engine, SessionLocal