Initial commit: Llama API Client with full documentation

- Added complete Python client for Llama AI models
- Support for internal network endpoints (tested and working)
- Support for external network endpoints (configured)
- Interactive chat interface with multiple models
- Automatic endpoint testing and failover
- Response cleaning for special markers
- Full documentation in English and Chinese
- Complete test suite and examples
- MIT License and contribution guidelines
This commit is contained in:
2025-09-19 21:38:15 +08:00
commit c6cc91da7f
18 changed files with 2072 additions and 0 deletions

14
連線參數.txt Normal file
View File

@@ -0,0 +1,14 @@
可以連接 llama 的模型ai進行對話
他的連線資料如下:
外網連線:
https://llama.theaken.com/v1https://llama.theaken.com/v1/gpt-oss-120b/
https://llama.theaken.com/v1https://llama.theaken.com/v1/deepseek-r1-671b/
https://llama.theaken.com/v1https://llama.theaken.com/v1/gpt-oss-120b/
外網模型路徑:
1. /gpt-oss-120b/
2. /deepseek-r1-671b/
3. /qwen3-embedding-8b/
金鑰paVrIT+XU1NhwCAOb0X4aYi75QKogK5YNMGvQF1dCyo=