Initial commit: Llama API Client with full documentation
- Added complete Python client for Llama AI models - Support for internal network endpoints (tested and working) - Support for external network endpoints (configured) - Interactive chat interface with multiple models - Automatic endpoint testing and failover - Response cleaning for special markers - Full documentation in English and Chinese - Complete test suite and examples - MIT License and contribution guidelines
This commit is contained in:
14
連線參數.txt
Normal file
14
連線參數.txt
Normal file
@@ -0,0 +1,14 @@
|
||||
可以連接 llama 的模型,ai進行對話
|
||||
他的連線資料如下:
|
||||
|
||||
外網連線:
|
||||
https://llama.theaken.com/v1https://llama.theaken.com/v1/gpt-oss-120b/
|
||||
https://llama.theaken.com/v1https://llama.theaken.com/v1/deepseek-r1-671b/
|
||||
https://llama.theaken.com/v1https://llama.theaken.com/v1/gpt-oss-120b/
|
||||
外網模型路徑:
|
||||
1. /gpt-oss-120b/
|
||||
2. /deepseek-r1-671b/
|
||||
3. /qwen3-embedding-8b/
|
||||
|
||||
|
||||
金鑰:paVrIT+XU1NhwCAOb0X4aYi75QKogK5YNMGvQF1dCyo=
|
Reference in New Issue
Block a user