- Added complete Python client for Llama AI models - Support for internal network endpoints (tested and working) - Support for external network endpoints (configured) - Interactive chat interface with multiple models - Automatic endpoint testing and failover - Response cleaning for special markers - Full documentation in English and Chinese - Complete test suite and examples - MIT License and contribution guidelines
14 lines
247 B
JSON
14 lines
247 B
JSON
{
|
|
"permissions": {
|
|
"allow": [
|
|
"Bash(pip install:*)",
|
|
"Bash(python:*)",
|
|
"Bash(ping:*)",
|
|
"Bash(curl:*)",
|
|
"Bash(dir)",
|
|
"Bash(git init:*)",
|
|
"Bash(git add:*)"
|
|
],
|
|
"defaultMode": "acceptEdits"
|
|
}
|
|
} |