LocalAI changes: - Rewrite localaictl to use Docker/Podman instead of standalone binary - Use localai/localai:v2.25.0-ffmpeg image with all backends included - Fix llama-cpp backend not found issue - Auto-detect podman or docker runtime - Update UCI config with Docker settings New Ollama package: - Add secubox-app-ollama as lighter alternative to LocalAI - Native ARM64 support with backends included - Simple CLI: ollamactl pull/run/list - Docker image ~1GB vs 2-4GB for LocalAI Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| files | ||
| Makefile | ||