Back to Marketplace
ollama-local
Manage and use local Ollama models for inference, embeddings, and tool use.
3,517downloads33installs26stars
v1.1.0
cmdopAI & Agentschat, completions, embeddings, llm, local, management, model, tool, use3/2/2026
Overview: Manage and use local Ollama models for inference, embeddings, and tool use. This skill provides a comprehensive set of tools for working with local LLMs, including model management, chat, completions, embeddings, and tool use. Key Features:
- Model management (list, pull, remove)
- Chat and completions with local models
- Embeddings and tool use with local LLMs
- OpenClaw sub-agent integration and model selection guidance
- Direct API access for custom integrations
How It Works: This skill uses a combination of command-line tools and Python scripts to interact with local Ollama models. The ollama.py script provides a quick reference for common tasks, while the ollama_tools.py script allows for more complex tool use. The skill also includes a model selection guide and OpenClaw sub-agent integration. Use Cases:
- Use local models for inference and embeddings
- Integrate local LLMs with OpenClaw sub-agents
- Use the direct API for custom integrations
Reviews
No reviews yet.