Skip to main content

Ollama

Ollama connections enable integration with locally hosted Ollama instances for AI-powered playbook actions without external API dependencies.

Used By

  • AI Action - Use self-hosted AI for log analysis and automation
FieldDescriptionScheme
model*

Ollama model to use

string

url*

Ollama API endpoint URL

EnvVar

Example

ollama-connection.yaml
apiVersion: mission-control.flanksource.com/v1
kind: Connection
metadata:
name: ollama-local
spec:
ollama:
model: llama3.1:8b
url:
value: http://ollama.ollama-system.svc:11434