Ollama

Detailed information on the Ollama conversation component

Component format

A Dapr conversation.yaml component file has the following structure:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: ollama
spec:
  type: conversation.ollama
  metadata:
  - name: model
    value: llama3.2:latest
  - name: responseCacheTTL
    value: 10m

Spec metadata fields

FieldRequiredDetailsExample
modelNThe Ollama LLM to use. Defaults to llama3.2:latest.phi4:latest
responseCacheTTLNTime-to-live for the in-memory response cache. When set, identical requests are served from cache until they expire.10m

OpenAI Compatibility

Ollama is compatible with OpenAI’s API. You can use the OpenAI component with Ollama models with the following changes:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: ollama-openai
spec:
  type: conversation.openai # use the openai component type
  metadata:
  - name: key
    value: 'ollama' # just any non-empty string
  - name: model
    value: gpt-oss:20b  # an ollama model (https://ollama.com/search) in this case openai open source model. See https://ollama.com/library/gpt-oss
  - name: endpoint
    value: 'http://localhost:11434/v1' # ollama endpoint