Location via proxy:   
[Report a bug]   [Manage cookies]                
ollama logo

Get up and running with large language models.

Run Llama 3.3, DeepSeek-R1, Phi-4, Mistral, Gemma 3, and other models, locally.

Available for macOS, Linux, and Windows