Co-Founder & CEO |Ollama
Serial founder who built Kitematic (acquired by Docker), Infra, and now Ollama — the leading open-source platform for running LLMs locally with 165k+ GitHub stars. University of Waterloo BSE. Previously senior engineer at Docker, intern at Google and Twitter. Ollama won GitHub's inaugural Awesome AI Award (2024) and was the fastest-growing open-source AI project by contributor count.
Episodes
Biography
Jeffrey Morgan is the Co-Founder and CEO of Ollama, the leading open-source platform for running large language models locally. Ollama's GitHub repository has amassed over 165,000 stars, making it one of the fastest-growing open-source projects in history and the winner of GitHub's inaugural Awesome AI Award in 2024. Before Ollama, Morgan co-founded Kitematic, a Mac tool for simplifying Docker container management that was acquired by Docker in March 2015. He then served as a Senior Engineer and Product Manager at Docker from 2014 to 2019. He subsequently co-founded Infra (2021-2023), an infrastructure access management startup. Morgan holds a Bachelor of Software Engineering from the University of Waterloo (2008-2013) and had early career stints at Google and Twitter. He co-founded Ollama in 2023 with Michael Chiang; the company graduated from Y Combinator's W21 batch and is headquartered in Palo Alto.
Open-source platform for running LLMs locally with a simple CLI (ollama run). 165,600+ GitHub stars, winner of GitHub's inaugural Awesome AI Award (2024). Supports dozens of model families including Llama, DeepSeek, Qwen, Gemma, Mistral, and more across macOS, Linux, and Windows.
Official Python client for the Ollama API, enabling programmatic access to local LLM inference. 9,600+ GitHub stars.
Official JavaScript/TypeScript client for Ollama, powering web and Node.js integrations with local models. 4,070+ GitHub stars.
Mac GUI for managing Docker containers that reduced Docker setup from 30-60 minutes to a few minutes. Co-founded with Michael Chiang and Sean Li while at University of Waterloo. Acquired by Docker in March 2015.
Infrastructure access management platform for safe, consistent, and fully automated provisioning. Co-founded in 2021.
Drop-in API compatibility with OpenAI's chat completions format, allowing existing OpenAI-based applications to switch to local models with minimal code changes.
Research-driven approach combining local and cloud LLMs for privacy-preserving yet capable AI workflows, including the 'Secure Minions' extension.
Research generated March 19, 2026