Creator & Independent Developer |Datasette
Co-creator of Django, creator of Datasette and LLM CLI. Coined 'prompt injection'. Builds open-source tools for data journalism and AI-assisted development.
Biography
Simon Willison is a British software developer, independent open-source contributor, and the co-creator of the Django web framework. Born in January 1981, he studied at the University of Bath and began his career in 2000 as a webmaster at Gameplay. In 2003 he joined the Lawrence Journal-World in Kansas on a year-long internship, where he and Adrian Holovaty created Django to streamline newsroom web development. After graduating in 2005 he joined Yahoo's Technology Development team, then in 2008 became a software architect at The Guardian. In 2010 he co-founded Lanyrd, a social conference directory backed by Y Combinator, which was acquired by Eventbrite in 2013 where he rose to Engineering Director. Since leaving Eventbrite he has worked full-time as an independent open-source developer building tools for data journalism. He created Datasette (2017), an open-source tool for exploring and publishing SQLite data; sqlite-utils, a CLI and Python library for manipulating SQLite databases; and LLM (2023), a command-line tool and Python library for accessing large language models. He coined the term 'prompt injection' in 2022 to describe a critical class of LLM security vulnerabilities. Willison is a Python Software Foundation board member (since 2022), a GitHub Star, and a participant in the GitHub Accelerator program (2023). He is based in Half Moon Bay, California.
Co-created the Django web framework in 2003 at the Lawrence Journal-World. Django became one of the most popular Python web frameworks, powering Instagram, Pinterest, Mozilla, and thousands of other sites.
Open-source multi-tool for exploring and publishing data from SQLite databases as interactive web applications. 10,800+ GitHub stars, with a plugin ecosystem of 100+ extensions.
Command-line tool and Python library for accessing large language models (OpenAI, Anthropic, Gemini, Ollama) via a plugin architecture. 11,300+ GitHub stars.
Python CLI utility and library for manipulating SQLite databases with a clean, composable API. 2,000+ GitHub stars.
CLI tool to concatenate a directory of files into a single prompt for use with LLMs. 2,600+ GitHub stars.
Command-line utility for taking automated screenshots of websites, powered by Playwright. 2,300+ GitHub stars.
Coined the term 'prompt injection' in September 2022 (named after SQL injection). Became a leading voice on LLM security, developing concepts like the 'lethal trifecta' for AI agent vulnerabilities.
Public collection of 575+ short-form learning notes, published as a Datasette-powered site. 1,400+ GitHub stars.
It's not about getting work done faster, it's about being able to ship projects that I wouldn't have been able to justify spending time on at all.
Think of them as an over-confident pair programming assistant who's lightning fast at looking things up.
If someone tells you that coding with LLMs is easy they are (probably unintentionally) misleading you.
Most of the craft of getting good results out of an LLM comes down to managing its context.
I think it's really useful to have a model hallucinate at you early because it helps you get that better mental model of what it can do.
The one thing you absolutely cannot outsource to the machine is testing that the code actually works.
Research generated March 19, 2026