Co-founder & CTO |Hugging Face
Co-founded Hugging Face and led development of the Transformers library and Hub, building the central platform for open-source AI with 1M+ hosted models.
Biography
Julien Chaumond is the co-founder and Chief Technology Officer (CTO) of Hugging Face, the leading open-source AI platform often called 'the GitHub of machine learning.' He co-founded the company in 2016 with Clement Delangue (CEO) and Thomas Wolf (CSO), initially as a chatbot for teenagers before pivoting in 2018 to building open-source ML infrastructure. Under his technical leadership, Hugging Face created the Transformers library (158k+ GitHub stars), the Hugging Face Hub (hosting 1M+ models, 300k+ datasets, and 300k+ Spaces), and helped launch the BigScience initiative that produced BLOOM, a 176B-parameter open-access multilingual language model. Chaumond studied Applied Mathematics at Ecole Polytechnique (2003-2006), Computer Science at Telecom Paris (2006), and earned an M.S. in Electrical Engineering and Computer Science from Stanford University (2006-2007). Before Hugging Face, he co-founded Glose (a social reading app), served as an advisor to the French Deputy Minister for Digital Affairs, and worked at Stupeflix and Corps des Telecommunications. Hugging Face raised $235M in its Series D at a $4.5B valuation in August 2023, with investors including Google, Amazon, Nvidia, Salesforce, Intel, AMD, and Qualcomm. The company reached $130M+ annual revenue by late 2024. Based in Brooklyn, NY, Chaumond continues to champion open-source AI, on-device inference, and community-driven ML development.
Co-created the most popular ML framework (158k+ GitHub stars), providing a unified API for thousands of pretrained models across NLP, vision, audio, and multimodal tasks.
Architected the git-based platform hosting 1M+ models, 300k+ datasets, and 300k+ Spaces, becoming the central distribution layer for open-source AI.
Co-launched the BigScience workshop that brought 1,200+ researchers across 39 countries together to train BLOOM, a 176B-parameter open-access multilingual language model.
Pioneered on-device transformer inference on Apple platforms by building Swift Core ML implementations of GPT-2, BERT, and DistilBERT.
Launched the first major open-source alternative to ChatGPT, powered by community-built open models.
Led development of production inference infrastructure including ONNX optimization, serverless GPU endpoints, and sub-millisecond latency for BERT-scale models.
Published a minimalist 50-line JavaScript agent implementation demonstrating Model Context Protocol (MCP) integration with Hugging Face models.
Some people said that closed APIs were winning... but we will never give up the fight for open source AI.
The intersection of open source, open science, and machine learning is going to be incredibly successful and it's going to bring life to a new generation of open-source-type companies.
Language is the API to humans right? Between humans. It's what you use to communicate.
If you manage to achieve a cost-effective way of reaching a few milliseconds inference times at scale on BERT size models, there's pretty much no reason not to put them into production.
Machine learning is going to be one way to make software engineering more accessible. So anyone can start building stuff no matter where they are.
Democratization of AI will be one of the biggest achievements for society and no single company, not even a Big Tech business, can do it alone.
Research generated March 19, 2026