Gemma crosses 150 Million downloads
Google’s open-source AI model family, Gemma, has hit a significant milestone — over 150 million downloads since its launch in February 2024. The achievement highlights the growing appeal of open-source AI and the developer community’s enthusiasm for highly accessible tools.
The news was shared by Omar Sanseviero, a developer relations engineer at Google DeepMind, via a post on X (formerly Twitter), noting that over 70,000 custom versions of Gemma have been created and shared on the Hugging Face platform.
🌍 A Truly Global Open AI Model
Gemma models are designed to be multilingual, supporting over 100 languages, and are fine-tuned for diverse applications — from natural language processing to drug discovery and scientific research. Google positioned Gemma as a lighter-weight, open alternative to proprietary LLMs like GPT-4 and Claude.
Learn more about the models at Google’s official site:
🔗 https://ai.google.dev/gemma
📊 Open-Source AI Race: Meta vs Google
Despite Gemma’s rapid growth, it still trails Meta’s LLaMA models, which recently surpassed 1.2 billion downloads, according to TechCrunch.
Both Google and Meta have been criticized for non-standard open-source licenses, which may limit commercial use — a concern shared by many in the AI startup ecosystem.
🔧 Developer Momentum & Community Adoption
The explosion of custom Gemma variants on Hugging Face reflects a vibrant open-source community. Many developers cite Gemma’s model efficiency and modularity as key advantages over heavier, cloud-reliant solutions.
Google continues to push updates and community tools to support Gemma’s evolution as a leading open model in the AI landscape.
🔗 Sources:
Schema Selected:
