AI startup Sarvam launches two made-in-India large language models
Sarvam launches 30B and 105B parameter indigenous LLMs trained on Indian languages, positioning India closer to a sovereign, voice-first AI ecosystem
)
Sarvam has raised about $50 million in funding which include investors such as Khosla Ventures, Lightspeed Venture and Peak XV.
Listen to This Article
Indian artificial intelligence (AI) startup Sarvam launched two indigenous large language models (LLMs) specifically trained on Indian languages on Wednesday.
The announcement includes a 30 billion and a 105 billion parameter model. The first model handles real time conversations with 32,000 context windows making the cost of inferencing very low while the other one has 128,000 tokens for more complex tasks.
Milestones shared at the launch show the model performing competitively against international models including Gemma 27B, Mistral-32-24B, Nemotron-30B, Qwen-30B, and GPT-OSS-20B across tasks measuring mathematical reasoning, coding accuracy, and general problem-solving.
“It is on par with most other open and closed frontier models of its class, and designed to do complex reasoning tasks very well” Sarvam cofounder Pratyush Kumar said at the launch.
“This 105 billion parameter model can meet most benchmarks, be it a DeepSeek R1 model that was released a year ago on 600 billion parameters. This was a model trained from scratch, one-sixth the size of that model and today is providing intelligence which is competitive to what DeepSeek was earlier. It is also cheaper than Google’s Gemini Flash but outperforms its many benchmarks,” he added.
Also Read
At the launch, Vikram — Sarvam’s AI chatbot — spoke in several Indian languages, including Punjabi and Hindi. It also explained that the name “Vikram” was kept in honour of Vikram Sarabai, Indian physicist and astronomer.
Sarvam cofounder Vivek Raghavan had told Business Standard in November that the launch was supposed to happen at the India AI Impact Summit.
Sarvam AI was selected by the India AI Mission this year to build the country’s first sovereign LLM ecosystem, developing an open source 120 billion parameter AI model to enhance governance and public service access through use cases like 2047: Citizen Connect and AI4Pragati.
Besides Sarvam, Soket will develop India's first open-source 120 billion parameter foundation model optimised for India’s linguistic diversity, targeting sectors such as defense, healthcare, and education. Meanwhile, Gnani launched its own model earlier this week. Gan AI too will create a 70 billion parameter multilingual foundation model targeting text-to-speech capabilities.
The models support all 22 Indian languages and are optimised for voice first interaction, an important step considering that voice AI is expected to bring the positive impacts of AI closer to the masses.
The launch comes in the backdrop of OpenAI starting IndQA in November, a new benchmark designed to evaluate how well AI models understand and reason about questions pertinent to various Indian languages across a wide range of cultural domains to deepen its focus in India, its second highest user base after the United States (US).
Similarly, Anthropic has infused 10 Indic languages in Claude which include Hindi, Bengali, Marathi and Urdu.
Sarvam has raised about $50 million in funding which include investors such as Khosla Ventures, Lightspeed Venture and Peak XV.
More From This Section
Don't miss the most important news and views of the day. Get them on our Telegram channel
First Published: Feb 18 2026 | 8:47 PM IST