Sridhar Vembu Champions Low‑Compute AI as Sustainable Alternative
Bengaluru‑based Sarvam AI is gaining attention in India’s push to evolve from an AI consumer to a builder of indigenous models, particularly those tailored for Indian languages and governance workflows. Sridhar Vembu, founder and former CEO of Zoho, praised Sarvam AI’s approach on X, asserting that advanced AI doesn’t require the massive computing and energy footprints seen in today’s dominant systems. He argued that “world‑class AI can be done much more affordably and sustainably,” stressing that future AI, especially code‑generation tools, must prioritise efficiency because “the earth cannot afford today’s AI energy footprint.”
Sarvam AI originated from the AI4Bharat research ecosystem and has drawn early‑stage funding to scale its research and deployment. Rather than chasing ever‑larger general‑purpose models, the startup focuses on domain‑specific systems — such as multilingual document understanding, OCR for complex regional records, and speech/text models for multiple Indian languages — optimised for lower compute requirements. This localisation aligns with national ambitions to build sovereign digital infrastructure and reduce dependency on foreign AI platforms.
While proponents of lean, use‑case‑focused AI highlight its potential to cut computational overhead, broader trends show AI expansion driving a surge in data‑centre energy use globally. Analysts project data‑centre electricity consumption could nearly double by 2030 as demand for generative AI, cloud services, and high‑performance training infrastructure grows. Critics warn that without efficiency‑first design and cleaner energy integration, the rapid scale‑up of AI could significantly increase electricity demand, carbon emissions and strain on utility systems.
Pic courtesy: google/ images are subject to copyright




