India's Edge AI Strategy: Experts Outline Path to Compete in Global AI Race
In the midst of a global competition to develop large, energy-intensive AI models, India is charting a more pragmatic and decentralised course, with edge AI emerging as a central focus. This approach aims to leverage smaller, localised data centres to reduce latency and enhance efficiency, particularly for critical applications like autonomous vehicles and healthcare systems.
Government Emphasis on Edge AI and Small Language Models
S Krishnan, Secretary of the Ministry of Electronics and Information Technology (MeitY), emphasised the importance of edge AI data centres during a pre-Summit event titled 'Democratising AI Access Through Distributed Compute.' Organised by the AI Knowledge Consortium in collaboration with think tanks Esya Centre and Deep Strat on January 30 in New Delhi, the event is part of preparations for the India-AI Impact Summit 2026, scheduled from February 16 to 20. Krishnan highlighted that edge computing eliminates latency, making systems more efficient for safety-critical uses.
He further advocated for India to concentrate on developing small language models (SLMs) tailored to specific sectors such as healthcare, rather than solely focusing on generative AI. "Why are we obsessed with generative AI? Why are we not looking at other aspects of AI and other ways in which AI can be used?" Krishnan questioned, suggesting that fine-tuning previous AI generations could yield better results. The government's strategy under the India AI mission involves subsidising access to compute rather than establishing large facilities directly, aligning with the Economic Survey 2026-27's recommendation for a "bottom-up" approach to avoid high capital expenditure and energy intensity.
Panel Discussion: Key Themes and Insights
A panel of experts, including Abhishek Kankani from Cloudflare India, Rajiv Aggarwal from Samsung India, Sidharth Choudhary from Qualcomm, and Sundeep Narwani from Narrative Research Lab, moderated by Meghna Bal of Esya Centre, addressed critical questions about India's hybrid AI ecosystem.
AI in Your Pocket: Smartphones as National Infrastructure
Sundeep Narwani argued for building local AI models that can run on mobile phones with 12GB RAM, citing advantages such as eliminating ongoing transaction costs and enabling one-time deployment. "We need to start building more local AI models because they offer three key advantages," he stated, noting that while phones may heat up, up to 80% of image processing can occur on-device. Rajiv Aggarwal supported this view, asserting that smartphones should be considered part of national AI infrastructure, as they are the primary medium for consumer access to AI-enabled appliances and services.
Privacy and Data Sovereignty Concerns
Narwani highlighted the privacy benefits of locally run AI models, using quantisation to compress larger models for edge systems. This allows entrepreneurs to develop tailored applications, such as journalists transcribing interviews locally to avoid privacy risks. However, Abhishek Kankani pointed out data sovereignty as a barrier, cautioning against localising everything without leveraging existing resources. "Localising everything seems good on paper but it will lead to re-investing in heavy compute again and again," he warned, emphasising the need for programmable policies and security layers.
Policy Blind Spots and Regulatory Needs
Kankani identified connectivity and capital expenditure as major policy blind spots in hybrid AI, noting that most data centres are concentrated in cities like Bengaluru and Mumbai, limiting access in tier-3 areas. Edge AI can mitigate this by providing local compute centres, reducing reliance on continuous internet connectivity. Sidharth Choudhary from Qualcomm stressed the importance of making edge AI a pillar of the AI mission, given its natural fit with mobile devices and benefits in power consumption and latency. He called for educating policymakers and demonstrating real use cases to drive adoption.
Kankani added that interoperability is crucial, advocating for open standards and protocols to ensure seamless integration across tech stacks. "The safest way to leverage latent compute is to ensure interoperability between all these different tech stacks," he concluded, underscoring the need for regulatory interventions to support edge AI adoption in India.