Home / Opinion / Editorial / Policy models will need to evolve as OpenAI, others make their India play
Policy models will need to evolve as OpenAI, others make their India play
OpenAI Chief Executive Officer Sam Altman has argued that this is, in some sense, a test launch into its second-largest market, and it will roll out similar plans worldwide
premium
Policy needs to be developed to ensure AI companies offer transparency regarding the collection, storage, processing, and monetisation of data. | Image: Bloomberg
3 min read Last Updated : Aug 24 2025 | 9:07 PM IST
Don't want to miss the best from Business Standard?
India is high on the priority list of OpenAI. It is looking to establish its physical presence here and is in the process of opening an office. It also recently launched a low-cost subscription plan, ChatGPT (generative pre-training transformer) Go, specifically for Indian users, which would affect the dynamics of India’s artificial intelligence (AI) ecosystem. The new offer, at ₹399 per month, is much cheaper than the top-end Pro plan of ₹19,999, and it offers enhanced message limits, image-generation capabilities, file-uploading capabilities, chat memory, and data analysis. These features are supported by the latest model, GPT-5, and the plan will offer more support for local languages. OpenAI Chief Executive Officer Sam Altman has argued that this is, in some sense, a test launch into its second-largest market, and it will roll out similar plans worldwide after it has absorbed feedback and responses. From OpenAI’s perspective, this may help monetise usage of an already popular tool. It is also a response to competition arising from Airtel’s bundled offer of a year’s free subscription to ChatGPT’s rival Perplexity Pro (worth roughly ₹17,000 a year) to post-paid subscribers.
For many Indians, access to affordable AI tools could be a game-changer. It will enable small businesses and individuals (especially students) to use such tools more effectively, and thus, work out more use cases and solutions as they play with different possibilities. Users may find ChatGPT becoming a force multiplier as they learn how to organise work and create custom GPTs to build AI tools tailored to their specific needs. OpenAI may hope that, as AI penetration in the general population increases and as user sophistication improves with exposure, ChatGPT (along with Perplexity, Claude, Gemini, Grok, and other AIs) will become indispensable, and calibrated tariff raises will be possible. The number of ChatGPT users in India trebled during 2024, and OpenAI says it is committed to locally storing the data from Indian users of ChatGPT Enterprise, ChatGPT Edu, and the OpenAI API platform. This complies with India’s current policy on data localisation and privacy. Of course, OpenAI will develop insights by working with that data, and given the dependency of AI on the creation and understanding of large data sets, this could in itself give OpenAI a big lead over the competition.
However, while this move to localise should be welcomed, India needs to take a look at its privacy safeguards. Free web search is subsidised by ads that leverage personalised user information gathered by identifying intent from search, cache, and cookies. AI models could find ways to monetise data in different but analogous ways. A shift has clearly occurred in usage with surfers increasingly moving to AI-driven searches from the “vanilla” Google model. Offers like ChatGPT Go and bundled Perplexity Pro herald acceleration in AI penetration. The Indian regulatory framework for managing AI-related privacy and surveillance risks needs to evolve. There are critically important issues about algorithmic transparency, bias, and automated decision-making. Policy needs to be developed to ensure AI companies offer transparency regarding the collection, storage, processing, and monetisation of data. This should go beyond the current legal mandates on what data is collected and stored, and why it’s being used, to offer more details about monetisation, including how platforms intend to profit from behavioural data or profiling, and information shared with third parties.