With consent placed at the core of the Digital Personal Data Protection (DPDP) Rules, companies will need to rethink how they use customer data to train internal artificial intelligence (AI) models, including removing any training data where consent is not granted.
Retraining AI models may require significant cost and effort.
However, the data used for already trained models can be deleted from repositories on the lack of consent, without affecting the model’s efficiency, said industry experts.
This comes as companies processing user data — known as data fiduciaries — must clearly explain to users, or data principals, how their personal

)