The Centre has directed YouTube to put disclaimers on channels likely to be spreading ‘fake’ news, according to sources. This comes months after the government’s plan to appoint a fact-checking unit met with controversy over possible censorship.
In an attempt to crack down on online “misinformation” ahead of the upcoming Assembly and Lok Sabha elections, the Ministry of Electronics and Information Technology (MeitY) has sent a memorandum to several digital platforms.
The letter, seen by Business Standard, asks all significant social media intermediaries (SSMIs) to submit an action taken note on dispelling ‘fake’ news and unlawful content within 10 days.
However, the notice sent to intermediaries, including Meta, X (formerly Twitter), YouTube, Dailyhunt, ShareChat, and Telegram does not provide a definition of misinformation or ‘fake’ news. It also does not explain the ways to identify 'fake' news from huge chunks of user-generated content.
“YouTube should have an appropriate policy to take legal action on ‘fake’ news channels and it is advisable to put a disclaimer/ticker ‘news not verified’,” said the memorandum.
It also mentions that these recommendations came out of last week’s meeting of the Parliamentary Standing Committee on Communications and IT, officials of MeitY and the Competition Commission of India (CCI) along with representatives of social media companies.
As reported earlier, the standing committee is considering a set of recommendations for “a stronger law” to deal with misinformation, and defamatory and obscene content on social media platforms. The measures could be presented in the Winter Session of Parliament, a senior government official told this paper after the October 9 meeting. YouTube, Meta, X, ShareChat and Dailyhunt did not respond to Business Standard’s queries sent on Friday till the time of going to the press.
The new directive is likely to significantly raise the compliance burden for YouTube, which has around 467 million active users in India. This is its highest user base in the world, according to the latest data available on analytics website Global Media Insight.
According to global advisory firm Oxford Economics, YouTube’s creative ecosystem supported more than 750,000 full-time equivalent jobs in India as of April 2023. News content forms a big part of the content on the video streaming app.
More than one of three YouTube users search for news on the platform. Of these, 73 per cent agree that they can find news from credible, trusted sources on YouTube, said the study. Fact-checking of online content and government directives on the removal of misinformation has for long been a matter of contention among the government, intermediaries and free-speech activists.
The varied nature of online content — ranging from satire and parody to newly-emerging AI-generated imagery — further complicates the task of fact-checking, according to some platforms. According to YouTube, it has removed more than 92,000 channels and 78,000 videos for violating its anti-misinformation policies globally in the second quarter of 2023 using machine learning and human reviewers to enforce.
Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, requires intermediaries to make reasonable efforts to stop their users from publishing or sharing any information, which is patently false and untrue or misleading.
An amendment to the IT Rules in April 2023 extended the rule, adding that intermediaries must remove any central government-related content flagged as fake by a government-appointed fact check unit. A division Bench of the Bombay High Court has been hearing a batch of petitions against the amendment.
Media reports in the same month had suggested that an industry consortium Misinformation Combat Alliance (MCA) was about to set up a self-regulatory organisation (SRO) to fact-check non-government-related content. However, no development has taken place since then.
The ministry, in its latest directive, has also asked online platforms to submit their community standards/ policy and company’s strategies for the safety of internet users.
They may also need to submit a note on their awareness creation campaigns regularly. This would strengthen the cyber safety of users, especially children, through necessary safeguards, including reporting of unlawful acts, and parental supervision.
Furthermore, the intermediaries have been asked to submit a note on their revenue and details about yearly awareness activities, under the corporate social responsibility fund for the last five years.