Alice Munro, celebrated for her literary prowess and awarded the Nobel Prize in Literature in 2013, is often regarded as one of the greatest short-story writers. Read on to find out why she is in the
The Allahabad High Court has directed that in cases of sexual offences against children, police must ensure that a medical report determining the victim's age is drawn up at the outset and is submitted to the court without any delay. The court said discrepancies in the victim's age in POCSO (Protection of Children from Sexual Offences) cases can substantially affect the rights and liberties of the accused. It granted bail to Ghaziabad-resident Aman, alias Vansh, who was lodged in the jail since December 5, 2023 last year in a POCSO case. "False depiction of a victim as a minor in POCSO Act cases is an abuse of the process of court." Justice Ajay Bhanot said on Tuesday. He observed that the victim's age as mentioned in the prosecution case is often found to be at variance with the age determined by the expert medical boards in a large number of cases. "At times there are multiple contradictions in age-related documents available with the prosecution. Numerous cases of false implica
Senator Amy Klobuchar on Wednesday questioned what she said was inaction in the tech industry, comparing it to the response shown when a panel blew out of a Boeing plane earlier this month
Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built. Those same images have made it easier for AI systems to produce realistic and explicit imagery of fake children as well as transform social media photos of fully clothed real teens into nudes, much to the alarm of schools and law enforcement around the world. Until recently, anti-abuse researchers thought the only way that some unchecked AI tools produced abusive imagery of children was by essentially combining what they've learned from two separate buckets of online images - adult pornography and benign photos of kids. But the Stanford Internet Observatory found more than 3,200 images of suspected child sexual abuse in the giant AI database LAION, an index of online images and captions that's been used to train leading AI image-makers such
The government has issued notice to social media platforms including Telegram, YouTube and X to remove all child sex abuse material and groups circulating such content, Parliament was informed on Friday. Minister of State for Electronics and IT Rajeev Chandrasekhar in a written reply to Rajya Sabha said that the social media platforms have been informed that existing rules cast obligations on the intermediaries, including social media firms, to observe due diligence and if they fail to observe such due diligence, they shall no longer be exempt from their liability under law for third-party information hosted by them. "The Government has issued notice to various social media intermediaries including Telegram, YouTube and X, to remove or disable access to all such CSAM and groups circulating such material which is violative of rule 3(1)(d) and rule 4(4) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021)," Chandrasekhar ...
The company has submitted its formal response to the government on the issue
The Minister of State called out global tech giants for 'distorting' fair competition and free consumer choice in the internet world and said that it is an area of concern for the government
As reported earlier, law enforcement agencies have recorded an uptick in cybercrime against children with an increasing use of artificial intelligence
The government has issued notices to social media platforms X, formerly Twitter, YouTube and Telegram to remove child sexual abuse material from their platforms in India, an official statement said on Friday. Minister of State for Electronics and IT, Rajiv Chandrasekhar said if social media intermediaries do not act swiftly, their 'safe harbour' under section 79 of the IT Act would be withdrawn, implying that the platforms can be directly prosecuted under the applicable laws and rules even though the content may have not been uploaded by them. "Ministry of Electronics and IT has issued notices to social media intermediaries X, Youtube and Teleram, warning them to remove Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet. "The notices served to these platforms emphasize the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms," the statement said. The notices also call for the implementation of proactive measure
A simple search for sexually explicit keywords specificaly referencing children leads to accounts that use these terms to advertise content showing sexual abuse of minors
"The Meta unit's systems for fostering communities have guided users to child-sex content" while the social networking platform has claimed it is "improving internal controls"
Seers in Ayodhya are now demanding an amendment to the Protection of Children from Sexual Offences (POCSO) Act on the grounds that it was being widely misused
Apple and Microsoft don't pro-actively detect child abuse material stored on their iCloud and OneDrive services, despite the wide availability of identifying technology, the report found
The report titled "What works to prevent online violence against children," showcases strategies and best practices to better protect children
Elon Musk on Thursday expressed grave concerns over reports about the presence of tweets soliciting child pornography on Twitter, as India takes on the micro-blogging platform over similar concerns
The CBI on Saturday carried out searches at 59 locations across 21 states and a Union Territory in an internationally coordinated law-enforcement crackdown on online circulation of Child Sexual Abuse Material (CSAM), with officials here saying more than 50 suspects in the country are under the scanner. The raids, part of Operation 'Megha Chakra', were launched after the agency registered two cases under relevant provisions of the Information Technology Act based on inputs from the Crime Against Children (CAC) unit of Interpol, based in Singapore, which had received it from the New Zealand Police, the officials said. The agency is questioning suspects about illicit material found on their electronic devices so as to identify the victims and the abusers, a CBI spokesperson said. "It was alleged that a number of Indian citizens were involved in circulation/downloading/transmission of Child Sexual Abuse Material (CSAM) using cloud-based storage," the CBI spokesperson said after the ...
India on Friday joined the Interpol's international child sexual exploitation (ICSE) database which will allow it to draw links between victims, abusers and crime scene using audio-visual data. The CBI, which is India's nodal agency for Interpol matters, joined the database making India the 68th country to connect to it, according to a statement from the Interpol. "The ICSE database uses video & image comparison to analyse child sexual abuse material and make connections between victims, abusers and places," it said. An intelligence and investigative tool, the database allows specialized investigators to share information on cases of child sexual abuse. Through the image and video comparison software, the investigators can nail down the criminals by identifying victims and places of crime. "The database avoids duplication of effort and saves precious time by letting investigators know whether a series of images has already been discovered or identified in another country, or ...
The legislation would require tech companies to report child sexual abuse material to authorities
The data showed the highest number of rape victims were in the age group of 12 to 18 years
The figures had prompted the CBI to start a massive operation against the alleged peddlers of online child sexual abuse material (CSAM) in India