You are here: Home » Technology » News » Apps
Business Standard

Like Facebook, YouTube to set policies to curb extremist content online

YouTube has also introduced four steps to confront extremist activity

IANS  |  New York 

YouTube
YouTube | Photo Courtesy: Wikimedia Commons

Three days after outlined (AI) powered measures to combat terrorism, now has also introduced four steps to confront extremist activity.

In an op-ed in the Financial Times, Senior Vice-President and General Counsel of Google Kent Walker, wrote that has been working with various governments and to identify and remove this content and has invested in systems that help with that task.

He acknowledged that more has to be done in the industry.

The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to "train new content classifiers to help us more quickly identify and remove such content".

The company is also expanding its pool of "Trusted Flagger" users; a group of experts with special privileges to review flagged content that violates the site's

According to Walker, the company will almost double the size of the programme "by adding 50 expert NGOs to the 63 organisations who are already part of the programme".

This expansion would allow the company to draw on specialty groups to target specific types of videos, such as self-harm and

The third step would be hiding videos like ones that contain inflammatory religious or supremacist content -- that do not violate community standards, behind a warning.

The company will do more with counter-radicalisation efforts by building off of its "Creators for Change programme", which will redirect users targeted by extremist groups such as Islamic State (IS) to counter-

said that it uses to remove the terror-related content.

The giant is currently focusing its techniques to combat terrorist content about Islamic State (IS), Al-Qaeda and their affiliates.

RECOMMENDED FOR YOU

Like Facebook, YouTube to set policies to curb extremist content online

YouTube has also introduced four steps to confront extremist activity

YouTube has also introduced four steps to confront extremist activity

Three days after outlined (AI) powered measures to combat terrorism, now has also introduced four steps to confront extremist activity.

In an op-ed in the Financial Times, Senior Vice-President and General Counsel of Google Kent Walker, wrote that has been working with various governments and to identify and remove this content and has invested in systems that help with that task.

He acknowledged that more has to be done in the industry.

The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to "train new content classifiers to help us more quickly identify and remove such content".

The company is also expanding its pool of "Trusted Flagger" users; a group of experts with special privileges to review flagged content that violates the site's

According to Walker, the company will almost double the size of the programme "by adding 50 expert NGOs to the 63 organisations who are already part of the programme".

This expansion would allow the company to draw on specialty groups to target specific types of videos, such as self-harm and

The third step would be hiding videos like ones that contain inflammatory religious or supremacist content -- that do not violate community standards, behind a warning.

The company will do more with counter-radicalisation efforts by building off of its "Creators for Change programme", which will redirect users targeted by extremist groups such as Islamic State (IS) to counter-

said that it uses to remove the terror-related content.

The giant is currently focusing its techniques to combat terrorist content about Islamic State (IS), Al-Qaeda and their affiliates.

image
Business Standard
177 22

Like Facebook, YouTube to set policies to curb extremist content online

YouTube has also introduced four steps to confront extremist activity

Three days after outlined (AI) powered measures to combat terrorism, now has also introduced four steps to confront extremist activity.

In an op-ed in the Financial Times, Senior Vice-President and General Counsel of Google Kent Walker, wrote that has been working with various governments and to identify and remove this content and has invested in systems that help with that task.

He acknowledged that more has to be done in the industry.

The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to "train new content classifiers to help us more quickly identify and remove such content".

The company is also expanding its pool of "Trusted Flagger" users; a group of experts with special privileges to review flagged content that violates the site's

According to Walker, the company will almost double the size of the programme "by adding 50 expert NGOs to the 63 organisations who are already part of the programme".

This expansion would allow the company to draw on specialty groups to target specific types of videos, such as self-harm and

The third step would be hiding videos like ones that contain inflammatory religious or supremacist content -- that do not violate community standards, behind a warning.

The company will do more with counter-radicalisation efforts by building off of its "Creators for Change programme", which will redirect users targeted by extremist groups such as Islamic State (IS) to counter-

said that it uses to remove the terror-related content.

The giant is currently focusing its techniques to combat terrorist content about Islamic State (IS), Al-Qaeda and their affiliates.

image
Business Standard
177 22