YouTube parent company Google is taking a tough stance on extremist videos and content.
YouTube plans to use machine learning to "train new 'content classifiers to help [YouTube] more quickly identify and remove such content".
Google will also add 50 new expert NGOs to the 63 organizations that already participate in its Trusted Flagger program. It said videos that "contain inflammatory religious or supremacists content" will appear with a warning.
Google's also going to throw its advertising expertise at users, by redirecting those deemed "potential Isis recruits" based on content they seek online "towards anti-terrorist videos that can change their minds about joining".
We expect that Google will be rolling out these initiatives on YouTube over the coming months.
The Google column in the FT comes days after Facebook published a blog post detailing the various efforts it was making to try and tackle terrorism. Would you do us a favor?
The investigation, which led to several major brands pulling their ad spend on YouTube till the issue was ironed out, led Google to revamp its ad policy to give brands greater control over where their ads appear and more aggressively police "hateful, offensive and derogatory content".
In response, the U.K.'s Home Office called on companies to work toward implementing technology to identify, remove and even prevent extremist content from being widely distributed on their sites.
Walker also acknowledged the need for more work in this section of the industry, and said that any developments need to be made as quick as possible.
Mr Walker said: "Collectively, these changes will make a difference".
Artificial Intelligence has ramped up to use on Facebook such as matching the language and identify the content quickly and also remove that content.
Whether it manages to get this right without penalising news websites and creators on YouTube that focus on current events, remains to be seen.
Throughout the entire post, he paid attention to Google's intention to attain balance free and open societies and prevention of online terrorism, which aims to dissipate these same values. Tech companies can help build lasting solutions to this complex challenge.
Many brands showed their intolerance over the placement of ads on videos from controversial extremists like David Duke (preacher of Ku Klux Klan) and Steven Anderson (an Anti-gay preacher who praised the terrorist attack on a gay nightclub in Orlando). "We are committed to playing our part", Walker concluded.