Who is Responsible for Online Content Policing?
YouTube is the world’s most popular video-sharing site. Billions of users watch and upload videos on the website, which makes it a natural fit for advertisers.
But the company has come under fire after a UK-based newspaper found adverts running alongside inappropriate videos of children and comments.
The Times investigation found there were many videos of pre-teen girls, which were then liked and commented on by hundreds of paedophiles.
One such clip of a young girl drew 6.5 million views. Several companies, including chocolate maker MARS and Deutsche bank, have pulled their ads from YouTube in response.
The chocolate maker company, Mars said “We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally untill we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
The newspaper added that the website allowed sexualised imagery of children to be easily searchable. It also criticised the company for not monitoring its content.
Google spokesman responded by saying, “There shouldn’t be any ads running on this content and we are working urgently to fix this”
So, will YouTube do more to monitor and remove certain content?