Quantcast

Web Firms Urged to Protect Children From ‘Dark Side’

August 4, 2008

Online companies such as YouTube are being urged by MPs today to do more to protect children from the “dark side” of the internet.

The House of Commons Culture, Media and Sport Committee said it was “unimpressed” that the video-sharing website – owned by internet giant Google – made no attempt to vet clips posted by users, which in one case appeared to show a gang rape.

And the committee said there was a “lax approach” by some sites to removing illegal material. It was “shocking” that the industry standard for removing material containing child abuse was as long as 24 hours.

In a report published today, the committee warned of a “dark side” to the internet, where hardcore pornography and videos of fights, bullying or alleged rape can be found, as well as websites promoting extreme diets, self-harm and even suicide.

It cited research suggesting that 16 per cent of eight- to 15- year-olds in the UK have come across “nasty, worrying or frightening” content online, and said there was “consistent” evidence that up to 20 per cent of children have suffered cyber- bullying.

The report recommended the creation of an industry self- regulation body to agree and police minimum standards for protection of internet users from potentially harmful content. But it stopped short of calling for statutory regulation, arguing that its effectiveness would be limited as so many sites are based overseas.

It should be “standard practice” for sites hosting user- generated content such as video clips and photos to review material in a proactive way, said the report.

The committee acknowledged that, with 10 hours of content uploaded every minute to YouTube alone, it was unrealistic to expect companies to watch every video before it goes online. But it rejected industry arguments that the volume of traffic makes any screening of content impractical.

Sites can use filtering software to detect potentially inappropriate material before it is screened and should employ staff on a routine basis to watch posted videos and remove those which raise concerns.

YouTube’s practice of vetting clips only after they have been flagged up as inappropriate by users did not go far enough, argued the committee, which said that leaving individual companies to regulate themselves had resulted in an “unsatisfactory piecemeal approach which lacks consistency and transparency”.

Video-sharing sites should have “one-click” facilities for users to report clips which may include abuse directly to law enforcement agencies, said the report.

The report also recommended the application of the British Board of Film Classification’s ratings system to computer games as well as films.

(c) 2008 Yorkshire Post. Provided by ProQuest Information and Learning. All rights Reserved.




comments powered by Disqus