Skip navigation
Skip navigation
You are using an outdated browser. Please upgrade your browser.

In response to the launch of the Department for Children, Schools and Families (DCSF) launching a new UK watchdog in an attempt to protect children online, Rob Marcus, director of Chat Moderators is in full support of plans for a voluntary code of practice for user-generated content (UGC) sites.

The Government internet watchdog; The UK Council for Child Internet Safety, will aim to bring in many of the points Dr. Tanya Byron highlights in her report ‘Safer children in a digital world’. This includes addressing issues such as, cyber bullying, inappropriate content online and advertising. The council’s aims are to teach parents and children about possible dangers, target illegal sites that contain harmful content and aim to establish a code of conduct for sites that allow people to post their own video clips or messages.

Chat Moderators is no stranger to managing and moderating user-generated content to minimise the risk of potentially damaging material, and indeed since 2004 the DCSF has itself been a client of theirs.

Rob Marcus, director of Chat Moderators comments, “I am pleased to see that some action is now being taken as I have believed for a long time that more could be done to protect children online. I back up the council’s ambitions for a voluntary code of practice among user-generated content sites; however I do not see that enacting laws for the industry should be the way forward. I believe it is up to every UGC site to act responsibly and I see that self regulation is the way forward. Putting legislation in place is rarely a quick-fix solution; it could take several years to begin seeing results. Internet applications tend to be fast moving in terms of technological advancement so maintaining any standards may become difficult.”

In order for UGC sites to make their chat rooms, forums and galleries safer for children, Rob Marcus recommends the following points:

• Registration: A robust registration process can work to deter potential paedophiles from using the sites, and as a last resort can act as a means of catching them after the event. An additional part of the registration procedure can involve charging a nominal fee and ensuring each person pays with a credit card so that they are identifiable.

• Users will also have to provide a valid email address by responding to a registration email. Were they banned for any reason, the email address and user name will be flagged as invalid.

• Standards set by companies should be supported by The UK Council for Child Internet Safety so that they are not tempted to be watered down by commercial pressures.

• All providers should have both child-orientated and non-child orientated chat rooms where the designs are different so that parents can clearly see whether they are in a child-friendly room.

• Don’t allow private chat functions in child friendly chat rooms – this is where it is easiest for paedophiles to operate without being seen by other users.

Human moderation should be compulsory to an extent and have minimum requirements based on an average number of users. Levels of moderation should be clearly displayed for parents. It may be best to outsource a moderation company as many providers underestimate the time, effort and resource this takes in order to moderate effectively.

Marcus summarises, “It is important for all chat rooms to act responsibly and take adequate measures to ensure that children are not at risk while visiting their website. However, I do feel that The UK Council for Child Internet Safety will play an important role in educating parents to the dangers that face their child on the Internet. In many instances the best way to arrive at a solution is to involve people with specialist knowledge who can discuss the best way forward and ensure that certain measures are being put in place by the online industry.”

- ENDS -

About Chat Moderators:

Empowering an audience with the ability to create their own digital content can bring many benefits to a business’ brand and reputation, provided it is managed responsibly. Chat Moderators takes the management and risk out of publishing user-generated content (UGC) by judging contributions to discussion areas, forums, chat rooms, comment areas, blogs and social network profiles. Other services provided by Chat Moderators include digital community consultancy, community management and insight reporting.

Chat Moderators’ client list includes Amnesty, BBC, Bauer, Blue Cross, EMI, Friends Reunited, Glaxo, HM Government, Iris, MTV, National Magazines, Orange, Panasonic, Reed Elsevier, Sony, Transport for London, Vodafone and Waitrose.

For more information visit: www.chatmoderators.com, www.targetedmoderation.com or call Ascent PR on 0118 988 0501

FOR FURTHER INFORMATION CONTACT:

Louise Mapp/Danielle Mumford
Ascent PR
T: 0118 988 0501
E: Louise.mapp@ascentpr.co.uk

This press release was distributed by ResponseSource Press Release Wire on behalf of Ascent PR in the following categories: Consumer Technology, Computing & Telecoms, for more information visit https://pressreleasewire.responsesource.com/about.