Skip navigation
Skip navigation
You are using an outdated browser. Please upgrade your browser.

+Companies operating virtual worlds for children and young adults should use moderators not just to devise and enforce safety guidelines but also to enhance game play, according to a new paper from eModeration. The report, How to Encourage Participation and Loyalty in Virtual Worlds, is released to coincide with the US-based ‘Engage’ Expo, where there has been much discussion about how to engage consumers in online communities and MMOGs (massively multi-player online games).

How to Encourage Participation and Loyalty in Virtual Worlds outlines how the role of the moderator has evolved from monitoring online communities to becoming an active in-game, or even character-based host that enhances the experience for players.

Recommendations included in the report detail how companies should: use moderators to set and enforce user policies, and adopt in-game roles to help children engage with the various activities within the game; use humour and quirkiness to engage more effectively with children; and get parents involved so that they trust the site and encourage their children to visit.

The paper has been drafted by Tamara Littleton, a respected pioneer and authority on virtual world moderation, who was a member of the Home Office Sub-Committee that advised the UK government on moderation of communities to help safeguard children. eModeration is one of the few moderation agencies providing moderation services and consultancy services for virtual worlds aimed at children.. The full report can be accessed at:

Below is a summary of the recommendations covered in the report:

1. Setting and enforcing user policies – a child and parent that know a virtual world is as safe as possible are far more likely to return to the site. One of the best ways for a virtual world to prove this is to draft clear user guidelines and make it very easy to report inappropriate behaviour – backed up with moderators that are ready to intervene if necessary.

The maxim ‘children need boundaries’ applies just as much in digital environments as it does in the real world. eModeration has found that not only do children respond positively to boundaries being enforced, they’re often very happy to help enforce guidelines and remind other children when they’re breaking the rules.

2. Moderators as in-game characters/hosts – moderators now have a wider role than just monitoring digital worlds and ensuring these environments are safe play areas for children. Today, moderators also act as hosts, becoming interactive characters within the game itself and enhancing the experience for players in virtual worlds. Using moderators in this way can significantly deepen children’s participation in the game and develop a greater attachment and loyalty to the site.

3. Use humour and quirkiness – making children laugh will ensure they enjoy themselves and engage with the site, so it’s important to add as many humorous and quirky elements as possible. For example, make a moderator’s character something outside of a child’s normal experience, such as an animal that talks in rhyme; and make the character change shape or colour after a certain period of time to sustain interest.

4. Engage with parents – essentially, if parents believe the site is safe, they’ll have no qualms about encouraging their children to play the game. To help get parents on board, make sure there is a ‘guidelines for parents’ page clearly visible on the site.

Tamara Littleton, CEO, eModeration, comments: “Traditionally, moderators were inconspicuous and remained in the background, deleting offensive material, defusing confrontation or reporting abusive behaviour. Today, the in-game moderator is becoming increasingly popular, as they do much more than monitor digital communities – their active participation not only keeps children safe, but also significantly adds to the game play and encourages players to return to the site. Setting the tone and establishing a positive culture within the community from the start will pay dividends in the future.”

For more information, visit

About eModeration
Founded in 2002, eModeration Limited is an international, specialist user-generated content moderation company. It provides 24-hour community management and content moderation to clients in the entertainment and digital publishing industry and major corporate clients hosting online communities and consumer-driven projects.
eModeration's CEO and founder, Tamara Littleton, has an established background in editorial quality control, fault escalation and process management gained from previous work as the Product Delivery Director for Chello Broadband and Online Operations Manager for BBC Online, where she managed the world's first ISO 9000-accredited team for digital publishing management and monitored over 400 BBC websites. Tamara Littleton is a member of the Home Office Internet Taskforce for Child Protection on the Internet which brings together government, law enforcement, children’s agencies and the internet industry, who are all working to ensure that children can use the internet in safety. She was also the Chair of e-mint, the online community for community professionals from 2006-2007.
eModeration's team of moderators and staff are the key to eModeration's success and excellent client list. eModeration draws on the expertise of carefully recruited and trained moderators located mainly in the US and Europe with specialist editorial and community moderation skills, which are matched uniquely to the client. The company can moderate 24/7 in more than 30 languages. All its moderators are managed online from eModeration's headquarters in London, United Kingdom.

Further press information:
Kate Hartley
Carrot Communications
Tel: +44 (0)771 406 5233

This press release was distributed by ResponseSource Press Release Wire on behalf of Carrot Communications in the following categories: Media & Marketing, for more information visit