CyberBullying Tragedies and Internet Safety: eModeration Part3

hopeDec. 3, 2009 Just visited the excellent Connect Safely forum to pose the industry question, is there ANYthing that digital moderators or online advocates could’ve done in the child safety realm to prevent the loss of 13-year old Hope Witsell due to the suicide/sexting case? Obviously, Hope’s case (visual at left/MSNBC) was on mobile phones, but cyberbullying/bodysnarking and photo posting is common in online communities as well. So as we continue Part3 of our ‘behind the scenes’ chat with industry pros from eModeration, we’ll find out the nitty gritty:

Who’s monitoring what behavior online? How do reputable sites handle ‘worst case scenarios?’ (like predator panic, cyberbullying extremes and other instances that surface and splash all over mainstream media news and talk shows) How can sites better recognize and track bullying and predatory behavior? When do they intervene? And how?

Today we’re asking moderation pros the tough stuff beyond an internet safety guide overview…Specifically, what are the fire drills and specialized training in place to spot grooming, bullying, age violations, red alerts, and legal vulnerabilities that reputable sites spend countless hours and dollars monitoring?

As you can see by the CSM Internet Safety Guide visual, below…it’s not such a simple task when it comes to the ‘moderation’ of new media forms as there are many and hard to keep eyeballs in all places at once. It’s one of the huge reasons I get REALLY tired of parents being plopped in the driver’s seat with dismissive, “kids are your responsibility” mode.

I’m thrilled to have the pros at eModeration lift the curtain for us like the Wizard of Oz for a peek behind the scenes of online community moderation (defined here as virtual worlds, social networks, MMORPGs, kids’ texting ‘in-game,’ chat rooms, etc.) because frankly, even for a ‘media maven’ it’s a daunting prospect to ‘keep up.’

csm internet guide

What ages and stages need strict, airtight enforcement of regulatory and legal vulnerabilities?

How often is this really being done and what can happen when kids ‘slip under the curtain’ and enter social media platforms without the emotional wherewithal to navigate their own internal compass?

We can all relate on some level to Hope’s case, in terms of parental guilt on basic discipline (grounding/denial of media privileges, a crash course on dealing with texting nightmares) as well as the youthful mindset of invincible bravado…Thinking she could ‘handle it,’ bravely confronting her mistake and taking an insurmountable amount of guff to ‘own up’ without sharing/revealing the severity of the daily brutality endured.


This is where we absolutely must engage in a dialog across the generations…

With new roles and rules for navigating new media, it’s imperative to bring forth a ‘Meeting of the Minds’ as the Harvard’s GoodPlay project, and CSM and Global Kids have tackled in their newly released Focus Dialogues about how youth and adults relate to life online through a series of cross-generational online dialogues. (I’ll post on the Focus Dialogues separately…meanwhile…onward!)

For those that missed the rest of the eModeration interview, Part One is about the need for 21st century media literacies (Hastac/Howard Rheingold roundup of “musts”) Part Two discusses the challenges of ‘safe chat’ and vigilance required from online communities to do their job right.

And though I want to give full focus to the serious instances and concerns in Part Three I’d also like to focus on solutions-based approaches in advance so that parents AND kids know ‘what to look for’ how to screen sites, situations and navigate conundrums with common sense so these tragedies won’t occur…

connect safelyBefore we get to making sense of “automated vs. human moderators” how reliance on internet filters and a false sense of security can put kids in peril, etc., I’d like to caution parents NOT to over-react and instead offer them my favorite outreach vehicle to surf through ConnectSafely’s forum and SEE FOR YOURSELF what’s rearing its head…

It’s a great way to get a solid touchpoint on what kids are facing today, while giving a poignant peek at the reality behind the sensationalism.

Yes, there are raw emotional cries for ‘help’ on serious issues ranging from imposter profiles to removing content that’s seeped onto the net…But there are also firsthand peer to peer youth stories and tips, adults skilled in everything from cybersafety to forensics and law enforcement giving their two cents and sharing knowledge, and ALL are intent on helping kids navigate as global citizens.

Example? “Social Networking Abuse & Peer Tech Support” has an ongoing thread of about 1400 voices in dialog, “Sexting, Cyberbullying & other online risks” offer about 300 comments, which is where I posted my Hope Witsell query.

Site co-founders Anne Collier of NetFamilyNews and Larry Magid of Safe both take a very level headed approach to ‘balancing safety and fun’ and give youth and parents the tools to triumph in the cybersafety realm while pointing them in the proper direction to troubleshoot issues that are site specific or out of their area of expertise.

So with that caveat, here’s Part Three with eModeration…

Amy Jussel, Shaping Youth How does eModeration handle a serious breach of safety, such as grooming, predatory behavior, stalking, and that kind of thing?

Do you have tracking tools that are automated as well as human? Rely on human relationships with forensics/law enforcement people that can trackback and hunt down the internet service provider as a source or what?

eModeration: We have a serious incident escalation procedure for each project, which is drawn up with the client at the start of a contract.  We have to be able to reach clients 24/7 in the event of a time-crucial incident such as a bomb or suicide threat – something where we need to be able to report an incident to the police with an IP address as quickly as possible.

All suicide or bomb threats are taken seriously; they have to be, and our moderators are trained as to what to doing terms of taking threads down, reporting to clients and management, sending evidence through to reporting bodies and following up.  Not all serious incidents are time-crucial: for example uploading child abuse images, whilst extremely serious, isn’t time crucial in the same way.

thinkuknowWe do what is necessary on the site in terms of take down; logging and reporting, then follow it up with the clients and the relevant authorities – in the UK this means the Child Exploitation and Online Protection Centre (CEOP) (Amy’s note: also, ck out CEOP’s  ThinkUKnow microsite which gives a helpful age/stage media literacy snapshot of basic ‘need to knows’ for teaching safety w/hands-on sources) and the Internet Watch Foundation (who work internationally as well), otherwise the Virtual Global Taskforce or (in the US) CyberTipline

Obviously, because eModeration is a specialist firm, with workflows and protocol in place, we may be able to provide a greater degree of efficiency than an in-house team, including counseling for any moderators who might feel they need it post-incident…Many of our larger clients are also geared up for this type of escalation and have well-oiled systems too.

Amy Jussel, Shaping Youth: What about the more common ‘imposter profiles’ and security breaches like hacking or cyberbullying with inappropriate content?

eModeration: Breaches of security such as suspected hacking profiles fall into another category, and would be reported through to the client.

IWFWe log all breaches of terms which result in moderation actions and report through to clients on an agreed basis.

If a child was in immediate danger we would deal directly with the police to intervene immediately.

Also, we’re a member of the IWF, who are the ones that would deal with UK ISPs in relation to any hosting of child abuse material…so we support CEOP and work with them to further their aims in every way.

Amy Jussel, Shaping Youth: What about ‘language barriers?’ If 21st century connectivity is about global reach and digital multi-culturalism, how can we keep kids safe yet let them explore multi-cultured learning as global citizens?

How do you even begin to ‘moderate’ that? (on Shaping Youth I get a lot of incoming comments and links I can’t moderate on a global scale, so I’ve ratched up my spam filters and taken a stern “when in doubt, delete’ approach) How do you moderate multilingual communities specifically?

eModeration: We provide moderation in over 30 languages, but ‘foreign language’ criteria are different for every client.

multilingualSome projects are set up to be run in just one language – and hopefully the terms would state this – and we are instructed to delete any UGC not in that language.  For others, there is a flexible approach to other languages, and we moderate them on an ad hoc basis.

For a lot of recent clients though, we are set up to moderate in several different languages right from the start, providing the same service level for each.  All our moderators are native or fluent in English, and we have a fantastic team of bi-lingual moderators some with several.

It’s not all about language though…it’s also about culture and nuance.  For example, with some projects, we are using UK native moderators only because they need to understand deeply the social context of the young people posting.  For other projects it’s vital we have moderators who are native speakers in Chinese for example rather than just bi-lingual so they can pick up on cultural issues as well.

Amy Jussel, Shaping Youth: So what exactly does a moderator ‘do?’ Can you explain the job of a ‘host’ moderator or highly visible moderator in a virtual world for kids?

eModeration: Well, here I can quote directly from our white paper on How to Encourage Participation and Player Loyalty in Virtual Worlds:

“Today, there are two types of moderators. The first and more traditional type is the silent moderator, who stays in the background blocking offensive material from participants, warning users, defusing confrontation and reacting to abusive or illegal behavior. ..The second and increasingly-popular type is the in-game moderator, who actively participates as a character or avatar on the site, helping other players engage with the various activities within the game.

This type of moderator may also act as an in-game host – ie visible to the children – and can be compared to the host of a children’s party: the role is about encouraging children to explore and try new things and have as positive experience as possible, but stay safe and secure while doing so.”

Amy Jussel, Shaping Youth: So what happens if this ‘host’ devolves into a ‘peer’ (I’ve seen this on some sites where there are perceived ‘favorites’ and game play is impacted)

Is it better to have moderators visible or invisible…and why?

eModeration: It’s very important for moderators to keep a certain level of detachment from the children and not become their friends, ensuring they remain impartial and act consistently. To this end, moderators should be clearly identifiable as such within the game so that a child can never confuse them with another player…often the moderator becomes an active character or “host” in the game.

izzy viking dogModerators can blend right in to the game itself, letting children know they are there without becoming over-bearing.  This also deters children from wanting to chat to the moderator, which could distract them from the game itself.

However, as Izzy Neis has observed:

“[Young people’s moderations teams] have a tight rope to walk… keep the audience engaged/happy/online, while also maintaining community,  individual safety and the feeling of fantastical freedom almost required in virtual sandboxes..”

…”Youth want you there when they need you, otherwise, they don’t even want to see you – [you’re the] elephant in the corner.  A child’s behavior changes when an adult is noticably present – no matter how “good” the child is. Adults become role models, scape goats, wardens, security cameras, mayors, etc – adults become “the man”, and that issues a shift in social control.”

So, in-game moderation isn’t all win, by any means.

(Amy’s note: The avatar/viking visual is one of safety guru Izzy Neis’ many personas, you can see a whole bunch of ’em on her site to give you a feel for the range and tonality within kids’ worlds in order to ‘blend’…)

Amy Jussel, Shaping Youth: What about automated filters as moderators…how does that all work?  Can you explain the science behind “content analysis?”

How sophisticated are these ‘engines’ within virtual communities? Are they sort of a ‘first tier strike’ safety measure to weed out crud like a spam filter, or are they more robust?

eModeration: Rather than using simply a blacklist or white list to restrict chat (safe chat dictionaries, etc.) intelligent content analysis engines such as Crisp’s Netmoderator TM not only detect inappropriate content but also the first warning signs of cyberbullying and predatory behavior.

For example it can reveal when one correspondent is trying to make direct contact with another or when someone is revealing personal information which may compromise their future safety…


We all know sexual predatory behavior is purposefully subtle and long-term in nature. So the engine analyzes content and relationships over the long term, looks at speech which in isolation contains nothing untoward (and so would not be picked up by a blacklist), but whose patterns correspond to recognized grooming behaviour.

The Netmoderator TM engine then prioritizes these alerts, and can handle low-level code of conduct breaches automatically with the ABM (gagging/silencing, blocking/banning, etc according to client-defined workflows), alerting the moderators to the more serious threats.

This helps us a lot because it leaves the moderation staff freer to focus their energies on more potentially serious offenders – It also means that clients do not need to scale up their moderation resources at the same rate that their membership base grows…

Amy Jussel, Shaping Youth: Thanks for this, there are some amazing new resources for keeping kids safe…Appreciate your taking the time to explain the ‘back end’ behind the curtain, Tamara…

Though as I’ve found doing my own series on ‘ethics’ in virtual worlds and online gaming (part one, two and three here) it’s complicated and everchanging.

I’d like to encourage all readers to leave questions/comments on topics that weren’t covered or you want to hear more about, and I’ll send them into the ‘Twitterstream’ for various moderation pros to have at it.

As far as Hope’s tragedy and MSM reporting, here’s Wired magazine’s blurb yesterday about the latest MTV/AP survey on sexting  (again, grain of salt required thinking ‘chicken or the egg’ there,  but hat tip to public health pro Andre Blackman)

Keep your media literacy hat on (and head level) on that one…as most of the teens I talked to at our ‘sex ed’ high school discussion last night (and other peers) ‘know better’ and gave me the wince and ‘doh, we KNOW that’ routine…so need to dive into the research methodology, regional samplings, ages and context next.

I’m THRILLED that MTV is addressing digital citizenship/peer-personal privacy issues AND partnering with public health pros to give it mindshare for prevention. They even have a contest and PSAs upcoming with their “Redraw the Line Challenge” to develop projects to address digital abuse via web-based tools/games for education and media literacy, woohoo!

cybermentorsHere’s a mini-resource roundup of other pertinent pieces ( I particularly love the UK’s CyberMentoring program, as I’m a huge advocate of peer to peer knowledge sharing!)

And…with that in mind,  here’s a step by step video “by kids for kids” on privacy settings called “Hailey Hacks” Hailey uses screenshots to set her Facebook profile settings to minimize leakage and seepage, offering tips for teens on the ‘frenemie’ front too. (I know some adults that could use this advice!)

And here’s Facebook’s own recent post by UK policy pro Richard Allen with tips called “How to Bullyproof Yourself on Facebook.” (I just found/’fanned’ their BBC Bullyproof’ Facebook page, too!)

Too much for one post, crossing between moderation, bullying/sexting, digital conduct and beyond so ‘to be continued…’

…Stay tuned for more on the importance of open conversation between generations as the Harvard GoodPlay/CSM/Global Kids’ Meeting of the Minds study shares insights from over 250 participants and 2500 posts highlighting similarities and differences in mediating ‘life online’ in the digital sphere!

Handful of Related Articles: Kids’ Safety/Convos/Digital Dialog

MTV’s A Thin Line Campaign (research on sexting, etc.)

Wired Magazine’s ‘Threat Level’ blurb on sexting survey 12-3-09

Global Kids Digital Initiative: Overview of Report

Meeting of Minds Report: Oct. ’09, 20pp pdf study (Global Kids/CSM/GoodPlay Project

Connect Safety Tips & Advice

CSM Internet Safety Tips Ethics & Virtual Worlds conversation

(initiated by Sam Gilbert of the GoodPlay Project)

Digital Media: The GoodPlay Project (funded by the MacArthur Foundation)

Virtual Research Ethics (Guardian/U.K.)

Related Posts On Shaping Youth

NetFamilyNews Posts Cyberbullying Statistics; ConnectSafely Forum Helpful

Harris Interactive Research: How Cyberbullying is Shaping Youth Savvy

Precedent Setting Cyberbullying Indictment for Missouri Mom

SXSW: Teen Docu-Drama & Digital Doings+New Cyberbullying Study

Shaping Youth Part One: Are Game Cheats a Misnomer?

Shaping Youth Part TWO: Kids, Gaming Ethics & Immersive Virtual Worlds

Shaping Youth Part Three: Community Solidarity Online

Related Research/Resources

CAMRI (Comm.& Media Research Inst; Global Ctr for media/social change)

Children in Virtual (20 pp Case Study/Adventure Rock) May ‘08)


Digital Youth Research: Kids’ Informal Learning With Digital Media

World of Warcraft Forum: Nazsh’s Guide to Dealing With Griefers

Ethics & Virtual Worlds Thread on

World of Warcraft Anthology Sent to MIT Press (author’s blog)

Digital Culture, Play and Identity (May ‘08)

Avatar Rights: Civil Liberties/Philanthropy in Virtual Worlds (Global Kids)

wiredsafetyParenting Tips from the Hope Witsell tragedy via Wired Safety, Dr. Parry Aftab’s widely acclaimed internet safety education resource  (full St. Petersburg Times article)

1. Take an inventory. Ask your children to show you all of the gadgets in the house that can take or store photos or videos. These can include cell phones, Webcams, video game consoles and iPods.

2. Ask them to show you images they have stored. Promise you won’t hit the roof if you find something bad — then keep your word.

3. Have a talk. This should be a conversation, not a lecture. Be sure to mention a range of unintended consequences, which could include criminal charges that would jeopardize admission to college.

4. Watch what you buy. Think twice before purchasing devices that can take or send images. Drop the image-sending capability from your child’s cell phone service.



  1. Down with Cyber Bullying!
    .-= Aaron Shaw´s last blog ..Fun For Kids Activities – Conversation Starter! =-.

  2. Wanted to share Anne Collier’s excellent response to my query on the Connect Safely forum asking, “are there any ‘watch for it’ tips that moderators can give to prevent other HopeT. types of cases? Specifically I asked:

    AMY JUSSEL: “Anne et al, I’m finishing part 3 of the cybersafety ‘behind the scenes” interview w/ eModeration and was wondering if you have any comment re: the Hope Witsell case, (e.g. was there any site moderation that might’ve prevented her suicide or a way we can give kids a primer for ‘how to handle’ once it’s been launched into ‘the stream’? Obviously prevention/media literacy is the best first strike offense, but what about kids who’ve made a mistake and are in ‘defense’ mode…Is there a ‘better way’ to handle the bullies in terms of shrugging it off/shutingt ’em down (e.g. the ‘don’t react bit’…kinda hard for a traumatized teen, esp. w/ drama at a fevered pitch)”

    And Anne replied with a well thought out response, to give you an idea of how valuable this forum and info sharing is for us all…inside AND outside of the media industry!!! (parents, kids, teachers, etc.)

    ANNE COLLIER: “Thanks for asking, ShapingYouth – you are such an important presence in the public discussion about youth, media and technology. Even though there have been many news reports about Hope’s tragic case, I’m not sure anyone could ever know enough to say what could’ve prevented her suicide. An online-safety advocate who appeared on the Today Show blamed her school. The local St. Petersburg Times, which reported her story in detail (, said that – the day before Hope committed suicide – the school counselor, noting cuts on her legs, asked Hope to sign a “no harm” contract by which she “agreed to tell an adult if she felt inclined to hurt herself.” Which sounds like the school may’ve felt its role in helping Hope was in process. My point is that it’s not helpful to point to any single factor for either blame or solution because human lives are complex and responses to harm are unpredictable. SAMHSA, the part of the US government that does suicide prevention education and supports the National Suicide Prevention Lifeline, has media guidelines for reports that, among other things, advising them not to oversimplify what happened.

    As for solutions, there has to be a diverse spectrum of help – probably in every space young people spend time in, including online. So, yes, site and virtual-world moderators need to play a role, but so do peers, parents, schools, government (SAMHSA is working hard to “be there” for youth on the fixed and mobile social Web). On moderators, specifically, there’s probably little they can do unless the person is actually voicing suicidal intentions and they happen upon that communication among the thousands or millions of posts or chat they may be watching in a day. A much more likely support in social sites or virtual worlds comes from friends who would see such communication more immediately; hopefully they’d no to go right to customer service staff for help in getting help to the person at risk.

    As for sexting, gosh it’s important to help our children think through the implications, from extreme embarrassment to bullying to prosecution (we hope our tips to help stop sexting – – can help with that. But even loving, nonconfrontational conversations (that aren’t so frequent that kids’ eyes glaze over) can’t always eclipse adolescent development, which includes risk assessment, social influencing, and impulsive behavior. That’s why youth online wellbeing so takes a village. Their peers are a vital part of that village, too – the National Suicide Prevention Lifeline says peers are the best source of referrals to the Lifeline, usually via social network sites, not a toll-free phone number (but that number is 1-800-273-TALK). The Lifeline coordinates the work of more than 100 toll-free help centers around the US, getting calls and cases to the center nearest the person needing help, and help not just for suicidal crisis, but depression, domestic violence, and all sorts of needs (more people need to know about that). The Lifeline also works with a number of social network sites.

    That doesn’t fully answer your question about how to help people after the fact, but there are a lot of answers to that question because each case is so individual. The most important thing is to help young people share whatever is troubling them so they can get help or get the response and healing process started.”

    .-= Amy Jussel´s last blog ..Meaningful Manga: Graphic Novels and Growth =-.

  3. Want to know how important moderation is in real time? Check out this incident that took place on New Moon Girl Media and their rapid response to same…THIS is why excellent moderation is pivotal to a positive experience…it’s about ‘trust agents’ and who parents feel comfortable ‘leaving their kids with’ behind the scenes, online. (even if hovering in nearby rooms 😉

    Kudos to Nancy Gruver and her staff at New Moon for lending a hand on this one:

    And of course, a great reminder for why we need to SUPPORT communities like this that DO keep such high standards. Need a holiday gift idea? Help Save New Moon and keep them sustainable to pay for their quality staff, as I wrote about here:

  4. Great post, Information about these Issues Always seem to make my day better!

Speak Your Mind