<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>online moderator &#8211; Online Moderation</title>
	<atom:link href="https://www.onlinemoderation.com/tag/online-moderator/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.onlinemoderation.com</link>
	<description>Social Media Management Services &#38; Content Moderation That Flex With Your Needs</description>
	<lastBuildDate>Wed, 19 Jun 2019 18:28:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>Facebook Doubles Down on Content Moderators</title>
		<link>https://www.onlinemoderation.com/facebook-doubles-content-moderators/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=facebook-doubles-content-moderators</link>
		
		<dc:creator><![CDATA[Mike Merriman]]></dc:creator>
		<pubDate>Fri, 05 May 2017 13:37:14 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[online moderator]]></category>
		<category><![CDATA[social media]]></category>
		<guid isPermaLink="false">https://www.onlinemoderation.com/?p=1608</guid>

					<description><![CDATA[<p>Unless you’ve been hiding under a rock, you’ve heard the stories of murders and suicides posted on Facebook.  In order to keep that type of content away from the public, Facebook has just announced that they will hire an additional 3000 content moderators.  This is on top of approximately 4500 current moderators.   In a recent post, [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-doubles-content-moderators/">Facebook Doubles Down on Content Moderators</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" class="alignleft size-medium wp-image-1587" src="https://www.onlinemoderation.com/wp-content/uploads/FB-f-Logo__blue_1024-300x300.png" alt="" width="300" height="300" />Unless you’ve been hiding under a rock, you’ve heard the stories of murders and suicides posted on Facebook.  In order to keep that type of content away from the public, Facebook has just announced that they will <a href="https://arstechnica.com/tech-policy/2017/05/facebook-promises-to-hire-3000-people-to-moderate-content/" target="_blank" rel="noopener noreferrer">hire an additional 3000 content moderators</a>.  This is on top of approximately 4500 current moderators.   In <a href="https://www.facebook.com/zuck/posts/10103695315624661" target="_blank" rel="noopener noreferrer">a recent post</a>, Mark Zuckerberg said the content moderators will, &#8220;help us get better at removing things we don&#8217;t allow on Facebook like hate speech and child exploitation.&#8221; Additionally he said they’ll work with Law Enforcement to help users, &#8220;because they&#8217;re about to harm themselves, or because they&#8217;re in danger from someone else.&#8221;</p>
<p>It’s unclear whether Facebook will hire people or outsource this work.  Nor is there any indication where in the world these additional resources will reside.  One thing however is certain – with over one billion active Facebook users, and over 5 billion pieces of content shared daily, this is a huge and necessary job.  We at Mzinga strongly believe in protecting the public and children specifically, and commend Facebook for the attention they are giving this issue.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-doubles-content-moderators/">Facebook Doubles Down on Content Moderators</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Online Moderators Keep it Civil, But What About Where They Work?</title>
		<link>https://www.onlinemoderation.com/online-moderators-keep-civil-work/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=online-moderators-keep-civil-work</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 13 Mar 2017 14:21:22 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[online moderator]]></category>
		<category><![CDATA[social media]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1172</guid>

					<description><![CDATA[<p>Online Moderators Keep it Civil, But What About Where They Work? Mzinga moderators spend much of their shifts putting an end to flame wars, banning trolls, handling customer complaints, and keeping the peace.  As Mzinga’s Director of Moderation Services, I ensure that the team works in an environment that encourages and practices civil interaction as [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/online-moderators-keep-civil-work/">Online Moderators Keep it Civil, But What About Where They Work?</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Online Moderators Keep it Civil, But What About Where They Work?</p>
<p>Mzinga moderators spend much of their shifts putting an end to flame wars, banning trolls, handling customer complaints, and keeping the peace.  As Mzinga’s Director of Moderation Services, I ensure that the team works in an environment that encourages and practices civil interaction as well.</p>
<p>To produce harmonious workplace conditions, the consulting firm <a href="http://civilitypartners.com/" target="_blank" rel="noopener noreferrer">Civility Partners</a> has established the following guidelines for teams, whether they work together in an office or virtually.  Teams should especially avoid:</p>
<p>&#8212;  Aggressive Communication (includes insults or offensive remarks, angry outbursts, avoidance,  offensive written communications, and blaming someone for issues not their fault or they have no control over)</p>
<p>&#8212; Humiliation (includes ridiculing or teasing, spreading gossip, taunting (in person or writing), publicly pointing out mistakes or mistakes that have been corrected, and snubbing for having a different interpretation of a company policy or management style)</p>
<p>&#8212; Manipulation of Work (includes subverting tasks associated with a person&#8217;s job responsibilities, unmanageable workloads and impossible deadlines, making general statements about poor performance without offering assistance to correct it, and leaving a person out of the correspondence and meeting loop)</p>
<p>Behaviors that contribute to workplace civility are respect, support, encouragement, politeness, openness, appreciation, trust, sensitivity, sincerity, having a positive attitude, taking pride in what you do, and being a good example.</p>
<p>Each company should have an established Company and Management Commitment to Civility that ensures their workers are free from negative, aggressive, and inappropriate behaviors and that the workplace will provide an atmosphere of respect, collaboration, openness, safety, and equality; where complaints about negative workplace behaviors are taken seriously and followed-through to resolution.  Every employee, from the CEO to the intern, is given a copy and a signed copy is part of their employee file.  Larger companies will have a training module as part of new employee orientation.</p>
<p>Online moderators keep their client&#8217;s sites free from risks that run from bad publicity to legal liability.  As a result, the burnout rate is high (see my blog entry from a couple of weeks ago about two Microsoft moderators who say they are permanently disabled from moderating disturbing images).  At Mzinga, the moderation team is able promote civil interaction because it is practiced in their workplace.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/online-moderators-keep-civil-work/">Online Moderators Keep it Civil, But What About Where They Work?</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Microsoft Online Moderators Allege Viewing Explicit Images Gave Them PTSD</title>
		<link>https://www.onlinemoderation.com/microsoft-moderators-allege-viewing-explicit-images-gave-ptsd/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=microsoft-moderators-allege-viewing-explicit-images-gave-ptsd</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 30 Jan 2017 15:00:38 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[online moderator]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1149</guid>

					<description><![CDATA[<p>Two Microsoft online moderators have filed suit against the company, saying they were forced to watch child porn and other offensive and disturbing images and videos to the extent that they began to exhibit symptoms of Post-Traumatic Stress Disorder (PTSD).  When they asked for help, they say in a McClatchy DC report, they received negative [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/microsoft-moderators-allege-viewing-explicit-images-gave-ptsd/">Microsoft Online Moderators Allege Viewing Explicit Images Gave Them PTSD</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Two Microsoft online moderators have filed suit against the company, saying they were forced to watch child porn and other offensive and disturbing images and videos to the extent that they began to exhibit symptoms of Post-Traumatic Stress Disorder (PTSD).  When they asked for help, they say in a <a href="http://www.mcclatchydc.com/news/nation-world/national/article125953194.html">McClatchy DC report</a>, they received negative reviews of their performance.</p>
<p>The two, members of the Online Safety Team, allege they could not transfer to another division for 18 months, they were not adequately trained, were not provided with the level of counseling they needed, and were denied workman’s compensation for medical leave time they took in order to take a break and reduce their PTSD-related symptoms.  The compensation requests were denied because OSHA said their conditions were not an occupational disease.</p>
<p>When contacted by McClatchy, Microsoft responded to the allegations by saying, in part, “The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan.”  The company also pointed to programs such as the Compassion Fatigue Referral Project and its mandatory support sessions for members of the Online Safety Team and the Digital Crimes Unit.</p>
<p>As Microsoft says in its response to the charges, “This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority…”  Mzinga moderators know they may encounter offensive content at any time, and they are trained in the proper responses.  Here are a few of our best practices:</p>
<p>&#8212; Our moderators are warned about and can handle the content they may see</p>
<p>&#8212; Our moderators are trained to rapidly escalate offensive content to the proper law enforcement authorities</p>
<p>&#8212; As soon as a text, picture, or video tips the scales, it is removed immediately without having to look at it any further</p>
<p>&#8212; If they feel overwhelmed, they can trade shifts with another moderator, consult with the lead moderator or the director about how to handle their feelings, or they can work on a different project.</p>
<p>Removing offensive content is part of keeping the Internet safe.  It is Mzinga’s goal to keep our moderators safe as well.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/microsoft-moderators-allege-viewing-explicit-images-gave-ptsd/">Microsoft Online Moderators Allege Viewing Explicit Images Gave Them PTSD</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Online Moderation is the Elf on the Shelf</title>
		<link>https://www.onlinemoderation.com/online-moderation-elf-shelf/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=online-moderation-elf-shelf</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 19 Dec 2016 13:59:08 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[online moderator]]></category>
		<category><![CDATA[Trolling]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1132</guid>

					<description><![CDATA[<p>Online Moderation is the Elf on the Shelf For many families with small children, the Elf on the Shelf is a Christmas tradition.  Based on the 1994 book and toy by Carol Aebersold and her daughter Chanda Bell, during the Christmas season, the Elf is sent by Santa to watch over children during the day.  [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/online-moderation-elf-shelf/">Online Moderation is the Elf on the Shelf</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Online Moderation is the Elf on the Shelf</p>
<p>For many families with small children, the Elf on the Shelf is a Christmas tradition.  Based on the 1994 book and toy by Carol Aebersold and her daughter Chanda Bell, during the Christmas season, the Elf is sent by Santa to watch over children during the day.  When they go to bed, he goes back to the North Pole and tells Santa if the children have been naughty or nice.  Each morning, the Elf appears in a different place (where parents place him the night before), leading children on a hunt to find and replace him when they awaken.</p>
<p>Moderators are the online version of the Elf on the Shelf, watching over interactive areas and reporting to clients which users have been trolling and which deserve special kudos for helping others.  Unlike the elf, online moderators are in the same place every morning, they work overnight, and not just during the Christmas season.  Do you need a watchful presence on your community spaces?  Give us a call and we can discuss the ways we can help you turn naughty into nice.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/online-moderation-elf-shelf/">Online Moderation is the Elf on the Shelf</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>More Misguided Online Moderation: YouTube Heroes</title>
		<link>https://www.onlinemoderation.com/misguided-moderation-youtube-heroes/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=misguided-moderation-youtube-heroes</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 03 Oct 2016 13:01:35 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[online moderator]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1042</guid>

					<description><![CDATA[<p>More Misguided Online Moderation: YouTube Heroes Two weeks ago, YouTube introduced a new program that enables volunteers to moderate site content.  The program, called “YouTube Heroes,” allows content creators to earn points when they caption, flag, or report videos that violate the YouTube Terms of Service.  The program, however, is just another attempt to get [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/misguided-moderation-youtube-heroes/">More Misguided Online Moderation: YouTube Heroes</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>More Misguided Online Moderation: YouTube Heroes</p>
<p>Two weeks ago, YouTube introduced a new program that enables volunteers to moderate site content.  The program, called “<a href="https://www.youtube.com/watch?v=Wh_1966vaIA">YouTube Heroes</a>,” allows content creators to earn points when they caption, flag, or report videos that violate the YouTube Terms of Service.  The program, however, is just another attempt to get users to perform online moderation tasks for free.</p>
<p>Disregarding the negative reaction from those who have valid issues with biased mob rule, the site caving to advertiser pressure, rewarding trolls, the ethics of gamification, and YouTube’s push to monetize the site (as brought up in <a href="https://www.youtube.com/watch?v=CK4QtiPhTRQ">this video response</a>), the main issue I have is that you are unnecessarily splitting users into privileged and unprivileged classes.</p>
<p>Allowing unpaid users to flag content is nothing new.  What’s different here is that a few “trusted” users are “rewarded” for doing something that all users should be allowed to do.  Giving points to those who report a higher number of valid violations (spam, profanity, personal attack, etc.) is fine – it creates engagement and civil social accountability.  Creating a separate program, with five different levels, , however, contributes to the “us-versus-them” mentality that has been plaguing the comment sections of YouTube’s commentary channels for years.</p>
<p>One thing I am grateful for is that flagged content submitted by any user is ultimately reviewed by a paid, trained, and unbiased moderation team who have received additional training in dealing with the increased hostility that the You-Tube Heroes program has created.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/misguided-moderation-youtube-heroes/">More Misguided Online Moderation: YouTube Heroes</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
