<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Terms of Service &#8211; Online Moderation</title>
	<atom:link href="https://www.onlinemoderation.com/tag/terms-of-service/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.onlinemoderation.com</link>
	<description>Social Media Management Services &#38; Content Moderation That Flex With Your Needs</description>
	<lastBuildDate>Wed, 19 Jun 2019 18:14:26 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</title>
		<link>https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=githubs-contributor-covenant-makes-waves-curbs-online-abuse</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Tue, 17 Jan 2017 14:16:48 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Terms of Service]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1141</guid>

					<description><![CDATA[<p>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse Over a year ago, I wrote about GitHub’s issues with bullying and discrimination, which came to a head when a female developer quit the collaborative coding hub – a victim of gender-based harassment by white male managers and co-workers.  The highly-publicized move eventually led to the resignation [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/">GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</p>
<p>Over a year ago, I wrote about GitHub’s <a href="http://money.cnn.com/2014/03/17/technology/github-sexual-harassment/">issues with bullying and discrimination</a>, which came to a head when a female developer quit the collaborative coding hub – a victim of gender-based harassment by white male managers and co-workers.  The highly-publicized move eventually led to the resignation of GitHub’s CEO.</p>
<p>As a result, GitHub has made <a href="http://fusion.net/story/369325/how-to-stop-online-harassment/">several major changes</a>.  First, they hired Nicole Sanchez as the company’s VP of Social Impact.  Sanchez formalized GitHub’s organization (previously there had been no designated managers or job titles), made it easier for employee issues to be addressed, and announced that workplace diversity would be acknowledged and celebrated.</p>
<p>Sanchez also hired two transgender community managers: February Keeney as head of the Community and Safety team tasked with eliminating workplace harassment, and Coraline Ada Ehmke, a senior engineer and creator of the “Contributor Covenant,” a code of conduct that had been loosely adopted by several project teams.</p>
<p>Sanchez, Keeney, and Ehmke found, however, that their institutional changes weren’t welcome at all levels.  Several groups, advocates of free speech, resisted being told that the terms of the Contributor Covenant were being applied more widely.  They retaliated by using the software’s tagging feature to connect Ehmke with fake projects that had racist names, a tactic they had also used on the developer who had initially exposed the harassment in 2014.</p>
<p>One of the first tasks of the Community and Safety Team was to build “consent and intent” into the software.  Now, you cannot tag a coder on a project without their approval.  And late last year, they updated the Contributor Covenant to include guidelines for conduct that prohibited doxxing, bullying, and discrimination, as well as a wider range of moderation tools.</p>
<p>While there is still resistance from a few groups, Sanchez and her team have moved the company in the right direction: making the platform less prone to abuse, developing a policy that is explicit and embraces civility, diversity, and inclusion, and involving the community members in its enforcement.  As she said at a <a href="https://recompilermag.com/2016/08/26/open-source-feelings-real-world-examples-real-world-impact/">recent conference</a>, ““Diversity is coming to your party despite my bad experiences at other parties. Inclusion is being glad I came.”</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/">GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Riot Games’ Tribunal System Reduces Abuse on Its Platform</title>
		<link>https://www.onlinemoderation.com/riot-games-tribunal-system-reduces-abuse-on-its-platform/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=riot-games-tribunal-system-reduces-abuse-on-its-platform</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 09 Jan 2017 15:33:11 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Terms of Service]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1137</guid>

					<description><![CDATA[<p>Riot Games’ Tribunal System Reduces Abuse on Its Platform Riot Games, producer of the PC-based multiplayer League of Legends (LoL), has been selected as Inc. magazine’s Company of the Year.  One of the reasons they are the most popular PC game in North America and Europe is due to their success at handling abuse, through [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/riot-games-tribunal-system-reduces-abuse-on-its-platform/">Riot Games’ Tribunal System Reduces Abuse on Its Platform</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Riot Games’ Tribunal System Reduces Abuse on Its Platform</p>
<p>Riot Games, producer of the PC-based multiplayer League of Legends (LoL), has been selected as Inc. magazine’s <a href="http://www.inc.com/magazine/201612/burt-helm-lindsay-blakely/company-of-the-year-riot-games.html">Company of the Year</a>.  One of the reasons they are the most popular PC game in North America and Europe is due to their success at handling abuse, through its <a href="http://na.leagueoflegends.com/legal/tribunal">Tribunal</a> system of rule enforcement, which not only penalizes toxic players but rewards those who have a pattern of positive behavior.</p>
<p>The Riot Games Tribunal is made up of players with exemplary conduct who review and vote on reported abuse of the “<a href="http://gameinfo.na.leagueoflegends.com/en/game-info/get-started/summoners-code/">Summoner’s Code</a>” of Conduct.  If the conduct warrants it, a case is opened.  Tribunal members then vote on the appropriate action against the user, either to “Punish” them, to “Pardon” them, or to “Skip” the case.  When twenty votes are received, the case is closed and the member receives a detailed report on the decision.  Punishment ranges from a warning, to a one-day ban, to permanent expulsion.</p>
<p>The most positive aspects of the Tribunal system are that a player is notified within hours of its decision, and in the report, they are told what infraction triggered the opening of a case.  As reported by Christine Porath in her book “<a href="https://www.amazon.com/Mastering-Civility-Manifesto-Christine-Porath/dp/1455568988">Mastering Civility</a>,” after 100 million votes, verbal abuse is 40% lower and 91.6% of reported members never receive another violation report.</p>
<p>The takeaway for community managers and moderators is to empower your community members: allow members to report the abuse they witness, allow an empowered team of members (The Tribunal) to quickly vote on the violation report and let the member know the outcome, and let the member what, if anything, they did wrong.</p>
<p>Of course, this system only works on platforms with an extremely large member database.  Smaller ones still need a team of experienced moderators to vet abuse reports and take action against toxic community members.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/riot-games-tribunal-system-reduces-abuse-on-its-platform/">Riot Games’ Tribunal System Reduces Abuse on Its Platform</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>University Athletes Suspended For Posting Offensive and Racist Texts</title>
		<link>https://www.onlinemoderation.com/university-athletes-suspended-posting-offensive-racist-texts/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=university-athletes-suspended-posting-offensive-racist-texts</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 21 Nov 2016 15:51:32 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Terms of Service]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1066</guid>

					<description><![CDATA[<p>University Athletes Suspended For Posting Offensive and Racist Texts Two recent cases highlight the need for increased monitoring of messages posted to social media sites by University students. The most recent took place at Columbia University, where members of the school’s wrestling team, over a two year period, posted lewd and racists messages to the [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/university-athletes-suspended-posting-offensive-racist-texts/">University Athletes Suspended For Posting Offensive and Racist Texts</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>University Athletes Suspended For Posting Offensive and Racist Texts</p>
<p>Two recent cases highlight the need for increased monitoring of messages posted to social media sites by University students.</p>
<p>The most recent took place at Columbia University, where members of the school’s wrestling team, over a two year period, posted lewd and racists messages to the team’s GroupMe page and were recently exposed by student-run <a href="http://bwog.com/2016/11/10/messages-from-wrestling-team-groupme-reveal-culture-of-intolerance/">BWOG.com</a>.  The school suspended those involved and issued a statement which read in part: “Columbia University has zero tolerance in its athletics programs for the group messaging and texts sent by several members of the men’s varsity wrestling team. They are appalling, at odds with the core values of the University, violate team guidelines, and have no place in our community.”</p>
<p>The Columbia suspensions came on the heels of the cancellation of Harvard’s Men’s soccer team’s games in 2016 when it was <a href="http://www.thecrimson.com/article/2016/10/25/harvard-mens-soccer-2012-report/">discovered</a> that team members had a tradition, going back at least four years, of describing and ranking the looks of female soccer players with lewd and offensive comments.  The “Scouting Report” was disseminated over a group email distribution list.</p>
<p>Harvard officials also issued a statement which said, &#8220;We strongly believe that this immediate and significant action is absolutely necessary if we are to create an environment of mutual support, respect, and trust among our students and our teams.” &#8220;The decision to cancel a season is serious and consequential, and reflects Harvard&#8217;s view that both the team&#8217;s behavior and the failure to be forthcoming when initially questioned are completely unacceptable, have no place at Harvard, and run counter to the mutual respect that is a core value of our community,&#8221; they added.</p>
<p>Both Columbia and Harvard could have easily avoided these embarrassing situations by letting all members know what conduct isn’t allowed and assigning a person or group to monitor content.  Mzinga recommends that it be a third party, so there is no collusion.  That way, violators can be rapidly identified and their posts removed, so they don’t appear in more-widely-distributed student (and national) publications.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/university-athletes-suspended-posting-offensive-racist-texts/">University Athletes Suspended For Posting Offensive and Racist Texts</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Cyberstalking Laws and Your Site&#8217;s Terms of Service</title>
		<link>https://www.onlinemoderation.com/cyberstalking-laws-sites-terms-service/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=cyberstalking-laws-sites-terms-service</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 01 Aug 2016 13:56:25 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Terms of Service]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1024</guid>

					<description><![CDATA[<p>Cyberstalking Laws and Your Site&#8217;s Terms of Service Laws to prevent cyberstalking and cyberbullying, defined as the use of electronic technology, including the Internet, to harass, stalk, intimidate, or harm a person or persons, are in their infancy. However, high-profile cases involving suicide, school shootings, and other tragedies, have put pressure on legislators pass legislation [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/cyberstalking-laws-sites-terms-service/">Cyberstalking Laws and Your Site&#8217;s Terms of Service</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Cyberstalking Laws and Your Site&#8217;s Terms of Service</p>
<p>Laws to prevent cyberstalking and cyberbullying, defined as the use of electronic technology, including the Internet, to harass, stalk, intimidate, or harm a person or persons, are in their infancy. However, high-profile cases involving suicide, school shootings, and other tragedies, have put pressure on legislators pass legislation and adopt policies to keep users safe.</p>
<p>So far, most laws have been written at the State level, treated as a civil matter rather than a criminal one, and primarily aimed at cyberbullying in schools while placing enforcement in the hands of school officials. In non-school-related incidents, lawmakers and law enforcement officials are relying on existing harassment and bullying laws and adding penalties if it is done via electronic technology and devices. Depending upon the severity of the cyberbullying act, criminal charges can be filed. Some charges associated with cyberbullying include hate crimes, impersonation, and harassment under the <a href="https://www.justice.gov/sites/default/files/criminal-ccips/legacy/2015/01/14/ccmanual.pdf">Computer Fraud and Abuse Act</a>.</p>
<p>If you have a website with areas where users and members interact with you or each other, your Terms of Service must have a section that deals with cyberbullying. You need to define it, provide tips on how to avoid it, and outline the consequences of this type of Terms of Service violation.  On the site itself, users should have a way of reporting cyberbullying, or any other violation of the Terms of Service, to site admins and moderators.  One of the best examples of a cyberbullying policy is that recently posted on <a href="https://support.google.com/youtube/answer/2802268">YouTube</a>.</p>
<p>No matter where cyberbullying takes place, your site needs a team of monitors ready to take swift action.  Mzinga moderators are trained and have many years of experience in handling the complicated issues associated with cyberbullying.  Do you want to keep your users safe from cyberbullies?  Your first action is to give us a call.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/cyberstalking-laws-sites-terms-service/">Cyberstalking Laws and Your Site&#8217;s Terms of Service</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
