<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Bullying &#8211; Online Moderation</title>
	<atom:link href="https://www.onlinemoderation.com/tag/bullying/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.onlinemoderation.com</link>
	<description>Social Media Management Services &#38; Content Moderation That Flex With Your Needs</description>
	<lastBuildDate>Wed, 19 Jun 2019 18:31:06 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>Texas Anti-Harassment Legislation Threatens Lawful Online Interaction</title>
		<link>https://www.onlinemoderation.com/texas-anti-harassment-legislation-threatens-lawful-online-interaction/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=texas-anti-harassment-legislation-threatens-lawful-online-interaction</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Thu, 16 Feb 2017 14:56:24 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1158</guid>

					<description><![CDATA[<p>Texas Anti-Harassment Legislation Threatens Lawful Online Interaction A new bill introduced in the Texas legislature seeks to criminalize cyber-bullying of children in educational settings.  The bill, called “David’s Law” (named after a 16-year-old victim of cyber-bullying who killed himself – there were no charges filed against those accused) would give school district officials more power [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/texas-anti-harassment-legislation-threatens-lawful-online-interaction/">Texas Anti-Harassment Legislation Threatens Lawful Online Interaction</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Texas Anti-Harassment Legislation Threatens Lawful Online Interaction</p>
<p>A new bill introduced in the Texas legislature seeks to criminalize cyber-bullying of children in educational settings.  <a href="http://www.capitol.state.tx.us/tlodocs/85R/billtext/pdf/SB00179I.pdf#navpanes=0">The bill</a>, called “David’s Law” (named after a 16-year-old victim of cyber-bullying who killed himself – there were no charges filed against those accused) would give school district officials more power to discipline, expel, and expose the identities of online harassment suspects.</p>
<p>The bill aims to protect students from communications that infringe upon their rights, but it does not define those rights or how they might be violated.   If a single email “infringes on the rights of the victim at school,” the sender could be disciplined.  If that email results in the recipient’s suicide, the sender could be expelled.</p>
<p>The worst provision, however, is the unmasking of the sender if they are accused of harassment.  The bill authorizes subpoenas to investigate injury claims before a lawsuit is filed.  As a result, if it is determined that no injury took place, the sender is still stamped with the stigma of a harasser or cyber-bully.</p>
<p>Bullying on social media is on the rise and a cause for concern, but anti-harassment policies must be very limited in scope (and rights and remedies narrowly defined) so they do not jeopardize the First Amendment rights of those who engage in lawful interaction.</p>
<p>&nbsp;</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/texas-anti-harassment-legislation-threatens-lawful-online-interaction/">Texas Anti-Harassment Legislation Threatens Lawful Online Interaction</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Lego Life App Combats Trolls and Bullies</title>
		<link>https://www.onlinemoderation.com/lego-life-app-combats-trolls-bullies/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=lego-life-app-combats-trolls-bullies</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Tue, 14 Feb 2017 14:47:19 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Trolling]]></category>
		<category><![CDATA[trolls]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1155</guid>

					<description><![CDATA[<p>A new app launched by Lego contains many features that minimize the ability of users to be harassed and bullied.  Called Lego Life, the app for iOS and Android (available in App Store and Google Play) allows kids under 13 to create profiles, watch videos, participate in challenges, upload photos of their projects, search and [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/lego-life-app-combats-trolls-bullies/">Lego Life App Combats Trolls and Bullies</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>A new app launched by Lego contains many features that minimize the ability of users to be harassed and bullied.  Called <a href="https://www.lego.com/en-us/life">Lego Life</a>, the app for iOS and Android (available in App Store and Google Play) allows kids under 13 to create profiles, watch videos, participate in challenges, upload photos of their projects, search and follow their favorites, and post in message boards.</p>
<p>Lego Life’s concerns for the safety of its users are evident in many ways: users under 13 must have their parents provide email permission, profiles are avatars that users create from a list of traits, usernames are randomly generated from a three-word sequence (i.e. ChairmanWilyDolphin), all user-generated content is premoderated, no photos are allowed that contain human faces, and most responses are either controlled by using emojis from a special keyboard or selected from a list of phrases.  Users are allowed to use their own words when responding to official Lego content.  A version for the web is in development.</p>
<p>Lego Life is a win-win for both the company and its customers.  The app increases brand loyalty and keeps kids safe. When they aren’t using it, they are presumably building new Lego creations.  That’s just fine with Lego Group’s senior director Rob Lowe, who says of kids who use the app, &#8220;One of its core purposes is to put their iPhone down and go do something else.”</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/lego-life-app-combats-trolls-bullies/">Lego Life App Combats Trolls and Bullies</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</title>
		<link>https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=githubs-contributor-covenant-makes-waves-curbs-online-abuse</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Tue, 17 Jan 2017 14:16:48 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Terms of Service]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1141</guid>

					<description><![CDATA[<p>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse Over a year ago, I wrote about GitHub’s issues with bullying and discrimination, which came to a head when a female developer quit the collaborative coding hub – a victim of gender-based harassment by white male managers and co-workers.  The highly-publicized move eventually led to the resignation [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/">GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</p>
<p>Over a year ago, I wrote about GitHub’s <a href="http://money.cnn.com/2014/03/17/technology/github-sexual-harassment/">issues with bullying and discrimination</a>, which came to a head when a female developer quit the collaborative coding hub – a victim of gender-based harassment by white male managers and co-workers.  The highly-publicized move eventually led to the resignation of GitHub’s CEO.</p>
<p>As a result, GitHub has made <a href="http://fusion.net/story/369325/how-to-stop-online-harassment/">several major changes</a>.  First, they hired Nicole Sanchez as the company’s VP of Social Impact.  Sanchez formalized GitHub’s organization (previously there had been no designated managers or job titles), made it easier for employee issues to be addressed, and announced that workplace diversity would be acknowledged and celebrated.</p>
<p>Sanchez also hired two transgender community managers: February Keeney as head of the Community and Safety team tasked with eliminating workplace harassment, and Coraline Ada Ehmke, a senior engineer and creator of the “Contributor Covenant,” a code of conduct that had been loosely adopted by several project teams.</p>
<p>Sanchez, Keeney, and Ehmke found, however, that their institutional changes weren’t welcome at all levels.  Several groups, advocates of free speech, resisted being told that the terms of the Contributor Covenant were being applied more widely.  They retaliated by using the software’s tagging feature to connect Ehmke with fake projects that had racist names, a tactic they had also used on the developer who had initially exposed the harassment in 2014.</p>
<p>One of the first tasks of the Community and Safety Team was to build “consent and intent” into the software.  Now, you cannot tag a coder on a project without their approval.  And late last year, they updated the Contributor Covenant to include guidelines for conduct that prohibited doxxing, bullying, and discrimination, as well as a wider range of moderation tools.</p>
<p>While there is still resistance from a few groups, Sanchez and her team have moved the company in the right direction: making the platform less prone to abuse, developing a policy that is explicit and embraces civility, diversity, and inclusion, and involving the community members in its enforcement.  As she said at a <a href="https://recompilermag.com/2016/08/26/open-source-feelings-real-world-examples-real-world-impact/">recent conference</a>, ““Diversity is coming to your party despite my bad experiences at other parties. Inclusion is being glad I came.”</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/githubs-contributor-covenant-makes-waves-curbs-online-abuse/">GitHub’s “Contributor Covenant” Makes Waves; Curbs Online Abuse</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Steve Brock discusses Online Safety on the Community Signal Podcast</title>
		<link>https://www.onlinemoderation.com/1129-2/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=1129-2</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Wed, 14 Dec 2016 15:57:15 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Trolling]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1129</guid>

					<description><![CDATA[<p>Recently I was the featured guest on Patrick O’Keefe’s “Community Signal” weekly podcast, where we discussed my career as a community manager for over 25 years (17 of them with many of the same clients and members of the Mzinga moderation team), how to use law enforcement agencies to keep clients and their customers safe, [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/1129-2/">Steve Brock discusses Online Safety on the Community Signal Podcast</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Recently I was the featured guest on Patrick O’Keefe’s “Community Signal” weekly <a href="http://www.communitysignal.com/working-with-law-enforcement/" target="_blank" rel="noopener noreferrer">podcast</a>, where we discussed my career as a community manager for over 25 years (17 of them with many of the same clients and members of the Mzinga moderation team), how to use law enforcement agencies to keep clients and their customers safe, and attempts to “patent” bundles of community management tools and management models.</p>
<p>At the heart of community management is keeping community members safe.  If a member threatens another member with harm (i.e. “I know where you live and I’m coming over right now”) or harming themselves (“I’m going to take every pill in my medicine cabinet”), community managers need to act fast.  They need to know who to contact (the client’s security division, the IP and ISP of the member, and law enforcement officials) and what information they need to provide (at the very least, a copy of the message, the IP address, and a log entry that identifies the exact time of the post).  Usually, security, ISP, and law enforcement officials take it from there.</p>
<p>It’s important for community managers and moderators to know the difference between a credible personal threat and “trash-talk” that takes place in the heat of a debate or argument.  As a best practice, my moderation team and I err on the side of caution: If I think a threat “may” be credible, I will escalate it to the next level.  If law enforcement is contacted, they track down the poster, and that person may be charged with a crime if they were “trolling.”</p>
<p>Patrick and I also discussed Facebook’s intention to use machine learning to automate the moderation of its interactive areas.  We agreed that Facebook’s moderation tools are severely lacking and that patenting them isn’t something worthy of a news release.</p>
<p>For more information and to hear the podcast, stop by the <a href="http://www.communitysignal.com/" target="_blank" rel="noopener noreferrer">Community Signal website</a>.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/1129-2/">Steve Brock discusses Online Safety on the Community Signal Podcast</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Use of Encrypted Software Thwarts Anti-Bullying Efforts</title>
		<link>https://www.onlinemoderation.com/use-encrypted-software-thwarts-anti-bullying-efforts/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=use-encrypted-software-thwarts-anti-bullying-efforts</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 05 Dec 2016 14:27:36 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1113</guid>

					<description><![CDATA[<p>Use of Encrypted Software Thwarts Anti-Bullying Efforts What can a cyberbullying victim do when they are told by both police and school officials that there is nothing they can do to help them because the bully’s account is untraceable?  A CNN article says that is what 18-year-old Brandy Vela of Texas City, Texas was told [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/use-encrypted-software-thwarts-anti-bullying-efforts/">Use of Encrypted Software Thwarts Anti-Bullying Efforts</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Use of Encrypted Software Thwarts Anti-Bullying Efforts</p>
<p>What can a cyberbullying victim do when they are told by both police and school officials that there is nothing they can do to help them because the bully’s account is untraceable?  <a href="http://www.cnn.com/2016/12/01/health/teen-suicide-cyberbullying-trnd/index.html">A CNN article</a> says that is what 18-year-old Brandy Vela of Texas City, Texas was told last month.  With no solutions and no help, other than the suggestion to change her phone number, last week she found a gun and committed suicide in her family home.</p>
<p>Brandy had been a victim of bullying for years, taunted about her weight by anonymous Facebook users.  Recently, someone posted her picture and phone number on a dating site, suggesting that she was available for no-strings sexual encounters.  She and her parents reported the incidents to the police, as well as to school officials just before the Thanksgiving break, but they were told that the software used was encrypted and could not be traced.</p>
<p>The use of this type of software is the latest blow to anti-bullying efforts.  If suspects are able to stay anonymous, victims have no hope that the harassment will stop.  There are legitimate reasons for using encryption (such as keeping a client’s personal information private), but the software makers need to be able to hand over the encryption key information to law enforcement officials when presented with a court order, or risk being sued when another bullying victim runs out of options.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/use-encrypted-software-thwarts-anti-bullying-efforts/">Use of Encrypted Software Thwarts Anti-Bullying Efforts</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Is the Prevalence of Internet Shaming Causing it to Lose Potency?</title>
		<link>https://www.onlinemoderation.com/prevalence-internet-shaming-causing-lose-potency/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=prevalence-internet-shaming-causing-lose-potency</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 28 Nov 2016 14:34:40 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1100</guid>

					<description><![CDATA[<p>Is the Prevalence of Internet Shaming Causing it to Lose Potency? The author of a recent Scientific American article speculates that Internet shaming (personal attacks that go viral) has reached a crescendo and because there is so much of it, the impacts are becoming less severe. To illustrate, David Pogue cites the case of Adam [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/prevalence-internet-shaming-causing-lose-potency/">Is the Prevalence of Internet Shaming Causing it to Lose Potency?</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Is the Prevalence of Internet Shaming Causing it to Lose Potency?</p>
<p>The author of a recent Scientific American <a href="https://www.scientificamerican.com/article/the-bright-side-of-internet-shaming/">article</a> speculates that Internet shaming (personal attacks that go viral) has reached a crescendo and because there is so much of it, the impacts are becoming less severe.</p>
<p>To illustrate, David Pogue cites the case of Adam Mark Smith, who videotaped a confrontation with a Chick-fil-A counterperson in 2014 over the company’s affiliation with hate groups.  Smith posted the video to You Tube, it went viral, and not only was the employee fired, but he received death threats, his private information was disseminated world-wide, and his children were harassed.</p>
<p>Pogue says that public shaming isn’t new: think stonings and putting miscreants in stocks in the public square.  But on the Internet, a small error can lead to permanent and tragic life changes.  And those who try to help also become targets for abuse.</p>
<p>Things move fast on the Internet and our collective memory is short.  The next error in judgment will have the trolls attacking a different target.  As a result, Pogue says, shaming’s prevalence might be its undoing, relegating it to just a sport (or online reality show) that none of us should take so personally.</p>
<p>It’s human nature to want to join crowd activities and follow it when the focus changes, but unfortunately, I do not see the tragic results of public shaming waning.  Instead, I see them increasing as more and more attention is given to each occurrence.</p>
<p>Public shaming is not collapsing under its own weight; it needs a widespread collective effort by those moderating the sites where it occurs, rapidly removing the trolling content as well as the trolls. Those posting threats and personal attacks will only stop doing so when they know their conduct will not be tolerated, and this will only happen when there is sufficient and proficient moderation that promotes and enforces basic civility and respect.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/prevalence-internet-shaming-causing-lose-potency/">Is the Prevalence of Internet Shaming Causing it to Lose Potency?</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>When It Comes to Bullying, Some Safety Nets Have Gaping Holes</title>
		<link>https://www.onlinemoderation.com/comes-bullying-safety-nets-gaping-holes/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=comes-bullying-safety-nets-gaping-holes</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 07 Nov 2016 14:14:46 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1059</guid>

					<description><![CDATA[<p>When It Comes to Bullying, Some Safety Nets Have Gaping Holes As much as we try to protect kids from the effects of bullying, some still fall through the safety nets we place underneath them.  The latest illustration is Bethany Thompson, an 11-year-old cancer survivor from Ohio. At the age of 3, doctors discovered that [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/comes-bullying-safety-nets-gaping-holes/">When It Comes to Bullying, Some Safety Nets Have Gaping Holes</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>When It Comes to Bullying, Some Safety Nets Have Gaping Holes</p>
<p>As much as we try to protect kids from the effects of bullying, some still fall through the safety nets we place underneath them.  The latest illustration is <a href="http://www.dispatch.com/content/stories/local/2016/10/29/family-community-struggles-for-answers-after-11-year-old-fatally-shoots-herself.html">Bethany Thompson</a>, an 11-year-old cancer survivor from Ohio.</p>
<p>At the age of 3, doctors discovered that Bethany had a brain tumor.  They removed it and treated her with radiation that left her cancer-free, but the resulting nerve damage left her with a “crooked” smile.  Because of the smile and her curly red hair, she was bullied by her school-mates.  After school two weeks ago, she told her best friend she was going to kill herself, and before the friend could get help, Bethany had found a gun and pulled the trigger.</p>
<p>Bethany had several safety nets: parents who helped her survive cancer, teachers and school administrators and counselors who she met with regularly, and close friends who she confided in. None, however, were able to come to her aid when she needed them most.</p>
<p>Her father was asleep with a loaded pistol in his house.  The day before her death, after a bullying incident, school officials refused to let Bethany put up anti-bullying posters on campus.  When she told her friend what she had planned to do later that afternoon, help couldn’t be summoned in time.</p>
<p>At Triad Middle School, Superintendent Chris Piper, said in an interview that he plans to “…re-evaluate our anti-bullying educational side so that we are able to determine when things go from normal misbehavior to a pattern of bullying and to deter and stop misbehavior.”</p>
<p>But those plans aren’t in place in time to save Bethany, and even a reassessment of school policy may not be enough.  Parents, teachers, counselors, and friends need to be aware of bullying behavior and not be afraid to take action.  If you see bullying, get involved.  If you know that someone is being bullied, do what it takes to help them or get them help.</p>
<p>Don’t know what to do?  There are many anti-bullying sites on the Internet.  Read up, and be ready.  There are many Bethanys out there.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/comes-bullying-safety-nets-gaping-holes/">When It Comes to Bullying, Some Safety Nets Have Gaping Holes</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>25 Percent of Canadians Are Harassed on Social Media</title>
		<link>https://www.onlinemoderation.com/25-percent-canadians-harassed-social-media/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=25-percent-canadians-harassed-social-media</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 24 Oct 2016 13:08:12 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[social media]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1053</guid>

					<description><![CDATA[<p>25 Percent of Canadians Are Harassed on Social Media A recent survey by the Angus Reid Public Interest Research Institute found that one-fourth of Canadians are subjected to “unwelcome comments, vicious insults, threats of violence, or worse,” with the bulk of abuse coming from Facebook and Twitter.  The number rises among frequent and younger users. [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/25-percent-canadians-harassed-social-media/">25 Percent of Canadians Are Harassed on Social Media</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>25 Percent of Canadians Are Harassed on Social Media</p>
<p>A recent <a href="http://angusreid.org/social-media/">survey</a> by the <a href="http://angusreid.org/">Angus Reid Public Interest Research Institute</a> found that one-fourth of Canadians are subjected to “unwelcome comments, vicious insults, threats of violence, or worse,” with the bulk of abuse coming from Facebook and Twitter.  The number rises among frequent and younger users.</p>
<p>In response, users have become reluctant to engage in debate with those holding contrary opinions.  Instead, they hold back and self-censor themselves when it comes to controversial topics.  The Institute also found that Canadians believe Internet companies aren’t doing enough to curb harassment and personal attacks.</p>
<p>When queried about where the companies are falling short, users said that they weren’t responsive to complaints and requests for assistance, and they weren’t performing sufficient moderation tasks:  finding and removing offensive content in a timely manner.  When given removal policy choices, most wanted the companies to “get tough” by regularly monitoring and moderating interactive areas and actively removing content (and those that post it) that does not follow the rules of conduct.</p>
<p>If a company does not have internal resources to moderate areas where users are allowed to post messages and comments, the best alternative is to hire a vendor who specializes in online moderation.</p>
<p>One such company is Mzinga, who for more than 25 years has been keeping companies (and their users) safe from harassment, offensive personal attacks, and other abuses.  We are also able to triage requests to customer service: providing assistance, answering questions, and routing specific requests to the proper divisions of the company.</p>
<p>Your users and customers are demanding that you take tough action to keep them safe.  Why not take the first step: give us a call and let us show you how Mzinga moderation and customer service triage can revive civility?</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/25-percent-canadians-harassed-social-media/">25 Percent of Canadians Are Harassed on Social Media</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>After School, Kids Use “After School” to Cyberbully Each Other Anonymously</title>
		<link>https://www.onlinemoderation.com/kids-use-after-school-to-cyberbully-each-other-anonymously/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kids-use-after-school-to-cyberbully-each-other-anonymously</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 17 Oct 2016 13:18:32 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Bullying]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1048</guid>

					<description><![CDATA[<p>After School, Kids Use “After School” to Cyberbully Each Other Anonymously When the last bell of the day rings, many high-school-age kids grab their smart phones and launch the After School app, which takes them to a place its developers call, “a positive and safe place to share and connect with other students in their [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/kids-use-after-school-to-cyberbully-each-other-anonymously/">After School, Kids Use “After School” to Cyberbully Each Other Anonymously</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>After School, Kids Use “After School” to Cyberbully Each Other Anonymously</p>
<p>When the last bell of the day rings, many high-school-age kids grab their smart phones and launch the After School app, which takes them to a place its developers call, “a positive and safe place to share and connect with other students in their high school,” using video, pictures, and text.  Unfortunately, after the software identifies a user through their Facebook account and the school they attend, their interactions are anonymous, opening the door to cyberbullying, posting inappropriate pictures, giving out personally identifying information, and other types of inappropriate content.</p>
<p>There have already been issues with the app &#8211; the creators were forced to remove it from both the Apple App Store and Google Play Store shortly after it was launched, due to complaints about child safety.  With assurances of increased security, the app was relaunched in April of last year.  Instead of calming those with concerns, the complaints have grown, not only due to user anonymity, but also because the site can’t be accessed by parents or teachers.</p>
<p>Earlier this year, technology website Make Use Of (MUO), listed <a href="http://www.makeuseof.com/tag/5-reasons-kids-shouldnt-use-school-app/">5 reasons</a> kids shouldn’t use After School: Bullying, Out of the Reach of Teachers, Age/School Verification, Personal Details, and Impersonation.  In the comments section of the article, the most recent comment is from a parent whose daughter used the app, had received a lewd proposal that was publicly viewable, and it had received 23 likes.</p>
<p>The After School website says “we are passionate about helping young people and are continuously improving the service we provide. Security and the safety of the millions of teens who use After School is our top priority.”  It’s obvious that security and safety aren’t a top priority – if they were, company admins would require personal identification and let parents and teachers see the content. Otherwise, it’s Reddit for kids.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/kids-use-after-school-to-cyberbully-each-other-anonymously/">After School, Kids Use “After School” to Cyberbully Each Other Anonymously</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
