<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Facebook &#8211; Online Moderation</title>
	<atom:link href="https://www.onlinemoderation.com/tag/facebook/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.onlinemoderation.com</link>
	<description>Social Media Management Services &#38; Content Moderation That Flex With Your Needs</description>
	<lastBuildDate>Wed, 19 Jun 2019 18:28:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>Facebook Doubles Down on Content Moderators</title>
		<link>https://www.onlinemoderation.com/facebook-doubles-content-moderators/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=facebook-doubles-content-moderators</link>
		
		<dc:creator><![CDATA[Mike Merriman]]></dc:creator>
		<pubDate>Fri, 05 May 2017 13:37:14 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[online moderator]]></category>
		<category><![CDATA[social media]]></category>
		<guid isPermaLink="false">https://www.onlinemoderation.com/?p=1608</guid>

					<description><![CDATA[<p>Unless you’ve been hiding under a rock, you’ve heard the stories of murders and suicides posted on Facebook.  In order to keep that type of content away from the public, Facebook has just announced that they will hire an additional 3000 content moderators.  This is on top of approximately 4500 current moderators.   In a recent post, [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-doubles-content-moderators/">Facebook Doubles Down on Content Moderators</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" class="alignleft size-medium wp-image-1587" src="https://www.onlinemoderation.com/wp-content/uploads/FB-f-Logo__blue_1024-300x300.png" alt="" width="300" height="300" />Unless you’ve been hiding under a rock, you’ve heard the stories of murders and suicides posted on Facebook.  In order to keep that type of content away from the public, Facebook has just announced that they will <a href="https://arstechnica.com/tech-policy/2017/05/facebook-promises-to-hire-3000-people-to-moderate-content/" target="_blank" rel="noopener noreferrer">hire an additional 3000 content moderators</a>.  This is on top of approximately 4500 current moderators.   In <a href="https://www.facebook.com/zuck/posts/10103695315624661" target="_blank" rel="noopener noreferrer">a recent post</a>, Mark Zuckerberg said the content moderators will, &#8220;help us get better at removing things we don&#8217;t allow on Facebook like hate speech and child exploitation.&#8221; Additionally he said they’ll work with Law Enforcement to help users, &#8220;because they&#8217;re about to harm themselves, or because they&#8217;re in danger from someone else.&#8221;</p>
<p>It’s unclear whether Facebook will hire people or outsource this work.  Nor is there any indication where in the world these additional resources will reside.  One thing however is certain – with over one billion active Facebook users, and over 5 billion pieces of content shared daily, this is a huge and necessary job.  We at Mzinga strongly believe in protecting the public and children specifically, and commend Facebook for the attention they are giving this issue.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-doubles-content-moderators/">Facebook Doubles Down on Content Moderators</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Facebook Adds AI to Suicide Prevention Arsenal</title>
		<link>https://www.onlinemoderation.com/facebook-adds-ai-suicide-prevention-arsenal/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=facebook-adds-ai-suicide-prevention-arsenal</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 06 Mar 2017 14:30:31 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Facebook]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Suicide Prevention]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1168</guid>

					<description><![CDATA[<p>Facebook Adds AI to Suicide Prevention Arsenal More than ten years ago, I complimented Facebook for encouraging its members to send in a report if they saw a post by a member or friend saying they were serious about harming themselves.  If a report was received, Facebook contacted the member with a message of concern, [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-adds-ai-suicide-prevention-arsenal/">Facebook Adds AI to Suicide Prevention Arsenal</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Facebook Adds AI to Suicide Prevention Arsenal</p>
<p>More than ten years ago, I complimented Facebook for encouraging its members to send in a report if they saw a post by a member or friend saying they were serious about harming themselves.  If a report was received, Facebook contacted the member with a message of concern, along with a list of resources for getting support.</p>
<p>Recently, Facebook <a href="http://newsroom.fb.com/news/2017/03/building-a-safer-community-with-new-suicide-prevention-tools/">announced</a> updated resource tools, as well as the use of artificial technology (AI), to offer a more rapid assist to those who may be contemplating suicide.  In addition to an improved support system that is now available on Facebook Live, text and videos are analyzed for content indicating that the member may be considering suicide.</p>
<p>If the software reads triggering words and phrases that indicate a member is at risk, the Facebook Community Operations team is notified.  They will send a message of support and suggest ways the member can seek help if they need it.</p>
<p>Have we seen the last suicide on Facebook Live?  Probably not, but instead of using AI to combat online trolling behavior (i.e. Google’s Perspective), saving lives is a much more real-world and effective use of the technology.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-adds-ai-suicide-prevention-arsenal/">Facebook Adds AI to Suicide Prevention Arsenal</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Facebook Removes Swedish Cancer Society Breast Exam Images</title>
		<link>https://www.onlinemoderation.com/facebook-removes-swedish-cancer-society-breast-exam-images/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=facebook-removes-swedish-cancer-society-breast-exam-images</link>
		
		<dc:creator><![CDATA[Mzinga Moderators]]></dc:creator>
		<pubDate>Mon, 31 Oct 2016 12:31:49 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[Content Moderation]]></category>
		<category><![CDATA[Facebook]]></category>
		<guid isPermaLink="false">http://onlinemoderation.com/?p=1056</guid>

					<description><![CDATA[<p>Facebook Removes Swedish Cancer Society Breast Exam Images As reported in The Guardian last week, Facebook removed a video posted by The Swedish Cancer Society which illustrated how to perform breast exams.  Facebook said the video, which used concentric pink and red circles for breasts, was offensive and violated their rule which states that an [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-removes-swedish-cancer-society-breast-exam-images/">Facebook Removes Swedish Cancer Society Breast Exam Images</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Facebook Removes Swedish Cancer Society Breast Exam Images</p>
<p>As reported in <a href="https://www.theguardian.com/technology/2016/oct/20/facebook-bans-breast-cancer-video-square-breasts">The Guardian</a> last week, Facebook removed a video posted by The Swedish Cancer Society which illustrated how to perform breast exams.  Facebook said the video, which used concentric pink and red circles for breasts, was offensive and violated their rule which states that an ad “can not market sex products or services nor adults products or services.”</p>
<p>When the Society protested, Facebook apologized, saying, “…our team processes millions of advertising images each week, and in some instances we incorrectly prohibit ads. This image does not violate our ad policies.”  They promised to repost the ads.</p>
<p>While the organization was happy for the restoration, they also wanted to make a point about how automated image processing can cause embarrassment and leave a mess for public relations teams to clean up.  Instead of reposting the breast circles that were deemed offensive, the Society altered the pictures and made the breasts squares.</p>
<p>The video was followed by an open letter to Facebook, which said in part: “After trying to meet your control for several days without success, we have now come up with a solution that will hopefully make you happy: Two pink squares! This can not possibly offend you, or anyone. Now we can continue to spread our important breast school without upsetting you.”</p>
<p>Does your automated text, graphic, or video filtering leave you with public relations messes?  Augmenting your content monitoring with Mzinga’s team of human moderators catches these errors before they go public, allowing your public relations team to concentrate on the good things you are doing, rather than responding to your client mocking your software.  Give us a call today to discuss the possibilities.</p>
<p>The post <a rel="nofollow" href="https://www.onlinemoderation.com/facebook-removes-swedish-cancer-society-breast-exam-images/">Facebook Removes Swedish Cancer Society Breast Exam Images</a> appeared first on <a rel="nofollow" href="https://www.onlinemoderation.com">Online Moderation</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
