{"id":6232,"date":"2019-04-01T21:29:23","date_gmt":"2019-04-02T01:29:23","guid":{"rendered":"http:\/\/ncjolt.org\/?p=6232"},"modified":"2020-06-04T20:52:26","modified_gmt":"2020-06-04T20:52:26","slug":"facebook-announces-the-use-of-ai-to-detect-and-prevent-revenge-porn","status":"publish","type":"post","link":"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/facebook-announces-the-use-of-ai-to-detect-and-prevent-revenge-porn\/","title":{"rendered":"Facebook Announces the Use of AI to Detect and Prevent \u201cRevenge Porn\u201d"},"content":{"rendered":"\n<p>Facebook recently <a href=\"https:\/\/newsroom.fb.com\/news\/2019\/03\/detecting-non-consensual-intimate-images\/\">announced<\/a>\nthe use of new technology to prevent the sharing of non-consensual intimate\nimages, most commonly referred to as \u201crevenge porn.\u201d Facebook will now use \u201cmachine\nlearning and artificial intelligence\u201d to \u201cproactively detect near nude images\nor videos that are shared without permission on Facebook or Instagram,\u201d\naccording to the company\u2019s Global Head of Safety, Antigone Davis. This\ninitiative is currently being done out of good will and not legal obligation. However,\ndue to a recent amendment of federal law, it is foreseeable that initiatives\nlike this will be required in the future and will come burdened with many legal\nconcerns. <\/p>\n\n\n<p><strong>The Impacts of\nRevenge Porn<\/strong><\/p>\n\n\n<p>\u201c<a href=\"https:\/\/www.cybercivilrights.org\/faqs\/\">Revenge porn<\/a>\u201d is defined as the \u201cdistribution of sexually graphic images of individuals without their consent,\u201d and it includes images that were obtained consensually in the context of a relationship, as well as those obtained through hidden camera or hacking. <a href=\"https:\/\/datasociety.net\/pubs\/oh\/Nonconsensual_Image_Sharing_2016.pdf\">One study<\/a> stated that 4 percent of internet users \u201chave either had sensitive images posted without their permission or had someone threaten to post photos of them.\u201d Victims of revenge porn suffer severe personal and psychological consequences. <a href=\"http:\/\/jaapl.org\/content\/jaapl\/44\/3\/359.full.pdf\">A study<\/a> noted that 80 to 93 percent of victims \u201csuffered significant emotional distress\u2026[including] anger, guilt, paranoia, depression, or even suicide.\u201d <\/p>\n\n\n<p><a href=\"https:\/\/www.cybercivilrights.org\/wp-content\/uploads\/2014\/12\/RPStatistics.pdf\">Another study<\/a> indicates that 51 percent of victims have had suicidal thoughts \u201cdue to being a victim;\u201d 42 percent have had to \u201cexplain the situation to professional or academic supervisors, coworkers, or colleagues;\u201d 55 percent fear that their professional reputation will be negatively impacted in the future; and 82 percent have \u201csuffered significant impairment in social, occupational, or other important areas of functioning due to being a victim.\u201d While men and women can be victims of revenge porn, this study concluded that 90 percent of the victims were women. <\/p>\n\n\n<p><strong>Facebook\u2019s Attempt\nto Combat Revenge Porn<\/strong><\/p>\n\n\n<p>In an attempt to preemptively combat this issue, Facebook is launching an <a href=\"https:\/\/newsroom.fb.com\/news\/2019\/03\/detecting-non-consensual-intimate-images\/\">AI tool<\/a> to detect revenge porn posts before anyone reports it. The tool itself will <a href=\"https:\/\/www.cnbc.com\/2019\/03\/15\/facebook-ai-tool-detects-revenge-porn-before-its-reported.html\">recognize<\/a> near nude content that is paired with \u201cderogatory or shaming text.\u201d It will then flag the content to be reviewed by a \u201cspecially-trained\u201d member of Facebook\u2019s staff who will determine if the content is in violation of the <a href=\"https:\/\/www.facebook.com\/communitystandards\/\">Community Standards<\/a>, remove it, and potentially disable the account that posted the content without permission. Facebook shared in an email to <a href=\"https:\/\/gizmodo.com\/\">Gizmodo <\/a>that the \u201cdetection technology was <a href=\"https:\/\/gizmodo.com\/facebook-needs-to-better-explain-how-its-going-to-use-a-1833323427\">trained on revenge porn<\/a> in order to better understand what these types of posts would look like.\u201d This most recent initiative is an extension of Facebook\u2019s 2017 <a href=\"https:\/\/newsroom.fb.com\/news\/h\/non-consensual-intimate-image-pilot-the-facts\/\">pilot program<\/a> that called for users to proactively submit their intimate images and videos so that they could receive a \u201cdigital fingerprint\u201d that would allow Facebook to detect and prevent it from ever being shared on the site.<\/p>\n\n\n<blockquote class=\"wp-block-quote\"><p> Facebook will now use \u201cmachine learning and artificial intelligence\u201d to \u201cproactively detect near nude images or videos that are shared without permission on Facebook or Instagram,\u201d according to the company\u2019s Global Head of Safety, Antigone Davis. <\/p><\/blockquote>\n\n\n<p>The 2017 pilot program received <a href=\"https:\/\/www.forbes.com\/sites\/dbloom\/2018\/05\/24\/facebook-wants-your-nude-photos-what-could-possibly-go-wrong\/#36c01a8b4587\">extensive criticism<\/a>, including skepticism over whether Facebook should be trusted with intimate photos after it was determined that the site had shared private data with third party providers. Similarly, the announcement of the AI tool has been met with skepticism of its success. A <a href=\"https:\/\/www.rollingstone.com\/culture\/culture-news\/facebook-revenge-porn-ai-software-808867\/\">major concern<\/a> is that Facebook has yet to announce exactly how the tool will work. AI is only as good as the information put into it and contains the bias of the programmers, which is an issue, <a href=\"https:\/\/www.rollingstone.com\/culture\/culture-news\/facebook-revenge-porn-ai-software-808867\/\">according to Rolling Stone<\/a> because \u201ca sizeable percentage of actual human men don\u2019t actually know what consent is.\u201d Therefore, it is unlikely that a machine would be able to determine if a post was shared with permission. Without more information on how the technology is actually \u201ctrained\u201d to detect nonconsensual images, the <a href=\"https:\/\/gizmodo.com\/facebook-needs-to-better-explain-how-its-going-to-use-a-1833323427\">skepticism will continue<\/a> and legal concerns will likely arise. <\/p>\n\n\n<p><strong>The Laws Regarding\nRevenge Porn and Social Media Liability<\/strong><\/p>\n\n\n<p>Revenge porn has been recognized as a serious threat prompting most states to enact revenge porn statutes. There is <a href=\"https:\/\/heinonline.org\/HOL\/P?h=hein.journals\/text50&amp;i=357\">dissonance between states<\/a> around whether revenge porn should be redressed through criminalization or civil remedies; and because revenge porn doesn\u2019t fall within any of the enumerated categories of unprotected speech, there is also an underlying First Amendment question surrounding the laws banning revenge porn. North Carolina\u2019s <a href=\"https:\/\/www.ncleg.net\/EnactedLegislation\/Statutes\/PDF\/BySection\/Chapter_14\/GS_14-190.5A.pdf\">Disclosure of Private Images Act<\/a> criminalizes the act of \u201cknowingly disclos[ing] an image of another person with the intent to\u2026coerce, harass, intimidate, demean, humiliate, or cause financial loss to the depicted person\u201d or with the intent to cause others to do any of the same. Along with criminal prosecution, the N.C. law allows for civil remedies to be awarded to victims. <\/p>\n\n\n<p>While states have begun to address victims\u2019 rights against perpetrators of revenge porn, there hasn\u2019t been as much legal progress in holding websites accountable for the explicit content shared on their platforms. Many websites are protected by the <a href=\"https:\/\/www.govinfo.gov\/content\/pkg\/USCODE-2011-title47\/pdf\/USCODE-2011-title47-chap5-subchapII-partI-sec230.pdf\">Communications Decency Act \u00a7230<\/a>, which provides a \u201c<a href=\"https:\/\/www.bna.com\/insight-communications-decency-n73014482221\/\">safe harbor to internet service providers and platforms<\/a>, exempting them from liability based on the speech and content of their users.\u201d However, in 2018, Congress passed an <a href=\"https:\/\/www.congress.gov\/bill\/115th-congress\/house-bill\/1865\/text\">amendment to Section 230<\/a> that allowed enforcement against online service providers that knowingly host content that promotes sex trafficking. <\/p>\n\n\n<p>Based on the recent amendment, it is foreseeable that in the near future, social media sites will have the more active duty to prevent the sharing of revenge porn on their platforms. If this does happen, Congress will have to decide if tools like the one Facebook is launching will help guard against liability or will be a signal that the site knowingly hosts explicit content. Another issue that will need to be addressed is whether removing the content is enough, or if the site should have the duty to report revenge porn to authorities, specifically in regard to minors or children.  \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/p>\n\n\n<p>As the law stands currently, there is no requirement for social media sites to prohibit the sharing of revenge porn or to report the criminal activity to authorities, but as the sharing of nonconsensual intimate images becomes more common, special attention should be paid to whether social media site should be required to take an active role in preventing it and what that role should actually be.<\/p>\n\n\n<p>Hannah Petersen, 18 March 2019<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Facebook recently announced the use of new technology to prevent the sharing of non-consensual intimate images, most commonly referred to as \u201crevenge porn.\u201d Facebook will now use \u201cmachine learning and artificial intelligence\u201d to \u201cproactively detect near nude images or videos that are shared without permission on Facebook or Instagram,\u201d according to the company\u2019s Global Head <a href=\"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/facebook-announces-the-use-of-ai-to-detect-and-prevent-revenge-porn\/\" class=\"more-link\">&#8230;<\/a><\/p>\n","protected":false},"author":1,"featured_media":3626,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[51],"tags":[],"_links":{"self":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6232"}],"collection":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/comments?post=6232"}],"version-history":[{"count":1,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6232\/revisions"}],"predecessor-version":[{"id":6830,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6232\/revisions\/6830"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/media\/3626"}],"wp:attachment":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/media?parent=6232"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/categories?post=6232"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/tags?post=6232"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}