{"id":6682,"date":"2020-02-12T21:03:38","date_gmt":"2020-02-12T21:03:38","guid":{"rendered":"https:\/\/journals.law.unc.edu\/jolt\/?p=6682"},"modified":"2020-06-04T20:52:23","modified_gmt":"2020-06-04T20:52:23","slug":"one-step-closer-to-facebook-oversight-board","status":"publish","type":"post","link":"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/one-step-closer-to-facebook-oversight-board\/","title":{"rendered":"\u201cOne step closer to Facebook Oversight Board\u201d"},"content":{"rendered":"\n<p>In December 2019, Facebook\u00a0<a href=\"https:\/\/variety.com\/2019\/digital\/news\/facebook-130-million-fund-content-oversight-board-1203434228\/\">announced<\/a> an initial commitment of $130 million to launch its new Facebook Oversight Board. The Board, which has been called Facebook\u2019s Supreme Court, is designed as a way for users to appeal decisions made by Facebook about its enforcement of community standards, which prohibit \u201cactivity and content like violence and criminal behavior, pornography and other objectionable content and behavior, as well as ads on the service.\u201d The Oversight Board Trust was\u00a0<a href=\"https:\/\/about.fb.com\/news\/2019\/12\/oversight-board-update\/\">created<\/a> as a \u201cnon-charitable purpose trust\u201d under Delaware law and the initial commitment of $140 million will serve to cover operation costs.<\/p>\n\n\n\n<p>Facebook is currently vetting candidates for up to 40 spots on the Oversight Board. The Board will have its own staff, which will operate independently from Facebook, and\u00a0<a href=\"https:\/\/about.fb.com\/news\/2019\/12\/oversight-board-update\/\">Facebook<\/a> anticipates that the staff would include a director, case managers, and staff members or contractors.<\/p>\n\n\n\n<p>The Board is seen as a way for Facebook to build user trust and position itself proactively against potential government\u00a0<a href=\"https:\/\/variety.com\/2019\/politics\/news\/elizabeth-warren-tech-company-breakup-1203372410\/\">regulation<\/a>. Facebook\u00a0<a href=\"https:\/\/about.fb.com\/news\/2019\/12\/oversight-board-update\/\">believes<\/a> that the board should be focused on human rights principles, including the freedom of expression, privacy and remedy. It worked with BSR, an independent nonprofit organization with \u201cexpertise in human rights practices and policies\u201d to commission an impact assessment on how best to respect and promote these human rights principles. The recommendations in the impact\u00a0<a href=\"https:\/\/www.bsr.org\/en\/our-insights\/blog-view\/a-human-rights-review-of-the-facebook-oversight-board\">assessment<\/a> include \u201cdiversity of board members, remedies, user support, transparent communications and privacy-protective tools,\u201d which Facebook states has helped inform its bylaws and its charter.<\/p>\n\n\n\n<p><strong>In Volume 21, Issue 1 of the North Carolina Journal of Law and Technology, Evelyn Douek published a\u00a0<a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">piece<\/a> entitled \u201cFacebook\u2019s \u2018Oversight Board:\u2019 Move Fast with Stable Infrastructure and Humility\u201d in which she analyzed the values the Board can bring to Facebook\u2019s content moderation cases. <\/strong>Douek proposes that the Board will be able to highlight weaknesses in the policy formation process at Facebook, which would help streamline the legislative process in creating its Community Standards. Second, Douek argues that the Board could serve as an important forum for public discussion \u201cnecessary for persons in a pluralistic community to come to accept the rules that govern them, even if they disagree with the substance.\u201d<\/p>\n\n\n\n<p>Douek highlights how the Facebook Oversight Board is intended to help solve the impossible challenge of content moderation, made impossible by the sheer scale of a platform like Facebook, which consists of over 2 billion monthly active users and 2.5 billion pieces of content shared every day. The content is moderated to align with Facebook\u2019s Community Standards and Facebook takes action on any content that is in breach of these rules. Facebook also re-reviews decisions that were appealed by users. In 2019, that amounted to nearly 25 million requests for appeal. Facebook has employed a dual prong approach which consists of: (1) an \u201cindustrial\u201d decision factory that approaches decisions with consistency and aims to reduce the Community Standards to bright line rules, and (2) the use of artificial intelligence, which took down over 95% of content violating Community Standards before it was even reported by a user.<\/p>\n\n\n\n<p>However, it is difficult to train AI to appreciate the infinite spectrum of human nuance, particularly in instances of bullying, harassment and hate speech. AI does not give a person who has a decision made against them the feeling of being heard, nor does it give public reasoning for its sentiments. AI can also not determine all of the value that should be encoded into the detection algorithms, because, for example, what constitutes hate speech varies around the world. Therefore, the benefits of outsourcing this role to an independent body like the FOB are \u201cgreater transparency and reason-giving provided by Facebook employees and policy-makers within the content moderation ecosystem.\u201d&nbsp;<\/p>\n\n\n\n<p>Since, FOB will be resolving disputes revolving around the exercise of power by Facebook over free speech and expression, Douek argues that this ultimately means that the FOB will be resolving disputes more analogous to public law. Thus, the FOB\u2019s decisions will need to take into account a conception of the \u201cpublic interest,\u201d rather than just each immediate case. To achieve this balance, FOB will adopt a \u201c<a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">Weak-Form Review\u201d<\/a> on this\u00a0<a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">\u201cJudicial-Style Check on Policy Making.\u201d<\/a> Thus, it will only review individual decisions under the Community Standards and \u201cwill not decide cases where reversing Facebook\u2019s decision would violate the law.\u201d\u00a0<\/p>\n\n\n\n<p>Moreover, Douek highlights two potential limits to the legitimacy of FOB. First, Facebook will still always have the\u00a0 \u201c<a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">power to overrule\u201d<\/a>\u00a0 the decisions made by FOB. Second, there will be some difficult cases where FOB will not have a \u201c<a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">right answer.\u201d<\/a> Despite these limitations, FOB can be a great way to fill the loopholes and blind spots in Facebook\u2019s policies. Most of Facebooks\u2019 policies are made haphazardly in response to high profile controversies and scandals. Having a Judicial-Style Check will give FOB a chance to account for all perspectives and give an opportunity to review different, practical and unintended consequences of a rule\u2019s application. This will lead Facebook to have <a href=\"http:\/\/journals.law.unc.edu\\\/ncjolt\/wp-content\/uploads\/2019\/10\/DouekIssue1_Final_.pdf\">better policies<\/a>, and will also encourage users to be responsible for their actions. As users will know they are accountable and will be provided reason for the decisions made, they will likely avoid those actions that led to such decisions.\u00a0<\/p>\n\n\n\n<p>Although, FOB will face challenges in giving practical public reasoning of their decisions due to lack of global norms and justifications to online free speech. FOB is still an important and promising innovation, which attempts to regulate and set standards over online disclosure, which can become an independent source of universally accepted free speech norms.<\/p>\n\n\n\n<p>Madiha Chhotani &amp; Meredith Richards<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In December 2019, Facebook\u00a0announced an initial commitment of $130 million to launch its new Facebook Oversight Board. The Board, which has been called Facebook\u2019s Supreme Court, is designed as a way for users to appeal decisions made by Facebook about its enforcement of community standards, which prohibit \u201cactivity and content like violence and criminal behavior, <a href=\"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/one-step-closer-to-facebook-oversight-board\/\" class=\"more-link\">&#8230;<\/a><\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[51],"tags":[],"_links":{"self":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6682"}],"collection":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/comments?post=6682"}],"version-history":[{"count":1,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6682\/revisions"}],"predecessor-version":[{"id":6683,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/6682\/revisions\/6683"}],"wp:attachment":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/media?parent=6682"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/categories?post=6682"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/tags?post=6682"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}