搜索

Facebook 监督委员会重申要求 Meta 更加透明

查看: 268.5k|回复: 0
  发表于 Dec 12, 2021 02:36:04 | 只看该作者 回帖奖励 |倒序浏览 |阅读模式
纽约(美国有线电视新闻网)负责监管 Facebook 母公司 Meta 的独立董事会再次敦促这家社交媒体巨头对其内容审核决策更加透明。

Facebook 监督委员会周四发布了两项裁决,推翻了 Meta 从其平台上删除用户帖子的决定,称这些内容实际上并未违反公司的政策。在这两项决定中,董事会建议 Meta (FB) 向用户提供更多有关其对其内容采取的行动的信息。

监督委员会由言论自由和人权等领域的专家组成,他们由公司任命但独立运作。该委员会负责审查用户对 Meta 拥有的平台上的内容决定的上诉,通常被描述为 Facebook 的一种最高法院。

5 月份决定支持 Facebook 暂停唐纳德特朗普的职务以来,董事会一直在呼吁提高透明度。在该裁决中,董事会同意特朗普严重违反了 Facebook 的政策,但批评该公司最初无限期停职当时的总统。董事会要求 Facebook 澄清其实际政策如何适用于这种情况,并表示“通过应用模糊、无标准的处罚,然后将此案提交董事会解决,Facebook 试图逃避其责任。”

“他们不能在适合他们的时候发明新的、不成文的规则,”监督委员会联合主席海尔·托宁-施密特在 5 月份谈到特朗普的裁决时说。 “他们必须有一种透明的方式来做到这一点。”

10 月,监督委员会发布了第一份季度透明度报告,指出“Facebook 对使用其平台的人并不明确”,并呼吁该公司向用户提供有关内容决策的更多信息。董事会还批评 Meta 在根据泄露的内部文件报道该计划后,隐瞒了有关其“交叉检查”计划的关键细节,以审查与知名用户相关的内容决策。 (作为回应,Meta 表示已要求董事会就 Cross-Check 提供意见,并将“努力在未来向他们解释时更清晰。”)

在最新一轮的监督委员会决定中,再次出现了提高透明度的呼声。

监督委员会周四的裁决

在这两个案例中的一个案例中,董事会推翻了 Meta 的决定,即删除一篇讨论 ayahuasca Instagram 帖子,这是一种长期以来一直用于本土治疗和精神仪式的迷幻饮料,经过其自动化系统和人类版主的审查。 Meta 告诉监督委员会,该帖子被删除是因为它鼓励使用非医疗药物。但董事会表示,该帖子当时实际上并未违反 Instagram 的社区准则,该准则仅禁止销售和购买毒品。它还对该公司声称该帖子可能损害公共健康的说法提出异议,称该帖子讨论了在宗教背景下使用死藤水,但没有包含有关如何获取或使用该物质的信息。

但董事会在此案中的大部分批评都集中在这样一个事实上,即 Meta 没有告诉发布帖子的用户他们违反了哪些规则。

“董事会担心该公司继续在 Instagram 上应用 Facebook 的社区标准,而没有透明地告诉用户它正在这样做,”董事会在一份声明中表示。 “董事会不明白为什么 Meta 不能立即更新 Instagram 社区指南中的语言来告诉用户这一点。”

董事会命令 Instagram 恢复帖子。在基于案例的建议中,董事会表示 Meta 应该“向用户准确解释他们违反了内容政策中的哪些规则”,当他们的内容被采取行动时。它还鼓励该公司在 90 天内更新 Instagram 的社区准则,以与 Facebook 的社区标准保持一致。

针对董事会的决定,Facebook 表示已恢复该帖子,并将对类似内容进行审查。它还表示将审查董事会的政策建议。

“我们欢迎监督委员会今天就此案作出的决定,”该公司在一份声明中表示。

第二个案例是 2021 8 月在 Facebook 上发布的一幅北美土著艺术作品,旨在提高人们对在原住民儿童寄宿学校发现无标记坟墓的认识。在帖子中,用户注意到艺术品的名称“杀死印第安人/拯救男人”,并将作品中的图像描述为“无辜者被盗,邪恶冒充救世主,寄宿学校/集中营,等待发现,带我们的孩子回家。” Facebook 的自动化系统识别出该帖子可能违反了其仇恨言论政策,并在发布后的第二天由人工审核员将其删除;当用户对该决定提出上诉时,第二位人工审核员确认了删除决定。

根据委员会周四的声明,当监督委员会选择此案进行审查时,Meta 将该帖子的删除确定为“执法错误”,并于 8 27 日将其恢复。然而,Meta 直到一个月后才通知用户他们的帖子已经恢复,在董事会询问公司向用户发送的消息后,Meta 将这个问题归咎于人为错误,董事会表示。

基于此案,董事会建议 Meta,“向用户提供及时、准确的通知,了解公司对其上诉相关内容采取的任何行动。”它补充说,“在像这样的执法错误案例中,给用户的通知应该承认该行动是监督委员会审查过程的结果。”

Meta 在一份声明中表示,根据董事会在这种情况下的决定,不需要采取任何行动,因为它已经恢复了该职位,并表示将审查董事会的政策建议。

在周四发布的第二份透明度报告中,监督委员会注意到了这两项决定,并表示将在回应董事会的建议时“监测公司是否以及如何履行其承诺”。它还宣布计划在明年发布年度报告,以评估公司在执行董事会决策和建议方面的表现。

“随着时间的推移,我们相信我们的建议的综合影响将推动 Meta 变得更加透明并使用户受益,”董事会在报告中表示。

Facebook Oversight Board reiterates calls for Meta to be more transparent

New York (CNN Business)The independent board that oversees Facebook parent company, Meta, is once again urging the social media giant to be more transparent about its content moderation decisions.

The Facebook Oversight Board on Thursday released two rulings overturning Meta's decisions to remove user posts from its platforms, saying the content did not actually violate the company's policies. In both decisions, the board recommended that Meta (FB) provide more information to users about actions it takes on their content.

The Oversight Board is comprised of experts in areas such as freedom of expression and human rights who are appointed by the company but operate independently. The board, which reviews user appeals of content decisions on Meta-owned platforms, is often described as a kind of Supreme Court for Facebook.

The board has been making similar calls for transparency since its decision in May to uphold Facebook's suspension of Donald Trump. In that ruling, the board agreed that Trump had severely violated Facebook's policies but criticized the company for its initial indefinite suspension of the then-President. The board asked Facebook to clarify how its actual policies applied to the situation, saying that "In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities."

"They can't just invent new, unwritten rules when it suits them," Oversight Board co-chair Helle Thorning-Schmidt said in May about the Trump ruling. "They have to have a transparent way of doing this."

In October, the Oversight Board released its first quarterly transparency report, noting that "Facebook isn't being clear with the people who use its platforms," and calling on the company to give users more information about content decisions. The board also criticized Meta for withholding crucial details about its "Cross-Check" program to review content decisions related to high-profile users, following reporting on the program based on leaked internal documents. (In response, Meta said it had asked the board for input on Cross-Check and would "strive to be clearer in our explanations to them going forward.")

Calls for more transparency cropped up again in the latest round of Oversight Board decisions.

Oversight Board's Thursday rulings

In one of the two cases, the board overturned Meta's decision to remove an Instagram post discussing ayahuasca, a psychedelic drink that has long been used for indigenous healing and spiritual rituals, after a review by its automated system and a human moderator. Meta told the Oversight Board that the post was removed because it encouraged the use of a non-medical drug. But the Board said the post did not actually violate Instagram's Community Guidelines at the time, which only prohibited the sale and purchase of drugs. It also took issue with the company's claim that the post could harm public health, saying the post discussed the use of ayahuasca in a religious context and did not include information about how to get or use the substance.

But much of the board's criticism in the case centered on the fact that Meta did not tell the user who made the post which of its rules they broke.

"The Board is concerned that the company continues to apply Facebook's Community Standards on Instagram without transparently telling users it is doing so," the board said in a statement. "The Board does not understand why Meta cannot immediately update the language in Instagram's Community Guidelines to tell users this."

The board ordered Instagram to restore the post. And in its recommendations based on the case, the board said Meta should "explain to users precisely what rule in the content policy they have violated," when their content is acted upon. It also encouraged the company to update Instagram's Community Guidelines to be consistent with the Community Standards on Facebook within 90 days.

In response to the board's decision, Facebook said it had reinstated the post and would conduct a review of similar content. It also said that it would review the board's policy recommendations.

"We welcome the Oversight Board's decision today on this case," the company said in a statement.

The second case regarded an August 2021 Facebook post of a piece of North American Indigenous art meant to raise awareness about the discovery of unmarked graves at a former residential school for Indigenous children. In the post, the user noted the name of the artwork, "Kill the Indian/ Save the Man," and described images in the work as, "Theft of the Innocent, Evil Posing as Saviours, Residential School / Concentration Camp, Waiting for Discovery, Bring Our Children Home." Facebook's automated systems identified the post as potentially violating its hate speech policy and a human reviewer removed it the day after it was posted; when the user appealed the decision, a second human reviewer affirmed the decision to remove.

When the Oversight Board selected the case for review, Meta identified the post's removal as an "enforcement error" and restored it on August 27, according to the board's Thursday statement. However, Meta did not notify the user that their post had been restored until a month later, after the board asked about the company's messaging to the user, an issue Meta blamed on human error, the board said.

Based on the case, the board recommended that Meta, "provide users with timely and accurate notice of any company action being taken on the content their appeal relates to." It added that "in enforcement error cases like this one, the notice to the user should acknowledge that the action was a result of the Oversight Board's review process."

Meta said in a statement that no action would be needed based on the board's decision in this case because it had already reinstated the post, and said it would review the board's policy recommendations.

In its second transparency report, also released Thursday, the Oversight Board noted the two decisions and said it would be "monitoring whether and how the company lives up to its promises" as it responds to the board's recommendations. It also announced plans to release an annual report next year to assess the company's performance in implementing the board's decisions and recommendations.

"Over time, we believe that the combined impact of our recommendations will push Meta to be more transparent and benefit users," the board said in the report.

您需要登录后才可以回帖 登录 | 注册

本版积分规则

秀哈英语

Copyright © 2024 秀哈英语版权所有

https://www.showha.cn/ ( 皖ICP备2022008997号 )

关于我们
关于我们
秀哈文化
使用指南
招聘信息
小黑屋
政策说明
法律声明
隐私保护
信息发布规则
关注秀哈微信公众号
手机访问秀哈英语,更方便!
快速回复 返回列表 返回顶部