Add news
March 2010
April 2010
May 2010June 2010July 2010
August 2010
September 2010October 2010
November 2010
December 2010
January 2011
February 2011March 2011April 2011May 2011June 2011July 2011August 2011September 2011October 2011November 2011December 2011January 2012February 2012March 2012April 2012May 2012June 2012July 2012August 2012September 2012October 2012November 2012December 2012January 2013February 2013March 2013April 2013May 2013June 2013July 2013August 2013September 2013October 2013November 2013December 2013January 2014February 2014March 2014April 2014May 2014June 2014July 2014August 2014September 2014October 2014November 2014December 2014January 2015February 2015March 2015April 2015May 2015June 2015July 2015August 2015September 2015October 2015November 2015December 2015January 2016February 2016March 2016April 2016May 2016June 2016July 2016August 2016September 2016October 2016November 2016December 2016January 2017February 2017March 2017April 2017May 2017June 2017July 2017August 2017September 2017October 2017November 2017December 2017January 2018February 2018March 2018April 2018May 2018June 2018July 2018August 2018September 2018October 2018November 2018December 2018January 2019February 2019March 2019April 2019May 2019June 2019July 2019August 2019September 2019October 2019November 2019December 2019January 2020February 2020March 2020April 2020May 2020June 2020July 2020August 2020
1234
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

PACT Act Does More Harm than Good

Will Duffield

The bipartisan, process oriented “Platform Accountability and Consumer Transparency Act” joins a recent parade of Section 230 reform proposals. Sponsored by Sens. Brian Schatz (D-HI) and John Thune (R-SD), the PACT Act proposes a collection of new requirements intended to optimize social media platforms’ governance of user speech. These government mandated practices for handling both illegal speech and that which merely violates platform community standards would upset delicate, platform specific balances between free expression and safety. While more carefully constructed than competing proposals, with provisions actually tailored to the ends of accountability and transparency, the bill threatens to encumber platforms’ moderation efforts while encouraging them to remove more lawful speech.

The PACT Act would establish a process for removing illegal speech, giving platforms 24 hours to remove content that deemed illegal by a court. A company that fails to act would lose Section 230’s protections against liability. Such protections are generally thought essential to these companies. Leaving decisions of legality to the courts is important, it preserves democratic accountability and prevents platforms from laundering takedown demands that wouldn’t otherwise pass legal muster. Under Germany’s NetzDG law platforms must remove “manifestly unlawful” content within 24 hours or risk steep fines, a set of incentives that have encouraged the removal of legal speech on the margins.

The bill’s proposed process for removal would be improved by the addition of a counter‐​notice system, more specific illegal content identification requirements, and a longer takedown window to allow for either user or platform appeal. Still, it is a broadly reasonable approach to handling speech unprotected by the First Amendment.

The breadth of covered illegal content is somewhat limited, including only speech “determined by a Federal or State court to violate Federal criminal or civil law or State defamation law.” This would exclude, for instance, New Jersey’s constitutionally dubious prohibition on the publication of printable firearms schematics.

While the legal takedown mechanism requires a court order, the bill’s requirement that platforms investigate all reports of community standards violations is ripe for abuse. Upon receiving notice of “potentially policy‐​violating content,” platforms would be required to review the reported speech within 14 days. Like law enforcement, content moderators have limited resources to police the endless flow of user speech and must prioritize particularly egregious or time sensitive policy violations. Platform‐​provided user reporting mechanisms are already abused in attempts to vindictively direct moderators’ focus. Requiring review (with a deadline) upon receipt of a complaint would make abusive flagging more effective by limiting moderators’ ability to ignore bad‐​faith reports. Compulsory review will be weaponized by political adversaries to dedicate limited platform enforcement capacity to the investigation of their rivals. Community standards can often be interpreted broadly; under sustained and critically directed scrutiny even broadly compliant speakers may be found in breach of platform rules. Moderators, not the loudest or most frequent complainants, should determine platform enforcement priorities. While the bill also mandates an appeals process, this amounts to a simple re‐​review rather than an escalation and will at best invite an ongoing tug of war over contentious content.

Some of the bill’s components are constructive. Its transparency reporting requirements would bring standardization and specificity to platform enforcement reports, particularly around the use of moderation tools like demonetization and algorithmic deprioritization. This measure would formalize platforms’ hitherto voluntary enforcement reporting, allowing for better cross‐​platform comparisons and evaluations of disparate impact claims. Beneficent intentions and effects aside, as requirements these reporting standards would likely raise compelled speech concerns.

However, other aspects of the bill are sheer fantasy in the face of platform scale. PACT would require platforms to maintain “a live company representative to take user complaints through a toll‐​free telephone number,” during regular business hours. If in a given day even a hundredth of a percent of Facebook’s 2.3 billion users decided to make use of such an option, they would generate tens of thousands of calls. In the early days of Xbox Live, Microsoft maintained a forum to answer user moderation complaints. The forum was so inundated with unreasonable and inane questions that the project was later abandoned. While Microsoft may have incidentally provided some valuable civic education, other platforms should not be required to replicate its Sisyphean efforts.

An Xbox Live moderator explains the difference between public and private speech governance.



Read also

Thrashing of security guard in Malaysia: Sufferer, assailant Nepali nationals

Lord & Taylor files for Chapter 11 bankruptcy

Women warned not to put their sex toys in the freezer during the heatwave – but sticking them in the fridge is fine



News, articles, comments, with a minute-by-minute update, now on Today24.pro




Today24.pro — latest news 24/7. You can add your news instantly now — here