Skip to main content

Deplatforming and Democratic Legitimacy

  • Chapter
  • First Online:
Regulating Free Speech in a Digital Age

Abstract

Who calls the shots in moderation of online content that defines the boundaries of free speech—private companies, or governments and the courts? This question came into sharp focus with the deplatforming of then-President Donald Trump by Twitter, Facebook and YouTube and the deplatforming of Parler by Google, Apple and Amazon following the insurrection at the U.S. Capitol on January 6, 2021. Decisions about deplatforming and restriction of free speech should not be made by private companies, but government censorship and web filters operated by public authorities can also be deeply problematic when governments over-reach democratic legitimacy and censorship mechanisms lack independent oversight or rights of review and appeal. Reining in abuse of the internet requires multi-stakeholder governance, with a system of checks and balances between government, business, civil society and public interest media. While deplatforming can be effective, it may have unintended consequences and is only a part-solution to complex social problems and processes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 89.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Deplatforming here means “the technical act of disabling social media profiles and rendering their content inaccessible to other users” (Fielitz & Schwarz, 2020, p. 12; Rogers, 2020, pp. 214, 226), or removing an app from an app store or denying access to web services, rather than the wider application of the term in “cancel culture” (Sect. 9.4), including withdrawing speaking invitations or denying the use of venues.

  2. 2.

    In the following days, Twitter also removed more than 70,000 accounts that promoted the QAnon conspiracy theory (Conger, 2021; Twitter Safety, 2021). New Zealand was not immune to Twitter’s clampdown, which suspended the accounts of hundreds of New Zealand-based users, many of whom voiced right-wing political opinions (Walls, 2021).

  3. 3.

    The advocacy coalition Stop Hate for Profit had launched a campaign to pressure the major platforms, including Alphabet’s YouTube, to ban Trump from their services. The organisation, which includes the Anti-Defamation League, the NAACP, the National Hispanic Media Coalition, Free Press and Colour of Change, said it would call for an advertiser boycott if the platforms did not take action by January 20, 2021, the date of president-elect Joe Biden’s inauguration (Stop Hate for Profit, 2021). See further Sect. 10.2.

  4. 4.

    Parler’s company name is French for “to speak”. The heading is from a song that was popular with Allied soldiers during World War I—“Mademoiselle from Armentières”, with its hook line, “Inky Pinky parlez-vous”.

  5. 5.

    Since its inception, Parler’s community guidelines have not allowed “obscenity, terrorist content and ‘fighting words,’ or calls to incite violence” (Lerman, 2020).

  6. 6.

    By August 2021, the instant messaging app Telegram had been downloaded over 1 billion times globally. Its largest market is India, followed by Russia and Indonesia (Singh, 2021).

  7. 7.

    Volokh (2021) adds: “Facebook and Twitter, unlike the government, can’t send us to jail or tax us. But at least governmental speech restrictions are implemented in open court, with appellate review.”

  8. 8.

    Navalny has good reason to distrust state censorship. In July 2021, it was reported that the Russian internet watchdog Roskomnadzor had blocked 49 websites connected to Navalny, including his own website (Deutsche Welle, 2021). In September 2021, Apple and Google removed an app to co-ordinate protest voting in Russia’s elections, after the Russian government threatened to prosecute local employees of the two companies (Troianovski & Satariano, 2021).

  9. 9.

    New Zealand’s Films, Videos, and Publications Classification Act 1993 defines a publication as objectionable “if it describes, depicts, expresses, or otherwise deals with matters such as sex, horror, crime, cruelty, or violence in such a manner that the availability of the publication is likely to be injurious to the public good”.

  10. 10.

    In 2020, DIA awarded the Digital Child Exploitation Filtering System contract to Allot Limited, a multinational provider of network intelligence and security solutions, based in Israel (N.Z. Department of Internal Affairs, 2020). The new contract includes filtering “violent extremism content” in addition to the current material (Kenny, 2020b).

  11. 11.

    Commenting on Apple’s announcement of impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage, McKinney and Portnoy (2021), writing for the Electronic Frontier Foundation, warn of risks to privacy and security: “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to wider abuses.” In September 2021, Apple delayed plans to roll out its child sexual abuse detection technology (Whittaker, 2021). See further Bajak and Ortutay (2021) and Green and Stamos (2021); and an October 2021 report by prominent cyber-security experts Abelson et al. (2021).

  12. 12.

    Practical concerns were highlighted when a NZ Herald investigation found that the Digital Child Exploitation Filter had stopped working for two months in 2020, allowing access to images and videos of abuse from banned sites and that since then, the Filter has failed on two other occasions, allowing access to online child sex abuse material (Fisher, 2021).

  13. 13.

    In considering freedom of peaceful assembly and association, the U.N. Human Rights Council has declared that the same rights that people have offline must also be protected online (Kaye, 2015; U.N. Human Rights Council, 2013, 2018; Voule, 2019).

  14. 14.

    Barraclough et al. (2021) define “better rules” as the application of service design techniques to policy development, including skillsets from computer science and business process modelling, concept modelling, decision flow diagrams and rule statements. By “law as code” they mean machine-consumable legislation that embodies (interpretations of) the law in code, for example, writing new laws using a process that could generate a computer-implementable output.

  15. 15.

    Barraclough et al. (2021, p. 100, para. 423) reference, for example, legislation described in the Online Harms White Paper in the United Kingdom, the European Union’s proposed Digital Services Act, and Australian legislation criminalising the sharing of “abhorrent violent material”, among other things.

  16. 16.

    The bill passed its third reading in October 2021 and takes effect from February 1, 2022. In New Zealand, voluntary internet filters that online service providers choose to sign up to remain on the table, outside of legislation.

  17. 17.

    The Trump administration reacted negatively to digital services tax regulations because most of the online service providers falling under the new regimes are based in the United States (BBC News, 2019).

  18. 18.

    President Biden has assembled “an aggressive anti-trust team”, including Jonathan Kanter to lead the Justice Department’s anti-trust division, Lina Khan to lead the Federal Trade Commission and Tim Wu as the special assistant to the president for technology and competition policy (Tankersley & Kang, 2021).

  19. 19.

    For further background and analysis of the Digital Services Act and Digital Markets Act, see Kalbhenn (2021).

  20. 20.

    On media law regulation of social networks in Germany, see Holznagel and Kalbhenn (2021).

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Bromell .

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Bromell, D. (2022). Deplatforming and Democratic Legitimacy. In: Regulating Free Speech in a Digital Age. Springer, Cham. https://doi.org/10.1007/978-3-030-95550-2_4

Download citation

Publish with us

Policies and ethics