23
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Trump, Parler, and Regulating the Infosphere as Our Commons

      editorial
      1 , 2 ,
      Philosophy & Technology
      Springer Netherlands

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The facts are well known by now: after a pro-Trump mob stormed the US Capitol building, causing the death of five people, Donald Trump became digitally toxic and was deplatformed (Crichton 2021), due to the danger of his violent, incendiary messages, containing often false or misleading statements. Facebook, Instagram, Twitter and YouTube suspended Trump’s accounts. Twitter suspended accounts linked to QAnon, the far-right movement close to Trump. Parler, the right-wing extremist platform frequented by Trump supporters, saw its app banned by Google and Apple, and Amazon suspended web hosting it. Similar initiatives were taken by other services such as Pinterest, Reddit, Shopify, TikTok and Twitch. In a way, it was a success (Rupar 2021): political misinformation online on electoral fraud fell by 73% (Ostrom et al. 1999), but the question, still echoing these days, remains: did these companies do the right thing? It is a crucial question for the future of digital societies and their democratic organisation. Unfortunately, it is also the wrong question, because it reduces a twofold problem to a binary choice. For if we are only talking about legality and the protection of public interest, these are good reasons to answer yes (Conger and Isaac 2021 - updated 12 January 2021), but if we are also talking about democratic legitimacy and digital sovereignty, these are good reasons to answer no (Liptak 2021). Luckily, the two answers are reconcilable (West and Lakier 2021). The crucial variable is time: today, they did the right thing (finally, some people like me would add), but tomorrow, societies should not depend on companies doing the right thing if and when they wish, independently of any rules and democratic accountability. The rest of the article explains why and how this is the case. Those approving or disapproving of the deplatforming of Trump agree that, if this can happen to the President of the USA, it can happen to anyone. The difference is that those in favour of the ban proclaim that this shows that no one is above the rules, whereas those against the ban complain that this shows that we are all subject to the arbitrary, potentially whimsical and unaccountable power of these companies. The truth is that, for some time, people had been complaining about Trump’s misuse of social media to spread populist, demagogic, misleading and incendiary messages, unacceptable both for what they stated (e.g. about the pandemic or the presidential election) and for what they omitted (e.g. in terms of rejecting or criticising white supremacists’ actions or propaganda). The violence in Washington and the pandemic, which has forced people to live increasingly connected and online, have made the public more keenly aware of the importance of good digital communication and a decent ecology of social media. What has been clear to researchers for a long time has become obvious to the educated public as well: the same companies involved in the deplatforming of Trump are also criticised for abusing their oligopolistic positions and enabling the spread of so much misinformation and fake news, so the question asked above—whether the deplatforming was acceptable—is important because it is the symptom of a more general and crucial historical problem: who is in charge in the infosphere (Floridi 2014a)? Today, digital sovereignty (Floridi 2020a)—understood as the ability to control our lives online and, increasingly, our onlife experience tout court (Floridi 2014b)—is also largely in the hands of a few, colossal companies. We have already seen this with Google and Apple and mobile telephony: through their APIs, the two companies have decided who can do what and how with mobile phones, even in the case of apps designed to fight COVID-19 (Morley et al. 2020; Floridi 2020b). The problem is clearly serious, but I already mentioned that the question—did they do the right thing or not?—is both simplistic and polarising. On the one hand, those in favour of the suspension of Trump’s accounts argue that the platforms in question are private companies that offer services on their own terms, set by them and freely accepted by the users, and hence that they have the right to suspend any user as and when they want, if the terms of service are not respected (Brandom 2021). They stress that the platforms have allowed Trump to communicate for so long only because, as the President of the United States (POTUS), he was considered one of those exceptional cases where, for reasons of public interest, messages were tolerated that would have otherwise led to the suspension of the services if sent by any other user, but they also conclude that things changed because, in the long run, communications like Trump’s, which deny the truth (think of Trump’s denial of the pandemic or of global change) and incite violence, end up harming the public interest, and ultimately must be moderated and then blocked. On the other hand, those opposed to the suspension object that this is not only a question of consistent application of the terms of use—because, in that case, the same platforms should have blocked Trump much earlier and intervened in many other contexts (Sri Lanka, Myanmar, India, Ethiopia, (Satariano 2021 - updated 17 January 2021))—but also of economic interest, unaccountable arbitrariness and a risk of ‘censorship’ (but note that this is a loaded word that prejudges as negative whatever content moderation it is used to describe, (Graham 2021)). The suspension happened so late—they continue—because Trump was finally an outgoing loser, because past clashes, even personal ones, could finally find an outlet without repercussion, and because the operation could help gain some favour with the new Biden administration. Too little too late for society, too convenient for companies, too risky for democracy. The real problem was not Trump, soon out of the game, but that the decision to silence a voice—no matter how problematic—was left to corporate discretion. The reasoning continues by stressing that the companies in question are not neutral but promote an ideology that is neo-liberalist, anti-conservative and exclusively focused on the freedom of speech as more important than any other right (think of privacy or security), as long as such an ideology is coherently but also conveniently aligned with companies’ business models and strategies. In the case of Trump, such a Californian ideology may be likable, but in other cases, it could easily erode pluralism and silence dissenting voices. Because of these arguments, I observe, those who want to defend the freedom of expression at all costs end up somewhat paradoxically being on the same side as right-wing and autocratic powers that have strongly objected to the decision to block Trump’s accounts. Indeed, more generally, digital sovereignty in the hands of private companies scares both those who fear it as an erosion of democracy and freedom of speech (Ragozin 2021), and those who oppose it as a threat to their own authoritarian power (Chunduru 2021). Thus, the editorial immunity sanctioned by the famous Section 230 is defended both by those who want freedom of speech protected against censorship, and by those who want it to ensure that their own violent and extremist contents are not removed; it is attacked both by those who want to make sure, like Trump (Smith 2020) that platforms cannot remove any content, and by those, like Biden (Lerman 2021), who want platforms to be held accountable for removing unacceptable content. The real difficulty is that it all depends on what it may replace if it is removed. How can this problem be solved? From a public interest and legality standpoint, companies did well to block Trump and Parler. They should have done it before, they should have done it in many other cases too, and they certainly were not too brave to do it so late. However, by blocking Trump and Parler (think also of the current debate about Facebook and Australian legislation on the linking and dissemination of news), these private companies have shown that, de facto, they have a public role which is of crucial public interest, since they decide what may or may not happen in the infosphere and hence in the lives of billions of people (Naughton 2021). This was never a simple matter of communication channels, where providers have no responsibility for the exchanged contents. In reality, the infosphere is a shared, relational space, a commons, to use a traditional English legal term (Ostrom et al. 1999; Floridi 2013). It is the space where humanity spends more and more time and where more and more activities take place directly or indirectly, from education to work, from socialisation to entertainment, from commerce to finance, from the exercise of justice to political discussion, from research to journalism. It is the space that influences every other space, even the physical one; just think of all the issues surrounding defence and security. It is a space that should be conceptualised and governed more like a condominium1—like Antarctica and the Space Station, which belong to everyone—rather than like a new frontier that can be appropriated and colonised by anybody, or like a space that belongs to no one, like the Moon. So, those who are worried about the fact that some companies have silenced Trump and Parler (for example in Germany and France, (Jennen and Nussbaum 2021)) are right because the sovereignty of this space should not be left to private enterprises, business strategies, self-regulation and market forces (Breton 2021). It is time to take seriously the fact that the infosphere is humanity’s commons and hence regulate its use with open and transparent rules, legally grounded on all human rights and on human dignity, to avoid arbitrariness, unaccountability, abuse, and discrimination (Stoller and Miller 2021). One must remember that the companies that suspended Trump are also part of the problem, not just the solution, because they are also the ones who first empowered and then disempowered such a demagogue through their platforms. Companies did the right thing by deplatforming Trump and Parler, for reasons of self-regulation of services provided and of public interest. Still, it is not right that they have so much power in the first place, for reasons of accountability and misplaced digital sovereignty. The conclusion is that this time we were lucky (Goldberg 2021) and the companies in question acted correctly (if late and partially), but crossing our fingers is not a viable political strategy, and therefore, we must establish the right ethical and legal framework to ensure that next time these companies operate in the interest of all, not just for convenience or if they wish to exercise some good will, but for reasons of regulatory responsibility and social accountability. This may sound unrealistic, but it is enough to read the Digital Services Act2 to understand that the European Union is coming to the same conclusion and building the regulatory framework that will make an operation like the one against Trump not only justified but also accountable and not arbitrary (see Article 20). And if this development seems worrisome because politics should never control free speech, two things must be remembered: that even the right of freedom of speech knows its limits when aligned and harmonised with other rights (Wildman 2017), such as that of security against disinformation and incitement to violence, and that politics is not the same everywhere. It is only there where those who control the controllers are the controlled themselves that one can talk of real democracy, and it is only in a real democracy that a limit to freedom of speech is not censorship but tolerant respect for civil communication, one that hurts nobody and is good for everybody, as in the European Union. And, to those who object that suspensions and deplatforming may even be welcome sometimes but never work because they do not block extreme, intolerant or radicalised views, and that the same unacceptable forms of communication will reappear elsewhere (Blackburn et al. 2021; Ou 2021), one may retort that separating what is edible from what is poisonous maybe does not wipe out the poisonous, but enables one to have a much healthier and safer diet (Bedingfield 2021). True, those who wish to do so will be able to continue to feed on falsehood, lies, demagogy, nonsense, violence and other unpalatable contents, but with greater difficulty, and those who want to avoid certain poisons will be able to do so much more easily, not running the risk of finding them mixed everywhere, indiscriminately, on open platforms accessible to billions of people. It is time to be green on our blue technologies: an ethically preferable and legally acceptable ecology of the infosphere is overdue. Maybe someday, we will thank Trump for making us reach a tipping point, and finally decide to reform the rules that determine who controls the infosphere and how.

          Related collections

          Most cited references4

          • Record: found
          • Abstract: found
          • Article: not found

          Revisiting the commons: local lessons, global challenges.

          In a seminal paper, Garrett Hardin argued in 1968 that users of a commons are caught in an inevitable process that leads to the destruction of the resources on which they depend. This article discusses new insights about such problems and the conditions most likely to favor sustainable uses of common-pool resources. Some of the most difficult challenges concern the management of large-scale resources that depend on international cooperation, such as fresh water in international basins or large marine ecosystems. Institutional diversity may be as important as biological diversity for our long-term survival.
            Bookmark
            • Record: found
            • Abstract: not found
            • Book: not found

            The fourth revolution, how the infosphere is reshaping human reality

              Bookmark
              • Record: found
              • Abstract: not found
              • Book: not found

              The ethics of information

                Bookmark

                Author and article information

                Contributors
                luciano.floridi@oii.ox.ac.uk
                Journal
                Philos Technol
                Philos Technol
                Philosophy & Technology
                Springer Netherlands (Dordrecht )
                2210-5433
                2210-5441
                8 March 2021
                : 1-5
                Affiliations
                [1 ]GRID grid.4991.5, ISNI 0000 0004 1936 8948, Oxford Internet Institute, , University of Oxford, ; 1 St Giles, Oxford, OX1 3JS UK
                [2 ]GRID grid.499548.d, ISNI 0000 0004 5903 3632, The Alan Turing Institute, ; 96 Euston Road, London, NW1 2DB UK
                Article
                446
                10.1007/s13347-021-00446-7
                7937439
                002e845c-dba9-4d66-8b06-62b871d91522
                © The Author(s), under exclusive licence to Springer Nature B.V. 2021

                This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.

                History
                : 24 February 2021
                : 24 February 2021
                Categories
                Editor Letter

                Philosophy of science
                Philosophy of science

                Comments

                Comment on this article