Network Knowledge Governance: Algorithms and ...

1 downloads 0 Views 592KB Size Report
Nov 8, 2017 - Spanish stock exchange (IBEX) in Madrid crashed. The Spanish prime minister, Mariano. Rajoy Brey, with a general election only two days ...
The European Legacy Toward New Paradigms

ISSN: 1084-8770 (Print) 1470-1316 (Online) Journal homepage: http://www.tandfonline.com/loi/cele20

Network Knowledge Governance: Algorithms and Platform Politics Richard R. Weiner To cite this article: Richard R. Weiner (2018) Network Knowledge Governance: Algorithms and Platform Politics, The European Legacy, 23:3, 306-310, DOI: 10.1080/10848770.2017.1396101 To link to this article: https://doi.org/10.1080/10848770.2017.1396101

Published online: 08 Nov 2017.

Submit your article to this journal

Article views: 55

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=cele20

The European Legacy, 2018 VOL. 23, NO. 3, 306–310 https://doi.org/10.1080/10848770.2017.1396101

REVIEW

Network Knowledge Governance: Algorithms and Platform Politics The Knowledge Corrupters: Hidden Consequences of the Financial Takeover of Public Life, by Colin Crouch, Cambridge, Polity Press, 2016, x+ 182 pp., $64.95/£50 (cloth), $19.95/£15.99, (paper) The Lure of Technocracy, by Jürgen Habermas, Cambridge, Polity Press, 2015, xii + 176 pp., $64.95/£50 (cloth), $22.95/£15.99 (paper) Digital Sociology, by Deborah Lupton, London, Routledge, 2015, vi + 230 pp., $168/£99 (cloth), $51.95/£28.99 (paper) State Transformations in OECD Countries: Dimensions, Driving Forces and Trajectories, edited by Heinz Rothgang and Steffen Schneider, Basingstoke, Palgrave Macmillan, 2015, xiv + 317 pp., $115/£76 (cloth) Richard R. Weiner Department of Political Science, Rhode Island College, Providence, RI 02908 USA

The vision of Henri Bergson, Romain Rolland, and Jean Monnet was one of a peaceable Europe that had escaped the ghosts of the past, one that could steer the continent to economic growth and fully realize the script with a cosmopolitan ending of democracy and human rights. This vision of a better Europe has fallen into crisis: it seems no longer realistic or realizable in the face of new risks and challenges. With the creation of the Organisation for Economic Co-operation and Development (OECD) in 1960, Monnet sought to help administer the Marshall Plan (the European Recovery Program) by using shared knowledge and peer pressure to improve policy and implement the soft law of standard setting to attain sustainable economic growth, employment and financial stability. The OECD was to have a mediating function: building learning networks that map out standards of appropriate behavior and appropriate technical solutions. It was to provide an ecology of collaborative governance and shared knowledge. Can the idea of Europe today as a European Union capacitate solidarity and reciprocity among its parts? Can European nation states and both inter-urban and inter-regional communities within Europe become embedded in knowledge-sharing so as to measure and steer the knowledge economy of a globalized society of networks? On June 24, 2016, one day after the British referendum vote to leave the European Union, stocks lost 140 billion dollars of value, the pound dropped to a 30-year low, and Britain’s

CONTACT  Richard R. Weiner  © 2017 Richard R.Weiner

[email protected]

THE EUROPEAN LEGACY 

 307

credit rating plummeted. The aftershocks were felt throughout Europe and the world. The Spanish stock exchange (IBEX) in Madrid crashed. The Spanish prime minister, Mariano Rajoy Brey, with a general election only two days away, condemned the populist new left political alliance of Unidos Podemos. Populism was dangerous. Now was not a time to take chances. Concurrently, the politics of resentment against European elites (Eurocrats) and their technocratic quants exploded, along with a growing intolerance for Otherness in the EU’s Schengen Agreement policy of free movement across borders. With these four books on my desk I realized I was no longer writing a review essay about history or hypotheticals. We were witnessing an eruption, a shifting of tectonic plates with aftershocks. The decades since 1989 have seen the arrogance of a coded world of national central planning replaced by the arrogance of a coded world of globalized neoliberalism. This is the world wide web of governing algorithms and the numbers they mine and grind: a world where altruism has been driven out, where trust is understood only in terms of expectations, and nation-states are assessed by their credit-worthiness. In The Lure of Technocracy, Europe’s leading philosophe/public intellectual Jürgen Habermas notes the expansion of system-steering capacities beyond the nation-state: coordinated sequential cooperation and coordination in complex networks of purposive action. Detached and distant experts and elites order our lives in more or less different ways with governing and discriminatory algorithms that quantitatively assess performance and risks, with dataveillance as well as surveillance. Can the associated emergent supranational and transnational technocrats—sarcastically dubbed quants—enlarge their basis for legitimation to match the expansion of their network steering capacities? Further—as Colin Crouch demonstrates—knowledge produced for network governance often comes from professional consultants rather than purported governing scientific expertise. Do citizens increasingly feel that the world is “passing them by”? Do they feel passed by and minimalized by transnational corporate networks, transnational law decided in private courts (arbitration in powerful global law partnerships offices), the Organization for Economic Development, the European Union, the International Monetary Fund (IMF), the Bank of International Settlements (BIS), the World Trade Organization (WTO)? The distant and detached expertise of liberalizing technocrats who increasingly “order” our lives provoke both illiberal populists and a populist newer left: the former without much talent at politics; the latter without thought-out policy alternatives. Habermas’s volume stirs us to ponder whether the governing algorithms used to sift and sort, collate and code, analyze and store “Big Data” can engage citizens to interact more with both government in respective nation-states and the politics of their myriad public spheres. Can citizens maintain—or re-take—control of their own destiny? Or are we caught in the “undertow” (Die Sog, Habermas’s initial noun in the German title, translated awkwardly as “lure”)— the undertow of technocratic rationality and of applied governing algorithms? Habermas directs us to puzzle over how governance and knowledge mutually constitute and impact each other in the emergent complex networks. These network webs can be understood as recursive feedback loops of mutual recognition, mutual self-limitation, and co-monitoring in systemic network steering by both state actors and non-state actors. These are multilayered and multi-scalar spaces outside formal government ordered by governing algorithms. Public questions like financial ones are transposed into problems solvable by algorithms—formal procedural rules—for reasons of efficiency amidst information overload. In embedded globalized coding, the objectifications

308 

 R. R. WEINER

and procedures are not value neutral, nor are they apolitical in setting indicators and measures. How do international organizations like the OECD, EU and WTO deal with the power of transnational corporations and their influence over data, their massaging and distorting of data? This is what Colin Crouch refers to as “knowledge corrupting.” Crouch points to such recent cases as: (1) Volkswagen’s malfunctioning emissions software; (2) British Petroleum’s record of pollution; and (3) Libor (London Interbank Offered Rate) with its fraudulent reports and manipulated submissions affecting hedge fund derivative markets. These cases demonstrate how executives in profit-maximizing corporations have incentives to ignore or distort knowledge. Worse are cases of physicians diagnosing dementia because raising the number of diagnoses result in gaining bonuses: gaming the system, as opposed to subverting the grid. Firms often seek to take control of public knowledge and use it for their own ends, for their own advantage. Left to itself, Crouch continues, market-mindedness only rarely gives a firm incentives to reduce any damage to the general environment as everything is ultimately reduced to money values. Who, Crouch asks, do technocrats serve in this reduction of all ways of life into financially framed algorithms of cost/benefit prescribed by neoliberal economics rather than the experience of professionals and skilled workers? The answer is finance capital, with its unelected financial specialists who affect the coding of the knowledge technocrats apply. Such finance-oriented framing can have corrupting effects on other forms of assessment and the knowledge they supposedly generate and create with their metrics. One must ponder what such knowledge tells you, and what it does not. It comes down to the quality of the knowledge. Here Deborah Lupton explains in Digital Sociology we must understand what goes into these so-called metrics of measuring, quantifying and monitoring performance as digital data flows for comparative purposes. Performance is assessed in terms of observable digitized algorithmic parameters wherein data is coded, aggregated and interpreted. Specifically, there is the modeling, the coding, the programming, the sorting, the gatekeeping comparative assessment, and even the decision-making algorithms for network systemic steering. The focus is on outcomes, not causes: how to anticipate, how to predict, how to regulate, how to control. Network protocols are designed by people who establish an authoritative logical container within which we as humans interact, such as the European Stability Mechanism. Our emergent global society of networks is embedded in a series of programmed assumptions about making: how to do something under given conditions. Parameters are set within which policymakers satisfice in reviewing alternatives. These regimes of encoded procedures develop as decentered and distributed networks constructing their own sort of ordering rule of algorithmic expertise, while at the same time trying to cope with their own sources of legitimacy and accountability in terms of some procedural rationality. Habermas—and the late American sociologist of science Robert K. Merton—stress the importance of setting up argumentative procedures in the networking spaces of the co-creation, sharing and transfer of knowledge.1 This is especially important in the cross-border transactions of both our scientific and economic world. Knowledge can only be assured if it is achieved argumentatively, according to a carefully disciplined procedural rationality enabling the practice of validation rules to test justificatory propositions and their application discourses.2 The testing needs to occur within what Merton termed the necessary trials of “organized skepticism” which are fundamental to a community of inquiry in the

THE EUROPEAN LEGACY 

 309

advancement of knowledge. Procedural standards must be met to secure reflexive, recursive and responsible competent deliberation with assured means for self-correction.3 For the later Habermas, this is a procedural rationality to meet practical problems, to test warranted assertions in the production of policy knowledge rather than an engagement with chimerical counterfactuals. To combat the undertow of unresponsive and unquestioning of algorithmic technocracy, Habermas cautions us about a shift from the normative sense of procedural rationality to a purely cognitive one of expertise in cascading networks of complexity. Crouch cautions us about a need to secure the integrity of knowledge. Merton, like Max Weber at the dawn of the twentieth century, stresses the need of policy science to be governed by an ethos of detached scrutiny. Coded network governance—as Heinz Rothgang and Steffen Schneider’s bulky volume shows—limits a nation-state’s sovereign existence and purposes as ends in themselves, with the emergence of complex and heterogeneous networks of political authorities in which the state is only one authority among others. However, this volume shows how the networks still remain reliant on the nation-state. Why? Because only the nation-state can provide the complementary resources to exercise political authority effectively and legitimately that non-state network actors lack. The nation-state no longer exercises authority directly and exclusively through its own powers and resources. More and more, it indirectly complements the powers and resources of non-state actors. The EU is dependent on national governments to enact and enforce European legislation passed by the European Parliament. Indeed, EU institutions are beginning to show—as Fritz Scharpf comments elsewhere—“more open access to a wider plurality of organized interests than is true of most member governments.”4 Rothgang and Schneider use the TRUDI state form model developed in the OECD during the first three decades after World War II: (T) a territorial state that controls monopoly on use of coercion, (RU) guarantees the rule of law, (D) with democratic decision-making reflecting the will of the people; and (I) with intervention by the state in economic and social affairs to secure efficiencies and social justice. As the regulatory role of the nationstate undergoes marked changes in the global knowledge economy, procedural limitations appear in the form of functional necessities of multilevel governance. Nation-states must thus comply with an unwelcome new sense of legitimacy, one reflecting procedural restraints outside their own respective legitimate law-making. The volume’s most significant chapter, by Gralf-Peter Calliess, Hermann B. Hoffmann, and Jens-Michael Lobschat, illustrates a splendid emerging resolution of ambiguity. While there is the nation-states’ struggle with the transnationalization of commercial law—initiated in exogenous global civil society and not in state legislatures—there are early indications that nation-states are recognizing the necessity to adapt state courts to the legal particularities of cross-border transactions and to reduce dependency on private arbitral tribunals. Network nodes are not centrally coordinated. They are decentered paranodality involving more than a single dominating code. Digital sociological analysis comprehends algorithms as more than abstract technical formulations: they are the formulations of deeper human and institutional choices made. As such, they establish forms of order that have political ramifications and engender argumentative debates as to the integrity of the knowledge shaped, stored and transferred.5 Recently, the Indignados movement in Spain explored and expanded “a politics of platforms” asserting legitimation claims within a mutually referent network. Resisting neoliberal assujetissement, there is a scaling up of local initiatives of social assertion and social

310 

 R. R. WEINER

capability, agonistically posing a sensibility (a habitus) of a solidarity-based economy of sustainable community movement organizations (SCMOs in the terminology of socioeconomics). These are developed endogenously within Spanish civil society. Similarly, Network-Europe 21 based in Berlin envisages a new version of Europe where regions and metropolitan areas and the interlinkages between them become the constitutive hubs of a new Europe stressing transnational and plurinational realities.6 Deborah Lupton talks of how software programs can increase expertise participation to construct alternative forms of life, especially expanding the possibility of getting more feedback from more diverse sources. For example, software platforms can be leveraged to enable citizens—whose credentials can have accessible digital badges—more opportunities to speak to their talents. This could add the deliberative democratic input with procedural restraints that Habermas talks of in “fulfilling citizens’ political will” in European institutions and reprise the mission of making a more representative and participatory bureaucracy that Max Weber envisioned.

Notes 1.  As Merton noted before the dropping of the atom bomb on Hiroshima in “The Normative Structure of Science,” 267–78, which was originally published in 1942 as “Science and Technology in a Democratic Order.” 2.  Alexy, Theory of Legal Argumentation. 3.  Günther, “Welchen Personenbegriff die Diskurstheorie des Rechts?” 87. 4.  Scharpf, “Reflections on Multilevel Legitimacy.” 5.  Crawford, “Can Algorithm Be Agonistic?” 77–92; cf. Mouffe, Agonistics. 6.  Guérot, “Europe as a Republic.”

Bibliography Alexy, Robert. A Theory of Legal Argumentation: The Theory of Rational Discourse as Theory of Legal Justification. Translated by Neil MacCormick. Oxford: Clarendon Press, 1999. Crawford, Kate. “Can Algorithm Be Agonistic?: Ten Scenes from Life in Calculated Politics.” Science, Technology and Human Values 41, no. 1 (2016): 77–92. Guérot, Ulrike. “Europe as a Republic: The Story of Europe in the Twenty-First Century.” Eurozine. http://www.eurozine.com.articles/2015-07-10-guerot-en.html. Günther, Klaus. “Welchen Personenbegriff die Diskurstheorie des Rechts?” In Das Recht der Republik, edited by Hauke Brunkhorst and Peter Niessen. Frankfurt a. M.: Suhrkamp, 1999. Merton, Robert K., “The Normative Structure of Science”. In The Sociology of Science. Chicago, IL: University of Chicago Press, 1973. Originally published as “Science and Technology in a Democratic Order.” Journal of Legal and Political, Sociology 1 (1942): 115–26. Mouffe, Chantal. Agonistics: Thinking The World Politically. London: Verso, 2013. Scharpf, Fritz W. “Reflections on Multilevel Legitimacy.” Max Planck Institüt für Gesellschaftsforschung Working Paper 07/3 (2007).

Suggest Documents