From P2P Foundation
Contents[hide] |
Discussion
The ecological problem and contemporary cybernetic planning
Nick Dyer-Witheford:"An abundant communist society of high automation, free software, and in-home replicators might, however, as Fraise (2011) suggests, need planning more than ever – not to overcome scarcity but to address the problems of plenty, which perversely today threaten shortages of the very conditions for life itself. Global climate change and a host of interlinked ecological problems challenge all the positions we have discussed to this point. Bio-crisis brings planning back on stage, or indeed calculation – but calculation according to metrics measuring limits, thresholds and gradients of the survival of species, human and otherwise. Discussing the imperatives for such ecosocialist planning, Michael Lowy (2009) points out how this would require a far more comprehensive social steering than mere ‘workers control’, or even the negotiated reconciliation of worker and consumer interests suggested by schemes such as Parecon.
Rather, it implies a far-reaching remaking of the economic systems, including the discontinuation of certain industries, such as industrial fishing and destructive logging, the reshaping of transportation methods, ‘a revolution in the energy-system’ and the drive for a ‘solar communism’ (Lowy, 2009: np).
Such transformations would involve cybernetics along two major axes, as both contributors to the current bio-crisis and as potential means for its resolution. On the first of these axes, the ecological costs of nominally ‘clean’ digital technologies have become increasing apparent: the electrical energy requirements of cloud computing data-centres; the demands of chip manufacture for fresh water and minerals, the latter from large scale extractive enterprises; and the resulting prodigious quantities of toxic e-waste. Making every home a fab-lab mini-factory will only speed-up planetary heat death. Contrary to all idealistic notions of virtual worlds, cybernetics are themselves inextricably part of the very industrial system whose operations have to be placed under scrutiny in a new system of metabolic regulation that aims for both red and green plenty.
However, cybernetic systems are also a potential part of any resolution of the bio-crisis – or, indeed, of even fully recognizing it.
Paul Edward’s (2010) A Vast Machine analyzes the global system of climatological measurement and projection – the apparatus of weather stations, satellites, sensors, digitally archived records and massive computer simulations, which, like the Internet itself, originated in US Cold War planning – on which comprehension of global warming rests. This infrastructure generates information so vast in quantity and from data platforms so diverse in quality and form that it can be understood only on the basis of computer analysis. Knowledge about climate change is dependent on computer models: simulations of weather and climate; reanalysis models, which recreate climate history from historical data; and data models, combining and adjusting measurements from multiple sources.
By revealing the contingency of conditions for species survival, and the possibility for their anthropogenic change, such ‘knowledge infrastructures’ of people, artifacts, and institutions (Edwards, 2010: 17) – not just for climate measurement, but also for the monitoring of ocean acidification, deforestation, species loss, fresh water availability – reveal the blind spot of Hayek’s catallaxy in which the very grounds for human existence figure as an arbitrary ‘externality’. So-called ‘green capital’ attempts to subordinate such bio-data to price signals. It is easy to point to the fallacy of pricing non-linear and catastrophic events: what is the proper tag for the last tiger, or the carbon emission that triggers uncontrollable methane release? But bio-data and bio-simulations also now have to be included in any concept of communist collective planning. Insofar as that project aims at a realm of freedom that escapes the necessity of toil, the common goods it creates will have to be generated with cleaner energy, and the free knowledge it circulates have metabolic regulation as a priority. Issues of the proper remuneration of labor time require integration into ecological calculations. No bio-deal that does not recognize the aspirations of millions of planetary proletarians to escape inequality and immiseration will succeed, yet labour metrics themselves need to be rethought as part of a broader calculation of the energy expenditures compatible with collective survival." (http://www.culturemachine.net/index.php/cm/article/view/511/526)
Requirements for Cybernetic Communism
Nick Dyer-Whiteford:"A new cybernetic communism ... would, we have seen, involve some of the following elements: use of the most advanced super-computing to algorithmically calculate labour time and resource requirements, at global, regional and local levels, of multiple possible paths of human development; selection from these paths by layered democratic discussion conducted across assemblies that include socialized digital networks and swarms of software agents; light-speed updating and constant revision of the selected plans by streams of big data from production and consumption sources; the passage of increasing numbers of goods and services into the realm of the free or of direct production as use values once automation, copy-left, peer-to-peer commons and other forms of micro-replication take hold; the informing of the entire process by parameters set from the simulations, sensors and satellite systems measuring and monitoring the species metabolic interchange with the planetary environment.
This would indeed be a communism heir to Lenin’s ‘soviets plus electricity’, with its roots in red futurism, constructivism, tektology and cybernetics, together with the left-science fiction imaginaries of authors such as Iain M. Banks, Ken McLeod and Chris Moriarty. It would be a social matrix encouraging increasingly sophisticated forms of artificial intelligence as allies of human emancipation. For those who fear the march of the machine it holds only this comfort: whatever singularities might spring from its networks would not be those of entities initially programmed for unconstrained profit expansion and the military defense of property, but rather for human welfare and ecological protection. Such a communism is consonant with a left accelerationist politic that, in place of anarchoprimitivisms, defensive localism and Fordist nostalgia, ‘pushes towards a future that is more modern, an alternative modernity that neoliberalism is inherently unable to generate’ (Williams & Srnicek, 2013). If it needs a name, one can take the K-prefix with which some designate ‘Kybernetic’ endeavors, and call it ‘K-ommunism’. The possibile space for such a communism now exists only between the converging lines of civilizational collapse and capitalist consolidation. In this narrowing corridor, it would arise not out of any given, teleological logic, but piece by piece from countless societal breakdowns and conflicts; a post-capitalist mode of production emerging in a context of massive mid-twenty-first century crisis, assembling itself from a hundred years of non-linear computerized communist history to create the platforms of a future red plenty." (http://www.culturemachine.net/index.php/cm/article/view/511/526)
History
The history of Cybernetic Planning
Nick Dyer-Witheford:"If central planning suffered from a calculation problem, why not just solve it with real calculation machines? This was precisely the point made by Hayek’s opponent, the economist Oskar Lange, who, retrospectively reviewing the ‘socialist calculation’ debate, remarked: ‘today my task would be much simpler. My answer to Hayek … would be: so what’s the trouble? Let us put the simultaneous equations on an electronic computer and we shall obtain the solution in less than a second’ (1967: 159). Such was the project of the cyberneticians featured in Red Plenty, a project driven by the realization that the apparently successful Soviet industrial economy, despite its triumphs in the 1940s and ‘50s, was slowly stagnating amidst organizational incoherence and informational bottlenecks.
Their effort depended on a conceptual tool, the input-output table, whose development is associated with two Russian mathematicians: the émigré Wassily Leontief, who worked in the US, and the Soviet Union’s Kantorovich, the central protagonist of Red Plenty. Inputoutput tables – which, it was recently discovered, are amongst the intellectual foundations of Google’s PageRank algorithm (Franceschet, 2010) – chart the complex interdependence of a modern economy by showing how outputs from one industry (e.g. steel or cotton) provide inputs for another (say, cars or clothing), so that one can estimate the change in demand resulting from a change in production of final goods. By the 1960s such tables were an accepted instrument of large scale industrial organizations: Leontief’s work played a role in the logistics of the US Air Force’s massive bomber offensive against Germany. However, the complexity of an entire national economy was believed to preclude their application at such a level.
Soviet computer scientists set out to surmount this problem. As early as the 1930s, Kantorovich had improved input-output tables with the mathematical method of linear programming that estimated the best, or ‘optimizing’, combination of production techniques to meet a given target. The cyberneticians of the 1960s aimed to implement this breakthrough on a massive scale by establishing a modern computing infrastructure to rapidly carry out the millions of calculations required by Gosplan, the State Board for Planning that oversaw economic five year plans. After a decade of experimentation, their attempt collapsed, frustrated by the pitiful state of the Soviet computer industry – which, being some two decades behind that of the US, missed the personal computer revolution and did not develop an equivalent to the Internet. It was thus utterly inadequate to the task set for it. All this, alongside political opposition from a nomenklatura that, seeing in the new scientific planning method a threat to its bureaucratic power, compelled abandonment of the project (Castells, 2000; Gerovitch, 2008; Peters, 2012).
This was not the only twentieth century project of ‘cybernetic revolutionaries’; as remarkable was the attempt by Salvador Allende’s Chilean regime to introduce a more decentralized version of electronic planning, ‘Project Cybersyn’ (Medina, 2005). Led by the Canadian cybernetician Stafford Beer, this was conceived as a system of communication and control that would enable the socialist regime to collect economic data, and relay it to government decision makers, even while embedding within its technology safeguards against state micro-management and encouragement for many-sided discussions of planning decisions. This was an attempt at socio-technical engineering of democratic socialism that today perhaps seems more attractive than the post-Stalinist manoeuvres of the Soviet computer planners. But it met an even more brutal fate; Project Cybersyn was extinguished in the Pinochet coup of 1973. In the end the failure of the USSR to adapt to a world of software and networks contributed to its economic/military defeat by the United States. Its disintegration, in which, as Alec Nove (1983) demonstrated, information bottlenecks and reporting falsifications played a major role, seemed to vindicate the Austrian economists. Hayek’s praise of market catallaxy thus became central to the ‘neoliberal thought collective’ (Mirowski, 2009) that led the subsequent victory march of global capitalism.
The combined pressure of the practical disaster of the USSR and the theoretical argument of the Austrian school exerted immense force inside what remained of the left, pressuring it to reduce and reset the limit of radical aspiration to, at most, an economy of collectively owned enterprises coordinated by price signals. The many variants on such ‘market socialist’ proposals have evoked rebuttals from Marxists who refuse to concede to commodity exchange. Perhaps because they grant to the market the automatic information processing functions ascribed by the Austrian economists and market socialists, they may address issues of technological innovation or public data availability, yet do not seem to engage deeply with the potentialities of contemporary computing.
Today, post-crash, claims that markets are infallible information machines may seem less credible than they did a quarter of century ago. The parasitic energy-theft that underlies price-signal transmissions (exploitation at the point of production); the inability of individual commodity exchanges to register collective consequences (the so-called ‘externalities’); and the recursivity of a chrematistic system that loops back on itself in financial speculation, have all become more salient in the midst of global capital’s economic and ecological implosion." (http://www.culturemachine.net/index.php/cm/article/view/511/526)
Movements
Contemporary schools of thought and the problem of Labour Algorithms
Nick Dyer-Witheford:"Despite the fall of actually-existing socialism, the idea of computerized economic planning continued to be developed by small groups of theorists, who have advanced its conceptual scope further than anything attempted by Soviet cyberneticians. Two schools have been of particular importance: the ‘New Socialism’ of Scottish computer scientists Paul Cockshott and Alan Cottrell (1993); and the German ‘Bremen School’, which includes Peter Arno (2002) and Heinz Dieterich (2006), the latter an advocate of Venezuelan-style ‘Twenty First Century Socialism’. These tendencies have recently converged (Cockshott, Cottrell & Dieterich, 2010). However, because little of the Bremen group’s work is translated, the focus here will be on the New Socialism of Cockshott and Cottrell.
The distinguishing mark of the New Socialist project is its classic Marxist rigor. Accordingly, its twenty-first century super-computer planning follows to the letter the logic of the late nineteenth century Critique of the Gotha Program (Marx, 1970), which famously suggests that at the first, ‘lower’ stage to communism, before conditions of abundance allow ‘to each according to his needs’, remuneration will be determined by the hours of socially necessary labour required to produce goods and services. In the capitalist workplace, workers are paid for the reproduction of the capacity to labour, rather than for the labour actually extracted from them; it is this that enables the capitalist to secure surplus value. The elimination of this state of affairs, Cockshott and Cottrell contend, requires nothing less than the abolition of money—that is, the elimination of the fungible general medium of exchange that, through a series of metamorphoses of money in and out of the commodity form, creates the self-expanding value that is capital. In their new Socialism, work would be remunerated in labour certificates; an hour’s work could be exchanged for goods taking, on a socially average basis, an equivalent time to produce. The certificates would be extinguished in this exchange; they would not circulate, and could not be used for speculation. Because workers would be paid the full social value of their labour, there would be no owner profits, and no capitalists to direct resource allocation. Workers would, however, be taxed to establish a pool of labour-time resources available for social investments made by planning boards whose mandate would be set by democratic decisions on overall social goals.
Labour time thus provides the ‘objective unit of value’ for the New Socialism (Cockshott & Cottrell 2003: 3). It is at this point that its proponents invoke the capacities of information technology. Such a system would require an enumeration of the labour time expended, both directly and indirectly, in the creation of goods and services, to assess the number certificates for which these goods and services can be exchanged, and to enable the planning of their production. The basic tool of the input-output table reappears, with special attention to labour time, both as an input necessary for the production of goods, and as an output that itself requires the inputs of training and education. However, here the New Socialists have to confront a basic objection. Since the fall of the USSR it has been conventionally accepted that the scale of information processing attempted by its cyberneticians was simply too large to be feasible. Writing in the 1980s, Nove (1983) suggested that such an effort, involving the production of some twelve million discrete items, would demand a complexity input-output calculation impossible even with computers. This claim was repeated in recent discussions of Red Plenty, with critics of central planning suggesting that, even using a contemporary ‘desktop machine’, solving the equations would take ‘roughly a thousand years’ (Shalizi, 2012).
Cockshott and Cottrell’s answer involves new tools, both conceptual and technical. The theoretical advances are drawn from branches of computing science that deal with abbreviating the number of discrete steps needed to complete a calculation. Such analysis, they suggest, shows their opponents’ objections are based on ‘pathologically inefficient’ methods (Cockshott, in Shalizi, 2012).
The input-output structure of the economy is, they point out, ‘sparse’—that is to say, only a small fraction of the goods are directly used to produce any other good. Not everything is an input for everything else: yogurt is not used to produce steel. The majority of the equations invoked to suggest insuperable complexity are therefore gratuitous. An algorithm can be designed to short-cut through input-output tables, ignoring blank entries, iteratively repeating the process until it arrives at a result of an acceptable order of accuracy.
The time would be further reduced by massive increases in computer processing speed yielded by Moore’s Law. Suggesting high-level economic planning is done on a ‘desktop machine’ is disingenuous. The issue is supercomputing capacity. According to an email communication from Benjamin Peters, in 1969, the time of Red Plenty, the ‘undisputed workhorse’ of the Soviet information economy was the BESM-6 (‘bol’shaya electronicheskaya schetnaya mashina’ – literally the ‘large/major electronic calculating machine’), which could perform at an operating speed of 800,000 flops or ‘floating operations per second’ – that is, at 8 megaflops, or 10^6 flops. By 2013, however, supercomputers used in climate modelling, material testing and astronomical calculations are commonly exceeding 10 quadrillion flops or ten ‘petaflops’. The holder of the crown at the time of writing is Cray’s Titan at the Oak Ridge National Laboratory achieving some 17.6 petaflops (10^15) (Wikipedia, 2013). Supercomputers with an ‘exaflop’ capacity (10^18 flops) are predicted from China by 2019 (Dorrier, 2012). Thus, as Peters (2013) says, ‘giving the Soviets a bit generously 10^7 flops in 1969, we can find (10^18 - 10^7 = 10^11) . . . a 100,000,000,000 fold increase’ by today.
With these capacities, Cockshott and Cottrell’s suggestion that the computer requirements for large scale economic planning could be handled by facilities comparable to those now used for meteorological purposes, seems at least plausible. The ‘calculation problem’, however, involves not just data processing but the actual availability of data; Hayek’s claim was not merely that central planners cannot crunch economic numbers fast enough, but that the numbers in a sense do not exist prior to price setting, which provide an otherwise absent measure of production performance and consumption activity. Again, Cockshott and Cottrell suggest the answer lies in computers being used as a means of harvesting economic information. Writing in the early 1990s, and invoking levels of network infrastructure available in Britain at the time, they suggest a coordinating system consisting of few personal computers in each production unit, using standard programming packages, would process local production data and send it by ‘telex’ to a central planning facility, which every twenty minutes or so would send out a radio broadcast of adjusted statistical data to be input at local levels.
This is a scenario too reminiscent of the ramshackle techno-futurism of Terry Gilliam’s Brazil. To bring the New Socialists up to date we should instead refer to Fredric Jameson’s iconoclastic vision of Wal- Mart as ‘the shape of a Utopian future looming through the mist’ (2009: 423). His point is that, if one for a moment ignores the gross exploitation of workers and suppliers, Wal-Mart is an entity whose colossal organizational powers model the planned processes necessary to raise global standards of living. And as Jameson recognizes, and other authors document in detail (Lichtenstein, 2006), this power rests on computers, networks and information. By the mid 2000s Wal-Mart’s data-centers were actively tracking over 680 million distinct products per week and over 20-million customer transactions per day, facilitated by a computer system second in capacity only to that of the Pentagon. Barcode scanners and point of sale computer systems identify each item sold, and store this information. Satellite telecommunications link directly from stores to the central computer system, and from that system to the computers of suppliers, to allow automatic reordering. The company’s early adoption of Universal Product Codes had led to a ‘higher stage’ requirement for Radio Frequency Identification (RFID) tags in all products to enable tracking of commodities, workers and consumers within and beyond its global supply chain." (http://www.culturemachine.net/index.php/cm/article/view/511/526)
On the Cybernetic Self-Management advocated by Cornelis Castoriadis
Nick Dyer Witheford:"Historically, the anti-statist tendency in Marxism has been largely carried in a very different ‘worker council’ tradition, that, against the powers of party and state has insisted on the role of workplace assemblies as the loci of decision-making, organization and power. In an essay antediluvian by digital standards, ‘Workers' Councils and the Economics of a Self-Managed Society,’ written in 1957 but republished in 1972, immediately after the Soviet crushing of Hungary’s Workers Councils, Cornelius Castoriadis noted the frequent failure of this tradition to address the economic problems of a ‘totally self-managed society.’ The question, he wrote, had to be situated ‘firmly in the era of the computer, of the knowledge explosion, of wireless and television, of input-output matrices’, abandoning ‘socialist or anarchist utopias of earlier years’ because ‘the technological infrastructures … are so immeasurably different as to make comparisons rather meaningless’ (Castoriadis, 1972: np).
Like the planners of Red Plenty, Castoriadis imagines an economic plan determined with input-output tables and optimizing equations governing overall resource allocation (e.g. the balance between investment and consumption), but with implementation in the hands of local councils. His crucial point, however, is that there should be several plans available for collective selection. This would be the mission of ‘the plan factory’, a ‘highly mechanized and automated specific enterprise’, using ‘a computer’ whose ‘memory’ would ‘store the technical coefficients and the initial productive capacity of each sector’ (Castoriadis, 1972: np). This central workshop would be supported by others studying the regional implications of specific plans, technological innovations, and algorithmic improvements. The ‘plan factory’ would not determine what social targets should be adopted; merely generate options, assess consequences, and, after a plan has been democratically chosen, up-date and revise it as necessary. Castoriadis would agree with Raymond Williams’s (1983) later observation that there is nothing intrinsically authoritarian about planning, providing there is always more than one plan." (http://www.culturemachine.net/index.php/cm/article/view/511/526)
More Information
- Gerovitch, S. (2008) ‘InerNyet: Why the Soviet Union Did Not Build a Nationwide Computer Network’, History and Technology 24 (4): 335-350.
- Greenwood, D. (2007) ‘From Market to Non-Market: An Autonomous Agent Approach to Central Planning’, Knowledge Engineering Review 22 (4): 349-360.
- Medina, E. (2011) Cybernetic Revolutionaries: Technology and Politics
- Mirowski, P. (2002) Machine Dreams: Economics Becomes a Cyborg
No comments:
Post a Comment