Cyborg Sciences: Mirowski Redefines Economics
The modern economy is not a neutral mechanism, but a historically shaped computational machine. Philip Mirowski introduces the concept of cyborg sciences as a meta-paradigm where the boundary between organism and machine, market and code, becomes fluid. Instead of 19th-century metaphors of energy and equilibrium, these sciences place information, control, and probability at the center. Understanding this transformation reveals that today's market is not just a place of exchange, but primarily a theater of automata, where the classical definition of rationality must be replaced by the theory of computability.
Maxwell's Demon and the Informational Foundation of Markets
The central figure of this new ontology is Maxwell's Demon—an entity capable of extracting order from chaos through information processing. In the 20th century, thanks to the work of Norbert Wiener and Claude Shannon, the demon became a figuration of the data operator. Wiener's cybernetics equated the struggle against entropy with the struggle for meaning and control, while Shannon defined information as a statistical measure of choice, separating it from sense. This approach redefines the market as a system for managing uncertainty.
However, a key limitation is introduced by the Turing machine. Alan Turing proved that there are undecidable problems and uncomputable numbers. In the light of cyborg sciences, the neoclassical model of the rational actor falls into a logical contradiction: it assumes that humans can solve optimization problems that are computationally impossible for physically existing machines to solve in finite time. This undermines the foundations of game theory and neoclassical choice theory.
The Market as Processor: From the Cowles Commission to Zoids
The evolution of economic thought within the Cowles Commission shows a transition from structural econometrics (Mark I) to treating the market as an information processing system (Mark II). Under the influence of von Neumann and Koopmans, the economic agent began to be seen as a utility processor. A breakthrough came with experiments involving zoids—zero-intelligence agents. They proved that market efficiency does not depend on the heroic rationality of individuals, but on institutional architecture.
Different types of markets correspond to different classes of automata with varying computational power. It is the language of bids and algorithmic rules, rather than the psychology of participants, that determines the system's ability to reach equilibrium. In this view, the market designer becomes a computational machine engineer, and responsibility for economic outcomes shifts from individual decisions to the structural institutional code.
Models of Cyborgization and the Radicalism of Artificial Intelligence
Modern macroeconomics masks its cyborg nature: officially, it speaks of equilibrium, but in practice, it operates through algorithms and data filters. Artificial intelligence radicalizes this trend, moving the logic of Command, Control, Communications, and Intelligence (C3I) to the heart of business. Globally, we observe three styles of taming "Maxwell's Demon": the North American model (unrestricted accumulation and digital monopolies), the Gulf model (a hybrid of technology and Sharia law), and the European model (an attempt at constitutionalization and data protection regulation).
Delegating decisions to autonomous algorithms carries the risk of systemic opacity. When markets become an interaction of meta-automata, the classical notion of responsibility vanishes. AI does not solve the problem of rationality—it merely moves it to a higher level of abstraction, where heuristic errors can lead to catastrophes of unprecedented scale, while simultaneously strengthening the concentration of computational power.
The Constitutionalization of Computation and Future Scenarios
To avoid the totalization of data, the constitutionalization of computation is necessary. This calls for the right to insight into the logic of automata, the right to challenge machine decisions, and the designation of spheres free from automation. Law becomes the last bastion protecting human subjectivity from being reduced to a data point. The future outlines three scenarios: asymmetric hegemony by tech corporations, fragmented regional regulation, or a procedural reconstruction where every algorithmic decision must be publicly justified in human language.
Will we manage to harness Maxwell's Demon to serve human ends, or will we ourselves become cogs in its digital machinery? The future depends on whether we can create a constitution for algorithms before they write one for us.
📄 Full analysis available in PDF