Introduction
Modern organizations, in their pursuit of maximum efficiency, paradoxically lose their capacity for adaptation and real accountability. Dan Davies diagnoses this phenomenon as institutional lobotomy—the atrophy of higher cognitive functions and the erosion of communication channels. Consequently, decisions become automatic, detached from human consequences, and responsibility dissolves in a thicket of procedures. The core problem lies in the suppression of variety and the lack of correction mechanisms, leading to structures that shield power centers from inconvenient feedback. Understanding this "unaccountability machine" requires a cybernetic perspective that exposes the technical causes of institutional blindness.
Decision Vacuums and the Cybernetics of Control
In large organizations, a decision vacuum emerges—a state where decisions are made constantly, yet have no identifiable author capable of changing them. This is a problem of a broken feedback loop. According to the cybernetic principle of POSIWID (The Purpose Of a System Is What It Does), the real purpose of a system is what it actually does, not what it declares in its mission statement. If an institution repeatedly generates errors, those errors are its technical function.
The foundation of effective control is Ashby’s Law of Requisite Variety: only variety can absorb variety. A regulator must possess at least as many responses as there are disturbances in the environment. Stafford Beer, in his Viable System Model (VSM), described an organization as a framework of five systems: from operations (S1) to identity (S5). When higher systems (strategic intelligence and policy) atrophy, the organization loses its ability to learn, becoming defenseless against the complexity of the world.
Outsourcing, Bureaucracy, and the Technology of Debt
Outsourcing and bureaucracy are variety-reduction technologies that lead to the "lobotomization" of organizations. Outsourcing replaces living information channels with dead binary contracts, drastically limiting the room for maneuver. Bureaucracy, meanwhile, standardizes individual cases, creating so-called responsibility sinks. These are places (such as call centers) where employees absorb emotions and complaints without having any authority to change decisions. The system does not resolve the conflict; it merely localizes it where it cannot threaten management.
Modern cognitive filters, such as the Friedman doctrine, reduce the complexity of organizational goals solely to profit. This mechanism is reinforced by debt technology, which acts like a jamming device: the only signal that cannot be ignored is timely repayment. Debt minimizes the creditor's cognitive costs but robs the debtor of the capacity to adapt and invest in competencies, turning management into the administration of coercion.
Polycrisis, Populism, and the Pathologies of Automation
Phenomena such as polycrisis and populism are symptoms of a profound failure in control systems. Populism is an algedonic signal—a primal impulse of pain that smashes the institution's window when normal correction channels have been walled up. Accountability, in this view, is not a moral virtue but a property of the system: a relationship of agency that allows for a real change in decisions. Its absence leads to catastrophes, exemplified by the Horizon scandal in the British Post Office, where blind faith in system errors destroyed many lives.
Automation without a feedback loop destroys the social fabric because AI systems treat humans as data errors. To avoid institutional insanity, real human oversight is necessary, understood as the competence to intervene and halt the automation. Without this, technology becomes a tool of organized thoughtlessness, where no one feels guilty for systemically produced suffering.
Summary: The Institutionalization of the Exception
Repairing the architecture of accountability requires the institutionalization of the exception. The exception should not be treated as a disturbance, but as invaluable information about the limits of the world model in which the organization operates. Accountability is only reborn when a "decision owner" appears in the system with the authority to suspend procedure in the name of overriding values. However, personalizing decisions is not a simple alternative—an individual lacks the bandwidth to replace a system. The solution lies in designing a feedback infrastructure that connects the lived world with the decision-making center.
In the pursuit of optimization, are we condemning ourselves to institutional blindness? Rebuilding accountability in the age of polycrisis is an engineering task on a systemic scale. It requires creating space for the exception, reminding us that behind every form and algorithm, there is ultimately a human being.
📄 Full analysis available in PDF