Uncertainty: The Foundation and Raison d'Être of Statistics
The modern economy is an arena of decisions made amidst a flood of data. According to Peter Kenny, statistics is not merely a collection of techniques, but the art of dealing with uncertainty. It is a necessary condition for the discipline's existence, as its total elimination would mean the end of freedom and innovation. This article analyzes how practical reason allows the transformation of a chaos of facts into statements of probability, offering managers a proposition of reasonable gambling instead of metaphysical certainty. Readers will learn how to avoid interpretative pitfalls and why responsibility begins where algorithms end.
Data, Sampling, and the Paradox of Scale
The foundation of business analysis is the distinction between descriptive data (categorical, such as occupation) and numerical data (continuous like height or discrete like the number of children). Statistics works on samples representing a population. The reliability of conclusions is determined by sampling methods: from ideal random selection to systematic, stratified, and cluster sampling. The sample paradox is key: random error depends on the absolute number of observations, not its proportion to the whole. Therefore, a reliable survey of a thousand people describes a population of one million with nearly the same precision as a population of ten million.
Three Levels of Uncertainty and Logical Pitfalls
Kenny identifies three levels of uncertainty: primary data errors, processing distortions, and subjective model assumptions. Faith in dataism leads to an aporia revealed by three theses on certainty: if data provides certainty, management loses agency; however, if they bear responsibility, the data cannot be fully representative. Another trap is the naive extrapolation of trends. Malthus's example teaches us that forecasts fail not because of calculation errors, but when the rules of reality change (e.g., a technological revolution), invalidating existing models.
Big Data, AI, and Global Development Models
Big Data and AI are revolutionizing statistics, shifting the emphasis from theory to empirical models derived from data. In market analysis, a crucial caveat remains: correlation is not causation. A strong relationship between variables often masks a third, hidden factor, illustrated by the anecdote about storks and birth rates. Globally, three approaches are visible: the USA focuses on radical pragmatism, the EU on a regulatory framework and ethics, and Arab countries on a hybrid of modernization and control. Business oscillates between technocratic optimism and skepticism toward automated decision-making.
Statistics as a Foundation for Ethics and Responsibility
Choosing a significance threshold is an ethical decision about the distribution of risk between Type I errors (false alarms) and Type II errors (missing a signal). It is important to remember that statistical significance does not always mean economic significance—minimal differences may not be cost-effective to implement. While Kenny’s approach is valuable, it has its limits: it overlooks power structures and the role of data in public discourse. Statistics without an ethical compass masks cracks in the illusion of order. True wisdom lies in the ability to perceive the uncertainty that defines our humanity.
📄 Full analysis available in PDF