Overview
The Milieu#
The period of modernism:
The period under study here belongs to a phase of science still dominated by developments in Europe. Some of the most representative theoretical achievements of that period include logic, probability, and quantum mechanics. […] Since the turn of the century she had been facing great changes in all directions: First of all, there is science itself, with the new discoveries in physics perhaps in the forefront. The classical nineteenth century image of a ready-made mechanical-material world had been shaken by relativity. The essential discontinuity of the microworld of atoms and quanta finally dislodged the parts of that image. But overthrowing old images was not peculiar to the sciences only. In the arts, modernistic ideas made their breakthrough: Painters no longer depicted a 'given' reality, and proven ideas of harmony were dismissed within music. Logical empiricism dispensed with the philosophies of the old school, declaring their problems outright meaningless. The period under study here coincides with what has been felt to be the 'golden era' of European science. The year 1933, when Kolmogorov's monograph appeared also stands as a symbol for the beginning of the end of that era in political terms.
With the renouncement of classical probability, which was virtually demanded by the historical trend, of which one manifestation is the emergence of statistical science, came modern probability. The restriction of classical probability became evident through the growing importance of statistics, since statistical or frequentist probability, which was the dominant intuitive idea in the developments leading to modern probability, is not reducible to the classical one.
- The shift to modern probability started around the turn of the 20th century (establishment of connections between probability and pure mathematics).
- By the late 1930s probability theory ahd become an autonomous part of mathematics (establishment of probability theory as an independent mathematical science).
- The underlying semantics of probabilistic language that secured the concepts of chance and statistical law for an autonomous mathematical theory of probability was provided by, for example, among others, quantum mechanics (1925-1927), and earlier classical statistical physics (topics from statistical physics, quantum theory, and dynamical systems).
There is a completely different approach to probability, Bayesianism, or subjectivism. Here probability is understood as a measure of “degree of belief”, and probability theory as a systematic guide to “behavior under uncertainty”. The subjectivist theory has gained support mostly since the 1950s, but it was developed almost single-handedly by Bruno de Finetti in the late 1920s and early 1930s.
From Classical Probability to Modern Probability#
Classical Probability#
Classical probability theory was concerned with a finite number of alternative results of a trial. A composite event consists of several elementary events, whose probability is the sum of the probabilities of the elementary events. There are continuous quantities appearing in the classical calculus of probability. Geometric probabilties, and limiting distributions. Random variables were handled through their distribution functions with elementary integration theory, without being led to questions characteristic of modern probability.
Probability's earliest appearance was in connection with games of chance, where elementary events are given symmetric treatments: giving each of a number \(m\) of elementary events the same probability \(1/m\). For a sequence of \(n\) repetitions, also the assignment of equal probabilities, hence with \(n\) independent repetitions the probability of each sequence is \((1/m)^n\). The calculus of combinatorics was developed to handle more complex cases.
The finitary classical calculus of probability is based on the classical interpretation of probability.There is supposed to be a finite number \(m\) of
‘possible cases' which are judged ‘equipossible’ and hence equiprobable, if “there is no reason to think the occurrence of one of them would be more likely than that of any other” (Laplace's rule of insufficient reason and indifference argument).
Emergence of connections between pure mathematics and probability#
Mathematics#
On the mathematical side, Borel and Lebesgue created the general theory of measure, of which the applicability to probability was immediately evident. A real number is identified with a sequence of natural number, thus the probabilistic problem of the limiting behavior of e.g. relative frequency could be formulated as a problem about the measure of a set of real numbers. On the other hand, the very concept of the real line provides the background for the emergence of modern infinitary probability, since a real number's being rational is an exceptional ‘probability 0’ property that doesn't always mean impossibility.
Borel's first entrance to modern probability seriously was motivated by a concern over the foundations of mathematics. His mathematical application of dnumerable probability, the study of distribution problems in arithmetic sequences, managed to attract considerable interest from mathematicians (number theory ,analysis, set theory). This was, but, also influenced by astronomer Gylden's problem regarding continued fraction expansions of real numbers that originated from the perturbation calculations of planetary motions, and from Poincare's recurrence theorem.
A very similar pattern can be found in Hermann Weyl's work on the distribution problems of real numbers. His work on the equidistribution of reals mod 1 is actually a result about ergodicity, concerning whether the long-range behavior in time of a statistical mechanical system can be determined from its physical description in a certain way, which also bears a close connection to the existence of mean motions in astronomical dynamics. Equidistribution of reals mod 1 is equivalent to drawing on \(U(1)\) points in succession with angles of equal size with an appropriate arc length \(x\), if \(x\) is irrational a dense set of points is produced. Weyl notes that there is a connection between such rotations and the epicyclic models of celestial motion of Ptolemy and others in antiquity (actually, Fourier analysis).
Physics#
The influence came from, of course, quantum mechanics and statistical physics.
Statistical physics:
There was always a tension between the classical mechanics that was supposed to be valid on the level of the atomic motions, and the macroscopic behavior of matter. Specifically, while mechanical processes are reversible and symmetric in time, heat processes obviously have a preferred direction, namely, toward the equalization of temperature differences. The problem of irreversibility became the crucial one for the kinetic theory. Probabilistic arguments were invented for reconciling the two levels with each other, that is, the levels of the mechanical molecular processes and of the macroscopic observable ones. These arguments, in turn, called for a more sophisticated probability mathematics.
It is natural to apply measure theory to the state space of a statistical mechanical system. Random events following each other in continuous succession became under consideration with Boltzmann's equation, theory of Brownian motion, Planck diffusion equation, Chapman's equation, and also by the development of the theories of random processes, and finally stochastic processes.
Richard von Mises when attempted at formulating a purely probabilistic ergodic theory created a theory of probability in which randomness was a basic undefined concept, through which probability was defined as a limit of relative frequency in a random sequence.
This shift from a classical mechanical to a purely probabilistic theory took place some years after modern quantum mechanics was created. It was felt that through the probabilistic formulation, ergodic theory was freed of the threat posed by quantum mechanics. Oddly enough, these developments were triggered by the application of the mathematical methods of quantum theory to classical dynamics. The application was due to von Neumann, who also was the chief architect of what is known as the Hilbert space formalism of quantum mechanics
Quantum mechanics: the story is familiar. But the following quote is interesting enough:
Von Neumann's view of probability in physics has become standard. In classical physics probabilities are basically nonphysical, epistemic additions to the physical structure, a 'luxury' as von Neumann says, while quantum physics, in contrast, has probabilities which stem from the chancy nature of the microscopic world itself. Epistemic probability is a matter of 'degree of ignorance' or of opinion, if you permit. The quantum mechanical probabilities, instead, are computed out of the wavefunction so that no place seems to be left over at which the knowing subject could inject his ignorance. The standard view is that these two kinds of probabilities, the classical-epistemic and the quantum mechanical-objective, exhaust all possibilities. This view is known, after Popper, by its popular name as the propensity interpretation of probability. Whether his is a viable view of quantum mechanical prob- abilities can be discussed. But as an account of objective probability in general, the propensity theory raises two fundamental questions: 1. Why should quantum mechanics be the explanatory ground for all objective probabilities? and 2. Why should a mechanical basis turn statistical behavior into something apparent only, into a mere 'scheinbar statistisches Verhalten,' as von Neumann puts it? Was the reality of statistical law not precisely one ofthe lessons to be learned from classical statistical physics?
The Final Stages (1919-1933)#
In his Grundbegriffe der Wahrscheinlichkeitsrechung Kolmogorov gave the definitive formulation of modern probability in our present measure theoretic sense, in 1933. This is nowadays presented with a set theoretic of thinking about mathematical existence. But traditionally mathematical objects had to be constructive, e.g. for proto-intuitionists (Borel, etc.), all mathematical objects are required to be defined by a finite number of words, and it was part of the notion of function that it had to be computable (no one contemplated the existence of noncomputable functions, hence this was hardly spelled out). For them a sequence due to chance does not follow a mathematical law at all (now cf. Van der Waerden's theorem Randomness, Chance, Coincidence, Lawlessness ) and a sequence of integers was defined by defininig a function that for each \(n\) gives the \(n\)-th member of the sequence. Borel had to find a way of representing chance mathematically, and hence he had to widen the requirement of finite definability.
This was similar to the situation in physics. Mathematical laws were as strict as the deterministic laws of physics, and this led to difficulties in incorporating chance.
Then came Hilbert's idea of existence as consistency. Let chance produce an indefinitely, or even infinitely, long sequence of numbers. Existence as consistency says that there is some mathematical law that the sequence follows - though it doesn't needs to be knowable. The limits of mathematical existence were soon
raised even much higher by set theory's nondenumerable infinities. Many of the most important contributors of probability had reservations about the formalist or set theoretic foundations of mathematics, including Borel, Weyl, Kolmogorov himself, the latter two being Brouwerian intuitionists.