Things add up.
The second trend is even more obvious in measure theory, namely, the interplay between stages and actors.
By stages, I mean number systems, or, in general, sets; and actors refer to functions and transformations.
In classical analysis, these two are separated and handled quite differently in nature. Sets are like sites on which one constructs things and functions are those buildings constructed, though closely related to each other, these two are distinct and they do not interact.
However, in measure theory, things are different. These two act intimately with each other and this interaction is quite substantial in the theory in the way that it ties functions closely to the topology or other properties of the underlying space.
On the one hand, many functions are defined according to properties of sets/ spaces while others are constructed just to demonstrate something about the space, among which characteristic functions serve as good examples.
On the other hand, sets are manipulated with respect to certain functions. As an example, consider the trick when the whole space is partitioned into preimages of functions. Another good example comes from Riesz Representation Theorem for locally compact Hausdorff spaces and positive linear functionals, where the interplay of stages and actors is so apparent and plays such an central role.
This phenomenon somehow reminds me of Einstein’s Special Relativity, which states that time-space is not something constant, but keeps changing, due to the energy and matter inside.
Also, the idea that given any measure and any function, a new measure can be given by just taking their product also resembles principles in Special Relativity that length, area, or volume are subject to change within different frame of reference. But this is beyond the scale of this post.
It occurred to me that measure theory, especially Lebesgue‘s theory suggests some trends of the development of modern mathematical analysis.
For the first one, I mean that, comparing to their predecessors, modern mathematicians are more willing to sacrifice some control over objects (sets, functions, etc.) in return for more useful or elegant theories. Maybe this is their surrender after being tortured for so many years by wild creatures released by themselves, but this reluctant give-up does lead to many deep results. After all, the whole point of Lebesgue‘s theory is supported by the philosophy that some small things are negligible.
Take Littlewood‘s famous three principles as an example:
There are three principles, roughly expressible in the following terms: Every set is nearly a finite sum of intervals; every function is nearly continuous; every convergent sequence of functions is nearly uniformly convergent.
For layman, these nearly and roughly things do not sound very mathematical. Anyway, mathematics is all about rigor and among all subjects of mathematics, analysis requires the most of it. It is difficult to imagine generating rigor results while you lose such an extent of control with these nearly and roughly.
However, these principles are highly appreciated and demonstrates a very deep understanding of mathematics that can only be possessed by a top-class mathematician, those with a sense of how to perceive the abstract through intuition and how to balance the requirement of rigor and the desire for beauty. Only after many years of training can one observe that without some control lost, elegance just cannot be achieved.
Similar examples are more than abundant: from convergence to convergence a.e. or in measure, from strong topology to weak topology, etc. Without these, insights will be lost. Many beautiful results would be drowned in ugly details and others would never be discovered at all.
And this is one of the trends of modern analysis: lose some control to achieve elegance and beauty.