Posts Tagged ‘topology’
Unorthodox Grigori Perelman (Fields Prize of mathematics); (Dec. 13, 2009)
The “Poincare conjecture” or (hypothesis) was stated in 1904 and says: “The sphere is the only compact space simply connected “connexe” in three dimensions” so that any curve drawn on the sphere can be deformed continuously until it is reduced to a point. This conjecture involved the research of many mathematicians. And the Russian mathematician Grigori Perelman finally completely demonstrated it in 2006.
It is to be noted that the “Fermat conjecture” required over 4 centuries to be demonstrated.
The French mathematician and physicist Henry Poincare (1854-1912) is the founder of modern topology. In order to resolve problems in celestial mechanics and relativity of N bodies, Poincare had to develop a new branch of theories in mathematics called “analysis situs” or geometry of situation.
Topology studies the invariant properties of continuously deformable spaces. For example, a balloon can be deformed continuously into a rugby ball and then to the shape of a bowl.
Grigori Perelman was born in Russia in 1966 and taught for several years at the University of Berkeley, California, and then returned to Russia in 1990. Perelman worked on the conjecture in secrecy and then published the first of his three articles in 2002 on the internet.
Perelman already declined two international prizes, including the highest in mathematics or the Fields Prize. The Clay foundation has one million dollars reserved for Perelman and we are not sure if he will decline the money too.
Actually, two more mathematicians working on the conjecture received the Fields Prize in 1966 and in 1986. Stephen Smale demonstrated in 1960 that in compact spaces of over 5 dimensions there are no simply connected “connexe”.
In 1982, Michael Freeman demonstrated the conjecture in four dimensions. Topology gained many theories and tools that are now standard that enabled Perelman to build on.
In 1970, William Thurston worked on the space of tangents to surfaces. At each point of the sphere, for example, the vector tangents passing by the point can be added or multiplied by any number; these vector spaces are called “tangent spaces”. Associated to each tangent space is a scalar product that measures the length and the angular deviation between the different couples of vectors.
The set of scalar products at points is called “Riemannian metric” and the space linked to this metric is called “Riemannian variety”. Thus, with this branch of mathematics we may define a distance between two points (on the sphere) which is the shortest curve that joins them or a “geodesic length”. There are two characteristics for scalar metrics:
First, there are an infinite of possible metrics. Special local metrics for Euclidian, spherical, and hyperbolic spaces are varieties that differ by the number of parallels to a straight line passing through an external point; for example, there exists only one parallel in Euclidian space, none for a sphere, and infinity of parallels for a hyperbolic space.
Second, a curve measures deviations of geodesic lengths and angles. For example, differences among the three varieties of spaces are related to the sum of the angles of a triangle. In Euclidean space we have exactly 180 degrees. If the angle of the curve is independent of the corresponding point then we say the angle is constant. For Euclidean space we say the angle of the curve is constant, nil, or flat. In the spherical variety it is positive, and for the hyperbolic it is negative.
Work and research in Geometry in the 20th century lead to the following result: the geometry of surfaces in two dimensions is spherical on spheres, flat on shapes as buoy, and hyperbolic on all other surfaces. Thus, Thurston proposed his conjecture on spaces of three dimensions “Any compact space of 3 dimensions can be decomposed in particular geometric pieces”
Peter Scott proved that only 8 possible compact spaces in 3 dimensions existed: the three topological kinds in 2 dimensions (Euclidian, spherical, and hyperbolic) and five topologies possessing important symmetrical groups. Thus, it was sufficient to prove the existence of a spherical geometry on the compact 3 dimensions sphere. Thurston resolved his own conjecture on many important particular cases but failed to generalize. Analytical geometry was introduced as a tool in topology.
In the same year of 1982, Richard Hamilton extended a method to minimize a function (analogous to energy) that can be computed on Thurston curves. Hamilton built a family of metrics “g” by continuous deformations in such a way that starting from any metrics it can lead to spherical geometry.
This family of metrics depended on a parameter that can metaphorically be viewed as time. The parameter satisfied the equation dg/dt = – 2 Ric (g(t)); Ric is a curve notion called “Ricci curve” or the measure of the difference in volume between Euclidean and Riemannian compact spaces. The equation is non linear (meaning the sum of two solutions is not generally a solution); it is part of partial derivative equation. Thus, the equation admits but a unique family of metrics that equal a given metric at time zero or “small time”. Otherwise, non linear equations diverge in finite time.
The new scalar product defines how the metric varies with time and is called “flow of Ricci curve”. Applying this equation on simply connected compact space of three dimensions should lead to a sphere. Hamilton developed many tools to describe the evolution of these metrics; one of the tolls proved that if the starting metric has a strictly positive Ricci curve then the variety of compact simply connected space is a sphere. The a priori knowledge of the sign of the curve was not within the premises of Poincare conjecture because we have to start from a compact variety we know nothing about it.
Hamilton went even further in his research. He demonstrated that if a curve becomes infinitely positive with time then the flow of Ricci ends and the metric is no longer Riemannian but labeled “degenerate”.
To bypass these possibilities “singularities” Hamilton invented the “surgical method” of slicing the sections that stopped the Ricci flow and attached standard objects to the sectioned parts in order to resume the Ricci flow. For example of standard objects we have cylinders in 3 dimensions and “hoods” that resemble semi-spheres.
Grigori Perelman described what happens in the vicinity where the Ricci flow ends and used two models of standard objects to resume the flow of Ricci. In points where the flow is normal we know nothing of its variety. Thus, if the curves are large at every point on the surface then the possible topologies are limited: the object is either a tore in 3 dimensions if the series are cylinders that fit inside one another or the object is a sphere if we attach two hoods. If the object admits only one hood then it is not compact. If more than two hoods were attached then the object is no longer connected.
Consequently, Poincare conjecture was demonstrated if the curves are large everywhere and only two hoods were attached to the sliced sections. Perelman demonstrated that we can pursue this procedure without problems because in finite time there are finite “surgical” interventions.
Perelman proof also demonstrated Thurston conjecture. The proof of Poincare conjecture was done using a reasoning based on elementary topology but it cannot confirm that the metric tended to the spherical variety.
Richard Hamilton worked for 30 years on Poincare conjecture and he received his share of prizes but not the “Fields medal”. He developed the methods and tools; he shrank from the insurmountable number of “singularity” to consider. Why? Is computer not a good enough mathematical tool? Is experimenting with variability not within the orthodox mathematical culture of elegance, simplicity, and beauty of deduction?
I still don’t know why Perelman declined the many mathematical prizes, including the “Fields medal”. Is it because the order of mathematicians insists on given preference to the pen and paper demonstration of the “Greek” cultural bias? Is it because most mathematicians abhor getting “dirty” using analytical methods that required massive computations for the exhaustive possibilities or “singularities”?
Maybe the order of mathematicians is ripe for a paradigm shift which states “All tools and methods that contribute to demonstrating mathematical theorems and problems are equally valid”.
Note: The Greek mathematicians have constructed mathematics different than that of the Mesopotamians and Egyptians who relied on algorithm and techniques based on counting and pacing and experimental alternatives. Thus, there is a cautious move toward using computer and computational facilities and newer tools that smack of experimentation.
The order of mathematicians is most conforming to Greek cultural bias in solving problems; the last two centuries exacerbated this bias because Europe badly needed to relate to a Greek civilization. The mathematicians basing their ideology on “elegant, beautiful, and simple” deductive methods for proofs have lost sight of the purpose of demonstrating theorems. The goal is to prove and move on so that scientists may adapt theorems to their fields of study and then move on.
Efficiency has limits within cultural biases, and within mathematicians…
Posted December 11, 2009
on:Efficiency has limits within cultural bias; (Dec. 10, 2009)
Sciences that progressed so far have relied on mathematicians: many mathematical theories have proven to be efficacious in predicting, classifying, and explaining phenomena.
In general, fields of sciences that failed to interest mathematicians stagnated or were shelved for periods; maybe with the exception of psychology.
People wonder how a set of abstract symbols that are linked by precise game rules (called formal language) ends up predicting and explaining “reality” in many cases.
Biology has received recently a new invigorating shot: a few mathematicians got interested working for example on patterns of butterfly wings and mammalian furs using partial derivatives, but nothing of real value is expected to further interest in biology.
Economy, mainly for market equilibrium, applied methods adapted to dynamic systems, games, and topology. Computer sciences is catching up some interest.
“Significant mathematics” or those theories that offer classes of invariant relative to operations, transformations, and relationship almost always find applications in the real world: they generate new methods and tools such as theories of group and functions of a complex variable.
For example, the theory of knot was connected to many applied domains because of its rich manipulation of “mathematical objects” (such as numbers, functions, or structures) that remain invariant when the knot is deformed.
What is the main activity of a modern mathematician?
First of all, they do systematic organization of classes of “mathematical objects” that are equivalent to transformations. For example, surfaces to a homeomorphisms or plastic transformation and invariant in deterministic transformations.
There are several philosophical groups within mathematicians.
1. The Pythagorean mathematicians admit that natural numbers are the foundations of the material reality that is represented in geometric figures and forms. Their modern counterparts affirm that real physical structure (particles, fields, and space-time…) is identically mathematical. Math is the expression of reality and its symbolic language describes reality.
2. The “empirical mathematicians” construct models of empirical (experimental) results. They know in advance that their theories are linked to real phenomena.
3. The “Platonist mathematicians” conceive the universe of their ideas and concepts as independent of the world of phenomena. At best, the sensed world is but a pale reflection of their ideas. Their ideas were not invented but are as real though not directly sensed or perceived. Thus, a priori harmony between the sensed world and their world of ideas is their guiding rod in discovering significant theories.
4. There is a newer group of mathematicians who are not worried to getting “dirty” by experimenting (analyses methods), crunching numbers, and adapting to new tools such as computer and performing surgery on geometric forms.
This new brand of mathematicians do not care to be limited within the “Greek” cultural bias of doing mathematics: they are ready to try the Babylonian and Egyptian cultural way of doing math by computation, pacing lands, and experimenting with various branches in mathematics (for example, Pelerman who proved the conjecture of Poincaré with “unorthodox” techniques and Gromov who gave geometry a new life and believe computer to be a great tool for theories that do not involve probability).
Explaining phenomena leads to generalization (reducing a diversity of phenomena, even in disparate fields of sciences, to a few fundamental principles). Mathematics extend new concepts or strategies to resolving difficult problems that require collaboration of various branches in the discipline.
For example, the theory elaborated by Hermann Weyl in 1918 to unifying gravity and electromagnetism led to the theory of “jauge” (which is the cornerstone theory for quantum mechanics), though the initial theory failed to predict experimental results.
The cord and non-commutative geometry theories generated new horizons even before they verified empirical results.
Axioms and propositions used in different branches of mathematics can be combined to developing new concepts of sets, numbers, or spaces.
Historically, mathematics was never “empirically neutral”: theories required significant work of translation and adaptation of the theories so that formal descriptions of phenomena are validated.
Thus, mathematical formalism was acquired by bits and pieces from the empirical world. For example, the theory of general relativity was effective because it relied on the formal description of the invariant tensor calculus combined with the fundamental equation that is related to Poisson’s equations in classical potential.
The same process of adaptation was applied to quantum mechanics that relied on algebra of operators combined with Hilbert’s theory of space and then the atomic spectrum.
In order to comprehend the efficiency of mathematics, it is important to master the production of mental representations such as ideas, concepts, images, analogies, and metaphors that are susceptible to lending rich invariant.
Thus, the discovery of the empirical world is done both ways:
First, the learning processes of the senses and
Second, the acquisition processes of mathematical modeling.
Mathematical activities are extensions to our perception power and written in a symbolic formal language.
Natural sciences grabbed the interest of mathematicians because they managed to extract invariant in natural phenomena.
So far, mathematicians are wary to look into the invariant of the much complex human and social sciences. Maybe if they try to find analogies of invariant among the natural and human worlds then a great first incentive would enrich new theories applicable to fields of vast variability.
It appears that “significant mathematics” basically decodes how the brain perceives invariant in what the senses transmit to it as signals of the “real world”. For example, Stanislas Dehaene opened the way to comprehending how elementary mathematical capacity are generated from neuronal substrate.
I conjecture that, since individual experiences are what generate intuitive concepts, analogies, and various perspectives to viewing and solving problems, most of the useful mathematical theories were essentially founded on the vision and auditory perceptions.
New breakthrough in significant theories will emerge when mathematicians start mining the processes of the brain of the other senses (they are far more than the regular six senses). Obviously, the interested mathematician must have witnessed his developed senses and experimented with them as hobbies to work on decoding their valuable processes in order to construct this seemingly “coherent world”.
A wide range of interesting discoveries on human capabilities can be uncovered if mathematicians dare direct scientists and experimenters to additional varieties of parameters and variables in cognitive processes and brain perception that their theories predict.
Mathematicians have to expand their horizon: the cultural bias to what is Greek has its limits. It is time to take serious attempts at number crunching, complex computations, complex set of equations, and adapting to newer available tools.
Note: I am inclined to associate algebra to the deductive processes and generalization on the macro-level, while viewing analytic solutions in the realm of inferring by manipulating and controlling possibilities (singularities); it is sort of experimenting with rare events.