Adonis Diaries

Posts Tagged ‘Poincare conjecture

Unorthodox Grigori Perelman (Fields Prize of mathematics); (Dec. 13, 2009)

The “Poincare conjecture” or (hypothesis) was stated in 1904 and says: “The sphere is the only compact space simply connected “connexe” in three dimensions” so that any curve drawn on the sphere can be deformed continuously until it is reduced to a point. This conjecture involved the research of many mathematicians.  And the Russian mathematician Grigori Perelman finally completely demonstrated it in 2006.

It is to be noted that the “Fermat conjecture” required over 4 centuries to be demonstrated.

The French mathematician and physicist Henry Poincare (1854-1912) is the founder of modern topology.  In order to resolve problems in celestial mechanics and relativity of N bodies, Poincare had to develop a new branch of theories in mathematics called “analysis situs” or geometry of situation.

Topology studies the invariant properties of continuously deformable spaces. For example, a balloon can be deformed continuously into a rugby ball and then to the shape of a bowl.

Grigori Perelman was born in Russia in 1966 and taught for several years at the University of Berkeley, California, and then returned to Russia in 1990.  Perelman worked on the conjecture in secrecy and then published the first of his three articles in 2002 on the internet.

Perelman already declined two international prizes, including the highest in mathematics or the Fields Prize.  The Clay foundation has one million dollars reserved for Perelman and we are not sure if he will decline the money too.

Actually, two more mathematicians working on the conjecture received the Fields Prize in 1966 and in 1986. Stephen Smale demonstrated in 1960 that in compact spaces of over 5 dimensions there are no simply connected “connexe”.

In 1982, Michael Freeman demonstrated the conjecture in four dimensions. Topology gained many theories and tools that are now standard that enabled Perelman to build on.

In 1970, William Thurston worked on the space of tangents to surfaces. At each point of the sphere, for example, the vector tangents passing by the point can be added or multiplied by any number; these vector spaces are called “tangent spaces”. Associated to each tangent space is a scalar product that measures the length and the angular deviation between the different couples of vectors.

The set of scalar products at points is called “Riemannian metric” and the space linked to this metric is called “Riemannian variety”. Thus, with this branch of mathematics we may define a distance between two points (on the sphere) which is the shortest curve that joins them or a “geodesic length”.  There are two characteristics for scalar metrics:

First, there are an infinite of possible metrics.  Special local metrics for Euclidian, spherical, and hyperbolic spaces are varieties that differ by the number of parallels to a straight line passing through an external point; for example, there exists only one parallel in Euclidian space, none for a sphere, and infinity of parallels for a hyperbolic space.

Second, a curve measures deviations of geodesic lengths and angles. For example, differences among the three varieties of spaces are related to the sum of the angles of a triangle. In Euclidean space we have exactly 180 degrees.  If the angle of the curve is independent of the corresponding point then we say the angle is constant. For Euclidean space we say the angle of the curve is constant, nil, or flat. In the spherical variety it is positive, and for the hyperbolic it is negative.

Work and research in Geometry in the 20th century lead to the following result: the geometry of surfaces in two dimensions is spherical on spheres, flat on shapes as buoy, and hyperbolic on all other surfaces.  Thus, Thurston proposed his conjecture on spaces of three dimensions “Any compact space of 3 dimensions can be decomposed in particular geometric pieces”

Peter Scott proved that only 8 possible compact spaces in 3 dimensions existed: the three topological kinds in 2 dimensions (Euclidian, spherical, and hyperbolic) and five topologies possessing important symmetrical groups.  Thus, it was sufficient to prove the existence of a spherical geometry on the compact 3 dimensions sphere.  Thurston resolved his own conjecture on many important particular cases but failed to generalize.  Analytical geometry was introduced as a tool in topology.

In the same year of 1982, Richard Hamilton extended a method to minimize a function (analogous to energy) that can be computed on Thurston curves.  Hamilton built a family of metrics “g” by continuous deformations in such a way that starting from any metrics it can lead to spherical geometry.

This family of metrics depended on a parameter that can metaphorically be viewed as time. The parameter satisfied the equation dg/dt = – 2 Ric (g(t)); Ric is a curve notion called “Ricci curve” or the measure of the difference in volume between Euclidean and Riemannian compact spaces. The equation is non linear (meaning the sum of two solutions is not generally a solution); it is part of partial derivative equation.  Thus, the equation admits but a unique family of metrics that equal a given metric at time zero or “small time”. Otherwise, non linear equations diverge in finite time.

The new scalar product defines how the metric varies with time and is called “flow of Ricci curve”.  Applying this equation on simply connected compact space of three dimensions should lead to a sphere.  Hamilton developed many tools to describe the evolution of these metrics; one of the tolls proved that if the starting metric has a strictly positive Ricci curve then the variety of compact simply connected space is a sphere. The a priori knowledge of the sign of the curve was not within the premises of Poincare conjecture because we have to start from a compact variety we know nothing about it.

Hamilton went even further in his research. He demonstrated that if a curve becomes infinitely positive with time then the flow of Ricci ends and the metric is no longer Riemannian but labeled “degenerate”.

To bypass these possibilities “singularities” Hamilton invented the “surgical method” of slicing the sections that stopped the Ricci flow and attached standard objects to the sectioned parts in order to resume the Ricci flow.  For example of standard objects we have cylinders in 3 dimensions and “hoods” that resemble semi-spheres.

Grigori Perelman described what happens in the vicinity where the Ricci flow ends and used two models of standard objects to resume the flow of Ricci. In points where the flow is normal we know nothing of its variety. Thus, if the curves are large at every point on the surface then the possible topologies are limited: the object is either a tore in 3 dimensions if the series are cylinders that fit inside one another or the object is a sphere if we attach two hoods. If the object admits only one hood then it is not compact.  If more than two hoods were attached then the object is no longer connected.

Consequently, Poincare conjecture was demonstrated if the curves are large everywhere and only two hoods were attached to the sliced sections. Perelman demonstrated that we can pursue this procedure without problems because in finite time there are finite “surgical” interventions.

Perelman proof also demonstrated Thurston conjecture. The proof of Poincare conjecture was done using a reasoning based on elementary topology but it cannot confirm that the metric tended to the spherical variety.

Richard Hamilton worked for 30 years on Poincare conjecture and he received his share of prizes but not the “Fields medal”.  He developed the methods and tools; he shrank from the insurmountable number of “singularity” to consider. Why? Is computer not a good enough mathematical tool? Is experimenting with variability not within the orthodox mathematical culture of elegance, simplicity, and beauty of deduction?

I still don’t know why Perelman declined the many mathematical prizes, including the “Fields medal”.  Is it because the order of mathematicians insists on given preference to the pen and paper demonstration of the “Greek” cultural bias?  Is it because most mathematicians abhor getting “dirty” using analytical methods that required massive computations for the exhaustive possibilities or “singularities”?

Maybe the order of mathematicians is ripe for a paradigm shift which states “All tools and methods that contribute to demonstrating mathematical theorems and problems are equally valid”.

Note: The Greek mathematicians have constructed mathematics different than that of the Mesopotamians and Egyptians who relied on algorithm and techniques based on counting and pacing and experimental alternatives.  Thus, there is a cautious move toward using computer and computational facilities and newer tools that smack of experimentation.

The order of mathematicians is most conforming to Greek cultural bias in solving problems; the last two centuries exacerbated this bias because Europe badly needed to relate to a Greek civilization.  The mathematicians basing their ideology on “elegant, beautiful, and simple” deductive methods for proofs have lost sight of the purpose of demonstrating theorems.  The goal is to prove and move on so that scientists may adapt theorems to their fields of study and then move on.

How random and unstable are your phases? (Dec. 7, 2009)

There are phenomena in the natural world that behave randomly or what it seems chaotic such as in percolation and “Brownian movement” of gases.  The study of phases in equilibrium among chaotic, random, and unstable physical systems were analyzed first my physicists and taken on by modern mathematicians.

The mathematician Wendelin Werner (Fields Prize) researched how the borders that separate two phases in equilibrium among random, and unstable physical systems behave; he published “Random Planar Curves…”

Initially, the behavior of identical elements (particles) in large number might produce deterministic or random results in various cases.

For example, if we toss a coin many times we might guess that heads and tails will be equal in number of occurrences; the trick is that we cannot say that either head or tail is in majority.

The probabilistic situations inspired the development of purely mathematical tools.  The curves between the phases in equilibrium appear to be random, but have several characteristics:

First, the curves have auto-similarity, which means that the study of a small proportion could lead to generalization in the macro-level with the same properties of “fractal curves”,

The second characteristic is that even if the general behavior is chaotic a few properties remain the same (mainly, the random curves have the same “fractal dimension” or irregular shape;

The third is that these systems are very unstable (unlike the games of head and tails) in the sense that changing the behavior of a small proportion leads to large changes by propagation on a big scale.  Thus, these systems are classified mathematically as belonging to infinite complexity theories.

Themes of unstable and random systems were first studied by physicists and a few of them received Nobel Prizes such as Kenneth Wilson in 1982.

The research demonstrated that such systems are “invariant” by transformations (they used the term re-normalization) that permit passages from one scale to a superior scale.  A concrete example is percolation.

Let us take a net resembling beehives where each cavity (alveolus) is colored black or red using the head and tail flipping technique of an unbiased coin. Then, we study how these cells are connected randomly on a plane surface.

The Russian Stas Smirnov demonstrated that the borders exhibit “conforming invariance”, a concept developed by Bernhard Riemann in the 19th century using complex numbers. “Conforming invariance” means that it is always possible to warp a rubber disk that is covered with thin crisscross patterns so that lines that intersect at right angle before the deformation can intersect at right angle after the deformation.  The set of transformations that preserves angles is large and can be written in series of whole numbers or a kind of polynomials with infinite degrees. The transformations in the percolation problem conserve the proportion of distances or similitude.

The late Oded Schramm had this idea: suppose two countries share a disk; one country control the left border and the other the right border; suppose that the common border crosses the disk. If we investigate a portion of the common border then we want to forecast the behavior of the next portion.

This task requires iterations of random conforming transformations using computation of fractal dimension of the interface. We learn that random behavior on the micro-level exhibits the same behavior on the macro-level; thus, resolving these problems require algebraic and analytical tools.

The other case is the “Brownian movement” that consists of trajectories where one displacement is independent of the previous displacement (stochastic behavior).  The interfaces of the “Brownian movement” are different in nature from percolation systems.

Usually, mathematicians associate a probability “critical exponent or interaction exponent” so that two movements will never meet, at least for a long time.  Two physicists, Duplantier and Kyung-Hoon Kwan, extended the idea that these critical exponents belong to a table of numbers of algebraic origin. Mathematical demonstrations of the “conjecture” or hypothesis of Benoit Mandelbrot on fractal dimension used the percolation interface system.

Werner said: “With the collaboration of Greg Lawler we progressively comprehended the relationship between the interfaces of percolation and the borders of the Brownian movement.  Strong with Schramm theory we knew that our theory is going to work and to prove the conjecture related to Brownian movement.”

Werner went on: “It is unfortunate that the specialized medias failed to mention the great technical feat of Grigori Perelman in demonstrating Poincare conjecture.  His proof was not your tread of the mill deductive processes with progressive purging and generalization; it was an analytic and human proof where hands get dirty in order to control a bundle of possible singularities.

These kinds of demonstrations require good knowledge  of “underlying phenomena”.

As to what he consider a difficult problem Werner said: “I take a pattern and then count the paths of length “n” so that they do not intersect  twice at a particular junction. This number increases exponentially with the number n; we think there is a corrective term of the type n at exponential 11/32.  We can now guess the reason for that term but we cannot demonstrate it so far.”

The capacity of predicting behavior of a phenomenon by studying a portion of it then, once an invariant is recognized, most probably a theory can find counterparts in the real world; for example, virtual images techniques use invariance among objects. It has been proven that vision is an operation of the brain adapting geometric invariance that are characteristics of the image we see.

Consequently, stability in the repeated signals generates perception of reality.  In math, it is called “covariance laws” when system of references are changed.  For example, the Galileo transformations in classical mechanics and Poincare transformations in restricted relativity.

In a sense, math is codifying the processes of sensing by the brain using symbolic languages and formulations.


adonis49

adonis49

adonis49

December 2020
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  

Blog Stats

  • 1,443,173 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 790 other followers

%d bloggers like this: