Adonis Diaries

Posts Tagged ‘Gauss

Einstein speaks on General Relativity; (Nov. 20, 2009)

I have already posted two articles in the series “Einstein speaks on…” This article describes Einstein’s theory of restricted relativity and then his concept for General Relativity. It is a theory meant to extend physics of fields (for example electrical and magnetic fields among others) to all natural phenomena, including gravity. Einstein declares that there was nothing speculative in his theory but it was adapted to observed facts.

The fundamentals are that the speed of light is constant in the void and that all systems of inertia are equally valid (each system of inertia has its own metric time). The experience of Michelson has demonstrated these fundamentals. The theory of restrained relativity adopts the continuum of space coordinates and time as absolute since they are measured by clocks and rigid bodies with a twist: the coordinates become relative because they depend on the movement of the selected system of inertia.

The theory of General Relativity is based on the verified numerical correspondence of inertia mass and weight. This discovery is obtained when coordinates posses relative accelerations with one another; thus each system of inertia has its own field of gravitation. Consequently, the movement of solid bodies does not correspond to the Euclid geometry as well as the movement of clocks. The coordinates of space-time are no longer independent. This new kind of metrics existed mathematically thanks to the works of Gauss and Riemann.

Ernst Mach realized that classical mechanics movement is described without reference to the causes; thus, there are no movements but those in relation to other movements.  In this case, acceleration in classical mechanics can no longer be conceived with relative movement; Newton had to imagine a physical space where acceleration would exist and he logically announced an absolute space that did not satisfy Newton but that worked for two centuries. Mach tried to modify the equations so that they could be used in reference to a space represented by the other bodies under study.  Mach’s attempts failed in regard of the scientific knowledge of his time.

We know that space is influenced by the surrounding bodies and so far, I cannot think the general Relativity may surmount satisfactorily this difficulty except by considering space as a closed universe, assuming that the average density of matters in the universe has a finite value, however small it might be.

Einstein speaks his mind processes on the origin of General Relativity; (Nov. 21, 2009)

This article is on  how Einstein described his mind processes that lead to the theory of restricted relativity and then his concept for General Relativity. In 1905, restricted relativity discovered the equivalence of all systems of inertia for formulating physics equations.

From a cinematic perspective, there was no way to doubting relative movements. Still, there was the tendency among physicists to physically extend privileged significance to system of inertia.  The question was “if speed is relative then, do we have to consider acceleration as absolute?”

Ernest Mach considered that inertia did not resist acceleration except when related to the acceleration toward other masses. This idea impressed Einstein greatly.  Einstein said: ” First, I had to establish a law of gravitation field and suppress the concept of absolute simultaneity. Simplicity urged me to maintain Laplace’s “scalar gravity potential” and fine tune Poisson’s equation.

Given the theorem of inertia of energy then, inertia mass must be depended on gravitation potential; but my research left me skeptical. In classical mechanics, vertical acceleration in a vertical field of gravity is independent of the horizontal component of velocity; it follows that vertical acceleration is exercised independently of the internal kinetic energy of the body in movement.

I discovered that this independence did not exist in my draft theory; this evidence did not coincide with the affirmation that all bodies submit to the same acceleration in a gravitational field. Thus, the principle that there is equality between inertia mass and weight grew with striking significance. I was convinced of its validity, though I had no knowledge of the results of experiments done by Eotvos.”

Consequently, the principle of equality between inertia mass and weight would be explained as follows: in a homogeneous gravitational field, all movements are executed in relation to a system of coordinates accelerating uniformly as if in absence of gravity field. I conjectured that if this principle is applicable to any other events then it can be applied to system of coordinates not accelerating uniformly.

These reflections occupied me from 1908 to 1911 and I figured that the principle of relativity needed to be extended (equations should retain their forms in non uniform accelerations of coordinates) in order to account for a rational theory of gravitation; the physical explanation of coordinates (measured by rules and clocks) has to go.

I reasoned that if in reality “a field of gravitation used in system of inertia” did not exist it could still be served in the Galilean expression that “a material point in a 4-dimensional space is represented by the shortest straight line”. Minkowski has demonstrated that this metric of the square of the distance of the line is a function of the squares of the differential coordinates.  If I introduced other coordinates by non linear transformation then the distance of the line stay homogeneous if coefficients dependent on coordinates are added to the metric (this is the Riemann metric in 4-dimension space not submitted to any gravity field). Thus, the coefficients describe the field of gravity in the selected system of coordinates; the physical significance is just related to the Riemannian metric. This dilemma was resolved in 1912.

Two other problems had to be resolved from 1912 to 1914 with the collaboration of Marcel Grossmann.

The first problem is stated as follows: “How can we transfer to a Riemannian metric a field law expressed in the language of restrained relativity?”  I discovered that Ricci and Levi-Civia had answered it using infinitesimal differential calculus.

The second problem is: “what are the differential laws that determine the coefficients of Riemann?”  I needed to resolve invariant differential forms of the second order of Riemann’s coefficients. It turned out that Riemann had also answered the problem using curb tensors.

“Two years before the publication of my theory on General Relativity” said Einstein “I thought that my equations could not be confirmed by experiments. I was convinced that an invariant law of gravitation relative to any transformations of coordinates was not compatible with the causality principle. Astronomic experiments proved me right in 1915.”

Note:  I recall that during my last year in high school my physics teacher, an old Jesuit Brother, filled the blackboard with partial derivatives of Newton’s equation on the force applied to a mass; then he integrated and he got Einstein’s equation of energy which is  mass multiplied by C square. At university, whenever I had problems to solve in classical mechanics on energy or momentum conservation I just applied the relativity equation for easy and quick results; pretty straightforward; not like the huge pain of describing or analyzing movements of an object in coordinate space.

Article #30 of “What is Human Factors in Engineering?”;  December 27, 2005

 “How objective and scientific are experiments?”

If we narrow this article to the statistical analysis of experiments and without going into details suffice us to mention a few controversies.  First, let us do a chronology of the various paradigms in statistics and statistical algorithms.  From a European perspective Pascal is believed to have started probability theory in1654.

LaPlace and Legendre contributed to the Least-Squares algorithm for how to fit a model to data (1750-1810)

Gauss developed the geometry and algebra of the multivariate normal distribution (1800’s)

Galton studied regression between two variables (1885) and Pearson the correlation coefficient in 1895.

Fisher, Snedecor and Sheffe concurrently worked on experimental design and analysis of variance algorithm (ANOVA) to statistically test the population distribution under the assumptions of normality in the 1920’s.

The data analyses of non distribution base samples to fit models to data showing structural features were developed by Thurstone in Factor Analysis, by Young and Householder (1935) in Multidimensional scaling and Cluster analysis algorithms.

Joreskog, K. G developed in 1973 the algorithm of a general method for estimating a linear structural relational equation labeled LISREL that analyses the relationships among latent variables linked to operationalized indicators. This general method considers as special cases path analysis recursive or non recursive as well as Factors analysis.

John Tukey and Mosteller concentrated on studying exploratory data analysis to fit mathematical and geometric models to data showing both structural and residual, and thus complementing confirmatory or inferential analyses.

There are divergent paradigms in the following concepts:  first, the suitability of data measurements according to measurement theory versus the distribution properties of the variable of interest (S. S. Stevens versus I. R. Savage in the 60’s); second, the need to investigate real world data prior to applying any statistical package (data snooping) so that if you perform serious detective work on the data and torture it long enough it will confess and open many ways to understand its underlying behavior (John Tukey); thus increased emphasis on graphs of individual data points and plotting to investigate the preliminary screening so as to ensure that the summary statistics selected are truly relevant to the data at hand. 

Third, the application of the Bayesian approach from the consumer or decision maker viewpoint which provide the final probability against evidence instead of the investigator standard acceptance of a p-value to rejecting a hypothesis (read the “Illusion of Objectivity” by James Berger and Donald Berry, 1988).

Fourth, the selection of an investigator for a statistical package that he is familiar with instead of the appropriate statistics for the research in question;  The acceptance of untenable assumptions on population distributions and computing unrealistic parameters simply because the investigator is not trained to understanding or interpreting alternative statistical methods of nonparametric or distribution freer population methods.

Fifth, there are examples of investigators adopting explanatory statistical packages to torture data into divulging confusing causative variables while, in fact, the science is already well established in the domain to specifically determine exhaustively the causative factors simply because the investigator is not versed in mathematics or physics (“Tom Swift and his electric factor analysis machine by J. Scott Armstrong, 1967).

Sixth, there is a need to the “mathematization of behavioral sciences” (Skelum, 1969) which involves the development of mathematically stated theories leading to quantitative predictions of behavior and to derivation from the axioms of the theory of a multitude of empirically testable predictions. Thus, instead of testing verbal model as to the null hypothesis, an adequate mathematical model account for both variability and regularity in behavior and the appropriate statistical model is implied by the axioms of the model itself.  Another advantage is that attention is turned to measuring goodness of fit, range of phenomena handled by the model and ability to generating counterintuitive predictions.

This discussion is an attempt to emphasize the concept of experimentation as a structured theory and that the current easy and cheap computational potentials should be subservient to the theory so that data are transformed to answer definite and clear questions.  The Human Factors practitioner, whom should be multidisciplinary in order to master the human and physical sciences, is hard hit by the need of performing complex scientific experiments involving human subjects and yet required to yield practical recommendations for the applied engineering fields.

No wonder Human Factors professionals are confused in their purposes and ill appreciated by the other discipline unless a hybrid kind of scientists are generated from a structural combination of engineering discipline and modern experimental methods and statistical algorithms. 

However, Human Factors engineers who have an undergraduate engineering discipline and a higher degree in experimental research and statistical analyses training can be better positioned to handle research involving mathematical modeling of theories in sciences.

The fixed mindedness in adolescents reminds us of the mind fix of old people with the assumption that the mind has the potential flexibility to grow while young.

You may look young masking and old mind or look older and exhibiting a younger mind; it is your choice how much time and energy you are willing to invest for acquiring knowledge.

Article #30, December 27, 2005

 “How objective and scientific are experiments?”

If we narrow this article to the statistical analysis of experiments and without going into details suffice us to mention a few controversies.  First, let us do a chronology of the various paradigms in statistics and statistical algorithms.  From a European perspective Pascal is believed to begin the probability theory in1654.

LaPlace and Legendre contributed to the Least-Squares algorithm for how to fit a model to data (1750-1810)

Gauss developed the geometry and algebra of the multivariate normal distribution (1800’s)

Galton studied regression between two variables (1885) and Pearson the correlation coefficient in 1895.

Fisher, Snedecor and Sheffe concurrently worked on experimental design and analysis of variance algorithm (ANOVA) to statistically test the population distribution under the assumptions of normality in the 1920’s.

The data analyses of non distribution base samples to fit models to data showing structural features were developed by Thurstone in Factor analysis, by Young and Householder (1935) in Multidimensional scaling and Cluster analysis algorithms.

Joreskog, K. G developed in 1973 the algorithm of a general method for estimating a linear structural relational equation labeled LISREL that analyses the relationships among latent variables linked to operationalized indicators. This general method considers as special cases path analysis recursive or non recursive as well as Factors analysis.

John Tukey and Mosteller concentrated on studying exploratory data analysis to fit mathematical and geometric models to data showing both structural and residual, and thus complementing confirmatory or inferential analyses.

There are divergent paradigms in the following concepts:  first, the suitability of data measurements according to measurement theory versus the distribution properties of the variable of interest (S. S. Stevens versus I. R. Savage in the 60’s); second, the need to investigate real world data prior to applying any statistical package (data snooping) so that if you perform serious detective work on the data and torture it long enough it will confess and open many ways to understand its underlying behavior (John Tukey); thus increased emphasis on graphs of individual data points and plotting to investigate the preliminary screening so as to ensure that the summary statistics selected are truly relevant to the data at hand. 

Third, the application of the Bayesian approach from the consumer or decision maker viewpoint which provide the final probability against evidence instead of the investigator standard acceptance of a p-value to rejecting a hypothesis (read the “Illusion of Objectivity” by James Berger and Donald Berry, 1988).

Fourth, the selection of an investigator for a statistical package that he is familiar with instead of the appropriate statistics for the research in question;  The acceptance of untenable assumptions on population distributions and computing unrealistic parameters simply because the investigator is not trained to understanding or interpreting alternative statistical methods of nonparametric or distribution freer population methods.

Fifth, there are examples of investigators adopting explanatory statistical packages to torture data into divulging confusing causative variables while, in fact, the science is already well established in the domain to specifically determine exhaustively the causative factors simply because the investigator is not versed in mathematics or physics (“Tom Swift and his electric factor analysis machine by J. Scott Armstrong, 1967).

Sixth, there is a need to the “mathematization of behavioral sciences” (Skelum, 1969) which involves the development of mathematically stated theories leading to quantitative predictions of behavior and to derivation from the axioms of the theory of a multitude of empirically testable predictions. Thus, instead of testing verbal model as to the null hypothesis, an adequate mathematical model account for both variability and regularity in behavior and the appropriate statistical model is implied by the axioms of the model itself.  Another advantage is that attention is turned to measuring goodness of fit, range of phenomena handled by the model and ability to generating counterintuitive predictions.

This discussion is an attempt to emphasize the concept of experimentation as a structured theory and that the current easy and cheap computational potentials should be subservient to the theory so that data are transformed to answer definite and clear questions.  The Human Factors practitioner, whom should be multidisciplinary in order to master the human and physical sciences, is hard hit by the need of performing complex scientific experiments involving human subjects and yet required to yield practical recommendations for the applied engineering fields.

No wonder Human Factors professional are confused in their purposes and ill appreciated by the other discipline unless a hybrid kind of scientists are generated from a structural combination of engineering discipline and modern experimental methods and statistical algorithms.  However, Human Factors engineers who have an undergraduate engineering discipline and a higher degree in experimental research and statistical analyses training can be better positioned to handle research involving mathematical modeling of theories in sciences.

The fixed mindedness in adolescents reminds us of the mind fix of old people with the assumption that the mind has the potential flexibility to grow while young.

You may look young masking and old mind or look older and exhibiting a younger mind; it is your choice how much time and energy you are willing to invest for acquiring knowledge.


adonis49

adonis49

adonis49

July 2020
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

Blog Stats

  • 1,398,187 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 745 other followers

%d bloggers like this: