##
Posts Tagged ‘**Pascal**’

The French mathematician Pascal wrote: “All afflictions of mankind is his feeling restless in a room: He never learns to be at rest. Man sincerely thinks that he is seeking rest and repose periods, but in fact he is after agitation.” It is like life abhors the void; it is as keeping in restful position is tantamount to worthlessness.

Maybe idleness is mother of all vices; but it is certainly that idleness is mother of all virtues: How else could you meditate, voluntarily inhale deeply, exhale slowly, reflect on your past achievement and plans, comprehend your limitations, capabilities, and potentials, setting up future action plans, revisit serious talks that passed you by, regain courage and the will to go forward?

Forced work and unhappy and unsafe workplace are excellent excuses to running away from pressing responsibilities at home, to delegating important responsibilities, to faking being too busy to listen and deliver on promises. Is working for salary an excellent way out to earning some money to be freer in taking better decisions for better quality products, services, and taking longer leisure quality time?

Let me clarify how I comprehend class structure in capitalist system in order to drive my point through. The lowest class is represented by 20% of the downtrodden or daily workers. When a government claims that unemployment has reached say 10% it means that the lowest middle class is suffering most of the unemployment rate.

For example, if we say that at least 70% of the “working people” are from the middle class (divided among the lower, middle, and upper) middle classes and that 70% of the middle class are represented in the lower middle class section then, about 4.9% of the unemployed are from the lower middle class (for example: 70%*70%*10%). The other 5.1% of the unemployed are constituted from all the other classes (excepting the 10% of the richest class that hoard 50% of the wealth of the nation: Its amassed wealth works for the richest class)

When a government starts drastic cut in the budget (except the military: It goes without sayin) it affects primarily the lower middle class, since the 20% of downtrodden are already suffering at the bare minimum and cannot contribute to taxes. Actually, the class of the downtrodden feels helpless and waits for the middle class to begins marching revolts to join it and scare the richest class and their political, security forces, legal, and law institutions for a short while. When unemployment jumps to 20% that affects 10% of the lower middle class then, serious problems await the power-to-be in capitalist systems: This number of 10% of unemployed lower middle class is a critial turning point for serious revolts.

It is to the advantage of most of the citizens in capitalist society that unemployment increases to beyond 20% for serious reforms to be considered and enacted in forms of laws and financial institutions control, restrictions, and constraints.

Are you being pressured to forced labor? If you are not a teenager and think that you have skills then, would you work for McDonald and invite your kids to eat deadly food? If you hate to lie and scam people then, would you work for Wall Street? If you cannot suffer selling products and services that you don’t believe in because they are of low quality or unsafe to use then, would you work as a salesman for such a company?

First, do you know what kinds of moral values and ethical conducts that best represent your spirit and would make you happy to noticing their applications around you and in your neighborhood? Would you take sometime to learn and select the work you enjoy doing and stop this harassing fleeing exercises? Would you mind getting organized around clear laws that equalize opportunities and fairness in employment?

**Article #30, December 27, 2005**

** “How objective and scientific are experiments?”**** **

If we narrow this article to the statistical analysis of experiments and without going into details suffice us to mention a few controversies. First, let us do a chronology of the various paradigms in statistics and statistical algorithms. From a European perspective Pascal is believed to begin the probability theory in1654.

LaPlace and Legendre contributed to the Least-Squares algorithm for how to fit a model to data (1750-1810)

Gauss developed the geometry and algebra of the multivariate normal distribution (1800’s)

Galton studied regression between two variables (1885) and Pearson the correlation coefficient in 1895.

Fisher, Snedecor and Sheffe concurrently worked on experimental design and analysis of variance algorithm (ANOVA) to statistically test the population distribution under the assumptions of normality in the 1920’s.

The data analyses of non distribution base samples to fit models to data showing structural features were developed by Thurstone in Factor analysis, by Young and Householder (1935) in Multidimensional scaling and Cluster analysis algorithms.

Joreskog, K. G developed in 1973 the algorithm of a general method for estimating a linear structural relational equation labeled LISREL that analyses the relationships among latent variables linked to operationalized indicators. This general method considers as special cases path analysis recursive or non recursive as well as Factors analysis.

John Tukey and Mosteller concentrated on studying exploratory data analysis to fit mathematical and geometric models to data showing both structural and residual, and thus complementing confirmatory or inferential analyses.

There are divergent paradigms in the following concepts: first, the suitability of data measurements according to measurement theory versus the distribution properties of the variable of interest (S. S. Stevens versus I. R. Savage in the 60’s); second, the need to investigate real world data prior to applying any statistical package (data snooping) so that if you perform serious detective work on the data and torture it long enough it will confess and open many ways to understand its underlying behavior (John Tukey); thus increased emphasis on graphs of individual data points and plotting to investigate the preliminary screening so as to ensure that the summary statistics selected are truly relevant to the data at hand.

Third, the application of the Bayesian approach from the consumer or decision maker viewpoint which provide the final probability against evidence instead of the investigator standard acceptance of a p-value to rejecting a hypothesis (read the “Illusion of Objectivity” by James Berger and Donald Berry, 1988).

Fourth, the selection of an investigator for a statistical package that he is familiar with instead of the appropriate statistics for the research in question; The acceptance of untenable assumptions on population distributions and computing unrealistic parameters simply because the investigator is not trained to understanding or interpreting alternative statistical methods of nonparametric or distribution freer population methods.

Fifth, there are examples of investigators adopting explanatory statistical packages to torture data into divulging confusing causative variables while, in fact, the science is already well established in the domain to specifically determine exhaustively the causative factors simply because the investigator is not versed in mathematics or physics (“Tom Swift and his electric factor analysis machine by J. Scott Armstrong, 1967).

Sixth, there is a need to the “mathematization of behavioral sciences” (Skelum, 1969) which involves the development of mathematically stated theories leading to quantitative predictions of behavior and to derivation from the axioms of the theory of a multitude of empirically testable predictions. Thus, instead of testing verbal model as to the null hypothesis, an adequate mathematical model account for both variability and regularity in behavior and the appropriate statistical model is implied by the axioms of the model itself. Another advantage is that attention is turned to measuring goodness of fit, range of phenomena handled by the model and ability to generating counterintuitive predictions.

This discussion is an attempt to emphasize the concept of experimentation as a structured theory and that the current easy and cheap computational potentials should be subservient to the theory so that data are transformed to answer definite and clear questions. The Human Factors practitioner, whom should be multidisciplinary in order to master the human and physical sciences, is hard hit by the need of performing complex scientific experiments involving human subjects and yet required to yield practical recommendations for the applied engineering fields.

No wonder Human Factors professional are confused in their purposes and ill appreciated by the other discipline unless a hybrid kind of scientists are generated from a structural combination of engineering discipline and modern experimental methods and statistical algorithms. However, Human Factors engineers who have an undergraduate engineering discipline and a higher degree in experimental research and statistical analyses training can be better positioned to handle research involving mathematical modeling of theories in sciences.

The fixed mindedness in adolescents reminds us of the mind fix of old people with the assumption that the mind has the potential flexibility to grow while young.

You may look young masking and old mind or look older and exhibiting a younger mind; it is your choice how much time and energy you are willing to invest for acquiring knowledge.