Adonis Diaries

Posts Tagged ‘statistical packages

Einstein speaks on theoretical physics; (Nov. 18, 2009)

The creative character of theoretical physicist is that the products of his imagination are so indispensably and naturally impressed upon him that they are no longer images of the spirit but evident realities. Theoretical physics includes a set of concepts and logical propositions that can be deduced normally. Those deductive propositions are assumed to correspond exactly to our individual experiences.  That is why in theoretical book the deduction exercises represent the entire work.

Newton had no hesitation in believing that his fundamental laws were provided directly from experience.  At that period the notion of space and time presented no difficulties: the concepts of mass, inertia, force, and their direct relationship seemed to be directly delivered by experience.  Newton realized that no experience could correspond to his notion of absolute space which implicates absolute inertia and his reasoning of actions at distance; nevertheless, the success of the theory for over two centuries prevented scientists to realize that the base of this system is absolutely fictive.

Einstein said “the supreme task of a physician is to search for the most general elementary laws and then acquire an image of the world by pure deductive power. The world of perception determines rigorously the theoretical system though no logical route leads from perception to the principles of theory.” Mathematical concepts can be suggested by experience, the unique criteria of utilization of a mathematical construct, but never deducted. The fundamental creative principle resides in mathematics.

Logical deductions from experiments of the validity of the Newtonian system of mechanics were doomed to failures. Research by Faraday and Maxwell on the electro-magnetic fields initiated the rupture with classical mechanics. There was this interrogation “if light is constituted of material particles then where the matters disappear when light is absorbed?” Maxwell thus introduced partial differential equations to account for deformable bodies in the wave theory. Electrical and magnetic fields are considered as dependent variables; thus, physical reality didn’t have to be conceived as material particles but continuous partial differential fields; but Maxwell’s equations are still emulating the concepts of classical mechanics.

Max Plank had to introduce the hypothesis of quanta (for small particles moving at slow speed but with sufficient acceleration), which was later confirmed, in order to compute the results of thermal radiation that were incompatible with classical mechanics (still valid for situations at the limit).  Max Born pronounced “Mathematical functions have to determine by computation the probabilities of discovering the atomic structure in one location or in movement”.

Louis de Broglie and Schrodinger demonstrated the fields’ theory operation with continuous functions. Since in the atomic model there are no ways of locating a particle exactly (Heisenberg) then we may conserve the entire electrical charge at the limit where density of the particle is considered nil. Dirac and Lorentz showed how the field and particles of electrons interact as of same value to reveal reality. Dirac observed that it would be illusory to theoretically describe a photon since we have no means of confirming if a photon passed through a polarizator placed obliquely on its path. 

      Einstein is persuaded that nature represents what we can imagine exclusively in mathematics as the simplest system in concepts and principles to comprehend nature’s phenomena.  For example, if the metric of Riemann is applied to a continuum of four dimensions then the theory of relativity of gravity in a void space is the simplest.  If I select fields of anti-symmetrical tensors that can be derived then the equations of Maxwell are the simplest in void space.

The “spins” that describe the properties of electrons can be related to the mathematical concept of “semi-vectors” in the 4-dimensional space which can describe two kinds of elementary different particles of equal charges but of different signs. Those semi-vectors describe the magnetic field of elements in the simplest way as well as the properties electrical particles.  There is no need to localize rigorously any particle; we can just propose that in a portion of 3-dimensional space where at the limit the electrical density disappears but retains the total electrical charge represented by a whole number. The enigma of quanta can thus be entirely resolved if such a proposition is revealed to be exact.

Critique

            Till the first quarter of the 20th century sciences were driven by shear mathematical constructs.  This was a natural development since most experiments in natural sciences were done by varying one factor at a time; experimenters never used more than one independent variable and more than one dependent variable (objective measuring variable or the data).  Although the theory of probability was very advanced the field of practical statistical analysis of data was not yet developed; it was real pain and very time consuming doing all the computations by hand for slightly complex experimental designs. Sophisticated and specialized statistical packages constructs for different fields of research evolved after the mass number crunchers of computers were invented. 

            Thus, early theoretical scientists refrained from complicating their constructs simply because the experimental scientists could not practically deal with complex mathematical constructs. Thus, the theoretical scientists promoted the concept or philosophy that theories should be the simplest with the least numbers of axioms (fundamental principles) and did their best to imagining one general causative factor that affected the behavior of natural phenomena or would be applicable to most natural phenomena.

            This is no longer the case. The good news is that experiments are more complex and showing interactions among the factors. Nature is complex; no matter how you control an experiment to reducing the numbers of manipulated variables to a minimum there are always more than one causative factor that are interrelated and interacting to producing effects.

            Consequently, the sophisticated experiments with their corresponding data are making the mathematician job more straightforward when pondering on a particular phenomenon.  It is possible to synthesize two phenomena at a time before generalizing to a third one; mathematicians have no need to jump to general concepts in one step; they can consistently move forward on firm data basis. Mathematics will remain the best synthesis tool for comprehending nature and man behaviors.

            It is time to account for all the possible causatives factors, especially those that are rare in probability of occurrence (at the very end tail of the probability graphs) or for their imagined little contributing effects: it is those rare events that have surprised man with catastrophic consequences.

            Theoretical scientists of nature’s variability should acknowledge that nature is complex. Simple and beautiful general equations are out the window. Studying nature is worth a set of equations! (You may read my post “Nature is worth a set of equations”)

Nature is worth a set of equations; (Nov. 17, 2009)

I have been reading speeches and comments of Albert Einstein, a great theoretical physicist in the 20th century.

Einstein is persuaded that mathematics, exclusively, can describe and represent nature’s phenomena; that all nature’s complexities can be comprehend and imagined as the simplest system in concepts and principles.

The fundamental creative principle resides in mathematics.  And formulas have to be the simplest and most beautifully general. Mathematical concepts can be suggested by experience, the unique criteria of utilization of a mathematical construct.

I got into thinking.

I read this dictum when I was graduating in physics and I have been appreciating this recurring philosophy ever since. The basic goal in theoretical physics for over a century was to discover the all encompassing field of energy that can unite the varieties of fields that experiments have been popping up to describing particular phenomena in nature, such as electrical and magnetic fields as well as all these “weak” and “stronger” fields of energy emanating from atoms, protons, and all the varieties of smaller elements.

I got into thinking.

Up until the first quarter of the 20th century most experiments in natural sciences were done by varying one factor at a time; experiments never used more than one independent variable and more than one dependent variable (objective measuring variable or the data).  Even today, most engineers perform these kinds of totally inefficient and worthless experiments: no interactions among variables can be analyzed, the most important and fundamental intelligences in all kinds of sciences. These engineers have simply not been exposed to experimental designs in their required curriculum!

Although the theory of probability was very advanced, the field of practical statistical analysis of data was not yet developed; it was real pain and very time consuming doing all the computations by hand for slightly complex experimental designs.

Sophisticated and specialized statistical packages constructs for different fields of research evolved after the mass number crunchers of computers were invented.

Consequently, early theoretical scientists refrained from complicating their constructs simply because they had to solve their exercises and compute them by hand in order to verify their contentious theories.

Thus, theoretical scientists knew that the experimental scientists could not practically deal with complex mathematical constructs and would refrain from undertaking complex experiments in order to confirm or refute any complex construct.

The trend, paradigm, or philosophy for the theoretical scientists was to promoting the concept that theories should be the simplest with the least numbers of axioms (fundamental principles); they did their best to imagining one general causative factor that affected the behavior of natural phenomena or would be applicable to most natural phenomena.

When Einstein mentioned that equations should be beautiful in their simplicity he had not in mind graphic design; he meant they should be simple for computations.

This is no longer the case.

Nature is complex; no matter how you control and restrict the scope of an  experiment in order to reducing the numbers of manipulated variables to a minimum there are always more than one causative factor that are interrelated and interacting to producing effects.

Currently, physicist and natural scientists can observe many independent variables and several dependent variables and analyze huge number of data points.

Still, nature variables are countable and pretty steady over the experiment. Unlike experiments involving” human subjects” that are in the hundreds and hard and sensitive to control.

Man is far more complex than nature to study his behavior.

Psychologists and sociologists have been using complex experimental designs for decades in order to study man’s behavior and his hundreds of physical and mental characteristics and variability.

All kinds of mathematical constructs were developed to aid “human scientists” perform experiments commensurate in complexity with the subject matter.

The dependent variables had no longer to be objectively measurable and many subjective criteria were adopted.

Certainly, “human scientists” did not have to know the mathematical constructs that the statistical packages were using, just the premises that justified their appropriate use for their particular field.

Anyway, these mathematical models were pretty straightforward and no sophisticated mathematical concepts were used: the human scientists should be able to understand the construct if they desired to go deeper into the program without continuing higher mathematical education.

Nature is complex, though far less complex than human variability.

Theoretical natural scientists should acknowledge that complexity. And studying nature is worth a set of equations!

Simple and beautiful general equations are out the window.  There are no excuses for engineers and natural scientists for not expanding their imagination and focusing their intuition on complex constructs that may account for many causative factors and analyzing simultaneously many variables for their interactions.

There are no excuses that experimental designs are not set up to handle three independent variables (factors) and two dependent variables; the human brain is capable of visualizing the interactions of 9 combinations of variables two at a time. 

Certainly, scientists can throw in as many variables as they need and the powerful computers will crunch the numbers as easily and as quickly as simple designs; the problem is the interpretation part of the reams and reams of results.

Worst, how your audience is to comprehend your study?

A set of coherent series of relatively complex experiments can be designed to answer most complex phenomena and yet be intelligibly interpreted.

It is time to account for all the possible causatives factors, especially those that are rare in probability of occurrence (at the very end tail of probability graphs) or for their imagined little contributing effects: it is those rare events that have surprised man with catastrophic consequences.

If complex human was studied with simple sets of equations THEN nature is also worth sets of equations.

Be bold and make these equations as complex as you want; the computer would not care as long as you understand them for communication sake.

How do you value quality of life? (October 20, 2009)

 

            French President Sarkozy assembled a committee of Nobel Prize economists such as Joseph Stiglitz and Amartya Sen to ponder on new indicators for measuring economic performance and social progress. This honorable committee submitted its report on September 13, 2009. The conclusion of the report concerning social progress target the well being of the citizens such as life expectancy, affordable health care, affordable dwelling, worthy education system that focus on individual reflection instead of data and fact memorization since the individual will be called upon to act on his decision, alternatives to organize our life around activities that we love; having satisfying jobs that we value; the possibility of expressing our opinions in public politics and social meetings; enjoying wholesome environment, clean water and purer breathable air; and feeling secure in the neighborhood.  All this social indicators are more valuable to measure how a State has been progressing than relying solely on GNP or how many cars a family own or the number of household equipments.

            Joseph Stiglitz is not welcomed in the Obama Administration because he harshly criticized the President economic adviser Larry Summers in The New York Times;  Stiglitz said: “the plan for financial and economic stability is too modest to be effective. The pumping of money in banks is practically free gifts offered to Wall Street: only investors and creditors to these banks are benefiting but not the tax payers.”  Stiglitz is the chief of the line of economists who attack the concept that free markets have the capability to stabilize imbalances efficiently.  His mathematical models have demonstrated that transactions in free markets are biased toward those who are specialized in finance and have the necessary data to fool clients; “globalization has created a fresh pool of investors to exploit their ignorance”.

            In this post I will ask binary questions of (Yes or No) for voting on laws and amendments in three categories of quality of life: personal, community, and State levels. For example, on the community level, suppose that if people postpone purchasing their first cars for a year and the saved money covers the expenses of inoculating all babies in the community then how would you vote?  Suppose people are asked to postpone buying a new car instead of their older one for a year, then how would you vote?  Suppose of inoculating babies the community decided for pay for free complete blood tests for citizens over 45 of years? Suppose that the community can perform free bypass surgery for the badly needed patients, or free urine dialysis?

            What if you can postpone for a year replacing your washing machine to cover the expenses of investing in playgrounds for kids, or clean water, or new sewer system, or public transport system, or upgrading a hospital, or modernizing schools with updated communication and audio visual systems? How would you vote?

 

            On the State level, suppose the tax breaks exempt people earning less than $10,000 of taxes.  If the State decided to exempt people earning less than $20,000 would you vote for that new tax break knowing that investing money on the previous tax break are targeted to preserving natural reserves, distributing electricity 24 hours per day at the original rate, establishing affordable State health care for all, paying higher rates for teachers for continuing education to encourage individual reflection, increasing rates for nurses with higher quality of services, investing in clean alternative sources of energy, or salvaging beach resorts and better accommodating camping grounds and reclaiming greener locations for the public? How would you vote?

 

            On the personal level, suppose your family is over three kids and they attend private schools. If you are to send them to public schools, in safe neighborhoods, then would you invest the saved money on a new bathroom, building an extra large room for the kids to assemble and play, arranging the garden as an attractive playground for the kids, taking additional vacations, working part-time so that you may monitor the teaching of your kids after school, subscribing your kids in various clubs and extra-curricular activities, or going out more frequently to movie theaters, musical event, and plays?

            The premises are clear: for the same financial saving you have choices of improving the quality of life of the many in return of lavisher personal comfort.  These questionnaires permit you to value the kinds of quality of life you believe in; they are easy to administer and the responses can be statistically analyzed using statistical packages specialized for binary responses.  How your community value quality of life? How your nation value quality of life?  What do you think about this research project?

Human Factors in Engineering (Article #29)

“How objective and scientific are research?”

Friend, allow me just a side explanation on experimentation.  Psychologists, sociologists and marketing graduates are trained to apply various experimentation methods and not just cause and effects designs.

There are many statistical packages oriented to provide dimensions and models to the set of data dumped into the experiment, so that a preliminary understanding of the system behavior is comprehended qualitatively.

Every applied science has gone through many qualitative models or schema, using various qualitative methods, before attempting to quantify their models. However, many chairmen of engineering departments, especially those who have no understanding of the discipline of Human Factors or were never exposed to designing experiments, have a conception that this field is mostly qualitative in nature.

They would ask me to concentrate in my courses on the quantitative aspects such as the environmental factors of lighting, noise, heat and any topic that requires computation or has well defined physics equations.

We have 3 concepts in the title: objectivity, scientific and research that are related in people’s mind as connoting the same concept.

However, the opposite meanings for these concepts are hard to come by without philosophical divergences or assumptions.

If we define science as a set of historical paradigms, a set of concepts, truths, facts and methods that most of them keep changing as new technologies and new methodologies enlarge the boundaries of knowledge, then you might be more inclined to discuss notions with a freer mind.

Could subjectivity be accepted as the opposite of objectivity without agreeing on a number of axioms and assumptions that are not tenable in many cases?  Any agreement in the meanings of objectivity in scientific research procedures and results are basically consensual among the professionals in a discipline, for a period, until the advent of a new paradigm that changes the meaning or orientation of the previous consensus among the professionals.

Could opinions, personal experiences, recalled facts or events not be accepted in the domain of research even if they could be found in written documents but not thoroughly investigated by a researcher? 

So what if you refer to an accredited research article and then it turned out that the article was fraught with errors, misleading facts with borderline results and untenable interpretations?  Would the research be thrown in the dust bin as unscientific or non objective and thus not worth further investigations?

Research in Physics, Chemistry and engineering deal with objects and are related to studying the behavior of the physical nature; these kind of research can arrive to well establish mathematical models because the factors are countable, could be well controlled in experimental settings and the variability in errors are connected to the technology of the measuring instruments once the procedure is well defined and established according to experimental standards.

It is when research has to deal with the variability in the human nature such as in psychology, psychometric, sociology, marketing, business management and econometric that the notions of objectivity, research and science become complex and confusing.

The main problem is to boldly discriminate among research and admit that not every research is necessarily scientific or objective and that a research has an intrinsic value if the investigator is candid about the purpose and nature of his research.

We need to admit that every research is subjective in nature because it is the responsibility of the investigator to select his topic, his intentions, his structured theory, references, fund providers, the hypotheses, the design, the methodology, the sample size, the populations, the data collection techniques, the statistical package, emphasis on either error type I or error type II, the interpretation of results and so on.

By admitting prior subjective environment to a research endeavor, we can proffer the qualitative term of objectivity to the research only and only when the investigators provide full rationales to every subjective choices in the research process.

Every step in the research process is a variation on an accepted paradigm at one point in the history of science and the mixing of paradigms with no conscious realization of the mixing process should set a warning alarm on the validity of the research and the many pitfalls it is running through.

Acknowledging the role of subjectivity in the methodology, the data and its interpretation could open the way for more accurate and flexible judgments as to the extent of objectivity and scientific tendencies of the research.

Article #29, December 1st, 2005

“How objective and scientific are research?”

Would you please give me a minute to set the foundations first? Friend, allow me just a side explanation on experimentation.  Psychologists, sociologists and marketing graduates are trained to apply various experimentation methods and not just cause and effects designs.  There are many statistical packages oriented to providing dimensions and models to the set of data dumped into the experiment so that a preliminary understanding of the system behavior is comprehended qualitatively.

Every applied science has gone through many qualitative models or schemas, using various qualitative methods, before attempting to quantify their models. However, many chairmen of engineering departments, especially those who have no understanding of the disciple of Human Factors or were never exposed to designing experiments, have a conception that this field is mostly qualitative in nature and would ask me to concentrate in my courses on the quantitative aspects such as the environmental factors of lighting, noise, heat and any topic that requires computation or has well defined physics equations.

We have three concepts in the title: objectivity, scientific and research that are related in people’s mind as connoting the same concept.  However, the opposite meanings for these concepts are hard to come by without philosophical divergences or assumptions.  If we define science as a set of historical paradigms, a set of concepts, truths, facts and methods that most of them keep changing as new technologies and new methodologies enlarge the boundaries of knowledge then you might be more inclined to discuss notions with a freer mind.

Could subjectivity be accepted as the opposite of objectivity without agreeing on a number of axioms and assumptions that are not tenable in many cases?  Any agreement in the meanings of objectivity in scientific research procedures and results are basically consensual among the professionals in a discipline, for a period, until the advent of a new paradigm that changes the meaning or orientation of the previous consensus among the professionals.

Could opinions, personal experiences, recalled facts or events not be accepted in the domain of research even if they could be found in written documents but not thoroughly investigated by a researcher?  So what if you refer to an accredited research article and then it turned out that the article was fraught with errors, misleading facts with borderline results and untenable interpretations?  Would the research be thrown in the dust bin as unscientific or non objective and thus not worth further investigations?

Research in Physics, Chemistry and engineering deal with objects and are related to studying the behavior of the physical nature; these kind of research can arrive to well establish mathematical models because the factors are countable, could be well controlled in experimental settings and the variability in errors are connected to the technology of the measuring instruments once the procedure is well defined and established according to experimental standards.  It is when research has to deal with the variability in the human nature such as in psychology, psychometric, sociology, marketing, business management and econometrics that the notions of objectivity, research and science become complex and confusing.

The main problem is to boldly discriminate among research and admit that not every research is necessarily scientific or objective and that a research has an intrinsic value if the investigator is candid about the purpose and nature of his research.  We need to admit that every research is subjective in nature because it is the responsibility of the investigator to select his topic, his intentions, his structured theory, references, fund providers, the hypotheses, the design, the methodology, the sample size, the populations, the data collection techniques, the statistical package, emphasis on either error type I or error type II, the interpretation of results and so on. 

By admitting prior subjective environment to a research endeavor then we can proffer the qualitative term of objectivity to the research only and only when the investigators provide full rationales to every subjective choices in the research process.

Every step in the research process is a variation on an accepted paradigm at one point in the history of science and the mixing of paradigms with no conscious realization of the mixing process should set a warning alarm on the validity of the research and the many pitfalls it is running through.  

Acknowledging the role of subjectivity in the methodology, the data and its interpretation could open the way for more accurate and flexible judgments as to the extent of objectivity and scientific tendencies of the research.

What are error taxonomies, and other taxonomies in Human Factors in Engineering?

Article #12, written in April 9, 2005)

May you allow me just a side explanation on experimentation, to set the foundations first?

Psychologists, sociologists and marketing graduates are trained to apply various experimentation methods and not just cause and effect designs.

There are many statistical packages oriented to providing dimensions and models to the set of data dumped into the experiment, so that a preliminary understanding of the system behavior is comprehended qualitatively.

Every applied science has gone through many qualitative models or schema, using various qualitative methods, before attempting to quantify their models.

Many chairmen of engineering departments, especially those who have no understanding of the disciple of Human Factors in engineering and would never touch this body of knowledge and methods with a long pole, ask me to concentrate my courses on the quantitative aspects.

That hint sends immediate shiver through my rebellious spirit and I am tempted to ask them “what taxonomy of methods are you using in teaching engineering courses?”

What taxonomies Human Factors have to conceive?  How about the classification of human errors when operating a system, their frequencies and consequences on the safety of operators and system performance?

Human Factors professionals attempted to establish various error taxonomies, some within a specific context, during their study and analysis of errors that might be committed in the operation of nuclear power plants for example, and other taxonomy that are out of any specific context.

One alternative classification of human errors is based on human behavior and the level of comprehension. Mainly, skill-based, or rule-based or knowledge-based behavioral patterns.

This taxonomy identifies 13 types of errors and discriminates among the stages and strength of controlled routines in the mind that precipitate the occurrence of an error, whether during execution of a task, omitting steps, changing the order of steps, sequence of steps, timing errors, inadequate analysis or decision-making.

With a strong knowledge of the behavior of a system, provided that the mental model is not deficient, applying the rules consistently most of the errors will be concentrated on the level of skill achieved in performing a job.

Another taxonomy rely on the theory of information processing and it is a literal transcription of the experimental processes; mainly, observation of a system status, choice of hypothesis, testing of hypothesis, choice of goal, choice of procedure and execution of procedure.  Basically, this taxonomy may answer the problems in the rule-based and knowledge–based behavior.

It is useful to specify in the final steps of taxonomy whether an error is of omission or of commission.  I suggest that the errors of commission be also fine tuned to differentiate among errors of sequence, the kind of sequence, and timing of the execution.

There are alternative strategies for reducing human errors by either training, selection of the appropriate applicants, or redesigning a system to fit the capabilities of end users and/or taking care of his limitations by preventive designs, exclusion designs, and fail-safe designs.


adonis49

adonis49

adonis49

October 2020
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031  

Blog Stats

  • 1,428,654 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 776 other followers

%d bloggers like this: