Archive for February 4th, 2016
Polymath? Dabbing in many field of studies and discipline? And the advantages are…?
Posted by: adonis49 on: February 4, 2016
How To Be a Polymath
by Steven Mazie
Thinking back on the college recommendations I’ve written over the past few weeks, a pattern leaps up: the most successful students, the ones who are the most lively and engaged in class, the most interesting and most dedicated, are never merely great students.
They are also utterly devoted to six other pursuits.
This used to puzzle me. How can a kid write such detailed and analytically involved nightly reading journals on Augustine and Dante, schedule meetings with me about multiple drafts of her essays, excel in a Dostoevsky seminar, third-semester Calculus and painting and find the time to edit the school newspaper, run the debate club, take photography classes, volunteer at her city councilman’s office, sing in a band and write prize-winning poetry on the side?
I exaggerate, but only slightly. As humbling as it is to write letters for students like these, it’s also enlightening, and it’s not just about the elite few humans who can handle doing more than one thing well. “Our age reveres the specialist,” writes Robert Twigger, “but humans are natural polymaths, at our best when we turn our minds to many things.” It’s not just the youngsters who can join the polymath party:
The pessimistic assumption that learning somehow ‘stops’ when you leave school or university or hit thirty is at odds with the evidence. It appears that a great deal depends on the nucleus basalis, located in the basal forebrain.
Among other things, this bit of the brain produces significant amounts of acetylcholine, a neurotransmitter that regulates the rate at which new connections are made between brain cells. This in turn dictates how readily we form memories of various kinds, and how strongly we retain them.
So what’s the trick to letting the acetylcholine flow more abundantly? Twigger again:
People as old as 90 who actively acquire new interests that involve learning retain their ability to learn. But if we stop taxing the nucleus basalis, it begins to dry up.
In some older people it has been shown to contain no acetylcholine — they have been ‘switched off’ for so long the organ no longer functions.
In extreme cases this is considered to be one factor in Alzheimers and other forms of dementia — treated, effectively at first, by artificially raising acetylcholine levels. But simply attempting new things seems to offer health benefits to people who aren’t suffering from Alzheimers. After only short periods of trying, the ability to make new connections develops. And it isn’t just about doing puzzles and crosswords; you really have to try and learn something new.
Trying something new. Hmmmm. What kind of thing?
There’s evidence that something as trivial as changing the path you use when you walk home from the subway can rewire your brain for the better. But beyond tweaking your habit trail, there are more meaningful pursuits you might try, or adopt.
Two years ago, while on a fellowship that cut my teaching load in half and brought me from New York City to a bucolic liberal arts campus a couple of hours away, I had enough newfound headspace to write a piece for the New York Times and soon thereafter accepted an offer to launch Praxis here at Big Think.
I had no idea if I’d be able to keep up the writing while being a dad and a teacher and a runner, but I thought I’d give it a try. The experience has been busy, yes, but manageable, and a few months later I started blogging for The Economist as well.
Adding new activities to my plate—not just any activities, but stuff I really enjoyed doing and had some affinity for—seems to have given me a new source of energy, and sometimes when I’m exhausted I’m also, strangely, exhilarated.
Modern capitalist society bears part of the blame for generating generations of “monomaths.” A monomath, in Trigger’s words, is “a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests.”
You can’t have a modern economy without some degree of specialization, but taken too far the division of labor turns individuals, in Marx’s words, into automatons, “appendage[s] of the machine.” It’s the price we pay for our species’ relentless progress and ever-increasing gains in productivity:
For as soon as the distribution of labour comes into being, each man has a particular, exclusive sphere of activity, which is forced upon him and from which he cannot escape. He is a hunter, a fisherman, a herdsman, or a critical critic, and must remain so if he does not want to lose his means of livelihood. (from Marx’s German Ideology)
Does this conundrum sound familiar? You can raise your skeptical eyebrows, all my critical critics, about the plausibility or desirability of Marx’s alternative—my students certainly do—but close your eyes and imagine this for a second:
In communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, herdsman or critic.
This fixation of social activity, this consolidation of what we ourselves produce into an objective power above us, growing out of our control, thwarting our expectations, bringing to naught our calculations, is one of the chief factors in historical development up till now.
Few of us can dream of becoming such radical polymaths. (And some of us may consider this extreme de-specialization to be nightmarish.) But it undervalues our lives to willingly enter into mindless ruts. If you’re in a rut, at least be aware of the fact, and let it spur you to take some action. Take that sabbatical, if you are lucky enough to get one. Make stuff. Pursue a new interest. Learn a new language. Stop this, start that. Consider career changes, even if you don’t actually make one. Do something new. Come on, it’s good for you.
Photo courtesy of Shutterstock
Why does life exist?
Posted by: adonis49 on: February 4, 2016
Why does life exist?
Popular hypotheses credit a primordial soup, a bolt of lightning, and a colossal stroke of luck.
But if a provocative new theory is correct, luck may have little to do with it. Instead, according to the physicist proposing the idea, the origin and subsequent evolution of life follow from the fundamental laws of nature and “should be as unsurprising as rocks rolling downhill.”
From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat.
Jeremy England, an assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity.
The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that, under certain conditions, matter inexorably acquires the key physical attribute associated with life.
From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat.
“You start with a random clump of atoms, and if you shine light on it for long enough, it should not be so surprising that you get a plant,” England said.
England’s theory is meant to underlie, rather than replace, Darwin’s theory of evolution by natural selection, which provides a powerful description of life at the level of genes and populations. “I am certainly not saying that Darwinian ideas are wrong,” he explained. “On the contrary, I am just saying that from the perspective of the physics, you might call Darwinian evolution a special case of a more general phenomenon.”
His idea, detailed in a paper and further elaborated in a talk he delivered at universities around the world, has sparked controversy among his colleagues, who see it as either tenuous or a potential breakthrough, or both.
England has taken “a very brave and very important step,” said Alexander Grosberg, a professor of physics at New York University who has followed England’s work since its early stages. The “big hope” is that he has identified the underlying physical principle driving the origin and evolution of life, Grosberg said.
“Jeremy is just about the brightest young scientist I ever came across,” said Attila Szabo, a biophysicist in the Laboratory of Chemical Physics at the National Institutes of Health who corresponded with England about his theory after meeting him at a conference. “I was struck by the originality of the ideas.”
Others, such as Eugene Shakhnovich, a professor of chemistry, chemical biology and biophysics at Harvard University, are not convinced. “Jeremy’s ideas are interesting and potentially promising, but at this point are extremely speculative, especially as applied to life phenomena,” Shakhnovich said.
England’s theoretical results are generally considered valid. It is his interpretation — that his formula represents the driving force behind a class of phenomena in nature that includes life — that remains unproven. But already, there are ideas about how to test that interpretation in the lab.
“He’s trying something radically different,” said Mara Prentiss, a professor of physics at Harvard who is contemplating such an experiment after learning about England’s work. “As an organizing lens, I think he has a fabulous idea. Right or wrong, it’s going to be very much worth the investigation.”
At the heart of England’s idea is the second law of thermodynamics, also known as the law of increasing entropy or the “arrow of time.”
Hot things cool down, gas diffuses through air, eggs scramble but never spontaneously unscramble; in short, energy tends to disperse or spread out as time progresses.
Entropy is a measure of this tendency, quantifying how dispersed the energy is among the particles in a system, and how diffuse those particles are throughout space. It increases as a simple matter of probability: There are more ways for energy to be spread out than for it to be concentrated.
Thus, as particles in a system move around and interact, they will, through sheer chance, tend to adopt configurations in which the energy is spread out.
Eventually, the system arrives at a state of maximum entropy called “thermodynamic equilibrium,” in which energy is uniformly distributed. A cup of coffee and the room it sits in become the same temperature, for example.
As long as the cup and the room are left alone, this process is irreversible. The coffee never spontaneously heats up again because the odds are overwhelmingly stacked against so much of the room’s energy randomly concentrating in its atoms.
Although entropy must increase over time in an isolated or “closed” system, an “open” system can keep its entropy low — that is, divide energy unevenly among its atoms — by greatly increasing the entropy of its surroundings. In his influential 1944 monograph “What Is Life?” the eminent quantum physicist Erwin Schrödinger argued that this is what living things must do. A plant, for example, absorbs extremely energetic sunlight, uses it to build sugars, and ejects infrared light, a much less concentrated form of energy. The overall entropy of the universe increases during photosynthesis as the sunlight dissipates, even as the plant prevents itself from decaying by maintaining an orderly internal structure.
Life does not violate the second law of thermodynamics, but until recently, physicists were unable to use thermodynamics to explain why it should arise in the first place. In Schrödinger’s day, they could solve the equations of thermodynamics only for closed systems in equilibrium. In the 1960s, the Belgian physicist Ilya Prigogine made progress on predicting the behavior of open systems weakly driven by external energy sources (for which he won the 1977 Nobel Prize in chemistry). But the behavior of systems that are far from equilibrium, which are connected to the outside environment and strongly driven by external sources of energy, could not be predicted.
This situation changed in the late 1990s, due primarily to the work of Chris Jarzynski, now at the University of Maryland, and Gavin Crooks, now at Lawrence Berkeley National Laboratory.
Jarzynski and Crooks showed that the entropy produced by a thermodynamic process, such as the cooling of a cup of coffee, corresponds to a simple ratio: the probability that the atoms will undergo that process divided by their probability of undergoing the reverse process (that is, spontaneously interacting in such a way that the coffee warms up). As entropy production increases, so does this ratio: A system’s behavior becomes more and more “irreversible.” The simple yet rigorous formula could in principle be applied to any thermodynamic process, no matter how fast or far from equilibrium. “Our understanding of far-from-equilibrium statistical mechanics greatly improved,” Grosberg said. England, who is trained in both biochemistry and physics, started his own lab at MIT two years ago and decided to apply the new knowledge of statistical physics to biology.
Using Jarzynski and Crooks’ formulation, he derived a generalization of the second law of thermodynamics that holds for systems of particles with certain characteristics: The systems are strongly driven by an external energy source such as an electromagnetic wave, and they can dump heat into a surrounding bath. This class of systems includes all living things. England then determined how such systems tend to evolve over time as they increase their irreversibility. “We can show very simply from the formula that the more likely evolutionary outcomes are going to be the ones that absorbed and dissipated more energy from the environment’s external drives on the way to getting there,” he said. The finding makes intuitive sense: Particles tend to dissipate more energy when they resonate with a driving force, or move in the direction it is pushing them, and they are more likely to move in that direction than any other at any given moment.
“This means clumps of atoms surrounded by a bath at some temperature, like the atmosphere or the ocean, should tend over time to arrange themselves to resonate better and better
Note: this article is lengthier, but you got the gist: You efficiently dissipate heat and energy and you are more likely to be a living entity and grow.