Posts Tagged ‘Maria Popova’
Are you a Procrastinator? A symptom, a cause, a delayed diminishing value?
Posted by: adonis49 on: January 22, 2018
Are you a Procrastinator? A symptom, a cause, a delayed diminishing value?
Maria Popova posted:
Animated: Science of Procrastination and How to Manage I
This is where you insert the meta-joke about what else you’re actually supposed to be doing this very moment.
From AsapSCIENCE —
Who have previously brought us the scientific cure for hangovers, the neurobiology of orgasms, and how music enchants the brain — comes this illustrated explication of the science of procrastination and how to manage it.
A fine addition to these five perspectives on procrastination.
Among the proposed solutions is the Pomodoro technique, a time-management method similar to time boxing that uses timed intervals of work and reward.
Human motivation is highly influenced by how imminent the reward is perceived to be.
Meaning, the further away the reward is, the more you discount its value.
This is often referred to as Present bias, or Hyperbolic discounting.
For a more metaphysical take on the subject, see the fantastic anthology The Thief of Time: Philosophical Essays on Procrastination.
Hannah Arendt: on Science, Value of Space Exploration, Human Condition
Posted by: adonis49 on: May 10, 2017
Hannah Arendt on Science, the Value of Space Exploration,
and How Our Cosmic Aspirations Illuminate the Human Condition
A case against human solipsism
A clarion call for non-egocentric curiosity about the nature of reality.
By Maria Popova
“Who indeed will set bounds to human ingenuity?” Galileo asked in his magnificent letter to the Grand Duchess of of Tuscany as he dethroned the human animal from the center of the universe. “Who will assert that everything in the universe capable of being perceived is already discovered and known?”
Half a millennium later, as we continue to make revolutionary discoveries that invite us to revise our understanding of the cosmos and reassess our place in it — discoveries like the detection of gravitational waves, perhaps the greatest breakthrough in astronomy since Galileo pointed his telescope at the heavens — we continue to struggle with the same discomfiting questions: How are we to live with any sense of importance and meaning if the more we find out about the universe, the less significant we seem to be and the more meaningless it becomes? What, then, is the human and humane value of knowing more at all?
That’s what Hannah Arendt (October 14, 1906–December 4, 1975) addresses with great subtlety and uncompromising intellectual rigor in a 1963 essay titled “The Conquest of Space and the Stature of Man,” later included in her altogether spectacular and timely book Between Past and Future: Eight Exercises in Political Thought (public library).
The essay’s title was inspired by a question posed by the editors of the magazine Great Ideas Today for a special feature focusing on “what the exploration of space is doing to man’s view of himself and to man’s condition” —
the question of whether humanity’s so-called conquest of space has increased or diminished the existential stature of human beings.

Five years after she weighed the difference between how art and science illuminate the human condition, Arendt writes:
To understand physical reality seems to demand not only the renunciation of an anthropocentric or geocentric world view, but also a radical elimination of all anthropomorphic elements and principles, as they arise either from the world given to the five human senses or from the categories inherent in the human mind.
The question assumes that man is the highest being we know of, an assumption which we have inherited from the Romans, whose humanitas was so alien to the Greeks’ frame of mind that they had not even a word for it. (The reason for the absence of the word humanitas from Greek language and thought was that the Greeks, in contrast to the Romans, never thought that man is the highest being there is. Aristotle calls this belief atopos, “absurd.”)
This view of man is even more alien to the scientist, to whom man is no more than a special case of organic life and to whom man’s habitat — the earth, together with earthbound laws — is no more than a special borderline case of absolute, universal laws, that is, laws that rule the immensity of the universe.
Surely the scientist cannot permit himself to ask: What consequences will the result of my investigations have for the stature (or, for that matter, for the future) of man? It has been the glory of modern science that it has been able to emancipate itself completely from all such anthropocentric, that is, truly humanistic, concerns.
[…]
For the scientist, man is no more than an observer of the universe in its manifold manifestations. The progress of modern science has demonstrated very forcefully to what an extent this observed universe, the infinitely small no less than the infinitely large, escapes not only the coarseness of human sense perception but even the enormously ingenious instruments that have been built for its refinement.
Although science is, as astrophysicist Janna Levin has memorably noted, “a truly human endeavor,” Arendt argues that the task of the scientist is to stand outside and beyond human solipsism; that setting out to answer such questions as what man’s stature should be, how we differ from other other animals, and why we pursue knowledge at all would shackle science to constraining concerns, to a sort of smallness of curiosity. She reflects on the paradox of such questions:
All answers … whether they come from laymen or philosophers or scientists, are non-scientific (although not anti-scientific); they can never be demonstrably true or false. Their truth resembles rather the validity of agreements than the compelling validity of scientific statements.
Even when the answers are given by philosophers whose way of life is solitude, they are arrived at by an exchange of opinions among many men, most of whom may no longer be among the living. Such truth can never command general agreement, but it frequently outlasts the compellingly and demonstrably true statements of the sciences which, especially in recent times, have the uncomfortable inclination never to stay put, although at any given moment they are, and must be, valid for all.
In other words, notions such as life, or man, or science, or knowledge are pre-scientific by definition, and the question is whether or not the actual development of science which has led to the conquest of terrestrial space and to the invasion of the space of the universe has changed these notions to such an extent that they no longer make sense.
So if science ought to be concerned with questions far beyond the human scale, free of human ego, then the very notion of the “conquest” of space and man’s “stature” implies a sort of hunger for power antithetical to the real enterprise of science.
Fifteen years before the pioneering scientist Erwin Chargaff made his beautiful case for the poetics of curiosity, she considers the true animating force of scientists — amplified access to what Einstein famously called the human “passion for comprehension.” Arendt writes:
It is, I think, safe to say that nothing was more alien to the minds of the scientists, who brought about the most radical and most rapid revolutionary process the world has ever seen, than any will to power. Nothing was more remote than any wish to “conquer space” and to go to the moon…
It was indeed their search for “true reality” that led them to lose confidence in appearances, in the phenomena as they reveal themselves of their own accord to human sense and reason. They were inspired by an extraordinary love of harmony and lawfulness which taught them that they would have to step outside any merely given sequence or series of occurrences if they wanted to discover the overall beauty and order of the whole, that is, the universe.
[…]
It is, in fact, quite obvious that the scientists’ strongest intellectual motivation was Einstein’s “striving after generalization,” and that if they appealed to power at all, it was the interconnected formidable power of abstraction and imagination.
She turns to the particular case of space exploration and its immense humanizing value in enlarging not only our knowledge but our humility:
The magnitude of the space enterprise seems to me beyond dispute, and all objections raised against it on the purely utilitarian level — that it is too expensive, that the money were better spent on education and the improvement of the citizens, on the fight against poverty and disease, or whatever other worthy purposes may come to mind — sound to me slightly absurd, out of tune with the things that are at stake and whose consequences today appear still quite unpredictable.
There is, moreover, another reason why I think these arguments are beside the point. They are singularly inapplicable because the enterprise itself could come about only through an amazing development of man’s scientific capabilities. The very integrity of science demands that not only utilitarian considerations but the reflection upon the stature of man as well be left in abeyance.
Has not each of the advances of science, since the time of Copernicus, almost automatically resulted in a decrease in his stature? And is the often repeated argument that it was man who achieved his own debasement in his search for truth, thus proving anew his superiority and even increasing his stature, more than a sophism? Perhaps it will turn out that way.
At any event, man, insofar as he is a scientist, does not care about his own stature in the universe or about his position on the evolutionary ladder of animal life; this “carelessness” is his pride and his glory.
Complement this particular portion of Arendt’s altogether indispensable Between Past and Future with physicist Sean Carroll on how “poetic naturalism” helps us wrest meaning from an impartial universe, then revisit Arendt on the crucial difference between truth and meaning, the power of being an outsider, how tyrants use isolation as a weapon of oppression, and our only effective antidote to the normalization of evil.
Lucid Dreaming: Can it be Controlled? A science to back it up?
Posted by: adonis49 on: March 17, 2017
Lucid Dreaming: Can it be Controlled? A science to back it up?
A week ago, I watched a documentary on Lucid Dreaming. It means this ability to be aware that you are dreaming and decide to alter the course of events in the dream to satisfy your desires…
One handicap in the research is the rarity to finding Lucid Dreamers who can submit to the experiments.
Occasionally, I am aware that I am dreaming and I do my best to alter this state of wakefulness, particularly, when the dream is a horror movie or a re-run that I try to cut it short. Generally, this attempt to control my dream is terminated by waking up and I continue to edit the dream lying in bed, eyes closed.
The Science of Lucid Dreaming and How to Learn to Control Your Dreams, Animated
Maria Popova posted in Brain Pickings
Trekking the continuum of sleep and wakefulness in a journey into meta-consciousness.
As if the science of sleep and the emotional function of dreaming weren’t fascinating enough in and of themselves, things get even more bewildering when it comes to lucid dreaming — a dream state in which you’re able to manipulate the plot of the dream and your experience in it.
But how, exactly, does that work and can you train yourself to do it? Count on AsapSCIENCE — who have previously explored such mysteries as how music enchants the brain, the neurobiology of orgasms, and the science of procrastination — to shed some light:
Everybody has 3-7 dreams a night — the problem is, we quickly forget them.
(Then again, the probability that you are dreaming this very minute might be one in ten, so it might all be moot.)
For a deeper dive into the scientific nitty-gritty of lucid dreaming, see Stephen LaBerge and Howard Rheingold’s 1991 bible Exploring the World of Lucid Dreaming and LaBerge’s follow-up, Lucid Dreaming: A Concise Guide to Awakening in Your Dreams and in Your Life.
Treat yourself to this fantastic and mind-bending Radiolab episode about how one man cured himself of a recurring nightmare by learning lucid dreaming:
Is Art of Timing different from Art of Waiting? How to take pleasure of the Present?
Posted by: adonis49 on: March 15, 2017
The Art of Timing:
How to take pleasure of the Present?
Alan Watts on the Perils of Hurrying and the Pleasures of Presence
by Maria Popova
“For the perfect accomplishment of any art, you must get this feeling of the eternal present into your bones — for it is the secret of proper timing.”
Among the things that made British philosopher Alan Watts not only the pioneer of Zen teachings in the West but also an enduring sage of the ages was his ability to call out our culture’s chronic tendency to confuse things of substance with their simulacra.
Watts had a singular way of dispersing our illusory convictions about such pairings, whether he addressed belief vs. faith or money vs. wealth or productivity vs. presence or ego vs. true self or stimulation vs. wisdom or profit vs. purpose.
In one particularly poignant passage in his altogether soul-expanding 1970 anthology Does It Matter? Essays on Man’s Relation to Materiality (public library), Watts considers another such infinitely important duality — the notions of hurrying and timing.
Echoing Seneca’s ideas about busyness and Bertrand Russell’s famous lament — “What will be the good of the conquest of leisure and health, if no one remembers how to use them?” — Watts considers how we cheat ourselves of the joys of the present moment by grasping after the potential rewards of the future:
Just exactly what is the “good” to which we aspire through doing and eating things that are supposed to be good for us?
This question is strictly taboo, for if it were seriously investigated the whole economy and social order would fall apart and have to be reorganized. It would be like the donkey finding out that the carrot dangled before him, to make him run, is hitched by a stick to his own collar.
For the good to which we aspire exists only and always in the future. Because we cannot relate to the sensuous and material present we are most happy when good things are expected to happen, not when they are happening.
We get such a kick out of looking forward to pleasures and rushing ahead to meet them that we can’t slow down enough to enjoy them when they come.
We are therefore a civilization which suffers from chronic disappointment — a formidable swarm of spoiled children smashing their toys.
In a sentiment that calls to mind Mary Oliver’s thoughts on rhythm, Watts speaks to our one saving grace in countering the momentum of this headfirst rush toward disappointment:
There is indeed such a thing as “timing” — the art of mastering rhythm — but timing and hurrying are … mutually exclusive.
Much of our perilous hurrying, Watts argues, comes from the tyranny of the clock — a paradoxical pathology all the more anguishing given how relative and elastic time actually is. Watts writes:
Clock time is merely a method of measurement held in common by all civilized societies, and has the same kind of reality (or unreality) as the imaginary lines of latitude and longitude. (Until drawn on maps and being able to measure the location?)
The equator is useless for stringing a rolled roast. To judge by the clock, the present moment is nothing but a hairline which, ideally, should have no width at all — except that it would then be invisible.
If you are bewitched by the clock you will therefore have no present.
“Now” will be no more than the geometrical point at which the future becomes the past. But if you sense and feel the world materially, you will discover that there never is, or was, or will be anything except the present.
Presence, of course, is essential to our ability to experience the “spiritual electricity” of creative flow, something Watts captures unambiguously:
For the perfect accomplishment of any art, you must get this feeling of the eternal present into your bones — for it is the secret of proper timing. No rush. No dawdle.
Just the sense of flowing with the course of events in the same way that you dance to music, neither trying to outpace it nor lagging behind. Hurrying and delaying are alike ways of trying to resist the present.
Does It Matter? is a superb read in its entirety.
Complement it with Watts on how to live with presence, Sam Harris on cultivating mindful living, and Frank Partnoy on the art of waiting, then revisit Annie Dillard’s ever-timely reminder that how we spend our days is how we spend our lives.
Note: Without a project set in the future and well designed and planned, how can we enjoy the moment of its progress?
Montaigne on “Curation,”
llusion of OriginIality, and How We Form Our Opinions
by Maria Popova
“I have gathered a posy of other men’s flowers, and nothing but the thread that binds them is mine own.”
I often think of reading not as the acquisition of static knowledge but as the active springboard for thinking and dynamic contemplation — hence the combinatorial, LEGO-like nature of creativity, wherein we assemble building blocks of existing knowledge into new formations of understanding that we consider our original ideas.
But long before our contemporary conceptions of how creativity works, French Renaissance polymath and proto-blogger Michel de Montaigne (February 28, 1533–September 13, 1592) articulated this magpielike quality of the mind, so very central to ideation.
In Michel de Montaigne: The Complete Essays (public domain; public library) — the same indispensable volume that gave us the great philosopher’s ideas on death and the art of living — he writes:
A competent reader often discovers in other men’s writings other perfections than the author himself either intended or perceived, a richer sense and more quaint expression.
Half a millennium before Mark Twain proclaimed that “substantially all ideas are second-hand” and long before we drained the term “curation” of meaning by compulsive and indiscriminate application, Montaigne observed:
I have gathered a posy of other men’s flowers, and nothing but the thread that binds them is mine own.
But what makes Montaigne’s meditation so incisive — and such an urgently necessary fine-tuning of how we think of “curation” today — is precisely the emphasis on the thread.
This assemblage of existing ideas, he argues, is nothing without the critical thinking of the assembler — the essential faculty examining those ideas to sieve the meaningful from the meaningless, assimilating them into one’s existing system of knowledge, and metabolizing them to nurture a richer understanding of the world. Montaigne writes:
We take other men’s knowledge and opinions upon trust; which is an idle and superficial learning. We must make it our own.
We are in this very like him, who having need of fire, went to a neighbor’s house to fetch it, and finding a very good one there, sat down to warm himself without remembering to carry any with him home…
What good does it do us to have the stomach full of meat, if it do not digest, if it be not incorporated with us, if it does not nourish and support us?
Three centuries later, Thoreau — another of humanity’s most quotable and overquoted minds — made a similar point about the perils of mindlessly parroting the ideas of those who came before us, which produces only simulacra of truth.
The mindful reflection and expansion upon existing ideas and views, on the other hand, is a wholly different matter — it is the path via which we arrive at more considered opinions of our own, cultivate our critical faculties, and inch closer to truth itself.
Montaigne writes:
Aristotle ordinarily heaps up a great number of other men’s opinions and beliefs, to compare them with his own, and to let us see how much he has gone beyond them, and how much nearer he approaches to the likelihood of truth;
for truth is not to be judged by the authority and testimony of others; which made Epicurus religiously avoid quoting them in his writings. This is the prince of all dogmatists, and yet we are told by him that the more we know the more we have room for doubt.
Complement Montaigne’s Complete Essays — a timeless trove of wisdom on such diverse facets of existence as happiness, education, fear, and the imagination — with his enduring wisdom on how to live and Salvador Dalí’s rare and whimsical illustrations for his essays.
On Storytelling: Susan Sontag
Posted by: adonis49 on: October 10, 2016
What It Means to Be a Moral Human Being
Susan Sontag (January 16, 1933–December 28, 2004) spent a lifetime contemplating the role of writing in both the inner world of the writer and outer universe of readers, which we call culture
From her prolific essays and talks on the task of literature to her beautiful letter to Borges to her decades of reflections on writing recorded in her diaries.
But nowhere did she address the singular purpose of storytelling and the social responsibility of the writer with more piercing precision than in one of her last public appearances — a tremendous lecture on South African Nobel laureate Nadine Gordimer titled “At the Same Time: The Novelist and Moral Reasoning,” which Sontag delivered shortly before her death in 2004.
The speech is included in and lends its title to the endlessly enriching posthumous anthology At the Same Time: Essays and Speeches (public library), which also gave us Sontag on beauty vs. interestingness, courage and resistance, and literature and freedom.
Maria Popova posted:
Sontag begins with the quintessential question asked of, and answered by, all prominent writers — to distill their most essential advice on the craft:
I’m often asked if there is something I think writers ought to do, and recently in an interview I heard myself say: “Several things. Love words, agonize over sentences. And pay attention to the world.”
Needless to say, no sooner had these perky phrases fallen out of my mouth than I thought of some more recipes for writer’s virtue.
For instance: “Be serious.” By which I meant: Never be cynical. And which doesn’t preclude being funny.
What might Sontag say of the exponentially more exacting struggle against the cultural momentum of cynicism a mere decade later?
With the disclaimer that “descriptions mean nothing without examples,” Sontag points to Gordimer as the “living writer who exemplifies all that a writer can be” and considers what the South African author’s “large, ravishingly eloquent, and extremely varied body of work” reveals about the key to all great writing:
A great writer of fiction both creates — through acts of imagination, through language that feels inevitable, through vivid forms — a new world, a world that is unique, individual; and responds to a world, the world the writer shares with other people but is unknown or mis-known by still more people, confined in their worlds: call that history, society, what you will.
She cautions that despite all the noble uses of literature, despite all the ways in which it can transcend the written word to achieve a larger spiritual purpose — William Faulkner’s conviction that the writer’s duty is “to help man endure by lifting his heart” comes to mind — storytelling is still literature’s greatest duty:
The primary task of a writer is to write well. (And to go on writing well. Neither to burn out nor to sell out.) … Let the dedicated activist never overshadow the dedicated servant of literature — the matchless storyteller.
Echoing Walter Benjamin’s ideas on how storytelling transmutes information into wisdom — Sontag was a great admirer and rereader of his work — she adds:
To write is to know something. What a pleasure to read a writer who knows a great deal. (Not a common experience these days…) Literature, I would argue, is knowledge — albeit, even at its greatest, imperfect knowledge. Like all knowledge.
Still, even now, even now, literature remains one of our principal modes of understanding.
Everybody in our debauched culture invites us to simplify reality, to despise wisdom. There is a great deal of wisdom in Nadine Gordimer’s work. She has articulated an admirably complex view of the human heart and the contradictions inherent in living in literature and in history.
Nearly half a century after E.B. White proclaimed that the writer’s duty is “to lift people up, not lower them down,” Sontag considers “the idea of the responsibility of the writer to literature and to society” and clarifies the terms:
By literature, I mean literature in the normative sense, the sense in which literature incarnates and defends high standards.
By society, I mean society in the normative sense, too — which suggests that a great writer of fiction, by writing truthfully about the society in which she or he lives, cannot help but evoke (if only by their absence) the better standards of justice and of truthfulness that we have the right (some would say the duty) to militate for in the necessarily imperfect societies in which we live.
Obviously, I think of the writer of novels and stories and plays as a moral agent…
This doesn’t entail moralizing in any direct or crude sense. Serious fiction writers think about moral problems practically. They tell stories. They narrate. They evoke our common humanity in narratives with which we can identify, even though the lives may be remote from our own.
They stimulate our imagination. The stories they tell enlarge and complicate — and, therefore, improve — our sympathies. They educate our capacity for moral judgment.
In a sentiment that calls to mind French polymath Henri Poincaré’s assertion that creativity is the act of choosing the good ideas from among the bad ones, Sontag defines what a writer does and is:
Every writer of fiction wants to tell many stories, but we know that we can’t tell all the stories — certainly not simultaneously. We know we must pick one story, well, one central story; we have to be selective.
The art of the writer is to find as much as one can in that story, in that sequence … in that time (the timeline of the story), in that space (the concrete geography of the story).
A novelist, then, is someone who takes you on a journey. Through space. Through time. A novelist leads the reader over a gap, makes something go where it was not.
Time exists in order that everything doesn’t happen all at once … and space exists so that it doesn’t all happen to you.
[…]
The work of the novelist is to enliven time, as it is to animate space.
Repeating her memorable assertion that criticism is “cultural cholesterol,” penned in her diary decades earlier, Sontag considers the reactive indignation that passes for criticism:
Most notions about literature are reactive — in the hands of lesser talents, merely reactive.
The greatest offense now, in matters both of the arts and of culture generally, not to mention political life, is to seem to be upholding some better, more exigent standard, which is attacked, both from the left and the right, as either naïve or (a new banner for the philistines) “elitist.”
Writing nearly a decade before the golden age of ebooks and some years before the epidemic of crowdsourced-everything had infected nearly every corner of creative culture, Sontag once again reveals her extraordinary prescience about the intersection of technology, society, and the arts.
(Some decades earlier, she had presaged the “aesthetic consumerism” of visual culture on the social web.) Turning a critical eye to the internet and its promise — rather, its threat — of crowdsourced storytelling, she writes:
Hypertext — or should I say the ideology of hypertext? — is ultrademocratic and so entirely in harmony with the demagogic appeals to cultural democracy that accompany (and distract one’s attention from) the ever-tightening grip of plutocratic capitalism.
[But the] proposal that the novel of the future will have no story or, instead, a story of the reader’s (rather, readers’) devising is so plainly unappealing and, should it come to pass, would inevitably bring about not the much-heralded death of the author but the extinction of the reader — all future readers of what is labeled as “literature.”
Returning to the writer’s crucial task of selecting what story to tell from among all the stories that could be told, Sontag points to literature’s essential allure — the comfort of appeasing our anxiety about life’s infinite possibility, about all the roads not taken and all the immensities not imagined that could have led to a better destination than our present one.
A story, instead, offers the comforting finitude of both time and possibility:
Every fictional plot contains hints and traces of the stories it has excluded or resisted in order to assume its present shape. Alternatives to the plot ought to be felt up to the last moment. These alternatives constitute the potential for disorder (and therefore of suspense) in the story’s unfolding.
Endings in a novel confer a kind of liberty that life stubbornly denies us: to come to a full stop that is not death and discover exactly where we are in relation to the events leading to a conclusion.
The pleasure of fiction is precisely that it moves to an ending. And an ending that satisfies is one that excludes. Whatever fails to connect with the story’s closing pattern of illumination the writer assumes can be safely left out of the account.
A novel is a world with borders. For there to be completeness, unity, coherence, there must be borders. Everything is relevant in the journey we take within those borders.
One could describe the story’s end as a point of magical convergence for the shifting preparatory views: a fixed position from which the reader sees how initially disparate things finally belong together.
Once again echoing Walter Benjamin’s wise discrimination between storytelling and information, Sontag considers the two contrasting models “competing for our loyalty and attention”:
There is an essential … distinction between stories, on the one hand, which have, as their goal, an end, completeness, closure, and, on the other hand, information, which is always, by definition, partial, incomplete, fragmentary.
For Sontag, these two modes of world-building are best exemplified by the dichotomy between literature and the commercial mass media.
Writing in 2004, she saw television as the dominant form of the latter, but it’s striking to consider how true her observations hold today if we substitute “the internet” for every mention of “television.”
One can only wonder what Sontag would make of our newsfeed-fetishism and our compulsive tendency to mistake the latest and most urgent for the most important. She writes:
Literature tells stories. Television gives information.
Literature involves. It is the re-creation of human solidarity. Television (with its illusion of immediacy) distances — immures us in our own indifference.
The so-called stories that we are told on television satisfy our appetite for anecdote and offer us mutually canceling models of understanding.
(This is reinforced by the practice of punctuating television narratives with advertising.) They implicitly affirm the idea that all information is potentially relevant (or “interesting”), that all stories are endless — or if they do stop, it is not because they have come to an end but, rather, because they have been upstaged by a fresher or more lurid or eccentric story.
By presenting us with a limitless number of nonstopped stories, the narratives that the media relate — the consumption of which has so dramatically cut into the time the educated public once devoted to reading — offer a lesson in amorality and detachment that is antithetical to the one embodied by the enterprise of the novel.
Indeed, this notion of moral obligation is what Sontag sees as the crucial differentiator between storytelling and information — something I too have tussled with, a decade later, in contemplating the challenge of cultivating wisdom in the age of information, particularly in a media landscape driven by commercial interest whose very business model is predicated on conditioning us to confuse information with meaning.
(Why think about what constitutes a great work of art — how it moves you, what it says to your soul — when you can skim the twenty most expensive paintings in history on a site like Buzzfeed
Sontag, who had admonished against reducing culture to “content” half a century before the term became the currency of said commercial media, writes:
In storytelling as practiced by the novelist, there is always … an ethical component. This ethical component is not the truth, as opposed to the falsity of the chronicle.
It is the model of completeness, of felt intensity, of enlightenment supplied by the story, and its resolution — which is the opposite of the model of obtuseness, of non-understanding, of passive dismay, and the consequent numbing of feeling, offered by our media-disseminated glut of unending stories.
Television gives us, in an extremely debased and untruthful form, a truth that the novelist is obliged to suppress in the interest of the ethical model of understanding peculiar to the enterprise of fiction: namely, that the characteristic feature of our universe is that many things are happening at the same time.
(“Time exists in order that it doesn’t happen all at once… space exists so that it doesn’t all happen to you.”)
And therein lies Sontag’s greatest, most timeless, most urgently timely point — for writers, and for human beings:
To tell a story is to say: this is the important story. It is to reduce the spread and simultaneity of everything to something linear, a path.
To be a moral human being is to pay, be obliged to pay, certain kinds of attention.
When we make moral judgments, we are not just saying that this is better than that. Even more fundamentally, we are saying that this is more important than that. It is to order the overwhelming spread and simultaneity of everything, at the price of ignoring or turning our backs on most of what is happening in the world.
The nature of moral judgments depends on our capacity for paying attention — a capacity that, inevitably, has its limits but whose limits can be stretched.
But perhaps the beginning of wisdom, and humility, is to acknowledge, and bow one’s head, before the thought, the devastating thought, of the simultaneity of everything, and the incapacity of our moral understanding — which is also the understanding of the novelist — to take this in.
At the Same Time is an indispensable read in its entirety — an eternally nourishing serving of wisdom from one of the most expansive and luminous minds humanity ever produced.
Complement it with Sontag on love, art, how polarities imprison us, why lists appeal to us, and the joy of rereading beloved books, then revisit this evolving archive of celebrated writers’ advice on writing.
Sometimes I’m sad and I don’t know why. Michael Rosen children’s Sad Book
Posted by: adonis49 on: August 22, 2016
Michael Rosen’s Sad Book:
Anatomy of Loss, Illustrated by Quentin Blake
“Sometimes I’m sad and I don’t know why.
It’s just a cloud that comes along and covers me up.”
By Maria Popova
“Grief, when it comes, is nothing like we expect it to be,” Joan Didion wrote after losing the love of her life. “The people we most love do become a physical part of us,” Meghan O’Rourke observed in her magnificent memoir of loss, “ingrained in our synapses, in the pathways where memories are created.”
Those wildly unexpected dimensions of grief and the synaptic traces of love are what celebrated British children’s book writer and poet Michael Rosen confronted when his 18-year-old son Eddie died suddenly of meningitis.
Never-ending though the process of mourning may be, Rosen set out to exorcise its hardest edges and subtlest shapes five years later in Michael Rosen’s Sad Book (public library) — an immensely moving addition to the finest children’s books about loss, illustrated by none other than the great Quentin Blake.
With extraordinary emotional elegance, Rosen welcomes the layers of grief, each unmasking a different shade of sadness — sadness that sneaks up on you mid-stride in the street; sadness that lurks as a backdrop to the happiest of moments; sadness that wraps around you like a shawl you don’t take off even in the shower.
What emerges is a breathtaking bow before the central paradox of the human experience — the awareness that the heart’s enormous capacity for love is matched with an equal capacity for pain, and yet we love anyway and somehow find fragments of that love even amid the ruins of loss.
This is me being sad.
Maybe you think I’m happy in this picture.
Really I’m sad but pretending I’m happy.
I’m doing this because I think people won’t like me if I look sad
Sometimes sad is very big.
It’s everywhere. All over me.
Then I look like this.
And there’s nothing I can do about it.What makes me most sad is when I think about my son Eddie. I loved him very much but he died anyway.
With exquisite nuance, Rosen captures the contradictory feelings undergirding mourning — affection and anger, self-conscious introspection and longing for communion — and the way loss lodges itself in the psyche so that the vestiges of a particular loss always awaken the sadness of the all loss, that perennial heartbreak of beholding the absurdity of our longing for permanence in a universe of constant change.
Sometimes this makes me really angry.
I say to myself, “How dare he go and die like that?
How dare he make me sad?”
Eddie doesn’t say anything,
because he’s not here anymore.
Sometimes I want to talk about all this to someone.
Like my mum. But she’s not here anymore, either. So I can’t.
I find someone else. And I tell them all about it.
Sometimes I don’t want to talk about it.
Not to anyone. No one at all.
I just want to think about it on my own.
Because it’s mine. And no one else’s.
But what makes the story most singular and rewarding is that it refuses to indulge the cultural cliché of cushioning tragedy with the promise of a silver lining.
It is redemptive not in manufacturing redemption but in being true to the human experience — intensely, beautifully, tragically true.
Sometimes because I’m sad I do crazy things — like shouting in the shower…
Where is sad?
Sad is everywhere.
It comes along and finds you.When is sad?
Sad is anytime.
It comes along and finds you.Who is sad?
Sad is anyone.
It comes along and finds you.
Sometimes I’m sad and I don’t know why.
It’s just a cloud that comes along and covers me up.
It’s not because Eddie’s gone.
It’s not because my mum’s gone. It’s just because.
Blake, who has previously illustrated Sylvia Plath’s little-known children’s book and many of Roald Dahl’s stories, brings his unmistakably expressive sensibility to the book, here and there concretizing Rosen’s abstract words into visual vignettes that make you wonder what losses of his own he is holding in the mind’s eye as he draws.
Complement the absolutely breath-stopping Michael Rosen’s Sad Book with Oliver Jeffers’s The Heart and the Bottle and the Japanese masterpiece Little Tree, then revisit Joan Didion on grief.
What makes a great story? The Psychology behind a Great Story: Jerome Bruner
Posted by: adonis49 on: August 19, 2016
The Psychology of What Makes a Great Story
“The great writer’s gift to a reader is to make him a better writer.”
The product of storytelling is wisdom
By Maria Popova
“Stories,” Neil Gaiman asserted in his wonderful lecture on what makes stories last, “are genuinely symbiotic organisms that we live with, that allow human beings to advance.”
But what is the natural selection of these organisms — what makes the ones that endure fit for survival? What makes a great story?
That’s what the great Harvard psychologist Jerome Bruner (October 1, 1915–June 6, 2016), who revolutionized cognitive psychology and pioneered the modern study of creativity in the 1960s, explores in his 1986 essay collection Actual Minds, Possible Worlds (public library).

In an immensely insightful piece titled “Two Modes of Thought,” Bruner, turned 100 recently, writes:
There are two modes of cognitive functioning, two modes of thought, each providing distinctive ways of ordering experience, of constructing reality. The two (though complementary) are irreducible to one another. Efforts to reduce one mode to the other or to ignore one at the expense of the other inevitably fail to capture the rich diversity of thought.
Each of the ways of knowing, moreover, has operating principles of its own and its own criteria of well-formed (thought).
They differ radically in their procedures for verification. A good story and a well-formed argument are different natural kinds. Both can be used as means for convincing another. Yet what they convince of is fundamentally different: arguments convince one of their truth, stories of their lifelikeness.
The one verifies by eventual appeal to procedures for establishing formal and empirical proof. The other establishes Not truth but verisimilitude.
Maya Terro shared this link

A story (allegedly true or allegedly fictional) is judged for its goodness as a story by criteria that are of a different kind from those used to judge a logical argument as adequate or correct.

Bruner notes that the Western scientific and philosophical worldview has been largely concerned with the question of how to know truth, whereas storytellers are concerned with the question of how to endow experience with meaning — a dichotomy Hannah Arendt addressed brilliantly more than a decade earlier in her 1973 Gifford Lecture on thinking vs. knowing and the crucial difference between truth and meaning.
One could go even further and argue, after Walter Benjamin, that the product of the analytical mode is information, whereas the product of storytelling is wisdom.
Bruner calls these two contrasting modes the paradigmatic or logico-scientific, characterized by a mathematical framework of analysis and explanation, and the narrative. Each, he argues, is animated by a different kind of imagination:
The imaginative application of the paradigmatic mode leads to good theory, tight analysis, logical proof, sound argument, and empirical discovery guided by reasoned hypothesis.
But paradigmatic “imagination” (or intuition) is not the same as the imagination of the novelist or poet. Rather, it is the ability to see possible formal connections before one is able to prove them in any formal way.
The imaginative application of the narrative mode leads instead to good stories, gripping drama, believable (though not necessarily “true”) historical accounts.
It deals in human-like intention and action and the vicissitudes and consequences that mark their course. It strives to put its timeless miracles into the particulars of experience, and to locate the experience in time and place.
[…]
In contrast to our vast knowledge of how science and logical reasoning proceed, we know precious little in any formal sense about how to make good stories.
Perhaps one of the reasons for this is that story must construct two landscapes simultaneously. One is the landscape of action, where the constituents are the arguments of action: agent, intention or goal, situation, instrument, something corresponding to a “story grammar.”
The other landscape is the landscape of consciousness: what those involved in the action know, think, or feel, or do not know, think, or feel.

Bruner considers the singular landscape of narrative:
Narrative deals with the vicissitudes of human intentions. And since there are myriad intentions and endless ways for them to run into trouble — or so it would seem — there should be endless kinds of stories. But, surprisingly, this seems not to be the case.
[…]
We would do well with as loose fitting a constraint as we can manage concerning what a story must “be” to be a story. And the one that strikes me as most serviceable is the one with which we began: narrative deals with the vicissitudes of intention.
But this matter of intention remains forever mediated by the reader’s interpretation.
What young Sylvia Plath observed of poetry — “Once a poem is made available to the public,” she told her mother, “the right of interpretation belongs to the reader.” — is true of all art and storytelling, whatever the medium.
Bruner considers how the psychology of this interpretation factors into the question of what makes a great story:
It will always be a moot question whether and how well a reader’s interpretation “maps” on an actual story, does justice to the writer’s intention in telling the story, or conforms to the repertory of a culture.
But in any case, the author’s act of creating a narrative of a particular kind and in a particular form is not to evoke a standard reaction but to recruit whatever is most appropriate and emotionally lively in the reader’s repertory.
So “great” storytelling, inevitably, is about compelling human plights that are “accessible” to readers.
But at the same time, the plights must be set forth with sufficient subjunctivity to allow them to be rewritten by the reader, rewritten so as to allow play for the reader’s imagination.
One cannot hope to “explain” the processes involved in such rewriting in any but an interpretive way, surely no more precisely, say, than an anthropologist “explains” what the Balinese cockfight means to those who bet on it…
All that one can hope for is to interpret a reader’s interpretation in as detailed and rich a way as psychologically possible.

This essential “subjunctivity” is the act of designating a mood for the story. “To be in the subjunctive mode,” Bruner explains, means “to be trafficking in human possibilities rather than in settled certainties.”
Out of this drive toward unsettled possibilities arises the ultimate question of “how a reader makes a strange text his own,” a question of “assimilating strange tales into the familiar dramas of our own lives, even more than transmuting our own dramas in the process” — something Bruner illustrates brilliantly with an exchange between Marco Polo and Kublai Khan from Italo Calvino’s masterwork Invisible Cities, which takes place after Marco Polo describes a bridge stone by stone:
“But which is the stone that supports the bridge?” Kublai Khan asks.
“The bridge is not supported by one stone or another,” Marco answers, “but by the line of the arch that they form.”
Kublai Khan remains silent, reflecting. Then he adds: “Why do you speak to me of the stones? It is only the arch that matters to me.”
Polo answers: “Without stones there is no arch.”
Bruner extracts from this an allegory of the key to great storytelling:
But still, it is not quite the arch. It is, rather, what arches are for in all the senses in which an arch is for something — for their beautiful form, for the chasms they safely bridge, for coming out on the other side of crossings, for a chance to see oneself reflected upside down yet right side up.
So a reader goes from stones to arches to the significance of arches is some broader reality — goes back and forth between them in attempting finally to construct a sense of the story, its form, its meaning.
As our readers read, as they begin to construct a virtual text of their own, it is as if they were embarking on a journey without maps — and yet, they possess a stock of maps that might give hints, and besides, they know a lot about journeys and about mapmaking.
First impressions of the new terrain are, of course, based on older journeys already taken. In time, the new journey becomes a thing in itself, however much its initial shape was borrowed from the past. The virtual text becomes a story of its own, its very strangeness only a contrast with the reader’s sense of the ordinary.
The fictional landscape, finally, must be given a “reality” of its own — the ontological step. It is then that the reader asks that crucial interpretive question, “What’s it all about?” But what “it” is, of course, is not the actual text — however great its literary power — but the text that the reader has constructed under its sway. And that is why the actual text needs the subjunctivity that makes it possible for a reader to create a world of his own.
Bruner concurs with Barthes’s conviction that the writer’s greatest gift to the reader is to help her become a writer, then revises it to clarify and amplify its ambition:
The great writer’s gift to a reader is to make him a better writer.
Actual Minds, Possible Worlds is a remarkable read in its totality, exploring the psychological realities of language, thought and emotion, and the self.
Complement this particular portion with Susan Sontag on the task of storytelling, Oliver Sacks on its curious psychology, and Martha Nussbaum on how it remaps our interior lives, then revisit Bruner on creative wholeness, art as a mode of knowing, and the six essential conditions for creativity.
Art of Living, Having vs. Being, Free from the Chains of Our Culture: Erich Fromm
Posted by: adonis49 on: August 14, 2016
The Art of Living:
The Great Humanistic Philosopher Erich Fromm on Having vs. Being.
How to Set Ourselves Free from the Chains of Our Culture
“The full humanization of man requires the breakthrough from the possession-centered to the activity-centered orientation, from selfishness and egotism to solidarity and altruism.”
By Maria Popova
A pioneer of what he called “radical-humanistic psychoanalysis,” the great German social psychologist and philosopher Erich Fromm (March 23, 1900–March 18, 1980) was one of the most luminous minds of the twentieth century and a fountain of salve for the most abiding struggles of being human.
In the mid-1970s, twenty years after his influential treatise on the art of loving and four decades after legendary anthropologist Margaret Mead turned to him for difficult advice, Fromm became interested in the most basic, most challenging art of human life — the art of being.
At the height of a new era that had begun prioritizing products over people and consumption over creativity, Fromm penned a short, potent book titled To Have or To Be? — an inquiry into how the great promise of progress, seeded by the Industrial Revolution, failed us in our most elemental search for meaning and well-being.
But the question proved far too complex to tackle in a single volume, so Fromm left out a significant portion of his manuscript.
Those pages, in many ways even richer and more insightful than the original book, were later published as The Art of Being (public library) — a sort of field guide, all the timelier today, to how we can shift from the having mode of existence, which is systematically syphoning our happiness, to a being mode.
Fromm frames the inquiry:
The full humanization of man requires the breakthrough from the possession-centered to the activity-centered orientation, from selfishness and egotism to solidarity and altruism.
But any effort to outline the steps of this breakthrough, Fromm cautions, must begin with the foundational question of what the goal of living is — that is, what we consider the meaning of life to be, beyond its biological purpose. He writes:
It seems that nature — or if you will, the process of evolution — has endowed every living being with the wish to live, and whatever he believes to be his reasons are only secondary thoughts by which he rationalizes this biologically given impulse.
[…]
That we want to live, that we like to live, are facts that require no explanation. (Though experiments to confirm this given is still needed in our current times)
But if we ask how we want to live — what we seek from life, what makes life meaningful for us — then indeed we deal with questions (and they are more or less identical) to which people will give many different answers.
Some will say they want love, others will choose power, others security, others sensuous pleasure and comfort, others fame; but most would probably agree in the statement that what they want is happiness. This is also what most philosophers and theologians have declared to be the aim of human striving.
However, if happiness covers such different, and mostly mutually exclusive, contents as the ones just mentioned, it becomes an abstraction and thus rather useless. What matters is to examine what the term “happiness” means… (Happiness is a new invented term)
Most definitions of happiness, Fromm observes, converge at some version of having our needs met and our wishes fulfilled — but this raises the question of what it is we actually want. (As Milan Kundera memorably wrote, “we can never know what to want.”)
It’s essentially a question about human nature — or, rather, about the interplay of nature and nurture mediated by norms. Adding to the vocabulary of gardening as a metaphor for understanding happiness and making sense of mastery, Fromm illustrates his point:
This is indeed well understood by any gardener. The aim of the life of a rosebush is to be all that is inherent as potentiality in the rosebush: that its leaves are well developed and that its flower is the most perfect rose that can grow out of this seed.
The gardener knows, then, in order to reach this aim he must follow certain norms that have been empirically found. The rosebush needs a specific kind of soil, of moisture, of temperature, of sun and shade.
It is up to the gardener to provide these things if he wants to have beautiful roses. But even without his help the rosebush tries to provide itself with the optimum of needs. It can do nothing about moisture and soil, but it can do something about sun and temperature by growing “crooked,” in the direction of the sun, provided there is such an opportunity. Why would not the same hold true for the human species?
Even if we had no theoretical knowledge about the reasons for the norms that are conducive to man’s optimal growth and functioning, experience tells us just as much as it tells the gardener.
Therein lies the reason that all great teachers of man have arrived at essentially the same norms for living, the essence of these norms being that the overcoming of greed, illusions, and hate, and the attainment of love and compassion, are the conditions for attaining optimal being.
Drawing conclusions from empirical evidence, even if we cannot explain the evidence theoretically, is a perfectly sound and by no means “unscientific” method, although the scientists’ ideal will remain, to discover the laws behind the empirical evidence. (Confusing: empirical means performing experiments, which is the basis of sciences)
He distills the basic principle of life’s ultimate aim:
The goal of living [is] to grow optimally according to the conditions of human existence and thus to become fully what one potentially is; to let reason or experience guide us to the understanding of what norms are conducive to well-being, given the nature of man that reason enables us to understand.
But one of the essential ingredients of well-being, Fromm notes, has been gruesomely warped by capitalist industrial society — the idea of freedom and its attainment by the individual:
Liberation has been exclusively applied to liberation from outside forces; by the middle class from feudalism, by the working class from capitalism, by the peoples in Africa and Asia from imperialism.
Such external liberation, Fromm argues, is essentially political liberation — an inherently limiting pseudo-liberation, which can obscure the emergence of various forms of imprisonment and entrapment within the political system. He writes:
This is the case in Western democracy, where political liberation hides the fact of dependency in many disguises… Man can be a slave even without being put in chains…
The outer chains have simply been put inside of man. The desires and thoughts that the suggestion apparatus of society fills him with, chain him more thoroughly than outer chains. This is so because man can at least be aware of outer chains but be unaware of inner chains, carrying them with the illusion that he is free.
He can try to overthrow the outer chains, but how can he rid himself of chains of whose existence he is unaware?
Any attempt to overcome the possibly fatal crisis of the industrialized part of the world, and perhaps of the human race, must begin with the understanding of the nature of both outer and inner chains; it must be based on the liberation of man in the classic, humanist sense as well as in the modern, political and social sense…
The only realistic aim is total liberation, a goal that may well be called radical (or revolutionary) humanism.
The two most pernicious chains keeping us from liberation, Fromm observes, are our culture’s property-driven materialism and our individual intrinsic tendencies toward narcissism. He writes:
If “well-being” — [defined as] functioning well as a person, not as an instrument — is the supreme goal of one’s efforts, two specific ways stand out that lead to the attainment of this goal: Breaking through one’s narcissism and breaking through the property structure of one’s existence.
He offers the crispest definition of narcissism I’ve encountered (something that took Kafka a 47-page letter to articulate):
Narcissism is an orientation in which all one’s interest and passion are directed to one’s own person: one’s body, mind, feelings, interests… For the narcissistic person, only he and what concerns him are fully real; what is outside, what concerns others, is real only in a superficial sense of perception; that is to say, it is real for one’s senses and for one’s intellect. But it is not real in a deeper sense, for our feeling or understanding.
He is, in fact, aware only of what is outside, inasmuch as it affects him. Hence, he has no love, no compassion, no rational, objective judgment. The narcissistic person has built an invisible wall around himself. He is everything, the world is nothing. Or rather: He is the world.
But because narcissism can come in many guises, Fromm cautions, it can be particularly challenging to detect in oneself in order to then eradicate — and yet without doing so, “the further way to self-completion is blocked.”
A parallel peril to well-being comes from the egotism and selfishness seeded by our ownership-driven society, a culture that prioritizes having over being by making property its primary mode of existence. Fromm writes:
A person living in this mode is not necessarily very narcissistic. He may have broken through the shell of his narcissism, have an adequate appreciation of reality outside himself, not necessarily be “in love with himself”; he knows who he is and who the others are, and can well distinguish between subjective experience and reality.
Nevertheless, he wants everything for himself; has no pleasure in giving, in sharing, in solidarity, in cooperation, in love. He is a closed fortress, suspicious of others, eager to take and most reluctant to give.
Growth, he argues, requires a dual breakthrough — of narcissism and of property-driven existence. Although the first steps toward this breaking from bondage are bound to be anxiety-producing, this initial discomfort is but a paltry price for the larger rewards of well-being awaiting us on the other side of the trying transformation:
If a person has the will and the determination to loosen the bars of his prison of narcissism and selfishness, when he has the courage to tolerate the intermittent anxiety, he experiences the first glimpses of joy and strength that he sometimes attains. And only then a decisive new factor enters into the dynamics of the process.
This new experience becomes the decisive motivation for going ahead and following the path he has charted… [An] experience of well-being — fleeting and small as it may be — … becomes the most powerful motivation for further progress…
Awareness, will, practice, tolerance of fear and of new experience, they are all necessary if transformation of the individual is to succeed. At a certain point the energy and direction of inner forces have changed to the point where an individual’s sense of identity has changed, too. In the property mode of existence the motto is: “I am what I have.” After the breakthrough it is “I am what I do” (in the sense of unalienated activity); or simply, “I am what I am.”
In the remainder of The Art of Being, Fromm explores the subtleties and practicalities of enacting this transformation. Complement it with legendary social scientist John W. Gardner, a contemporary of Fromm’s, on the art of self-renewal, then revisit Fromm’s abiding wisdom on what is keeping us from mastering the art of love.
Culture and Costs of Anxiety? Global chronic stress? Or due to Globalization
Posted by: adonis49 on: August 11, 2016
The Culture and Costs of Anxiety
“Few people today would dispute that chronic stress is a hallmark of our times or that anxiety has become a kind of cultural condition of modernity.”
By Maria Popova
“Anxiety … makes others feel when a drowning man holds on to you,” Anaïs Nin wrote.
“Anxiety may be compared with dizziness. He whose eye happens to look down the yawning abyss becomes dizzy,” Kierkegaard observed.
“There is no question that the problem of anxiety is a nodal point at which the most various and important questions converge, a riddle whose solution would be bound to throw a flood of light on our whole mental existence,” Freud proclaimed in his classic introductory lectures on psychoanalysis.
And yet the riddle of anxiety is far from solved — rather, it has swelled into a social malady pulling countless numbers of us underwater daily.
Among those most mercilessly fettered by anxiety’s grip is Scott Stossel, familiar to most as the editor of The Atlantic.
In his superb mental health memoir, My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind (public library | IndieBound), Stossel follows in the tradition of Montaigne to use the lens of his own experience as a prism for illuminating insight on the quintessence of our shared struggles with anxiety.
From his personal memoir he weaves a cultural one, painting a portrait of anxiety though history, philosophy, religion, popular culture, literature, and a wealth of groundbreaking research in psychology and neuroscience.
Karim A. Badra shared a link.

Why? Because anxiety and its related psychoemotional disorders turn out to be the most common, prevalent, and undertreated form of clinically classified mental illness today, even more common than depression. Stossel contextualizes the issue with some striking statistics that reveal the cost — both financial and social — of anxiety:
According to the National Institute of Mental Health, some 40 million Americans, nearly one in seven of us, are suffering from some kind of anxiety disorder at any given time, accounting for 31% of the expenditures on mental health care in the United States.
According to recent epidemiological data, the “lifetime incidence” of anxiety disorder is more than 25 percent — which, if true, means that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetimes.
And it is debilitating: Recent academic papers have argued that the psychic and physical impairment tied to living with an anxiety disorder is equivalent to living with diabetes — usually manageable, sometimes fatal, and always a pain to deal with.
A study published in The American Journal of Psychiatry in 2006 found that Americans lose a collective 321 million days of work because of anxiety and depression each year, costing the economy $50 billion annually;
a 2001 paper published by the U.S. Bureau of Labor Statistics once estimated that the median number of days missed each year by American workers who suffer from anxiety or stress disorders is twenty-five.
In 2005 — three years before the recent economic crisis hit — Americans filled 53 million prescriptions for just two antianxiety drugs: Ativan and Xanax.
(In the weeks after 9/11, Xanax prescriptions jumped 9 percent nationally — and by 22 percent in New York City.)
In September 2008, the economic crash caused prescriptions in New York City to spike: as banks went belly up and the stock market went into free fall, prescriptions for anti-depressant and antianxiety medications increased 9 percent over the year before, while prescriptions for sleeping pills increased 11 percent.
To be sure, this isn’t a purely American phenomenon.
Stossel points to similar findings in Britain and Canada, for instance — which begs the facetious observation that perhaps speaking English is what is giving us anxiety. In all seriousness, however, the scale of the epidemic is nothing short of gobsmacking. Stossel writes:
Primary care physicians report that anxiety is one of the most frequent complaints driving patients to their offices — more frequent, by some accounts, than the common cold.
Few people today would dispute that chronic stress is a hallmark of our times or that anxiety has become a kind of cultural condition of modernity. We live, as has been said many times since the dawn of the atomic era, in an age of anxiety — and that, cliché though it may be, seems only to have become more true in recent years as America has been assaulted in short order by terrorism, economic calamity and disruption, and widespread social transformation.
Fittingly, Alan Watts’s The Wisdom of Insecurity: A Message for an Age of Anxiety, written in the very atomic era that sparked the dawn of our present predicament, remains one of the best meditations on the subject.
But, as Stossel points out, the notion of anxiety as a clinical category only appeared as recently as thirty years ago. He traces anxiety’s rise to cultural fame through the annals of academic history, pointing out that there were only three academic papers published on the subject in 1927, only fourteen in 1941, and thirty-seven in 1950.
It wasn’t until psychologist Rollo May published his influential treatise on anxiety in 1950 that academia paid heed. Today, a simple Google Scholar search returns nearly 3 million results, and entire academic journals are dedicated to anxiety.
But despite anxiety’s catapulting into cultural concern, our understanding of it — especially as far as mental health stereotypes are concerned — remains developmentally stunted, having evolved very little since the time of seventeenth-century Jewish-Dutch philosopher Baruch Spinoza, who asserted that anxiety was a mere problem of logic and could thus be resolved with tools of reason.
This is hardly different from present cognitive-behavioral therapy approaches or the all too common just-get-over-it cultural attitude towards this particular problem of mental health, which of course misses the debilitating dimensionality of what makes anxiety as crippling as it is. Looking back on his own messy lineage of Jewishness and Antisemitism and describing himself as “Woody Allen trapped in John Calvin,” Stossel counters such oversimplification with a case for layered, complex causality of the disorder:
The truth is that anxiety is at once a function of biology and philosophy, body and mind, instinct and reason, personality and culture.
Even as anxiety is experienced at a spiritual and psychological level, it is scientifically measurable at the molecular level and the physiological level. It is produced by nature and it is produced by nurture.
It’s a psychological phenomenon and a sociological phenomenon. In computer terms, it’s both a hardware problem (I’m wired badly) and a software problem (I run faulty logic programs that make me think anxious thoughts).
The origins of a temperament are many faceted; emotional dispositions that may seem to have a simple, single source — a bad gene, say, or a childhood trauma — may not.
True to this complexity, different epochs have attributed anxiety to various causes. Stossel probes what these historical patterns reveal:
The differences in how various cultures and eras have perceived and understood anxiety can tell us a lot about those cultures and eras.
Why did the ancient Greeks of the Hippocratic school see anxiety mainly as a medical condition, while the Enlightenment philosophers saw it as an intellectual problem?
Why did the early existentialists see anxiety as a spiritual condition, while Gilded Age doctors saw it as a specifically Anglo-Saxon stress response — a response that they believed spared Catholic societies — to the Industrial Revolution?
Why did the early Freudians see anxiety as a psychological condition emanating from sexual inhibition, whereas our own age tends to see it, once again, as a medical and neurochemical condition, a problem of malfunctioning biomechanics?
Do these shifting interpretations represent the forward march of progress and science? Or simply the changing, and often cyclical, ways in which cultures work?
Today, the definition of Generalized Anxiety Disorder in the DSM-IV — the fourth edition of The Diagnostic Manual of Mental Disorders, the bible of psychotherapy — is at once rather strict yet strangely horoscope-like in its ability to speak to things that feel uncomfortably familiar, at least on some level, to most of us:
Excessive anxiety about a number of events or activities, occurring more days than not, for at least 6 months. The person finds it difficult to control the worry. The anxiety and worry are associated with at least three of the following six symptoms (with at least some symptoms present for more days than not, for the past 6 months):
- Restlessness or feeling keyed up or on edge
- Being easily fatigued
- Difficulty concentrating or mind going blank
- Irritability
- Muscle tension
- Sleep disturbance
Of course, our relationship with anxiety and its potential cures is colored by our culture’s characteristic ambivalence. In fact, even Kierkegaard believed that anxiety was a necessary component of creativity and without it, genius would be incomplete. One of the experts Stossel interviews- David Barlow, the founder and director emeritus of Boston University’s Center for Anxiety and Related Disorders — echoes this sentiment:
Without anxiety, little would be accomplished. The performance of athletes, entertainers, executives, artisans, and students would suffer; creativity would diminish; crops might not be planted. And we would all achieve that idyllic state long sought after in our fast-paced society of whiling away our lives under a shade tree. This would be as deadly for the species as nuclear war.
The rest of the altogether excellent My Age of Anxiety goes on to explore such facets of our modern epidemic as what anxiety actually is, how early childhood experience might precipitate separation anxiety in adulthood, the promise and perils of antianxiety drugs, and what it takes to cultivate resilience, the trait modern psychology has identified as the most powerful immunization against anxiety and depression.
Complement it with Anaïs Nin’s illustrated meditation on anxiety and Alan Watts on how to heal the essential anxiety of existence.