Adonis Diaries

Posts Tagged ‘Barbara Ehrenreich.

On modern warfare weapons: Actual testing on many pre-emptive wars around the world

Back in 1997, Barbara Ehrenreich went after the human  attraction to violence in her book Blood Rites: Origins and History of the Passions of War.

In it, among other brilliant insights, she traced the beginnings of our  modern blood rites not to Man, the Aggressor, but to human beings, the  prey (in a dangerous early world of predators).

In an updated,  adapted version of an afterword she did for the British edition of that book, she turns from the origins of war to its end point, suggesting in her usual provocative way that drones and other warrior robotics may, in  the end, do us one strange favor: they may finally bring home to us that war is Not a human possession, that it is not what we are and must  be.

(To catch Timothy MacBain’s latest TomCast audio interview in which  Ehrenreich discusses the nature of war and how to fight against it,  click here, or download it to your iPod here.) Tom

War Without Humans Modern Blood Rites Revisited By Barbara Ehrenreich

For a book about the all-too-human “passions of war,” my 1997 work Blood Rites ended on a strangely inhuman note: I suggested that, whatever  distinctly human qualities war calls upon — honor, courage, solidarity,  cruelty, and so forth — it might be useful to stop thinking of war in  exclusively human terms.

After all, certain species of ants wage war  and computers can simulate “wars” that play themselves out on-screen  without any human involvement.

More generally, we should define war as a self-replicating  pattern of activity that may or may not require human participation.

In  the human case, we know it is capable of spreading geographically and  evolving rapidly over time — qualities that, as I suggested somewhat fancifully, make war a metaphorical successor to the predatory animals  that shaped humans into fighters in the first place.

A decade and a half later, these musings do not seem quite so airy  and abstract anymore. The trend, at the close of the twentieth century,  still seemed to be one of ever more massive human involvement in war —  from armies containing tens of thousands in the sixteenth century, to  hundreds of thousands in the nineteenth, and eventually millions in the  twentieth century world wars.

It was the ascending scale of war that originally called forth the existence of the nation-state as an administrative unit capable of maintaining mass armies and the infrastructure — for taxation, weapons manufacture, transport, etc. — that they require.

War has been, and we still expect it to be, the most massive collective project human beings undertake. But it has been evolving quickly in a very different direction, one in which human beings have a much smaller role to play.

One factor driving this change has been the emergence of a new kind of enemy, so-called “non-state actors,” meaning popular insurgencies and loose transnational networks of fighters, none of which are likely to field large numbers of troops or maintain expensive arsenals of their own.

In the face of these new enemies, typified by al-Qaeda, the mass armies of nation-states are highly ineffective, cumbersome to deploy, difficult to maneuver, and from a domestic point of view, overly dependent on a citizenry that is both willing and able to fight, or at least to have their children fight for them.

Yet just as U.S. military cadets continue, in defiance of military reality, to sport swords on their dress uniforms, our leaders, both military and political, tend to cling to an idea of war as a vast, labor-intensive effort on the order of World War II.

Only slowly, and with a reluctance bordering on the phobic, have the leaders of major states begun to grasp the fact that this approach to warfare may soon be obsolete.

Consider the most recent U.S. war with Iraq.

According to then-president George W. Bush, the casus belli was the 9/11 terror attacks.  The causal link between that event and our chosen enemy, Iraq, was, however, imperceptible to all but the most dedicated inside-the-Beltway intellectuals.

Nineteen men had hijacked airplanes and flown them into the Pentagon and the World Trade Center — 15 of them Saudi Arabians, none of them Iraqis — and we went to war against… Iraq?

Military history offers no ready precedents for such wildly misaimed retaliation. The closest analogies come from anthropology, which provides plenty of cases of small-scale societies in which the death of any member, for any reason, needs to be “avenged” by an attack on a more or less randomly chosen other tribe or hamlet.

Why Iraq?

Neoconservative imperial ambitions have been invoked in explanation, as well as the American thirst for oil, or even an Oedipal contest between George W. Bush and his father.

There is no doubt some truth to all of these explanations, but the targeting of Iraq also represented a desperate and irrational response to what was, for Washington, an utterly confounding military situation.

We faced a state-less enemy — geographically diffuse, lacking uniforms and flags, invulnerable to invading infantries and saturation bombing, and apparently capable of regenerating itself at minimal expense.

From the perspective of Secretary of Defense Donald Rumsfeld and his White House cronies, this would not do. (Meaning from Israel point of view or the “christian” Evangelical Zionists)

Since the U.S. was accustomed to fighting other nation-states — geopolitical entities containing such identifiable targets as capital cities, airports, military bases, and munitions plants — we would have to find a nation-state to fight, or as Rumsfeld put it, a “target-rich environment.

Iraq, pumped up by alleged stockpiles of “weapons of mass destruction,” became the designated surrogate for an enemy that refused to play our game.

The effects of this atavistic war are still being tallied: in Iraq, we would have to include civilian deaths estimated at possibly hundreds of thousands, the destruction of civilian infrastructure, and devastating outbreaks of sectarian violence of a kind that, as we should have learned from the dissolution of Yugoslavia, can readily follow the death or removal of a nationalist dictator.

But the effects of war on the U.S. and its allies may end up being almost as tragic.

Instead of punishing the terrorists who had attacked the U.S., the war seems to have succeeded in recruiting more such irregular fighters, young men (and sometimes women) willing to die and ready to commit further acts of terror or revenge.

By insisting on fighting a more or less randomly selected nation-state, the U.S. may only have multiplied the non-state threats it faces.

Unwieldy Armies

Whatever they may think of what the U.S. and its allies did in Iraq, many national leaders are beginning to acknowledge that conventional militaries are becoming, in a strictly military sense, almost ludicrously anachronistic. Not only are they unsuited to crushing counterinsurgencies and small bands of terrorists or irregular fighters, but mass armies are simply too cumbersome to deploy on short notice.

In military lingo, they are weighed down by their “tooth to tail” ratio — a measure of the number of actual fighters in comparison to the support personnel and equipment the fighters require. Both hawks and liberal interventionists may hanker to airlift tens of thousands of soldiers to distant places virtually overnight, but those soldiers will need to be preceded or accompanied by tents, canteens, trucks, medical equipment, and so forth.

“Flyover” rights will have to be granted by neighboring countries; air strips and eventually bases will have to be constructed; supply lines will have be created and defended — all of which can take months to accomplish.

The sluggishness of the mass, labor-intensive military has become a constant source of frustration to civilian leaders. Irritated by the Pentagon’s hesitation to put “boots on the ground” in Bosnia, then-Secretary of State Madeline Albright famously demanded of Secretary of Defense Colin Powell, “What good is this marvelous military force if we can never use it?”

In 2009, the Obama administration unthinkingly proposed a troop surge in Afghanistan, followed by a withdrawal within a year and a half that would have required some of the troops to start packing up almost as soon as they arrived. It took the U.S. military a full month to organize the transport of 20,000 soldiers to Haiti in the wake of the 2010 earthquake — and they were only traveling 700 miles to engage in a humanitarian relief mission, not a war.

Another thing hobbling mass militaries is the increasing unwillingness of nations, especially the more democratic ones, to risk large numbers of casualties. It is no longer acceptable to drive men into battle at gunpoint or to demand that they fend for themselves on foreign soil.

Once thousands of soldiers have been plunked down in a “theater,” they must be defended from potentially hostile locals, a project that can easily come to supersede the original mission.

We may not be able clearly to articulate what American troops were supposed to accomplish in Iraq or Afghanistan, but without question one part of their job has been “force protection.” In what could be considered the inverse of “mission creep,” instead of expanding, the mission now has a tendency to contract to the task of self-defense.

Ultimately, the mass militarist of the modern era, augmented by ever-more expensive weapons systems, place an unacceptable economic burden on the nation-states that support them — a burden that eventually may undermine the militaries themselves.

Consider what has been happening to the world’s sole military superpower, the United States. The latest estimate for the cost of the wars in Iraq and Afghanistan is, at this moment, at least $3.2 trillion, while total U.S. military spending equals that of the next 15 countries combined, and adds up to approximately 47% of all global military spending.

To this must be added the cost of caring for wounded and otherwise damaged veterans, which has been mounting precipitously as medical advances allow more of the injured to survive.  The U.S. military has been sheltered from the consequences of its own profligacy by a level of bipartisan political support that has kept it almost magically immune to budget cuts, even as the national debt balloons to levels widely judged to be unsustainable.

The hard right, in particular, has campaigned relentlessly against “big government,” apparently not noticing that the military is a sizable chunk of this behemoth.

In December 2010, for example, a Republican senator from Oklahoma railed against the national debt with this statement: “We’re really at war. We’re on three fronts now: Iraq, Afghanistan, and the financial tsunami  [arising from the debt] that is facing us.” Only in recent months have some Tea Party-affiliated legislators broken with tradition by declaring their willingness to cut military spending.

How the Warfare State Became the Welfare State

If military spending is still for the most part sacrosanct, ever more spending cuts are required to shrink “big government.”  Then what remains is the cutting of domestic spending, especially social programs for the poor, who lack the means to finance politicians, and all too often the incentive to vote as well.

From the Reagan years on, the U.S. government has chipped away at dozens of programs that had helped sustain people who are underpaid or unemployed, including housing subsidies, state-supplied health insurance, public transportation, welfare for single parents, college tuition aid, and inner-city economic development projects.

Even the physical infrastructure — bridges, airports, roads, and tunnels — used by people of all classes has been left at dangerous levels of disrepair. Antiwar protestors wistfully point out, year after year, what the cost of our high-tech weapon systems, our global network of more than 1,000 military bases, and our various “interventions” could buy if applied to meeting domestic human needs. But to no effect.

This ongoing sacrifice of domestic welfare for military “readiness” represents the reversal of a historic trend. Ever since the introduction of mass armies in Europe in the seventeenth century, governments have generally understood that to underpay and underfeed one’s troops — and the class of people that supplies them — is to risk having the guns pointed in the opposite direction from that which the officers recommend.

In fact, modern welfare states, inadequate as they may be, are in no small part the product of war — that is, of governments’ attempts to appease soldiers and their families. In the U.S., for example, the Civil War led to the institution of widows’ benefits, which were the predecessor of welfare in its Aid to Families with Dependent Children form. It was the bellicose German leader Otto von Bismarck who first instituted national health insurance.

World War II spawned educational benefits and income support for American veterans and led, in the United Kingdom, to a comparatively generous welfare state, including free health care for all.

Notions of social justice and fairness, or at least the fear of working class insurrections, certainly played a part in the development of twentieth century welfare states, but there was a pragmatic military motivation as well: if young people are to grow up to be effective troops, they need to be healthy, well-nourished, and reasonably well-educated.

In the U.S., the steady withering of social programs that might nurture future troops even serves, ironically, to justify increased military spending. In the absence of a federal jobs program, Congressional representatives become fierce advocates for weapons systems that the Pentagon itself has no use for, as long as the manufacture of those weapons can provide employment for some of their constituents.

With diminishing funds for higher education, military service becomes a less dismal alternative for young working-class people than the low-paid jobs that otherwise await them. The U.S. still has a civilian welfare state consisting largely of programs for the elderly (Medicare and Social Security). For many younger Americans, however, as well as for older combat veterans, the U.S. military is the welfare state — and a source, however temporarily, of jobs, housing, health care and education.

Eventually, however, the failure to invest in America’s human resources — through spending on health, education, and so forth — undercuts the military itself. In World War I, public health experts were shocked to find that one-third of conscripts were rejected as physically unfit for service; they were too weak and flabby or too damaged by work-related accidents.

Several generations later, in 2010, the U.S. Secretary of Education reported that “75 percent of young Americans, between the ages of 17 to 24, are unable to enlist in the military today because they have failed to graduate from high school, have a criminal record, or are physically unfit.”

(Wonderful news: Drop the Gendarme notion of controlling the world)

When a nation can no longer generate enough young people who are fit for military service, that nation has two choices: it can, as a number of prominent retired generals are currently advocating, reinvest in its “human capital,” especially the health and education of the poor, or it can seriously reevaluate its approach to war.

The Fog of (Robot) War

Since the rightward, anti-“big government” tilt of American politics more or less precludes the former, the U.S. has been scrambling to develop less labor-intensive forms of waging war. In fact, this may prove to be the ultimate military utility of the wars in Iraq and Afghanistan: if they have gained the U.S. no geopolitical advantage, they have certainly served as laboratories and testing grounds for forms of future warfare that involve less human, or at least less governmental, commitment.

One step in that direction has been the large-scale use of military contract workers supplied by private companies, which can be seen as a revival of the age-old use of mercenaries.  Although most of the functions that have been outsourced to private companies — including food services, laundry, truck driving, and construction — do not involve combat, they are dangerous, and some contract workers have even been assigned to the guarding of convoys and military bases.

Contractors are still men and women, capable of bleeding and dying — and surprising numbers of them have indeed died.  In the initial six months of 2010, corporate deaths exceeded military deaths in Iraq and Afghanistan for the first time. But the Pentagon has little or no responsibility for the training, feeding, or care of private contractors.

If wounded or psychologically damaged, American contract workers must turn, like any other injured civilian employees, to the Workers’ Compensation system, hence their sense of themselves as a “disposable army.”  By 2009, the trend toward privatization had gone so far that the number of private contractors in Afghanistan exceeded the number of American troops there.

An alternative approach is to eliminate or drastically reduce the military’s dependence on human beings of any kind.  This would have been an almost unthinkable proposition a few decades ago, but technologies employed in Iraq and Afghanistan have steadily stripped away the human role in war. Drones, directed from sites up to 7,500 miles away in the western United States, are replacing manned aircraft.

Video cameras, borne by drones, substitute for human scouts or information gathered by pilots. Robots disarm roadside bombs. When American forces invaded Iraq in 2003, no robots accompanied them; by 2008, there were 12,000 participating in the war.

Only a handful of drones were used in the initial invasion; today, the U.S. military has an inventory of more than 7,000, ranging from the familiar Predator to tiny Ravens and Wasps used to transmit video images of events on the ground.  Far stranger fighting machines are in the works, like swarms of lethal “cyborg insects” that could potentially replace human infantry.

These developments are by no means limited to the U.S. The global market for military robotics and unmanned military vehicles is growing fast, and includes Israel, a major pioneer in the field, Russia, the United Kingdom, Iran, South Korea, and China.

Turkey is reportedly readying a robot force for strikes against Kurdish insurgents. (Not likely. The Kurds have advanced robots)

Israel hopes to eventually patrol the Gaza border with “see-shoot” robots that will destroy people perceived as transgressors as soon as they are detected. (Will Not need much programming: whoever you detect, shoot to kill)

It is hard to predict how far the automation of war and the substitution of autonomous robots for human fighters will go. On the one hand, humans still have the advantage of superior visual discrimination.  Despite decades of research in artificial intelligence, computers cannot make the kind of simple distinctions — as in determining whether a cow standing in front of a barn is a separate entity or a part of the barn — that humans can make in a fraction of a second.

Thus, as long as there is any premium on avoiding civilian deaths, humans have to be involved in processing the visual information that leads, for example, to the selection of targets for drone attacks. If only as the equivalent of seeing-eye dogs, humans will continue to have a role in war, at least until computer vision improves.

On the other hand, the human brain lacks the bandwidth to process all the data flowing into it, especially as new technologies multiply that data. In the clash of traditional mass armies, under a hail of arrows or artillery shells, human warriors often found themselves confused and overwhelmed, a condition attributed to “the fog of war.”

, that fog is growing a lot thicker. U.S. military officials, for instance, put the blame on “information overload” for the killing of 23 Afghan civilians in February 2010, and the New York Times reported that:

“Across the military, the data flow has surged; since the attacks of 9/11, the amount of intelligence gathered by remotely piloted drones and other surveillance technologies has risen 1,600 percent. On the ground, troops increasingly use hand-held devices to communicate, get directions and set bombing coordinates. And the screens in jets can be so packed with data that some pilots call them “drool buckets” because, they say, they can get lost staring into them.”

When the sensory data coming at a soldier is augmented by a flood of instantaneously transmitted data from distant cameras and computer search engines, there may be no choice but to replace the sloppy “wet-ware” of the human brain with a robotic system for instant response.

War Without Humans

Once set in place, the cyber-automation of war is hard to stop.  Humans will cling to their place “in the loop” as long as they can, no doubt insisting that the highest level of decision-making — whether to go to war and with whom — be reserved for human leaders. But it is precisely at the highest levels that decision-making may most need automating.

A head of state faces a blizzard of factors to consider, everything from historical analogies and satellite-derived intelligence to assessments of the readiness of potential allies. Furthermore, as the enemy automates its military, or in the case of a non-state actor, simply adapts to our level of automation, the window of time for effective responses will grow steadily narrower. Why not turn to a high-speed computer? It is certainly hard to imagine a piece of intelligent hardware deciding to respond to the 9/11 attacks by invading Iraq.

So, after at least 10,000 years of intra-species fighting — of scorched earth, burned villages, razed cities, and piled up corpses, as well, of course, as all the great epics of human literature — we have to face the possibility that the institution of war might no longer need us for its perpetuation.

Human desires, especially for the Earth’s diminishing supply of resources, will still instigate wars for some time to come, but neither human courage nor human blood-lust will carry the day on the battlefield.

Computers will assess threats and calibrate responses; drones will pinpoint enemies; robots might roll into the streets of hostile cities. Beyond the individual battle or smaller-scale encounter, decisions as to whether to match attack with counterattack, or one lethal technological innovation with another, may also be eventually ceded to alien minds.

This should not come as a complete surprise. Just as war has shaped human social institutions for millennia, so has it discarded them as the evolving technology of war rendered them useless. When war was fought with blades by men on horseback, it favored the rule of aristocratic warrior elites. When the mode of fighting shifted to action-at-a-distance weapons like bows and guns, the old elites had to bow to the central authority of kings, who, in turn, were undone by the democratizing forces unleashed by new mass armies.

Even patriarchy cannot depend on war for its long-term survival, since the wars in Iraq and Afghanistan have, at least within U.S. forces, established women’s worth as warriors. Over the centuries, human qualities once deemed indispensable to war fighting — muscular power, manliness, intelligence, judgment — have one by one become obsolete or been ceded to machines.

What will happen then to the “passions of war”?

Except for individual acts of martyrdom, war is likely to lose its glory and luster. Military analyst P.W. Singer quotes an Air Force captain musing about whether the new technologies will “mean that brave men and women will no longer face death in combat,” only to reassure himself that “there will always be a need for intrepid souls to fling their bodies across the sky.”

Perhaps, but in a 2010 address to Air Force Academy cadets, an under secretary of defense delivered the “bad news” that most of them would not be flying airplanes, which are increasingly unmanned.

War will continue to be used against insurgencies as well as to “take out” the weapons facilities, command centers, and cities of designated rogue states. It may even continue to fascinate its aficionados, in the manner of computer games. But there will be no triumphal parades for killer nano-bugs, no epics about unmanned fighter planes, no monuments to fallen bots.

And in that may lie our last hope. With the decline of mass militaries and their possible replacement by machines, we may finally see that war is not just an extension of our needs and passions, however base or noble.

Nor is it likely to be even a useful test of our courage, fitness, or national unity. War has its own dynamic or — in case that sounds too anthropomorphic — its own grim algorithms to work out. As it comes to need us less, maybe we will finally see that we don’t need it either. We can leave it to the ants.

Barbara Ehrenreich is the author of a number of books including Nickel and Dimed: On (Not) Getting By in America and Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America. This essay is a revised and updated version of the afterword to the British edition of Blood Rites: Origins and History of the Passions of War (Granta, 2011).  To listen to Timothy MacBain’s latest TomCast audio interview in which  Ehrenreich discusses the nature of war and how to fight against it,  click here, or download it to your iPod here.

Copyright 2011 Barbara Ehrenreich

On War, Robot War, Drone War, Electronic War… Stop injustices, Respect human dignity…

It is not possible for a sane person to sincerely promote the killing of another person.

Ask anyone in the front line how he felt before he shot on an “adversary” and how he felt after the “enemy” fell.

Anyone sane is not able to forget “that he did kill someone else”.

The memory is there for the remainder of his life, and life is rotten and very unpleasant.

People like to claim “self-defense” excuse, any kinds of self-defense, thinking that the neighbor will be understanding and forgiving.

What the neighbor can do to you if your soul and mind are unable to erase the fact of having ended the life someone else?

Recently, the Chinese had more than two dozen models in some stage of development on display at the Zhuhai Air Show, some of  which they are evidently eager to sell to other countries.

There was a time in our history when bow and arrows were not yet put to use, not even for shooting down animal to eat. Battles were short and not many died in the field.

People life expectancy was very short, and those fighting were plagued with all kinds of diseases: They needed to rest after a short engagement, and maybe they sat to shoot the breeze, faking that they will get up again to resume the fight. Not a chance.

Killing from a long range is the skill of the coward and the totally useless soldier: Too much shouting for nothing.

If you really need to claim self-defense, engage in close body fight: A few wounds will go a long way into avoiding the promotion of war.

On July 2011, Barbara Ehrenreich published this piece. It is reposted on TomDispatch.

Last week, William Wan and Peter Finn of the Washington Post reported that at least 50 countries have now purchased or developed pilotless military drones.

So three cheers for a thoroughly drone-ified world.

In my lifetime, I’ve repeatedly seen advanced weapons systems or mind-boggling technologies of war hailed as  near-Utopian paths to victory and future peace (just as the atomic bomb  was soon after my birth).

Include in that the Vietnam-era, “electronic  battlefield,” President Ronald Reagan’s Strategic Defense Initiative  (aka “Star Wars”), the “smart bombs” and smart missiles of the first  Gulf War, and in the twenty-first century, “netcentric warfare,” that Rumsfeld high-tech favorite.

You know the results of this sort of magical thinking about wonder weapons (or technologies) just as well as I do.

The atomic bomb led to an almost half-century-long nuclear superpower standoff/nightmare, to  nuclear proliferation, and so to the possibility that someday even terrorists might possess such weapons.

The electronic battlefield was incapable of staving off defeat in Vietnam.

Reagan’s “impermeable” anti-missile shield in space never came even faintly close to making it into the heavens. (And the currently deployed steel domes are no better)

Those “smart bombs” of the Gulf War proved remarkably dumb, while the 50 “decapitation” strikes the Bush administration launched against Saddam Hussein’s regime on the  first day of the 2003 invasion of Iraq took out not a single Iraqi  leader, but dozens of civilians.

And the history of the netcentric military in Iraq is well known. Its “success” sent Secretary of Defense Rumsfeld into retirement and ignominy.

In the same way, robot drones as assassination weapons will prove to be just another weapons system rather than a panacea for American  warriors.

None of these much-advertised wonder technologies ever turns  out to perform as promised, but that fact never stops them, as with  drones today, from embedding themselves in our world.

From the atomic  bomb came a whole nuclear landscape that included the Strategic Air  Command, weapons labs, production plants, missile silos, corporate  interests, and an enormous world-destroying arsenal (as well as  proliferating versions of the same, large and small, across the planet).

Nor did the electronic battlefield go away.

Quite the opposite — it  came home and entered our everyday world in the form of sensors,  cameras, surveillance equipment, and the like, now implanted from our borders to our cities.

Rarely do wonder weapons or wonder technologies disappoint enough to disappear.

And those latest wonders, missile- and bomb-armed drones,  are now multiplying like so many electronic rabbits.

And yet there is  always hope. (Like what practical decisions and how to generate such hope?)

 

How America criminalised poverty

From TomDispatch, part of the Guardian Comment Network, Wednesday 10 August 2011

Note: Occasionally, I like to republish interesting articles before I butt in for a few comments. This article explains the reactions to the book Nickel and Dimed 2001

    • Homeless person, Washington DC
A homeless person sits wrapped in a blanket near the White House in Washington DC. Photograph: Robyn Beck/EPA
“There’s just no end to it once the cycle (of poverty) starts. It just keeps accelerating.”says Robert Solomon of Yale Law School

“The viciousness of State officials to the poor and homeless is breathtaking, trapping them in a cycle of poverty.

A Florida woman wrote to tell me that, before reading it, she’d always been annoyed at the poor for what she saw as their self-inflicted obesity. Now she understood that a healthy diet wasn’t always an option. And if I had a quarter for every person who’s told me, he or she now tipped more generously, I would be able to start my own foundation.

How to define poverty?

Three months after the book was published, the Economic Policy Institute in Washington DC issued a report entitled “Hardships in America: The Real Story of Working Families”, which found an astounding 29% of American families living in what could be more reasonably defined as poverty, meaning that they earned less than a barebones budget covering housing, child care, health care, food, transportation, and taxes – though not, it should be noted, any entertainment, meals out, cable TV, Internet service, vacations, or holiday gifts..

I completed the manuscript for Nickel and Dimed in a time of seemingly boundless prosperity. Technology innovators and venture capitalists were acquiring sudden fortunes, buying up McMansions, like the ones I had cleaned in Maine and much larger. Even secretaries in some hi-tech firms were striking it rich with their stock options.

There was loose talk about a permanent conquest of the business cycle, and a sassy new spirit infecting American capitalism. In San Francisco, a billboard for an e-trading firm proclaimed, “Make love not war,” and, down at the bottom, “Screw it, just make money.”

When the book Nickel and Dimed was published in May 2001, cracks were appearing in the dot-com bubble and the stock market had begun to falter, but the book still evidently came as a surprise, even a revelation, to many. In that first year or two after publication, people came up to me and opened with the words, “I never thought …” or “I hadn’t realised …”

To my own amazement, Nickel and Dimed quickly ascended to the bestseller list and began winning awards. Criticisms have accumulated over the years. But for the most part, the book has been far better received than I could have imagined it would be, with an impact extending well into the more comfortable classes.

Even more gratifying to me, the book has been widely read among low-wage workers. In the last few years, hundreds of people have written to tell me their stories: the mother of a newborn infant whose electricity had just been turned off, the woman who had just been given a diagnosis of cancer and has no health insurance, the newly homeless man who writes from a library computer.

At the time I wrote Nickel and Dimed, I wasn’t sure how many people it directly applied to – only that the official definition of poverty was way off the mark, since it defined an individual earning $7 an hour, as I did on average, as well out of poverty.

29% is a minority, but not a reassuringly small one, and other studies in the early 2000s came up with similar figures.

The big question, 10 years later, is whether things have improved or worsened for those in the bottom third of the income distribution.

For example, the people who clean hotel rooms, work in warehouses, wash dishes in restaurants, care for the very young and very old, and keep the shelves stocked in our stores. The short answer is that things have gotten much worse, especially since the economic downturn that began in 2008.

Post-meltdown poverty

While I was researching my book the  hardships encountered– the skipped meals, the lack of medical care, the occasional need to sleep in cars or vans –Mind you that those occurred in the best of times. The economy was growing, and jobs, if poorly paid, were at least plentiful.

In 2000, I had been able to walk into a number of jobs pretty much off the street.

Less than a decade later, many of these jobs had disappeared and there was stiff competition for those that remained. It would have been impossible to repeat my Nickel and Dimed “experiment”, had I had been so inclined, because I would probably never have found a job.

For the last couple of years, I have attempted to find out what was happening to the working poor in a declining economy – this time using conventional reporting techniques like interviewing. I started with my own extended family, which includes plenty of people without jobs or health insurance, and moved on to trying to track down a couple of the people I had met while working on Nickel and Dimed.

This wasn’t easy, because most of the addresses and phone numbers I had taken away with me had proved to be inoperative within a few months, probably due to moves and suspensions of telephone service. I had kept in touch with “Melissa” over the years, who was still working at Wal-Mart, where her wages had risen from $7 to $10 an hour, but in the meantime, her husband had lost his job.

Caroline, now in her 50s and partly disabled by diabetes and heart disease, had left her deadbeat husband and was subsisting on occasional cleaning and catering jobs. Neither seemed unduly afflicted by the recession, but only because they had already been living in what amounts to a permanent economic depression.

Media attention has focused, understandably enough, on the “nouveau poor” – formerly middle and even upper-middle class people who lost their jobs, their homes, and/or their investments in the financial crisis of 2008 and the economic downturn that followed it, but the brunt of the recession has been borne by the blue-collar working class, which had already been sliding downwards since de-industrialisation began in the 1980s.

In 2008 and 2009, for example, blue-collar unemployment was increasing three times as fast as white-collar unemployment, and African American and Latino workers were three times as likely to be unemployed as white workers. Low-wage blue-collar workers, like the people I worked with in this book, were especially hard hit for the simple reason that they had so few assets and savings to fall back on as jobs disappeared.

How have the already-poor attempted to cope with their worsening economic situation?

One obvious way is to cut back on health care.

The New York Times reported in 2009 that one-third of Americans could no longer afford to comply with their prescriptions and that there had been a sizable drop in the use of medical care. Others, including members of my extended family, have given up their health insurance.

Food is another expenditure that has proved vulnerable to hard times, with the rural poor turning increasingly to “food auctions“, which offer items that may be past their sell-by dates.

And for those who like their meat fresh, there’s the option of urban hunting.

In Racine, Wisconsin, a 51-year-old laid-off mechanic told me he was supplementing his diet by “shooting squirrels and rabbits and eating them stewed, baked and grilled”. In Detroit, where the wildlife population has mounted as the human population ebbs, a retired truck driver was doing a brisk business in raccoon carcasses, which he recommends marinating with vinegar and spices.

The most common coping strategy, though, is simply to increase the number of paying people per square foot of dwelling space – by doubling up or renting to couch-surfers.

It’s hard to get firm numbers on overcrowding, because no one likes to acknowledge it to census-takers, journalists, or anyone else who might be remotely connected to the authorities.

In Los Angeles, housing expert Peter Dreier says that “people who’ve lost their jobs, or at least their second jobs, cope by doubling or tripling up in overcrowded apartments, or by paying even 70% of their incomes in rent“.

According to a community organiser in Alexandria, Virginia, the standard apartment in a complex occupied largely by day labourers has two bedrooms, each containing an entire family of up to five people, plus an additional person laying claim to the couch.

No one could call suicide a “coping strategy”, but it is one way some people have responded to job loss and debt.

There are no national statistics linking suicide to economic hard times, but the National Suicide Prevention Lifeline reported more than a four-fold increase in call volume between 2007 and 2009, and regions with particularly high unemployment, such as Elkhart, Indiana, have seen troubling spikes in their suicide rates. Foreclosure is often the trigger for suicide – or, worse, murder-suicides that destroy entire families.

“Torture and Abuse of Needy Families”: TANF, or Temporary Assistance to Needy  Families

We do of course have a collective way of ameliorating the hardships of individuals and families – a government safety net that is meant to save the poor from spiralling down all the way to destitution.

But its response to the economic emergency of the last few years has been spotty at best. The food stamp program has responded to the crisis fairly well, to the point where it now reaches about 37 million people, up about 30% from pre-recession levels.  Welfare – the traditional last resort for the down-and-out until it was “reformed” in 1996 – only expanded by about 6% in the first two years of the recession.

What’s the difference between the two programs, Food stamp program andWelfare ?

There is a right to food stamps. You go to the office and, if you meet the statutory definition of need, they help you. For welfare, the street-level bureaucrats can, pretty much at their own discretion, just say no.

Take the case of Kristen and Joe Parente, Delaware residents who had always imagined that people turned to the government for help only if “they didn’t want to work”. Their troubles began well before the recession, when Joe, a fourth-generation pipe-fitter, sustained a back injury that left him unfit for even light lifting. He fell into a profound depression for several months, then rallied to ace a state-sponsored retraining course in computer repairs – only to find that those skills are no longer in demand. The obvious fallback was disability benefits, but – catch-22 – when Joe applied he was told he could not qualify without presenting a recent MRI scan. This would cost $800 to $900, which the Parentes do not have; nor has Joe, unlike the rest of the family, been able to qualify for Medicaid.

When they married as teenagers, the plan had been for Kristen to stay home with the children. But with Joe out of action and three children to support by the middle of this decade, Kristen went out and got waitressing jobs, ending up, in 2008, in a “pretty fancy place on the water”. Then the recession struck and she was laid off.

Kristen is bright, pretty, and to judge from her command of her own small kitchen, probably capable of holding down a dozen tables with precision and grace. In the past she’d always been able to land a new job within days; now there was nothing.

Like 44% of laid-off people at the time, Kristen failed to meet the fiendishly complex and sometimes arbitrary eligibility requirements for unemployment benefits. Their car started falling apart.

So the Parentes turned to what remains of welfare – TANF, or Temporary Assistance to Needy Families.

TANF does not offer straightforward cash support like Aid to Families with Dependent Children, which it replaced in 1996. It’s an income supplementation program for working parents, and it was based on the sunny assumption that there would always be plenty of jobs for those enterprising enough to get them.

After Kristen applied, nothing happened for six weeks – no money, no phone calls returned. At school, the Parentes’ seven-year-old’s class was asked to write out what wish they would present to a genie, should a genie appear. Brianna’s wish was for her mother to find a job because there was nothing to eat in the house, an aspiration that her teacher deemed too disturbing to be posted on the wall with the other children’s requests.

When the Parentes finally got into “the system” and began receiving food stamps and some cash assistance, they discovered why some recipients have taken to calling TANF “Torture and Abuse of Needy Families.”

From the start, the TANF experience was “humiliating”, Kristen says. The caseworkers “treat you like a bum. They act like every dollar you get is coming out of their own paychecks”.

The Parentes discovered that they were each expected to apply for 40 jobs a week, although their car was on its last legs and no money was offered for gas, tolls, or babysitting. In addition, Kristen had to drive 35 miles a day to attend “job readiness” classes offered by a private company called Arbor, which, she says, were “frankly a joke”.

Nationally, according to Kaaryn Gustafson of the University of Connecticut Law School, “applying for welfare is a lot like being booked by the police“. There may be a mug shot, fingerprinting, and lengthy interrogations as to one’s children’s true paternity. The ostensible goal is to prevent welfare fraud, but the psychological impact is to turn poverty itself into a kind of crime.

How the safety net became a dragnet

The most shocking thing I learned from my research on the fate of the working poor in the recession was the extent to which poverty has indeed been criminalised in America.

Perhaps the constant suspicions of drug use and theft that I encountered in low-wage workplaces should have alerted me to the fact that, when you leave the relative safety of the middle class, you might as well have given up your citizenship and taken residence in a hostile nation.

Most cities, for example, have ordinances designed to drive the destitute off the streets by outlawing such necessary activities of daily life as sitting, loitering, sleeping, or lying down. )It is the same tactics at every generation).

Urban officials boast that there is nothing discriminatory about such laws: “If you’re lying on a sidewalk, whether you’re homeless or a millionaire, you’re in violation of the ordinance,” a St Petersburg, Florida, city attorney stated in June 2009, echoing Anatole France’s immortal observation that “the law, in its majestic equality, forbids the rich as well as the poor to sleep under bridges.”

In defiance of all reason and compassion, the criminalisation of poverty has actually intensified as the weakened economy generates ever more poverty. So concludes a recent study from the National Law Centre on Poverty and Homelessness, which finds that the number of ordinances against the publicly poor has been rising since 2006, along with the harassment of the poor for more “neutral” infractions like jaywalking, littering, or carrying an open container.

The report lists America’s 10 “meanest” cities – the largest of which include Los Angeles, Atlanta and Orlando – but new contestants are springing up every day. In Colorado, Grand Junction’s city council is considering a ban on begging; Tempe, Arizona, carried out a four-day crackdown on the indigent at the end of June. And how do you know when someone is indigent? As a Las Vegas statute puts it, “an indigent person is a person whom a reasonable ordinary person would believe to be entitled to apply for or receive” public assistance.

That could be me before the blow-drying and eyeliner, and it’s definitely Al Szekeley at any time of day. A grizzled 62-year-old, he inhabits a wheelchair and is often found on G Street in Washington DC – the city that is ultimately responsible for the bullet he took in the spine in Phu Bai, Vietnam, in 1972.

He had been enjoying the luxury of an indoor bed until December 2008, when the police swept through the shelter in the middle of the night looking for men with outstanding warrants.

It turned out that Szekeley, who is an ordained minister and does not drink, do drugs, or cuss in front of ladies, did indeed have one – for “criminal trespassing“, as sleeping on the streets is sometimes defined by the law. So he was dragged out of the shelter and put in jail.

“Can you imagine?” asked Eric Sheptock, the homeless advocate (himself a shelter resident) who introduced me to Szekeley. “They arrested a homeless man in a shelter for being homeless?”

The viciousness of the official animus toward the indigent can be breathtaking.

A few years ago, a group called Food Not Bombs started handing out free vegan food to hungry people in public parks around the nation. A number of cities, led by Las Vegas, passed ordinances forbidding the sharing of food with the indigent in public places, leading to the arrests of several middle-aged white vegans.

One anti-sharing law was just overturned in Orlando, but the war on illicit generosity continues.

Orlando is appealing the decision, and Middletown, Connecticut, is in the midst of a crackdown. More recently, Gainesville, Florida, began enforcing a rule limiting the number of meals that soup kitchens may serve to 130 people in one day, and Phoenix, Arizona, has been using zoning laws to stop a local church from serving breakfast to homeless people.

For the not-yet-homeless, there are two main paths to criminalisation, and one is debt.

Anyone can fall into debt, and although we pride ourselves on the abolition of debtors’ prison, in at least one state, Texas, people who can’t pay fines for things like expired inspection stickers may be made to “sit out their tickets” in jail.

More commonly, the path to prison begins when one of your creditors has a court summons issued for you, which you fail to honour for one reason or another, such as that your address has changed and you never received it. OK, now you’re in “contempt of the court“.

Or suppose you miss a payment and your car insurance lapses, and then you’re stopped for something like a broken headlight (about $130 for the bulb alone). Now, depending on the state, you may have your car impounded and/or face a steep fine – again, exposing you to a possible court summons. “There’s just no end to it once the cycle starts,” says Robert Solomon of Yale Law School. “It just keeps accelerating.”

The second – and by far the most reliable – way to be criminalised by poverty is to have the wrong colour skin.

Indignation runs high when a celebrity professor succumbs to racial profiling, but whole communities are effectively “profiled” for the suspicious combination of being both dark-skinned and poor. Flick a cigarette and you’re “littering”; wear the wrong colour T-shirt and you’re displaying gang allegiance. Just strolling around in a dodgy neighbourhood can mark you as a potential suspect. And don’t get grumpy about it or you could be “resisting arrest“.

In what has become a familiar pattern, the government defunds services that might help the poor while ramping up law enforcement.

Shut down public housing, then make it a crime to be homeless. Generate no public-sector jobs, then penalise people for falling into debt. The experience of the poor, and especially poor people of colour, comes to resemble that of a rat in a cage scrambling to avoid erratically administered electric shocks.

And if you should try to escape this nightmare reality into a brief, drug-induced high, it’s “gotcha” all over again, because that of course is illegal too.

One result is our staggering level of incarceration, the highest in the world.

Today, exactly the same number of Americans – 2.3 million – reside in prison as in public housing. And what public housing remains has become ever more prison-like, with random police sweeps and, in a growing number of cities, proposed drug tests for residents. The safety net, or what remains of it, has been transformed into a dragnet.

It is not clear whether economic hard times will finally force us to break the mad cycle of poverty and punishment.

With even the official level of poverty increasing – to over 14% in 2010 – some states are beginning to ease up on the criminalisation of poverty, using alternative sentencing methods, shortening probation, and reducing the number of people locked up for technical violations like missing court appointments.

But others, diabolically enough, are tightening the screws: not only increasing the number of “crimes”, but charging prisoners for their room and board, guaranteeing they’ll be released with potentially criminalising levels of debt.

So what is the solution to the poverty of so many of America’s working people?

Ten years ago, when Nickel and Dimed first came out, I often responded with the standard liberal wish list – a higher minimum wage, universal health care, affordable housing, good schools, reliable public transportation, and all the other things we, uniquely among the developed nations, have neglected to do.

Today, the answer seems both more modest and more challenging: if we want to reduce poverty, we have to stop doing the things that make people poor and keep them that way. Stop underpaying people for the jobs they do. Stop treating working people as potential criminals and let them have the right to organise for better wages and working conditions.

Stop the institutional harassment of those who turn to the government for help or find themselves destitute in the streets.

Maybe, as so many Americans seem to believe today, we can’t afford the kinds of public programs that would genuinely alleviate poverty – though I would argue otherwise. At least, we should decide, as a bare minimum principle, to stop kicking people when they’re down.” End of article

This article is hugely important to me.

If I wrote my diary it is mainly to recollect the miseries I experienced living in the USA for 20 years, to face the conditions, and have a closure.

I earned a PhD in Industrial engineering, but graduated in 1991, at the peak of a recession during Bush senior.

Worse, I had no residency to even hope for a decent job, since I had no relative for support or to back me up.

I recall periods of utter helplessness.  I was living in Kensington (Maryland) and people who knew me assumed I had AIDS or a terminal disease, simply because I looked it.  I spent my last $10 visiting a local dispensary to be told that I am suffering from malnutrition.

At least, I had a professional opinion that I have no terminal disease…I was not entitled to food stamp or welfare programs either (I think), otherwise I would have jumped to the occasion since I turned every stone for survival sake.

I returned to Lebanon: I would not die of hunger or in a ditch like a dog, or in the one room basement frequently flooded and humid

Note 1: George Orwell described this situation very accuretly in https://adonis49.wordpress.com/2008/09/23/down-and-out-in-paris-and-london-by-george-orwell/

Note 2:  You may read https://adonis49.wordpress.com/2011/02/20/no-mass-demonstrations-in-the-us-so-far-is-youth-in-the-us-practically-illiterate/

Are you searching for a Job? (May 29, 2009)

 

            I recall that in 1991 the US was in serious recession during the Bush Sr. Administration and jobs were frighteningly scarce.  I had graduated with a PhD degree in Industrial/Human Factors engineering and missed better periods for hiring academicians. I was working as assistant to manager at a retirement community in Downtown San Francisco and visited an employment center on Van ness Street. It was a center meant to help you out re-write your CV for the nth time anytime you wanted to apply for the scarce job announcements posted in the center.  People swarmed this center just to feel busy and serious about searching for a job but not that hot for finding one.  I guess the center was one of the hundreds of facilities with the sole purpose to blaming the citizens for failure to doing their due diligence and compete since no one is about to beg you to work for them.  If you failed to re-write your CV and spent more money on useless stamps then you are not making good use of this “valuable” help facility. 

This was the period when ridiculous denials were the custom of the land. For example, this custodian at NASA who claims that he is contributing to sending astronauts to the moon; or redefining their jobs as sanitation “engineering”.  I recall that I was forced to accept a job cleaning and vacuuming the main library while working on my dissertation. I fooled my spirit into believing that as long as I am doing my job perfectly and with excitement then I am learning the value of a job well done, sort as a training period for toughening my character.  A state of denial is not a bad reaction; it is the string of successive states of denials that can be deleterious to your development.

This is no time for denial; get on it and find your own line of business.

 

If you loved your fields of “expertise” before you were fired then it is a matter of dignity to continue your education, update your knowledge, and “re-cycle” your skills in what you do best.  Otherwise, consider this liberty from an unsatisfactory job to revisit what you love to do next that would transform your wretched life into something of value to your pocket, nerves, and your soul.  Mindlessly resuming wasting your time and energy on “procedures” that millions are following without true hunger for what you like to work in and the conditions that suit your character is the road to hell.

 

Unemployment is increasing at a faster pace and the out of jobs in developed nations are anxious, except in the USA.  Not that the US has the potential to creating new jobs quickly but because the US citizens learned that unemployment is a period for working harder to locating a new job; they were indoctrinated that unemployed should invest 14 hours searching simply because the misery unemployment benefit is targeted for searching full time for a non-existing job.  For example, people still listen to the “guru” Harvey Mackay who said “Once you are fired then you have a new job; a harder job than your previous remunerating one because you should realize that you are required to invest 16 hours searching for a job” It is way of enslaving the mind of citizens and diverting them from getting on the march requesting answers to the state of affairs they are paying so dearly for.

 

Sir, there are no jobs for hire.  People who managed to retain their jobs are frantic about ways and for how long they could keep it.  Owners of enterprises are re-organizing and “re-structuring” their line of business until they figure out and absorb the new legal loopholes to cheat out the government for fresh money.

Sir, it is time you figure out your special skills that you have forgotten or never believed that people appreciated. The time for mass consumerism for redundant and similar items is coming to an end.  People are searching for value added products that express individuality, personal skills, and talent.

 

Sir, this is the time to militate for State health coverage, to joining organization helping the unemployed to re-cycling their skills, re-connecting to your professional associations and working for bureaucratic changes, newer opening, training, and facilities. This is an excellent time to joining the activists who are re-thinking alternative economic and financial systems.  The last think you need to believe and erase it from your mind with utmost prejudice is that searching for a job is indeed a full time job! Re-cycling skills that you hate and abhor is not a panacea either. 

            One thing is true: you are free at last to think straight, reflect about your life, your strengths and weaknesses, your set of values and what a world you want to live in.  You have vast potentials if you focus on your capabilities and concentrate on your forgotten skills.

 

Note: The theme was partly inspired from a short article by Barbara Ehrenreich.


adonis49

adonis49

adonis49

November 2020
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  

Blog Stats

  • 1,440,150 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 783 other followers

%d bloggers like this: