Adonis Diaries

Posts Tagged ‘engineering

Repetitive illnesses: Shouldn’t beast of burden enjoy the rights that Humain refuse themselves

Note: Repost of 2004 “What are the rights of the beast of burden; like a donkey?” 

Article #4: Human Factors in Engineering

People used to own donkeys for special works and they still do in many places.

Donkeys are relatively cheap, if you can find them:  They are quite obedient and resilient.

Donkeys can endure hardships if you provide food and minimal care.

Low level employees, such as in data input jobs, are far less loved and appreciated than the former hot blooded mammals.

They helplessly endure repetitive musculoskeletal pains. Ironically, many of the clerks do proudly claim these pains as a badge of honor.

They are remunerated cheaper than donkeys because all that their job entails is to just sit and do monotonous work.

They suffer all the sedentary diseases: neck, head, shoulders, and back pains.

They suffer irremediable hands, fingers and wrists handicaps for the rest of their wretched lives.

Graphic designers are certainly a tad better: They are paid slightly better; not for their artistic imagination, but may be because they can also use a few more computer application programs.

Historically, the design of the characters on the first typewriters was meant to slow down typing:

Fast typing used to jam the arms of the mechanical typewriters.

A large order by a big company at the time hampered any redesign of the characters for the newer technological advances in the manufacture of typewriters.

Still, secretaries had to awkwardly learn typing fast to meet production and greed.

The benefits of redesigning the shapes and forms of computer keyboards, which could temporarily alleviate the many cumulative musculoskeletal disorders from harsh continuous and daily typing, did not reach the common typists and data entry clerks.

These low level employees were not worth any investment in upgraded keyboards.

Higher level employees, who barely use computers for any productive task, were honored with the latest gizmos.

In fact, I believe that even the best ergonomically designed keyboards cannot solve these disorders:

Heavy computer users, for 8 hours daily, are still performing repetitive movements, sitting still, eyes riveted to a display.

They are still asked to perform maximally, under the watchful and tireless computer supervisor:

An efficient program is embedded in the computer itself, a program meant to collect data and analyzes performances of the donkey clerk.

Employees should not demand any redesign of the characters on keyboards.

Any faster typing design will be at their detriment and they will pay the price bitterly.

Their task will come to higher risks to their health and safety with no increase in wages.

They should know that faster standards will then be required of them;

Instead of 60 words per minutes, Mr. Greed might ask of them to be able to type 300 wpm.

It is not enough to improve technology; we need to restrain its consequences.

Bless the French Rabelais who said: “Science without conscience is the ruin of the soul”.

Note: Nothing has improved with the new communication technologies, but with small mobile phones people don’t have to sit still in one place. People can lay down, move and commit traffic accidents talking and manipulating their new gizmos.

Human Factors in Design

The term Design is all the rage.

Any professional in any field feels it imperative to add Design in the title.

Engineers, graphic professionals, photographers, dancers, environmentalists, climatologists, scientists… they all claim to be designers first.

And this is very refreshing.

Have you heard of this new field of Design Anthropology? https://adonis49.wordpress.com/2012/02/06/design-anthropology-why-are-there-designs-not-meant-for-human/

Dori Tunstall said in an interview with  Debbie Millman:

Design translate values into tangible experiences…Design can help make values such as equality, democracy, fairness, integration, connection…(values that we have lost to some extent), more tangible and express how we can use them to make the world a better place…”

Looks like Tunstall expanded the term design to overlap with the political realm of Congress jobs, law makers, political parties, election laws…

It is about time that everyone “think design” when undertaking any project or program

Anything we do is basically designed, explicitly or implicitly: Either we are generating products and programs for mankind, or it is mankind who is in charge of executing, controlling and managing what has been conceived.

So long as human are directly involved in using a product or a program, any design must explicitly study and research the safety, health, and mistakes that the operators and users will encounter.

Must as well that the design be as explicit in the attributes of health, safe usage, errors that might generate serious consequences, materially, mentally or physically.

Four decade ago, there was a field of study called Human Factors.

The term Human Factors was considered too general to be taken seriously in Engineering.

The implicit understanding was that “Of course, when an engineer designs anything, it is the human who is targeted….”

However, besides applying standards and mathematical formulas, engineers are the least concerned directly with the safety, health of users: The standards are supposed to take care of these superfluous attributes…

And who are the people concerned in setting standards?

Standards are arrived at in a consensus process between the politicians and the business people, and rarely the concerned users and consumers are invited to participate in the debate, except in later sessions when standards are already drafted…

And how explicitly experiments were designed to allow users to test, and give feedback to any kinds of standards, handed down from successive standard sets…?

Countless engineers and scientists are directly engaged in putting rovers on Mars and launching shuttles and… and the human in the project is taken for granted…

If you ask them whether they have human factors engineers in their teams, they don’t understand what you mean.

The project is supposed to be an engineering project, and “where the hell did you bring this human thing in the picture?”

Anything that is designed must consider the health, safety, and how a person from various ages, genders, and ethnic idiosyncracies might use the product or the program

Take all the time in design process. People are not supposed to be used as ginea pigs for any redesigned process… after countless lawsuits, pains, suffering…

This is a preliminary draft. Any input and replies?

Note: https://adonis49.wordpress.com/2008/10/04/whats-that-concept-of-human-factors-in-design/

Convention Without Walls: ‘Digital Divide’ Overlooked by the live-streaming technology?

With a steady stream of blog posts, tweets, Facebook posts and YouTube videos, even the Republican party convention is live-streaming on YouTube.

The presidential campaigns have increasingly embraced the web as a way to speak directly to voters.

The Republican National Convention in Tampa, which is calling itself the “Convention Without Walls,” is releasing a mobile app and encouraging Facebook users to share their photos and videos.

The upcoming Democratic National Convention in Charlotte has planned similar digital outreach.

Yet, millions of Americans won’t be able to participate. They are blocked from experiencing much of the online world:  Simply, they don’t have access to high-speed Internet.

About one-third of Americans (100 million people) do not subscribe to broadband. This so-called “digital divide” will likely receive little, if any attention during the political conventions.

Gerry Smith in the HuffPost wrote:

“Bridging the technology gap fits squarely within the candidates’ platforms for reducing unemployment, increasing access to health care and education, and helping the country compete in a globalized economy, experts say.

Almost every aspect of today’s society — from looking for jobs to accessing online medicine and classrooms — now requires a broadband connection, and those without access are quickly being left behind.

“I feel like I’m at a disadvantage,” said James Brunswick, a 51-year-old Philadelphia resident who is looking for a job but can’t afford a computer.

There are different reasons why Americans are disconnected.

1. About 19 million people, mostly in rural areas, don’t have high-speed Internet because phone and cable companies don’t provide service to their location. I

2. Many low-income Americans can’t afford broadband subscriptions.

3. About 40% of adults with household incomes less than $20,000 have broadband at home, compared to 93 percent with household incomes greater than $75,000, according to the F.C.C.

4. A growing number of people who can’t afford computers or Internet service are turning to smartphones as a more affordable way to get online.

Experts warn that mobile devices — with their small screens, data caps and slower speeds — are no substitute for a computer with a high-speed connection.

To help more people join the digital age, the Obama administration set aside $7.2 billion to deploy high-speed Internet to unserved and low-income areas. The Federal Communications Commission has overhauled its Lifeline program to provide discounted Internet service to families in need and has partnered with major cable providers to supply $10 Internet access to households with a child enrolled in the national school lunch program.

Again, experts say more must be done.

A few of the experts argue the next administration needs to regulate broadband providers to promote competition, which would give consumers more choices and lower prices for broadband service.

“We can throw subsidies at the problem all day, but it’s not going to close the digital divide unless we have a robust, competitive market that will lead to lower prices and more attractive services,” said Derek Turner, research director at the public-interest group Free Press.

There are other reasons why people don’t get online.

1. Some are not comfortable with the Internet, while others think the web is a waste of their time, surveys show.

2. And while the price of computers is falling, many low-income Americans still can’t afford them and must rely on public libraries to get online — a digital safety net that is starting to fray.

3. More than half of libraries say their Internet connections are not fast enough, and libraries nationwide are facing budget cuts that have forced them to close on weekends and evenings, according to the American Library Association.

“We are suffering from the perfect storm,” said Emily Sheketoff, the executive director of the American Library Association’s Washington office.

About 80% of schools and libraries receiving federal funding for Internet service say their connections “do not fully meet their needs,” according to an FCC report issued last week.

Stephanie Thomas is a history and government teacher at Broad Street High School in Shelby, Miss., a rural town of 2,000 people where nearly half of families live in poverty.

Thomas often wants to show her students online videos or conduct interactive lessons, but the school’s limited bandwidth makes that impossible.

“We have the Internet but it can be extremely slow,” Thomas said. “There are times where I’ve wanted to show YouTube videos and I spend half of the class period waiting for it to load.”

The FCC’s National Broadband Plan, which was released in 2010, offers a blueprint for helping more people join the digital age.

The plan suggests:

1. That the commission provide wireless spectrum to companies on the condition that they offer free or low-cost broadband service to low-income customers.

2. It recommends Congress provide more funding to teach low-income Americans how to use the Internet and help people with disabilities and Native Americans, who have especially low rates of broadband adoption, gain access to the web.

Turner said there is another reason why both presidential candidates should be concerned about the millions of Americans who are not online: They need their votes, and the Internet has become an increasingly popular platform for candidates to reach voters and voters to learn more about them.

“The Internet is rapidly becoming an indispensable tool for democratic participation,” Turner said. “And we need to be concerned that there is a social cost to those who can’t participate in that conversation.”

It is about time the old effective method of door-to-door connections be relaunched: When will the voters get to meet the candidate coordinators and relay their concerns face to face?

Nicholas Tesla: The genius Geek of all time

Do you know who invented or discovered most of modern time technology?

1. Alternative current AC

2. Radio transmission

3. Radar

4. X-Rays

5. Hydro-electric plant

6. Resonant frequency of the earth

7. Remote control

8. Neon lighting

9. Ball lightning

10. Earthquake measuring machine

11. Electric motor

12. Wireless communication…and much more?

Nicholas Tesla was born 100 years ago. He was a Serbian-American inventor.

He lived to be 86 and remained celibate. He was 6’6″ tall and mastered 8 languages.

Tesla survived on milk and Nabisco crackers… And died penniless while making Edisson reap the patents of his own inventions and being acclaimed as the inventor of the century…

Click on this comic for fascinating details: http://theoatmeal.com/comics/tesla

Atmospheric physics? Who is Toufic Hawaat?

For 50 years, scientists have been visiting the North and South Poles to study the atmosphere and matching their various models to data gathered painstakingly, under harsh climatic conditions…

Toufic Hawaat is one of these researches and physicists who spent 8 years in the Poles, three months at a stretch during the Poles summer seasons: The temperature drops to minus 122 in the winter and you’ll be stuck for an entire year there if the team makes the grave mistake of visiting the Poles in winter time.

The temperature in the South Pole in summer time is a mild minus 80 degrees Celsius, and the team members of researchers live within iced walls, and feed on biscuits tasting beef and pizza and canned food…

Toufic Hawaat is born in Lebanon in the town of Bte3bourat in the district of Koura.  He got his engineering degree from the University of Lebanon and resumed higher education in Paris, receiving his Ph.D on the physics of atmospheric constitution and phenomena…

For 6 years, Toufic worked for the French public CNF and had to travel to the North Pole in 1994 at the age of 25. He visited the North Pole 6 times and planted the Lebanese flag (the first time ever).

In 1998, Toufic met with an American research team from the University of Denver, doing the same kind of research. The US team offered Toufic the Green Card and dispatched him to the South Pole.

Antarctica is 5 times as vast as the USA and snow is 9,000 meters thick, and it takes 38 hours to reach destination.

So far, Dr. Hawaat has published 48 peer-reviewed scientific articles and will deliver a speech at Rio 20+. The collected and analyzed data reveal a steady increase in natural calamities since 1972, and going worse by the year.

Toufic do visit his hometown in Lebanon, now and then, with his family (his wife is also from Lebanon), but he claims that the air and water in Lebanon are too polluted to reside here. He has no plans of settling in Lebanon…

Note: This post was inspired from an interview conducted by Pascal Azar to the Lebanese daily Al Nahar.

Natural energy regeneration: Sun plus water produce methane and oxygen

Oxygen and methane are the natural energies that combine to generate heat, power, water, and carbon dioxide, sort of clean energy resources…

There are several methods for generating “non-toxic” energy from photo-chemical reactions, to micro-organism photosynthesis using photo-bioreactors of algae and bacteria…

New technologies have demonstrated that it is possible to produce methane and oxygen just using sun rays and water…How it works?

Step 1: Liquid prisms containing water redirect sun rays to bundles of parallel rays.

Step 2: The parallel rays get concentrated and focused using Fresnel-type of lenses (the kinds used in phares)

Step 3:   The focused rays hit an “optofluid reactor” constituted of microscopic translucent (tranlucid) tubes. Water and CO2 are injected in the tubes..

Step 4. The tubes are covered with catalyst dioxide of titan that accelerate the chemical decomposition into methane and O2.

Scientific researcher Demetri Psaltis at Lausane Polytechnic School published this mechanism in the magazine “Nature Photonics”.

Microchips tubes increase chemical reactions by a thousand fold, but it is industrial production of these special micro tubes that may be a difficulty for industrial production of O2 and methane.  In any case, this was a problem for solar cells 20 years ago, and it has been resolved as government got involved and pored in the necessary funds.

The other hurdle is how to clean the million of micro-tubes as organic matters and bacteria will pollute the “reactors”?

In any case, Greek researchers are adding pieces of cheese and bad milk in industrial batteries to increase performance.

Note: You may access this piece electronically on http://www.courrierinternational.com. There is no lock on this article and you may visualize the schematics.

“A short history of nearly everything” by Bill Bryson

Eco-system

Thomas Midgley Junior was an engineer by training and he developed an interest in the industrial applications of chemistry.  With an instinct for the regrettable that was almost uncanny, Midgley invented chlorofluorocarbons CFC that is eating up our ozone layer in the stratosphere.

Midgley also applied tetraethyl lead that spread devastation to human health by killing millions from lead contamination and increasing the lead content in our bones and blood 650 times the normal dose.

Tetraethyl lead was used to significantly reduce the “juddering” condition known as engine knock.  GM, Du Pont and Standard Oil of New Jersey formed a joint enterprise called Ethyl Gasoline Corporation with a view to making as much tetraethyl lead as the world was willing to buy this new gasoline and introduced this product in 1923.

Lead can be found in all manner of consumer products; food came in cans sealed with lead solder, water was stored in lead-lined tanks, and lead arsenate was sprayed onto fruit as a pesticide and even as part of the composition of toothpaste tubes.

However, lead lasting danger came as an additive to motor fuel.

Clair Patterson turned his attention to the question of all the lead in the atmosphere and that about 90% of it appeared to come from car exhaust pipes.  He set about to comparing lead levels in the atmosphere now with the levels that existed before 1923.

His ingenious idea was to evaluate these levels from samples in the ice cores in places like Greenland. This notion became the foundation of ice cores studies, on which much modern climatological work is based.

Patterson found no lead in the atmosphere before 1923.  Ethyl Corporation counter-attacked by cutting off all research grants that Patterson received.  Although Patterson was the unquestionable America’s leading expert on atmospheric lead, the National Research Council panel excluded him in 1971.

Eventually, his efforts led to the introduction of the Clean Air Act of 1970 and to the removal from sale of all leaded petrol in the USA in 1986.  Lead levels in the blood of the Americans fell by 80% almost within a year; but since the atmosphere contains so much lead and cannot be eliminated and is for ever, we are to live with a new constitution of heavy lead concentration in our blood stream and our bones.

Lead in paint was also banned in 1993, 44 years after Europe has banned it.  Leaded gasoline is still being sold overseas.  Ironically, all the research on lead effects on health were funded by the Ethyl Corporation; one doctor spent 5 years taking samples of urine and faces instead of blood and bones where lead accumulate.

Refrigerators in the 1920s used dangerous gases and leaks killed more than a hundred in 1929 in a Cleveland hospital.  Thomas Midgley came to the rescue with a safe, stable, non-corrosive, and non-flammable gas called CFC.

A single kilo of chlorofluorocarbon can capture and annihilate 70,000 kilo of atmospheric ozone, which is no thicker than 2 millimeter around the stratosphere and whose benefit is to capture the dangerous cosmic rays.

CFC is also a great heat sponge 10,000 times more efficient than carbon dioxide responsible for the greenhouse effect of increasing atmospheric temperature.

CFC was banned in 1974 in the USA but 27 million kilo a year are still being introduced in the market in other forms of deodorant or hairspray for example.  CFC will not be banned in the third world countries until 2010.

The natural level of carbon dioxide in the atmosphere should be 280 parts per million but it has increased to 360 and is roughly rising 0.025% a year and might be around 560 by the end of the century.

The seas soak up tremendous volumes of carbon and safely locked it away.  Since the Sun is burning 25% more brightly than when the solar system was young, what keeps our Earth stable and cool?

It seems that there are trillions upon trillions of tiny marine organisms that capture carbon from the rain falls and use it to make tiny shells. These marine organisms lock the carbon and prevent it from re-evaporating into the atmosphere; otherwise, the greenhouse effect of warming the atmosphere would have done much damage long time ago. These tiny organisms fall to the bottom of the sea after they die, where they are compressed into limestone.

Volcanoes and the decay of plants return the carbon to the atmosphere at a rate of 200 billion tones a year and fall to the Earth in rain.  The cycle takes 500,000 years for a typical carbon atom.  Fortunately that most of the rain fall in oceans because 60% of the rain that fall on land is evaporated within a couple of days.

Human has disturbed this cycle after the heavy industrialization era and is lofting about 7 billion tones each year.

There is a critical threshold where the natural biosphere stops buffering us from the effects of our emissions and actually starts to amplify them.

Brace for worst case nuclear scenario. What is “Systemic degradation of work ethics in nuclear power plants”?

You are warned: Brace for worst case nuclear scenario.  The next catastrophe is within a year!

Japan has uncovered other problems, in yet another series of nuclear power plants; not the Fukushima plants that are no longer under control.  This is the tip of the iceberg:  Not the worst case scenario that is setting the world community in turmoil.  It is not also that earthquakes are spreading westward to Myanmar, India, Pakistan, Iran…countries with many nuclear power plants and not very well maintained.

What is hiding beneath the iceberg of catastrophe is the  “Systemic degradation of work ethics in nuclear power plants”.  Four decades ago, most of the civilian nuclear power plants were built, maintained, and controlled by governments.  Each plant had its own trained and professional personnel and experts.  The operators knew exactly the problems of the plants and kept accurate and frank diagnostic procedures.

The operators were taught to immediately bring up the problems, however minors they were, to the attentions of their superiors.  Frank discussions ensued.  Engineers and scientists were not as confident as they are sounding nowadays:  They were on a mission to learn as they progressed on the danger and modeling of nuclear power.  The operators were dedicated and stable in their jobs.

Two decades ago, hell broke loose:  Government owned nuclear power plants were  “privatized”.  The owners were supposed to infuse fresh investment into upgrading the plants.  It didn’t work that way.

The owner quickly started to give troubles to the dedicated professional operators and licensed them as soon as they insisted on the existence of a problem in the functioning of the plants.  The owner purpose was to make profit out of the existing life of a plant:  So long that electricity was produced, the less money spent on repairs and maintenance the better.

Incidents judged minors were ignored.  Accidents that had the potential of deteriorating were hushed.  Divulging of the exact number of incident and minor accidents were not reported.  Maintenance of nuclear plants were subcontracted, and maintenance periods were spaced out and shortened to saving money.

The contracting companies hired low-waged operators, constantly on the road, and not of the professional operators kinds.  These subcontracted operators did what they were told to do and left the premises without knowing exactly what were the real problems.  Many of them received heavy radiation and didn’t know their medium-term health conditions.

The owners of the plants subcontracted in order not bear any financial or legal liabilities.  As if any catastrophe hits, anyone will be left off the hook.  Most of the nuclear power plants in France have experienced many accidents; a few of them over 30 accidents each without being formally disseminated to public attentions.  Most nuclear plants in France, over 55 nuclear plants, are owned by private electrical companies that have degraded work ethics:  A chain reaction of nuclear catastrophes are expected to spread in France.

It is urgent that all government re-nationalize their nuclear power plants and pay off those rascals of private owners with profit generated.

It is urgent that several independent panels of expert, at the sold of the government, review the conditions of the plants and closing degraded and ol plants immediately.

It is urgent that each plant has its own dedicated and stable operators, at the sold of the government.

Note:  When I published this post, three weeks ago, it was obvious that vast misrepresentation of the level of danger was underway at the highest levels.  It appears that the Fukishima catastrophe is far worse than Chernobyl.  The nuclear scientists have to increase the scale to over 9 in order to categorize the consequences of the danger of Fukushima.    So far, the ocean at over 400 miles deep is heavily contaminated; meaning any living form in the water is dead or soon to be dead.  It is like taking a swim in contaminated Lake Baikal, in Russia, and dying.  The water of Lake Baikal is clear and cobalt blue and terribly dead, as the water used to cool the nuclear reactor!  If also river water is contaminated, how would you feel living in Japan?

So far, the nuclear meltdown of Japan nuclear power plants are the third worst after Chernobyl (Ukraine) and Three Miles Island (USA); I don’t know:  With the successive aftershocks of about 6 on the Richter scale and the fourth plants acting up, consequences are not that encouraging for the foreseeable future. The collective intelligence of Japanese didn’t wait for their government to announce any warnings or reports:  They just fled; the furthest, the soonest, the quickest the best.  Every year, the Japanese mourn the victims of Hiroshima and Nagasaki nuclear bombs dropped by the USA in World War II.

The latest news are that this catastrophe has surpassed Three miles Island and the levels of radiation contamination has increased ten folds.  Only 50 professional operators have been working around the clock, and if no foreign specialized teams are flown in to replacing the over exhausted Japanese workers things might get out of hands.

I am wondering, if Japan invested on technologies to storing and transferring tsunami power, wouldn’t tsunami be a friendly event every year, like rain, sunshine, wind…?

It is understandable the main reasons for Japan’s decision to be dotted with nuclear power plants:  Japan has no oil or gas resources and has to import its needs in fuel and liquid gas by sea giant carriers.  Actually, Japan has been developing mini-transportable nuclear plants that are self maintained with a duration of five years and at a fraction of traditional plants https://adonis49.wordpress.com/2010/04/10/mini-nuclear-reactors-manufactured-in-series/.

France is already 60% sufficient in energy generated by nuclear sources and it has excess refined car fuel that it cannot find market for because France has shifted into efficient energy cars and substitute cleaner fuels.  France is about to be major market for performing electric cars.  Germany decided a couple of days ago on a 3-month moratorium for constructing nuclear power plants. The US was readying a program to re-launching a series of such plants:  Most probably, such a program will be revised.

The drawback is that Japan is an unstable island, geologically stuck smack on a major volcanic and seismic tectonic plate fault.  The earthquake that hit Japan, centered 150 miles in the deep ocean, was 7 times more powerful than the worst earthquake the nuclear power plant was built for (the Richter scale works logarithmically; the difference between the 8.2 that the plants were built for and the 8.9 that happened is 7 times (not 0.7 times). So the first hooray for Japanese engineering, everything held up.

When the earthquake hit with 8.9 force, the nuclear reactors all went into automatic shutdown. Within seconds, after the earthquake started, the control rods had been inserted into the core and nuclear chain reaction of the uranium stopped.

The cooling system has to carry away the residual heat. The residual heat load is about 3% of the heat load under normal operating conditions.  The earthquake destroyed the external power supply lines of the nuclear reactor. A “plant black out” emergency sources receives a lot of attention when designing the backup systems.  Since the power plant had been shut down, it cannot produce any electricity by itself any more to keeping the coolant pumps working.

Things were going well for an hour. One set of multiple sets of emergency Diesel power generators kicked in and provided the electricity that was needed. Then the Tsunami hit, much bigger than people had expected when building the power plant . The tsunami took out all multiple sets of backup Diesel generators.

When the diesel generators were gone, the reactor operators switched to emergency battery power. The batteries were designed as one of the backups to the backups, to provide power for cooling the core for 8 hours. And they did.

Within the 8 hours, another power source had to be found and connected to the power plant. The power grid was down due to the earthquake. The diesel generators were destroyed by the tsunami. So mobile diesel generators were trucked in.

This is where things started to go seriously wrong. The external power generators could not be connected to the power plant (the plugs did not fit).  Another proof that you cannot design safe-proof dangerous systems. So after the batteries ran out, the residual heat could not be carried away any more.

At this point the plant operators begin to follow emergency procedures that are in place for a “loss of cooling event”. It is again a step along the “Depth of Defense” lines. The power to the cooling systems should never have failed completely, but it did, so they “retreat” to the next line of defense. All these procedures are part of the day-to-day training you go through as an operator, right through to managing a core meltdown.

It was at this stage that people started to talk about core meltdown:  If cooling cannot be restored, the core will eventually melt, and the last line of defense, the core catcher and third containment, would come handy for a little while.

But the goal at this stage was to manage the core while it was heating up, and ensure that the first containment (the Zircaloy tubes that contains the nuclear fuel), as well as the second containment (pressure cooker) remain intact and operational for as long as possible, to give the engineers time to fix the cooling systems.

Because cooling the core is such a big deal, the reactor has a number of cooling systems, each in multiple versions (the reactor water cleanup system, the decay heat removal, the reactor core isolating cooling, the standby liquid cooling system, and the emergency core cooling system). Which cooling system failed is not clear at this point in time.

The plants at Fukushima are of the Boiling Water Reactors  (BWR) types. They are similar to a pressure cooker:  The nuclear fuel heats water, the water boils and creates steam, the steam drives turbines that create the electricity.  The steam is cooled and condensed back to water.   The water is send back to be heated by the nuclear fuel. The pressure cooker operates at about 250 °C.

The nuclear fuel is uranium oxide manufactured is a small ceramic cylindrical pellet forms like Logo bricks, with a very high melting point of about 3000 °C.  The fuel pieces inserted into a long tube made of Zircaloy with a melting point of 2200 °C, and sealed tight. The assembly is called a fuel rod. These fuel rods are put together to form larger packages, and a number of these packages constitute the reactor or “the core”.

The Zircaloy casing is the first containment. It separates the radioactive Uranium fuel from the rest of the world.  The core is placed in the “pressure vessels” or the pressure cooker. The pressure vessels is the second containment. This is one sturdy piece of a pot, designed to safely contain the core for temperatures several hundred °C. That covers the scenarios where cooling can be restored at some point.

The entire “hardware” of the nuclear reactor – the pressure vessel and all pipes, pumps, coolant (water) reserves, are then encased in the third containment.

The third containment is a hermetically (air tight) sealed, a very thick bubble of the strongest steel. The third containment is designed, built and tested for one single purpose: To contain, indefinitely, a complete core meltdown. For that purpose, a large and thick concrete basin is cast under the pressure vessel, which is filled with graphite, all inside the third containment. This is the  “core catcher”. If the core melts and the pressure vessel bursts (and eventually melts), it will catch the molten fuel and everything else. It is arranged in such a way that the nuclear fuel will be spread out, so it can cool down faster.

This third containment is surrounded by the reactor building. The reactor building is an outer shell that is supposed to keep the weather out, but nothing in. This is the part that was damaged in the explosion from the initial information

The uranium fuel generates heat by nuclear fission. Big uranium atoms, the biggest of atoms, are split into smaller atoms when hit by powerful neutrons. This fission generates heat plus neutrons . When the neutron hits another uranium atom, further splitting generate  “nuclear chain reactions”.

The nuclear fuel in a reactor can NEVER cause a nuclear explosion the type of nuclear bomb. In Chernobyl, the explosion was caused by excessive pressure buildup, hydrogen explosion and rupture of all containment, propelling molten core material into the environment or “dirty bomb”.

In order to control the nuclear chain reaction, the reactor operators use “control rods” that absorb the neutrons and kill the chain reaction instantaneously.  When operating normally, all the control rods are taken out. The coolant water carries away the heat at the same rate as the core produces, around the standard operating point of 250°C.

The challenge is that after inserting the rods and stopping the chain reaction, the core still keeps producing heat. The uranium “stopped” the chain reaction. But a number of intermediate radioactive elements (like Cesium and Iodine isotopes) are created by the uranium during its fission process.  Cesium and Iodine isotopes are radioactive versions of elements that will eventually split up into smaller atoms and cease to be radioactive anymore.

Those isotopes keep decaying and producing heat. But they are not regenerated any longer from the uranium they get less and less, and so the core cools down over a matter of days, until those intermediate radioactive elements are used up.  This residual heat is causing the headaches from the latest intelligence.

There is a second type of radioactive material created, outside the fuel rods, that have a very short half-life of seconds and split into non-radioactive materials.. So if these radioactive materials are released into the environment the released radioactivity is not dangerous, at all.  Those radioactive elements are N-16, the radioactive isotope of nitrogen (air) and a few noble gases such as Xenon: The neutrons are the cause of these short-lived radioactivity elements released in the environment.

So imagine our pressure cooker on the stove, heat on low, but on. The operators use whatever cooling system capacity they have to get rid of as much heat as possible, but the pressure starts building up. The priority now is to maintain integrity of the first containment (keep temperature of the fuel rods below 2200°C), as well as the second containment, the pressure cooker.

In order to maintain integrity of the pressure cooker (the second containment), the pressure has to be released from time to time. Because the ability to do that in an emergency is so important, the reactor has 11 pressure release valves. The operators now started venting steam from time to time to control the pressure. The temperature at this stage was about 550°C.

This is when the reports about “radiation leakage” starting coming in: Venting the steam is theoretically the same as releasing radiation into the environment.  The radioactive nitrogen as well as the noble gases do not pose a threat to human health because of their seconds of half-lives.

At some stage during this venting, the explosion occurred. The explosion took place outside of the third containment  “last line of defense” and the reactor building. The operators decided to vent the steam from the pressure vessel not directly into the environment, but into the space between the third containment and the reactor building (to give the radioactivity in the steam more time to subside).

The problem is that at the high temperatures that the core had reached at this stage, water molecules can “disassociate” into oxygen and hydrogen – an explosive mixture. And it did explode, outside the third containment, damaging the reactor building around. It was that sort of explosion, but inside the pressure vessel that lead to the explosion of Chernobyl.

This scenario is believed never to be a risk at Fukushima. The problem of hydrogen-oxygen formation is one of the biggies when you design a power plant, so the reactor is build and operated in a way it cannot happen inside the containment. It happened outside, which was not intended but a possible scenario and OK, because it did not pose a risk for the containment.

So the pressure was under control, as steam was vented. Now, if you keep boiling your pot, the problem is that the water level will keep falling and falling. The core is covered by several meters of water in order to allow for some time to pass (hours, days) before it gets exposed. Once the rods start to be exposed at the top, the exposed parts will reach the critical temperature of 2200 °C after about 45 minutes. This is when the first containment, the Zircaloy tube, would fail.

And this started to happen. The cooling could not be restored before there was some damage to the casing of some of the fuel. The nuclear material itself was still intact, but the surrounding Zircaloy shell had started melting.

What happened now is that some of the byproducts of the uranium decay – radioactive Cesium and Iodine – started to mix with the steam. The big problem, uranium, was still under control, because the uranium oxide rods were good until 3000 °C. It is confirmed that a very small amount of Cesium and Iodine was measured in the steam that was released into the atmosphere.

It seems this was the “go signal” for a major plan B. The small amounts of Cesium that were measured told the operators that the first containment on one of the rods was about to give. The Plan A had been to restore one of the regular cooling systems to the core. Why that failed is unclear. One plausible explanation is that the tsunami also took away or polluted all the clean water needed for the regular cooling systems.

The water used in the cooling system is very clean, like distilled water: Pure water does not get activated much and stays practically radioactive-free. Dirt or salt in the water will absorb the neutrons quicker, becoming more radioactive. This has no effect whatsoever on the core – it does not care what it is cooled by. But it makes life more difficult for the operators and mechanics when they have to deal with activated (i.e. slightly radioactive) water.

But Plan A had failed – cooling systems down or additional clean water unavailable – so Plan B came into effect. This is what it looks like happened:

In order to prevent a core meltdown, the operators started to use sea water to cool the core.  The nuclear fuel has now been cooled down. Because the chain reaction has been stopped a long time ago, there is only very little residual heat being produced. The large amount of cooling water that has been used is sufficient to take up that heat. Because it is a lot of water, the core does not produce sufficient heat any more to produce any significant pressure.

Boric acid has been added to the seawater. Boric acid is “liquid control rod”. Whatever decay is still going on, the Boron will capture the neutrons and further speed up the cooling down of the core.

The plant came close to a core meltdown. Here is the worst-case scenario that was avoided: If the seawater could not have been used for treatment, the operators would have continued to vent the water steam to avoid pressure buildup. The third containment would have been completely sealed to allow the core meltdown to happen without releasing radioactive material.

After the meltdown, there would have been a waiting period for the intermediate radioactive materials to decay inside the reactor, and all radioactive particles to settle on a surface inside the containment. The cooling system would have been restored eventually, and the molten core cooled to a manageable temperature. The containment would have been cleaned up on the inside. The messy job of removing the molten core from the containment would have begun, packing the (now solid again) fuel bit by bit into transportation containers to be shipped to processing plants.

Before the news of the failure in the fourth reactors we could surmise the damages will be contained and repaired within 5 years.  The reactor cores will be dismantled and transported to a processing facility, just like during a regular fuel change.  Fuel rods and the entire plant will be checked for potential damage.

The safety systems on all Japanese plants will be upgraded to withstand a 9.0 earthquake and tsunami (or worse), but the most significant problem will be a prolonged power shortage. About half of Japan’s nuclear reactors will probably have to be inspected, reducing the nation’s power generating capacity by 15%.  This will probably be covered by running gas power plants that are usually only used for peak loads to cover some of the base load as well. That will increase electricity bill and lead to potential power shortages during peak demand, in Japan.

The lesson learned so far: Japan suffered an earthquake and tsunami of unprecedented proportion that has caused unbelievable damage to every part of their infrastructure, and death of very large numbers of people. The media have chosen to report the damage to a nuclear plant which was, and still is, unlikely to harm anyone.

From the early morning Saturday nuclear activists were on TV labelling this ‘the third worst nuclear accident ever’. This was no accident (in the sense of man-made operating negligence of a fault in fundamental design according to the latest technologies), this was damage caused by truly one of the worst of earthquakes and tsunamis ever.

The second lesson is to the engineers: We all know that the water reactor has one principal characteristic when it shuts down that has to be looked after. It must have water to flow around the fuel rods and be able to inject it into the reactor if some is lost by a sticking relief valve or from any other cause – for this, it must have backup power to power the pumps and injection systems.

The designers apparently could not imagine a tsunami of these proportions and the backup power systems such as multiple outside power lines,  banks of diesels to produce backup power, and finally, banks of batteries to back that up, were all disabled one after another. There’s still a lot the operators can do, did and are doing. But reactors were damaged and may not have needed to be even by this unthinkable earthquake if they had designed the backup power systems to be impregnable, not an impossible thing for an engineer to do.

So we have damage that probably could have been avoided, and reporting of almost stunning inaccuracy and ignorance. Still, the odds are that no one will be hurt from radioactivity — a few workers from falling or in the hydrogen explosions, but tiny on the scale of the damage and killing around it.

A few arrogant nuclear States will voice their ignorance: “Of course our nuclear program is not going to be affected by an earthquake in Japan.” Beware of louder voices:  cataclysms will hit when nobody was ready and by devious unforeseeable ways.

Note 1:  In 1940, a German chemist accelerated a neutron on a uranium atom.  It was assumed that uranium core will absorb the neutron and increase in weight.  Uranium simply split into two other lighter atoms, releasing a neutron. Dr. Liz Meitner, another German physicist, calculated the energy released by the neutron and it fit with Einstein equation of mass of the neutron multiplied by the square of the speed of light.  A race for splitting nuclear chain reaction was on to creating the first atomic bomb.

Note 2:  Information on the construction of the Fukushima nuclear power plant and the consequences were extracted from articles published on wordpress.com

Scientists have been claiming in the last 30 years that sciences were stranger than science-fiction stories.  So far, the interviewers and the respondents did not attempt to clarify what is meant by “strange” before extending answers and comments.  So far, we have no clear idea what is meant by “science”; are we talking of natural sciences (labeled hard sciences) or are we including human and social sciences such as biology, neuroscience, psychology, medicine…

For example, with the launching of space programs in the late 70’s, many editors of science-fictions complained that actual space programs have pictured space trips fictions as redundant.

It appears that what is meant by science is hard science.  Sciences, meaning natural sciences or “hard sciences”, are so far stranger than science-fiction stories.  Why?

First, sciences are not backed up by any validation process by people, not even by advanced technology:  A few people are specialized and involved in sciences, while most common people take the words of scientists for granted for a single day, until they read or hear other alternative “truths or facts”.  Science-fictions are supported by narrative logic and fictitious rationality, made easy to understand by well-written stories.

Second, Scientists claim that sciences are neutral.  I don’t think anyone can get excited by such neutral approach that disturb their state of mind, though scientists are big liars in matter of neutrality. Science-fictions are based on current frustration, disorientation, doubts, fear… And thus, are not neutral:  They extend a release valve to believing in a better future.

Third, Sciences talk about cosmology, nano particles, expansion of the universe, quantum mechanics, relativity theory, chaos theory…Not of any concern to common people.  Science-fictions describe possibilities of living in different societies, customs, highly man-made environment managed and controlled by robots.  Science-fictions extend our horizon and forces us to re-evaluate our values and the meaning of man and life.

Fourth, Sciences are no longer driving technology advances.  Technology is short-circuiting sciences and has reduced sciences to an “after-thought” validation of a technological invention or processes by trying to explaining why the technology actually works.  Technology is interested in explaining how it works:  Just try to comprehend the manuals of how any device function.  Common people do not care why a device works and are ready to experiment and use it, even if safety and health factors were not investigated and tested before the release of a version.  Science-fictions try to describe why and how in layman terms, and the implication of technology in our daily life; its consequences in our near future.

Five, Sciences are boring and insipid for common people, while science-fictions is here to last in our dreams.

Sixth, Sciences are done within clubs of professionals reading “peer-reviewed” articles, while science-fiction authors communicate with many sources of intelligence and audience:  Safety, health, survival are more important in how heroes and protagonists interacts in the story.

Seventh, sciences are not perceived as factors for change; technology and science-fictions are.  Science-fictions are admitted to be literature for change; a literature that catalyze children to growing in radically different worlds from their parents.

Eight, technology gave science-fictions a big boost via video games and new kinds of movies such as “Star War”, “Matrix”, and the 3D versions.  Sciences do not appear to have made an impact on imagination of science-fiction authors.

Science-fictions were originally based on theories of hard sciences, particularly on mechanical inventions…  It is no surprise that transistors and computer technologies were not predicted in science-fictions:  When Galvani experimented on the reactions of muscles in frogs in the 18th century, applying electrical impulses or shock, it was done on live subjects and in a period when all inventions were focused on mechanical devices, manufacturing mass production tools for the “industrial age”, and boosting colonial expansions… For example, all Jules Verne fictions invariably considered the original people as second grade species good for extermination if they retard “colonial development”…

Suppose the question was: “What is stranger: Social sciences or science-fictions?”  I bet that both common people and social scientists will admit that science-fiction is far stranger.  Why?  Everyone of us consider himself expert in psychology and sociology based on personal experiences, even if based on a single experience that hurts deeply.

In any case, what we call science-fictions nowadays refer to predicting social and human transformations as well as organizational and control mechanisms.  Hard science is not exciting and has stopped inspiring science-fiction authors’ imagination.


adonis49

adonis49

adonis49

September 2020
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
282930  

Blog Stats

  • 1,416,640 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 768 other followers

%d bloggers like this: