Adonis Diaries

Archive for the ‘professional articles’ Category

No gluten sensitivity anymore? And the reactions are the consequences of what “sensitive” we are subjected to?

Posted on April 4, 2016

People can go back now to just eating wheat?

In one of the best examples of science at work, a researcher who provided key evidence of (non-celiac disease) gluten sensitivity recently published follow-up papers that show the opposite.

The paper came out last year in the journal Gastroenterology.

This backstory makes us cheer: The study was a follow up on a 2011 experiment in the lab of Peter Gibson at Monash University in Australia.

The scientifically sound – but small – study found that gluten-containing diets can cause gastrointestinal distress in people without celiac disease, a well-known auto-immune disorder triggered by gluten. They called this non-celiac gluten sensitivity

Maha Issa shared this post By JENNIFER WELSH, Business Insider

Gluten is a protein composite found in wheat, barley, and other grains. It gives bread its chewiness and is often used as a meat substitute: If you’ve ever had ‘wheat meat’, seitan, or mock duck at a Thai restaurant, that’s gluten.

Gluten is a big industry: 30% of people want to eat less glutenSales of gluten-free products are estimated to hit $15 billion by 2016.

Although experts estimate that only 1% of Americans – about 3 million people – actually suffer from celiac disease, 18% of adults now buy gluten-free foods.

Since gluten is a protein found in any normal diet, Gibson was unsatisfied with his finding. He wanted to find out why the gluten seemed to be causing this reaction and if there could be something else going on.

He therefore went to a scientifically rigorous extreme for his next experiment, a level not usually expected in nutrition studies. (Really? Health issues, and specifically nutrition studies, don’t require rigorous experiments?)

For a follow-up paper, 37 self-identified gluten-sensitive patients were tested. According to Real Clear Science’s Newton Blog, here’s how the experiment went:

Subjects would be provided with every single meal for the duration of the trial.

Any and all potential dietary triggers for gastrointestinal symptoms would be removed, including lactose (from milk products), certain preservatives like benzoates, propionate, sulfites, and nitrites, and fermentable, poorly absorbed short-chain carbohydrates, also known as FODMAPs.

And last, but not least, nine days worth of urine and faecal matter would be collected. With this new study, Gibson wasn’t messing around.

The subjects cycled through high-gluten, low-gluten, and no-gluten (placebo) diets, without knowing which diet plan they were on at any given time.

In the end, all of the treatment diets – even the placebo diet – caused pain, bloating, nausea, and gas to a similar degree. It didn’t matter if the diet contained gluten. (Read more about the study.)

“In contrast to our first study… we could find absolutely No specific response to gluten,” Gibson wrote in the paper.

A third larger study published this month has confirmed the findings.

It seems to be a ‘nocebo’ effect – the self-diagnosed gluten sensitive patients expected to feel worse on the study diets, so they did. They were also likely more attentive to their intestinal distress, since they had to monitor it for the study.

On top of that, these other potential dietary triggers – specifically the FODMAPS – could be causing what people have wrongly interpreted as gluten sensitivity. FODMAPS are frequently found in the same foods as gluten.

That still doesn’t explain why people in the study negatively reacted to diets that were free of all dietary triggers.

You can go ahead and smell your bread and eat it too. Science. It works. (But those who suffer dangerous reactions because they are indeed sensitive to gluten should Not?)

Guess what my job is: Human Factors in Engineering?

Posted on June 25, 2009 (Written in November 13, 2005)

“Guess what my job is”

It would be interesting to have a talk with the freshly enrolled engineering students from all fields as to the objectives and meaning of designing products, projects and services.

This talk should be intended to orient engineers for a procedure that might provide their design projects the necessary substance for becoming marketable and effective in reducing the pitfalls in having to redesign for failing to consider the health and safety of what they produced and conceived.

This design behavior should start right at the freshman level while taking formal courses so that prospective engineers will naturally apply this acquired behavior in their engineering career.

In the talk, the students will have to guess what the Human Factors discipline is from the case studies, exercises and problems that will be discussed.

The engineers will try to answer a few of the questions that might be implicit, but never formally explicitly explained or learned in engineering curriculums, because the necessary courses are generally offered outside their traditional discipline field.

A sample of the questions might be as follows:

1. What is the primary job of an engineer?

2. What does design means?  How do you perceive designing to look like?

3. To whom are you designing?  What category of people?

4. Who are your target users? Engineer, consumers, support personnel, operators?

5. What are your primary criteria in designing?  Error free application product?

6. Who commit errors?  Can a machine do errors?

7. How can we categorize errors?  Any exposure to an error taxonomy?

8. Can you foresee errors, near accidents, accidents?  Take a range oven for example, expose the foreseeable errors and accidents in the design and specifically the display and control idiosyncrasy.

9. Who is at fault when an error is committed or an accident occurs?

10. Can we practically account for errors without specific task taxonomy?

11. Do you view yourself as responsible for designing interfaces to your design projects depending on the target users?

12. Would you relinquish your responsibilities for being in the team assigned to design an interface for your design project?

13. What kinds of interfaces are needed for your design to be used efficiently?

14. How engineers solve problems?  Searching for the applicable formulas? Can you figure out the magnitude of the answer?  Have you memorized the allowable range for your answers from the given data and restriction imposed in the problem after solving so many exercises?

15. What are the factors or independent variables that may affect your design project?

16. How can we account for the interactions among the factors?

17. Have you memorize the dimensions of your design problem?

18. Have you been exposed to reading research papers? Can you understand, analyze and interpret the research paper data? Can you have an opinion as to the validity of an experiment?

19. Would you accept the results of any peer-reviewed article as facts that may be readily applied to your design projects? Can you figure out if the paper is Not biased or extending confounding results?

20. Do you expect to be in charge of designing any new product or program or procedures in your career?

21. Do you view most of your job career as a series of supporting responsibilities; like just applying already designed programs and procedures?

22. Are you ready to take elective courses in psychology, sociology, marketing, and business targeted to learn how to design experiments and know more about the capabilities, limitations and behavioral trends of target users?

23. Are you planning to go for graduate studies?  Do you know what elective courses might suit you better in your career?

And what are the Brain cells Survival Skills?

Posted on March 4, 2013

Fear beyond the Amygdala
Ranya Bechara posted on Feb. 6, 2013

Picture

For decades now, scientists have thought that fear could not be experienced without the amygdala.

This almond-shaped structure located deep in the brain (pictured on the left). The amygdala has been shown to play an important role in fear-related behaviours, emotions, and memories, and patients with damage to the amygdala on both sides of the brain were thought to be incapable of feeling afraid.

However, a recent study in Nature Neuroscience reports that these ‘fearless’ patients do experience fear if made to inhale carbon dioxide- a procedure that induces feelings of suffocation and panic.

The patients reported being quite surprised at their own fear, and that it was a novel experience for them!

Scientists behind the study have suggested that the way the brain processes fear information depends on the type of stimulus. The results of this study could have important implications for people who suffer from anxiety disorders such as panic attacks and post-traumatic stress disorder (PTSD).

More details can be found here

And how the brain can momentarily react to oxygen deficiency from Strokes?

Can scientists use the brain’s inherent survival mechanisms to develop better stroke treatment?

Strokes are a major cause of death and disability worldwide, with 150,000 people affected in the UK every year.

Most strokes happen when a blood vessel that supplies blood to the brain is blocked due to blood clots or fat deposits. Once blood is cut off from an area of the brain, brain cells are starved for oxygen and nutrients and start to die within minutes.

A new study in Nature Medicine, scientists at the University of Oxford reveal a novel way in which the brain protects itself in response to stroke.

Ranya Bechara posted on Feb. 27, 2013 “Stroke Vs Brain: Harnessing the Brain’s Survival Skills”

Current treatments for stroke are focussed on breaking up the clots, improving blood flow to the affected area, and ultimately reducing the brain damage caused by the stroke. However, the so called ‘clot-busters’ are only effective if given within one to two hours of the stroke.

Other ways of protecting the brain against stroke damage are in high demand.

In this study, the research team from Oxford University (in collaboration with other researchers from Greece, Germany, and Canada, and the UK) decided to try a new approach. They investigated a phenomenon that has been known for years: some brain cells have an inherent defence mechanism that allows them to survive when deprived of oxygen.

These cells are located in the part of the brain responsible for forming memories: a pretty sea-horse shaped structure called the hippocampus.

The scientists analysed the proteins produced by these cells and found that the key to their survival is a protein called hamartin. This protein is released by the cells in response to oxygen deprivation, and when its production was suppressed, the cells became more vulnerable to the effects of stroke.

Original article is available here

Photo credit: http://www.vascularinfo.co.uk

Picture

An exercise: taxonomy of methods

Posted on: June 10, 2009

Article #14 in Human Factors

I am going to let you have a hand at classifying methods by providing a list of various methods that could be used in Industrial engineering, Human Factors, Ergonomics, and Industrial Psychology.

This first list of methods is organized in the sequence used to analyzing part of a system or a mission;

The second list is not necessarily randomized, though thrown in without much order; otherwise it will not be an excellent exercise.

First, let us agree that a method is a procedure or a set of step by step process that our forerunners of geniuses and scholars have tested, found it good, agreed on it on consensus basis and offered it for you to use for the benefit of progress and science.

Many of you will still try hard to find short cuts to anything, including methods, for the petty argument that the best criterion to discriminating among clever people is who waste time on methods and who are nerds.

Actually, the main reason I don’t try to teach many new methods in this course (Human Factors in Engineering) is that students might smack run into a real occupational stress, which they are Not immune of, especially that methods in human factors are complex and time consuming.

Here is this famous list of a few methods and you are to decide which ones are still in the conceptual phases and which have been “operationalized“.

The first list contains the following methods:

Operational analysis, activity analysis, critical incidents, function flow, decision/action, action/information analyses, functional allocation, task, fault tree, failure modes and effects analyses, timeline, link analyses, simulation, controlled experimentation,  operational sequence analysis, and workload assessment.

The second list is constituted of methods that human factors are trained to utilize if need be such as:

Verbal protocol, neural network, utility theory, preference judgments, psycho-physical methods, operational research, prototyping, information theory, cost/benefit methods, various statistical modeling packages, and expert systems.

Just wait, let me resume.

There are those that are intrinsic to artificial intelligence methodology such as:

Fuzzy logic, robotics, discrimination nets, pattern matching, knowledge representation, frames, schemata, semantic network, relational databases, searching methods, zero-sum games theory, logical reasoning methods, probabilistic reasoning, learning methods, natural language understanding, image formation and acquisition, connectedness, cellular logic, problem solving techniques, means-end analysis, geometric reasoning system, algebraic reasoning system.

If your education is multidisciplinary you may catalog the above methods according to specialty disciplines such as:

Artificial intelligence, robotics, econometrics, marketing, human factors, industrial engineering, other engineering majors, psychology or mathematics.

The most logical grouping is along the purpose, input, process/procedure, and output/product of the method. Otherwise, it would be impossible to define and understand any method.

Methods could be used to analyze systems, provide heuristic data about human performance, make predictions, generate subjective data, discover the cause and effects of the main factors, or evaluate the human-machine performance of products or systems.

The inputs could be qualitative or quantitative such as declarative data, categorical, or numerical and generated from structured observations, records, interviews, questionnaires, computer generated or outputs from prior methods.

The outputs could be point data, behavioral trends, graphical in nature, context specific, generic, or reduction in alternatives.

The process could be a creative graphical or pictorial model, logical hierarchy or in network alternative, operational, empirical, informal, or systematic.

You may also group these methods according to their mathematical branches such as algebraic, probabilistic, or geometric.

You may collect them as to their deterministic, statistical sampling methods and probabilistic characters.

You may differentiate the methods as belonging to categorical, ordinal, discrete or continuous measurements.

You may wish to investigate the methods as parametric, non parametric, distribution free population or normally distributed.

You may separate them on their representation forms such as verbal, graphical, pictorial, or in table.

You may discriminate them on heuristic, observational, or experimental scientific values.

You may bundle these methods on qualitative or quantitative values.

You may as well separate them on their historical values or modern techniques based on newer technologies.

You may select them as to their state of the art methods such as ancient methods that new information and new paradigms have refuted their validity or recently developed.

You may define the methods as those digitally or analytically amenable for solving problems.

You may choose to draw several lists of those methods that are economically sounds, esoteric, or just plainly fuzzy sounding.

You may opt to differentiate these methods on requiring high level of mathematical reasoning that are out of your capability and those that can be comprehended through persistent efforts.

You could as well sort them according to which ones fit nicely into the courses that you have already taken, but failed to recollect that they were indeed methods worth acquiring for your career.

You may use any of these taxonomies to answer an optional exam question with no guarantees that you might get a substantial grade.

It would be interesting to collect statistics on how often these methods are being used, by whom, for what rational and by which line of business and by which universities.

It would be interesting to translate these methods into Arabic, Chinese, Japanese, Hindu, or Russian.

Vaccines Don’t Mean We’ll See the Last of Covid, Experts Warn

Past immunization campaigns suggest the disease may never be fully eradicated

Why this prediction that the Covid virus will never be eradicated? Because there will always be a large proportion of susceptible population in the community who are Not vaccinated..

By John Lauerman and James Paton. December 20, 2020

In record speed, vaccines are here, and more are on their way. 

Less than a year since the coronavirus began ravaging the world, the first shots are raising hopes for wiping the Covid-19 pandemic from the face of the earth.

Today’s programs in the U.S. and the U.K. are precursors to immunization campaigns intended to reach the planet’s entire population — all 8 billion people in every corner of the globe.

Is there reason for optimism? 

Vaccines are the best way to eliminate infectious disease: Smallpox has been eradicated and polio is on the brink, with just two countries where transmission persists. (How about the countries succumbing to sanctions from receiving vaccines, basic medication and basic food? Like Yemen, Libya, Syria, Palestinians in Gaza and West Bank…?)

But global vaccine campaigns take time — usually decades — suggesting that even with the latest technologies, money and power behind the unprecedented global drive to knock out Covid-19, the disease is unlikely to be eliminated any time soon.

“I would be surprised to see an actual eradication of this virus now that it’s all over the world, I’d be shocked, given how contagious it is.” said Walter Orenstein, associate director of the Emory Vaccine Center in Atlanta and former head of the U.S. Centers for Disease Control and Prevention’s immunization program. “I’d be shocked, given how contagious it is.”

Snags in supply and distribution have already arisen in the opening days of the U.S. campaign, and the U.K., the first Western country to begin immunizing, vaccinated just 138,000 people in its first week. Meanwhile, Europe has yet to start inoculations, and probably won’t do so until after Christmas.

Concerns are growing over how long it will take to immunize vast swaths of the world beyond a group of wealthy countries that have snapped up early supplies.

A global program called Covax, which aims to deploy Covid vaccines around the globe, has secured deals with developers including Johnson & Johnson and AstraZeneca Plc.

But some of those supplies are expected to come from an experimental inoculation from Sanofi and GlaxoSmithKline Plc that’s been delayed and may not be ready until late next year.

“It’s really, really complicated to make sure we get those vaccines produced and distributed in an equitable way globally, for both moral and economic reasons,” Mark Suzman, chief executive officer of the Bill & Melinda Gates Foundation, told reporters on a Dec. 9 call.

Suzman pointed to research showing that broad access to vaccines could deliver significant economic benefits to all countries and save many lives.

Since wealthy nations will likely have more than enough doses to vaccinate their entire populations, they should consider the reallocation of some supplies to those most in need, he said.

Smallpox Vaccination - NYC outbreak 1947
People line up for smallpox vaccinations outside a hospital in the Bronx after an outbreak in New York City in 1947.Photographer: Bettmann/Getty Images

Mass vaccination has been one of the most successful public health interventions in the world and has played an important part raising U.S. life expectancy by more than 50% over the last century.

About a third of U.S. deaths in 1900 occurred in children under age 5, many of them from diseases like smallpox, measles and whooping cough that are now preventable by immunization.

Some new vaccines have also gained quick and widespread use, like shots that prevent pneumococcal infections that can cause severe illness in children and adults. Introduction of the shingles vaccination has offered prevention of the painful disease to millions of people over the past two decades.

A veteran of the World Health Organization effort to eradicate smallpox, Orenstein would often immunize himself in front of entire villages to assuage safety fears.

The agency resolved to try to eradicate the disease in 1959 when it still afflicted many developing countries, but the effort didn’t kick into high gear until 1967 when more funds and personnel were committed by the WHO and its members.

The smallpox effort initially targeted entire populations, but that turned out to be impractical, recalled William Schaffner, a Vanderbilt University infectious-disease specialist who has advised the government on vaccination. The turnaround came when the strategy switched to identifying cases and then vaccinating everyone in proximity, sometimes hundreds of households.

This approach of creating a vaccination ring around cases was only possible, however, because smallpox can be a disfiguring disease, making it easy to identify, and spreads relatively slowly.

“It has this reputation of spreading rapidly but it actually spreads rather slowly,” Schaffner said. “You also need rather close contact for transmission to occur.”

Those features allowed vaccination teams to identify patients just as they were becoming infectious and close off all opportunities for transmission. Even so, it took two decades for the worldwide effort to contain the last outbreak in 1977.

A better comparison to Covid might be polio, an intestinal virus that sometimes causes permanent, severe disease. Polio is similar to Covid in that only a minority of infected people — about one in 100 — become extremely ill.

Sabin Sunday
Children and parents line up outside the Children’s Hospital to receive polio vaccines in Cincinnati, Ohio, on April 24, 1960.Photographer: Cincinnati Museum Center/Getty Images

That’s created one of the problems anticipated in widespread Covid vaccination: People who don’t believe they’re vulnerable to the disease may not want to be vaccinated, even though it may benefit others by keeping hospital intensive-care units free and possibly preventing transmission of the disease.

An important difference with polio, however, is that it can cause severe disease in young children, leaving them with lifelong paralysis, Orenstein said. That’s unlike Covid, which mainly strikes the elderly and chronically ill. That’s left some portions of the public indifferent.

“We’re getting more than a death a minute — on some days two deaths a minute,” he said. “It’s very disturbing to see the lack of concern in other people.”

Yet even with the specter of children paralyzed from polio and a vaccine available for some 65 years, global elimination of that disease still hasn’t been reached.

Two countries, Afghanistan and Pakistan, continue to have spread because of insufficient vaccination rates,  according to the Global Polio Eradication Initiative.

The latest Covid updates Make sense of the headlines and the outbreak’s global response with the Coronavirus Daily.EmailBloomberg may send me offers and promotions.

To defeat Covid, “we’ve got to convince people to take the vaccine,” said Anthony Fauci, the top U.S. government infectious-disease specialist, in an interview.

If you have a highly effective vaccine and only 50% of the people take it, you’re not going to have the impact that you’d need to essentially bring a pandemic down to such a low level that it’s no longer threatening society. And that’s the goal of a vaccine, the same way we did with measles, the same way we did with polio, the same way the world did with smallpox.”

Most standard immunizations provide protection for years to decades.  We still don’t know how long Covid vaccines will last, Fauci pointed out.

And it isn’t clear whether they prevent transmission along with protection against symptoms, although studies may soon shed light on that.

The logistics and supply-chain challenge the world faces today is “more complicated than usual because for the first time in history we’ll be introducing multiple vaccines against the same target at the same time,Rajeev Venkayya, president of Takeda Pharmaceutical Co.’s vaccines business, said in an interview.

That means countries will need databases to track the rollout and ensure people are getting the doses at the right times, as well as systems to monitor potential side effects and share the information with the public, he said.

Early on, countries plan to prioritize the most vulnerable people as well as health-care workers and other critical staff, which will reduce deaths and suffering considerably, said Venkayya, former special assistant for biodefense to U.S. President George W. Bush.

“But transmission won’t go down dramatically in the beginning. It’s going to take time to get to a sufficient level of vaccine-driven population immunity before we begin to dampen transmission.”

Potentially by the middle of next year countries such as the U.K. and U.S. will be able to see a “real dampening of transmission,” he said. “That timeline is going to be delayed in many other parts of the world that don’t have this kind of early access to vaccines.”

Unvaccinated populations always threaten to reintroduce disease into areas where herd immunity appears to have taken over.

Just last year, the annual number of worldwide, reported measles case rose more than six-fold to about 870,000, the most since 1996, as immunization rates flagged. 

The world is likely to see the same level of viral persistence from the coronavirus, said Klaus Stohr, a former Novartis AG vaccine executive and WHO official who championed efforts to prepare for pandemics.

“The prediction is pretty clear: The virus will never be eradicated. Why? Because there will always be a large proportion of susceptible population in the community.” said Klaus Stohr

— With assistance by Jason Gale

https://buy.tinypass.com/checkout/template/show?displayMode=inline&containerSelector=.transporter-item.current+.softwall%5Bdata-position%3D%274%27%5D&templateId=OTVTICPH3V8Z&offerId=fakeOfferId&showCloseButton=false&trackingId=%7Bjcx%7DH4sIAAAAAAAAAI2Q0U_CMBDG_5c-U9K13dbxhgoywEgUh_JWumNURje7DkiM_7vbIhoTH7yXy93vvi-57x1JnaIBiifjm2ocGztFPVTKDBINp7gllFCCPYqphz2OicCEN6PAq2JqrdzIxR1T8cgyzLY8oBFEAH7Iw1DxME0DKiXwcKvSrWyM4VyC1WAUdNaj52h2Nb0l8yga_6KjM6ja6cJ0Z54gIX8rSLXHpCnq9n6gxKuyLFObQEvqsUL90g_Vt7jaFaclHMpcOpi9RMFjuF7Fi4c59RvFTlYXhgbO1tBD7mvuxPfLZBlfLyYsEWv0wxJptTSuPTF1nveQkodS6sxUl8VRV7rj6Ij_DNBn-Kk-FNSw0Ozn_FDt4D8B6rLNgwd9T_h91jS_faOuwA4zMK6BaReFczkaeAERARNC-B-fNNmROucBAAA&experienceId=EX9KBJG0L99F&tbc=%7Bjzx%7Dt_3qvTkEkvt3AGEeiiNNgLScsJ7tkdNlsQLQ8XhqxcxydNWD5a_4dLb6jMTVQFjxxWcf9MrPneOPTL8kpX-E0yMSene3-kcl4G_pl3xTCxcB6KvsD1j50n0HGtHDN7McjylU0Z664w9lha1BgkmqDg&iframeId=offer-0-D14em&url=https%3A%2F%2Fwww.bloomberg.com%2Fnews%2Farticles%2F2020-12-20%2Fvaccines-don-t-mean-we-ll-see-the-last-of-covid-experts-warn%3Fcmpid%3Dsocialflow-facebook-business%26utm_medium%3Dsocial%26utm_source%3Dfacebook%26utm_campaign%3Dsocialflow-organic%26utm_content%3Dbusiness%26fbclid%3DIwAR3ofRi2hJ0dndMmVu2V9WzANVdsIZninTny2rc8LW7AI4Eos_Hw8m6FlZA&parentDualScreenLeft=14&parentDualScreenTop=10&parentWidth=1303&parentHeight=613&parentOuterHeight=692&aid=IHFDsFInrJ&contentSection=content-article&pageViewId=2020-12-21-14-08-04-128-WoJrrabaPM3cIEr3-3f4629e9ee57477c47dd62aae47fcdfa&visitId=v-2020-12-21-14-08-04-153-Uumo2n37nkL4mshe-3f4629e9ee57477c47dd62aae47fcdfa&userProvider=publisher_user_ref&userToken=&customCookies=%7B%7D&hasLoginRequiredCallback=false&width=620&_qh=26713e2063https://www.bloomberg.com/graphics/2020-coronavirus-dash/

A “Transparent accounting”. Away from biased Elite Class “Net profit legal” laws?

Transparent accounting: Based on revenue and posted 9 years ago,

This is one of my Daydream ideas.

Revenue is the one item in the balance sheet that No corporation is about to cheat on, Not even gang and drug criminal organizations cheat on it. Why?

You cannot have a balance sheet or working statement or any other accounting gimmick without accurate revenue

Board of director members take their cuts directly from the total revenue.

They know how much the company is generating in gross profit, excluding side revenues and under the table bonuses and favors…

For example, the percentage on the revenue for their First cut, ahead of time, must correspond to 50% of the gross profit, and then all other “cost/expense” items can be changed to correspond to the expected net profit.

Even without the huge amount of data, financial and economical data, companies in each particular line of business have an appreciation of the gross profit before the legal year starts, based on the previous revenue and very accurate forecasting

Every item in the balance sheet is known as a percentage of the revenue.

You change a percentage and you know what the managers should be doing as consequences: Fire employees and how many, reduce facility costs, save on energy, training, quality of spare parts, inspection, quality control,…

Actually, all the accounting standards and accounting schools and degrees awarded to graduates are Not meant to fine-tune the accounting records of anything. 

Mostly, these degrees are to know and apply the laws “legal cheating” that benefit the Elite Classes in a society.

The government and the corporations have no need for all the accurate numbers and inspection of records and papers: They know the revenue and the proper percentage on the revenue that each item is measured accordingly.

Government can as easily and more accurately get the taxes on revenue, instead of waiting for the gross profit computation, and saving the citizens the exacerbation of enacting loopholes as large as the State of Montana.

If the financial and business communities consider the tax rate on revenue high or exaggerated, they can lobby to simply reduce the rate of the percentage on the revenue…What’s the big deal?

Is transparency anathema to governing?

Should government persist on creating more mysterious laws than the citizens are ready to swallow?

Is governing meant to constantly resume the financial emulation of cult organizations with code-names, secrecy, childish gimmick…?

Why the top 1% of corporations have to skim 20% of total revenue for example, then rearrange all the items in the balance sheet, so that the workers and employees sweat out negotiating on a better minimum wage?

Who is taking advantages of the small prints as footnotes in the balance sheet and other accounting gimmicks?

Why should the nation needs expert on how to comprehend the meaning of the footnotes, if transparency is the goal in transactions?

Occupy Wall Street protests should demand that accounting ratios should be transparent on a special accounting sheet:  Citizens must know how much the top 1% are actually paid, how much the middle management is paid, and how much the rank-and-file of workers are paid as a proportion of the total revenue…

Actually, who is generating the profit if Not the workers and employees, and who is making the economy grow, and who is defending the interests of the top 1%?

Occupy Wall Street protests task is to demand transparency in all financial undertaking, starting with a transparent accounting.

What Meritocracy looks like in the US and elsewhere?

Why Poor kids who do everything right don’t do better than rich kids who do everything wrong

This propaganda that “America is the land of opportunity“, is it just for some more than others?

In large part, inequality starts in the crib, in the socio-political system

Rich parents can afford to spend more time and money on their kids, and that gap has only grown the past few decades.

Economists Greg Duncan and Richard Murnane calculate that, between 1972 and 2006, high-income parents increased their spending on “enrichment activities” for their children by 151% in inflation-adjusted terms, compared to 57% for low-income parents.

By Matt O’Brien October 18, 2014Poor Grads, Rich DropoutsSource: Data from Richard Reeves and Isabel Sawhill

It’s not just a matter of dollars and cents. It’s also a matter of letters and words.

Affluent parents talk to their kids three more hours a week on average than poor parents, which is critical during a child’s formative early years.

That’s why, as Stanford professor Sean Reardon explains, “rich students are increasingly entering kindergarten much better prepared to succeed in school than middle-class students,” and they’re staying that way.

It’s an educational arms race that’s leaving many kids far, far behind.

It’s depressing, but not nearly so much as this:

Even poor kids who do everything right don’t do much better than rich kids who do everything wrong.

Advantages and disadvantages tend to perpetuate themselves.

You can see that in the above chart, based on a new paper from Richard Reeves and Isabel Sawhill, presented at the Federal Reserve Bank of Boston’s annual conference, which is underway.

Specifically, rich high school dropouts remain in the top about as much as poor college grads stay stuck in the bottom — 14 versus 16%, respectively. Not only that, but these low-income strivers are just as likely to end up in the bottom as these wealthy ne’er-do-wells. Some meritocracy

What’s going on? Well, it’s all about glass floors and glass ceilings.

Rich kids who can go work for the family business — and, in Canada at least, 70 % of the sons of the top 1 percent do just that — or inherit the family estate don’t need a high school diploma to get ahead.

It’s an extreme example of what economists call “opportunity hoarding.” That includes everything from legacy college admissions to unpaid internships that let affluent parents rig the game a little more in their children’s favor.

But even if they didn’t, low-income kids would still have a hard time getting ahead.

That’s, in part, because they’re targets for diploma mills that load them up with debt, but not a lot of prospects.

And even if they do get a good degree, at least when it comes to black families, they’re more likely to still live in impoverished neighborhoods that keep them disconnected from opportunities.

It’s not quite a heads-I-win, tails-you-lose game where rich kids get better educations, yet still get ahead even if they don’t—but it’s close enough.

And if it keeps up, the American Dream will be just that.

Note: Kids of struggling and hard working parents learn to save money and appreciate the value of hard work. Kids of very rich families fail to learn the value of money or work hard when young.

Unless the rich kid  go to work for his parents’ business and are given countless second chances, he is unable to make it on his own.

It is not the rich parents fault as much as their inability to convince the kid, who see wealth of his family surrounding him, in the house and things coming his way the easy way, that the notion of hard work is not believable.

Which machine learning algorithm should I use? How many and which one is best?

Note: in the early 1990’s, I took graduate classes in Artificial Intelligence (AI) (The if…Then series of questions and answer of experts in their fields of work) and neural networks developed by psychologists. 

The concepts are the same, though upgraded with new algorithms and automation.

I recall a book with a Table (like the Mendeleev table in chemistry) that contained the terms, mental processes, mathematical concepts behind the ideas that formed the AI trend…

There are several lists of methods, depending on the field of study you are more concerned with.

One list of methods is constituted of methods that human factors are trained to utilize if need be, such as:

Verbal protocol, neural network, utility theory, preference judgments, psycho-physical methods, operational research, prototyping, information theory, cost/benefit methods, various statistical modeling packages, and expert systems.

There are those that are intrinsic to artificial intelligence methodology such as:

Fuzzy logic, robotics, discrimination nets, pattern matching, knowledge representation, frames, schemata, semantic network, relational databases, searching methods, zero-sum games theory, logical reasoning methods, probabilistic reasoning, learning methods, natural language understanding, image formation and acquisition, connectedness, cellular logic, problem solving techniques, means-end analysis, geometric reasoning system, algebraic reasoning system.

Hui Li on Subconscious Musings posted on April 12, 2017 Advanced Analytics | Machine Learning

This resource is designed primarily for beginner to intermediate data scientists or analysts who are interested in identifying and applying machine learning algorithms to address the problems of their interest.

typical question asked by a beginner, when facing a wide variety of machine learning algorithms, is “which algorithm should I use?”

The answer to the question varies depending on many factors, including:

  • The size, quality, and nature of data.
  • The available computational time.
  • The urgency of the task.
  • What you want to do with the data.

Even an experienced data scientist cannot tell which algorithm will perform the best before trying different algorithms.

We are not advocating a one and done approach, but we do hope to provide some guidance on which algorithms to try first depending on some clear factors.

The machine learning algorithm cheat sheet

Flow chart shows which algorithms to use when

The machine learning algorithm cheat sheet helps you to choose from a variety of machine learning algorithms to find the appropriate algorithm for your specific problems.

This article walks you through the process of how to use the sheet.

Since the cheat sheet is designed for beginner data scientists and analysts, we will make some simplified assumptions when talking about the algorithms.

The algorithms recommended here result from compiled feedback and tips from several data scientists and machine learning experts and developers.

There are several issues on which we have not reached an agreement and for these issues we try to highlight the commonality and reconcile the difference.

Additional algorithms will be added in later as our library grows to encompass a more complete set of available methods.

How to use the cheat sheet

Read the path and algorithm labels on the chart as “If <path label> then use <algorithm>.” For example:

  • If you want to perform dimension reduction then use principal component analysis.
  • If you need a numeric prediction quickly, use decision trees or logistic regression.
  • If you need a hierarchical result, use hierarchical clustering.

Sometimes more than one branch will apply, and other times none of them will be a perfect match.

It’s important to remember these paths are intended to be rule-of-thumb recommendations, so some of the recommendations are not exact.

Several data scientists I talked with said that the only sure way to find the very best algorithm is to try all of them.

(Is that a process to find an algorithm that matches your world view on an issue? Or an answer that satisfies your boss?)

Types of machine learning algorithms

This section provides an overview of the most popular types of machine learning. If you’re familiar with these categories and want to move on to discussing specific algorithms, you can skip this section and go to “When to use specific algorithms” below.

Supervised learning

Supervised learning algorithms make predictions based on a set of examples.

For example, historical sales can be used to estimate the future prices. With supervised learning, you have an input variable that consists of labeled training data and a desired output variable.

You use an algorithm to analyze the training data to learn the function that maps the input to the output. This inferred function maps new, unknown examples by generalizing from the training data to anticipate results in unseen situations.

  • Classification: When the data are being used to predict a categorical variable, supervised learning is also called classification. This is the case when assigning a label or indicator, either dog or cat to an image. When there are only two labels, this is called binary classification. When there are more than two categories, the problems are called multi-class classification.
  • Regression: When predicting continuous values, the problems become a regression problem.
  • Forecasting: This is the process of making predictions about the future based on the past and present data. It is most commonly used to analyze trends. A common example might be estimation of the next year sales based on the sales of the current year and previous years.

Semi-supervised learning

The challenge with supervised learning is that labeling data can be expensive and time consuming. If labels are limited, you can use unlabeled examples to enhance supervised learning. Because the machine is not fully supervised in this case, we say the machine is semi-supervised. With semi-supervised learning, you use unlabeled examples with a small amount of labeled data to improve the learning accuracy.

Unsupervised learning

When performing unsupervised learning, the machine is presented with totally unlabeled data. It is asked to discover the intrinsic patterns that underlies the data, such as a clustering structure, a low-dimensional manifold, or a sparse tree and graph.

  • Clustering: Grouping a set of data examples so that examples in one group (or one cluster) are more similar (according to some criteria) than those in other groups. This is often used to segment the whole dataset into several groups. Analysis can be performed in each group to help users to find intrinsic patterns.
  • Dimension reduction: Reducing the number of variables under consideration. In many applications, the raw data have very high dimensional features and some features are redundant or irrelevant to the task. Reducing the dimensionality helps to find the true, latent relationship.

Reinforcement learning

Reinforcement learning analyzes and optimizes the behavior of an agent based on the feedback from the environment.  Machines try different scenarios to discover which actions yield the greatest reward, rather than being told which actions to take. Trial-and-error and delayed reward distinguishes reinforcement learning from other techniques.

Considerations when choosing an algorithm

When choosing an algorithm, always take these aspects into account: accuracy, training time and ease of use. Many users put the accuracy first, while beginners tend to focus on algorithms they know best.

When presented with a dataset, the first thing to consider is how to obtain results, no matter what those results might look like. Beginners tend to choose algorithms that are easy to implement and can obtain results quickly. This works fine, as long as it is just the first step in the process. Once you obtain some results and become familiar with the data, you may spend more time using more sophisticated algorithms to strengthen your understanding of the data, hence further improving the results.

Even in this stage, the best algorithms might not be the methods that have achieved the highest reported accuracy, as an algorithm usually requires careful tuning and extensive training to obtain its best achievable performance.

When to use specific algorithms

Looking more closely at individual algorithms can help you understand what they provide and how they are used. These descriptions provide more details and give additional tips for when to use specific algorithms, in alignment with the cheat sheet.

Linear regression and Logistic regression

Linear regressionLogistic regression

Linear regression is an approach for modeling the relationship between a continuous dependent variable [Math Processing Error]y and one or more predictors [Math Processing Error]X. The relationship between [Math Processing Error]y and [Math Processing Error]X can be linearly modeled as [Math Processing Error]y=βTX+ϵ Given the training examples [Math Processing Error]{xi,yi}i=1N, the parameter vector [Math Processing Error]β can be learnt.

If the dependent variable is not continuous but categorical, linear regression can be transformed to logistic regression using a logit link function. Logistic regression is a simple, fast yet powerful classification algorithm.

Here we discuss the binary case where the dependent variable [Math Processing Error]y only takes binary values [Math Processing Error]{yi∈(−1,1)}i=1N (it which can be easily extended to multi-class classification problems).

In logistic regression we use a different hypothesis class to try to predict the probability that a given example belongs to the “1” class versus the probability that it belongs to the “-1” class. Specifically, we will try to learn a function of the form:[Math Processing Error]p(yi=1|xi)=σ(βTxi) and [Math Processing Error]p(yi=−1|xi)=1−σ(βTxi).

Here [Math Processing Error]σ(x)=11+exp(−x) is a sigmoid function. Given the training examples[Math Processing Error]{xi,yi}i=1N, the parameter vector [Math Processing Error]β can be learnt by maximizing the Pyongyang said it could call off the talks, slated for June 12, if the US continues to insist that it give up its nuclear weapons. North Korea called the military drills between South Korea and the US a “provocation,” and canceled a meeting planned for today with South Korea.of [Math Processing Error]β given the data set.Group By Linear RegressionLogistic Regression in SAS Visual Analytics

Linear SVM and kernel SVM

Kernel tricks are used to map a non-linearly separable functions into a higher dimension linearly separable function. A support vector machine (SVM) training algorithm finds the classifier represented by the normal vector [Math Processing Error]w and bias [Math Processing Error]b of the hyperplane. This hyperplane (boundary) separates different classes by as wide a margin as possible. The problem can be converted into a constrained optimization problem:
[Math Processing Error]minimizew||w||subject toyi(wTXi−b)≥1,i=1,…,n.

A support vector machine (SVM) training algorithm finds the classifier represented by the normal vector  and bias  of the hyperplane. This hyperplane (boundary) separates different classes by as wide a margin as possible. The problem can be converted into a constrained optimization problem:

Linear and kernel SVM charts

When the classes are not linearly separable, a kernel trick can be used to map a non-linearly separable space into a higher dimension linearly separable space.

When most dependent variables are numeric, logistic regression and SVM should be the first try for classification. These models are easy to implement, their parameters easy to tune, and the performances are also pretty good. So these models are appropriate for beginners.

Trees and ensemble trees

A decision tree for prediction model.

Decision trees, random forest and gradient boosting are all algorithms based on decision trees.

There are many variants of decision trees, but they all do the same thing – subdivide the feature space into regions with mostly the same label. Decision trees are easy to understand and implement.

However, they tend to over fit data when we exhaust the branches and go very deep with the trees. Random Forrest and gradient boosting are two popular ways to use tree algorithms to achieve good accuracy as well as overcoming the over-fitting problem.

Neural networks and deep learning

Neural networks flourished in the mid-1980s due to their parallel and distributed processing ability.

Research in this field was impeded by the ineffectiveness of the back-propagation training algorithm that is widely used to optimize the parameters of neural networks. Support vector machines (SVM) and other simpler models, which can be easily trained by solving convex optimization problems, gradually replaced neural networks in machine learning.

In recent years, new and improved training techniques such as unsupervised pre-training and layer-wise greedy training have led to a resurgence of interest in neural networks.

Increasingly powerful computational capabilities, such as graphical processing unit (GPU) and massively parallel processing (MPP), have also spurred the revived adoption of neural networks. The resurgent research in neural networks has given rise to the invention of models with thousands of layers.

A neural network

Shallow neural networks have evolved into deep learning neural networks.

Deep neural networks have been very successful for supervised learning.  When used for speech and image recognition, deep learning performs as well as, or even better than, humans.

Applied to unsupervised learning tasks, such as feature extraction, deep learning also extracts features from raw images or speech with much less human intervention.

A neural network consists of three parts: input layer, hidden layers and output layer. 

The training samples define the input and output layers. When the output layer is a categorical variable, then the neural network is a way to address classification problems. When the output layer is a continuous variable, then the network can be used to do regression.

When the output layer is the same as the input layer, the network can be used to extract intrinsic features.

The number of hidden layers defines the model complexity and modeling capacity.

Deep Learning: What it is and why it matters

k-means/k-modes, GMM (Gaussian mixture model) clustering

K Means ClusteringGaussian Mixture Model

Kmeans/k-modes, GMM clustering aims to partition n observations into k clusters. K-means define hard assignment: the samples are to be and only to be associated to one cluster. GMM, however define a soft assignment for each sample. Each sample has a probability to be associated with each cluster. Both algorithms are simple and fast enough for clustering when the number of clusters k is given.

DBSCAN

A DBSCAN illustration

When the number of clusters k is not given, DBSCAN (density-based spatial clustering) can be used by connecting samples through density diffusion.

Hierarchical clustering

Hierarchical partitions can be visualized using a tree structure (a dendrogram). It does not need the number of clusters as an input and the partitions can be viewed at different levels of granularities (i.e., can refine/coarsen clusters) using different K.

PCA, SVD and LDA

We generally do not want to feed a large number of features directly into a machine learning algorithm since some features may be irrelevant or the “intrinsic” dimensionality may be smaller than the number of features. Principal component analysis (PCA), singular value decomposition (SVD), andlatent Dirichlet allocation (LDA) all can be used to perform dimension reduction.

PCA is an unsupervised clustering method which maps the original data space into a lower dimensional space while preserving as much information as possible. The PCA basically finds a subspace that most preserves the data variance, with the subspace defined by the dominant eigenvectors of the data’s covariance matrix.

The SVD is related to PCA in the sense that SVD of the centered data matrix (features versus samples) provides the dominant left singular vectors that define the same subspace as found by PCA. However, SVD is a more versatile technique as it can also do things that PCA may not do.

For example, the SVD of a user-versus-movie matrix is able to extract the user profiles and movie profiles which can be used in a recommendation system. In addition, SVD is also widely used as a topic modeling tool, known as latent semantic analysis, in natural language processing (NLP).

A related technique in NLP is latent Dirichlet allocation (LDA). LDA is probabilistic topic model and it decomposes documents into topics in a similar way as a Gaussian mixture model (GMM) decomposes continuous data into Gaussian densities. Differently from the GMM, an LDA models discrete data (words in documents) and it constrains that the topics are a priori distributed according to a Dirichlet distribution.

Conclusions

This is the work flow which is easy to follow. The takeaway messages when trying to solve a new problem are:

  • Define the problem. What problems do you want to solve?
  • Start simple. Be familiar with the data and the baseline results.
  • Then try something more complicated.
  • Dr. Hui Li is a Principal Staff Scientist of Data Science Technologies at SAS. Her current work focuses on Deep Learning, Cognitive Computing and SAS recommendation systems in SAS Viya. She received her PhD degree and Master’s degree in Electrical and Computer Engineering from Duke University.
  • Before joining SAS, she worked at Duke University as a research scientist and at Signal Innovation Group, Inc. as a research engineer. Her research interests include machine learning for big, heterogeneous data, collaborative filtering recommendations, Bayesian statistical modeling and reinforcement learning.

Is Improvisation in Jazz a conversation? And how the brains work?

Does the brain works in the same way for all kinds of languages?

For the better part of the past decade, Mark Kirby has been pouring drinks and booking gigs at the 55 Bar in New York City’s Greenwich Village.

The cozy dive bar is a neighborhood staple for live jazz that opened on the eve of Prohibition in 1919.

It was the year Congress agreed to give American women the right to vote, and jazz was still in its infancy.

Nearly a century later, the den-like bar is an anchor to the past in a city that’s always changing.

ADRIENNE LAFRANCE published in The Atlantic this Feb. 19 2014:

How Brains See Music as Language

A new Johns Hopkins study looks at the neuroscience of jazz and the power of improvisation.

For Kirby, every night of work offers the chance to hear some of the liveliest jazz improvisation in Manhattan, an experience that’s a bit like overhearing a great conversation.

“There is overlapping, letting the other person say their piece, then you respond. Threads are picked up then dropped. There can be an overall mood and going off on tangents.”

Brain areas linked to meaning shut down during improvisational jazz interactions: this music is syntactic, not semantic.A member of the Preservation Hall Jazz Band performs at the New Orleans Jazz and Heritage Festival in New Orleans. (Gerald Herbert/AP)

The idea that jazz can be a kind of conversation has long been an area of interest for Charles Limb, an otolaryngological surgeon at Johns Hopkins. Limb, a musician himself, decided to map what was happening in the brains of musicians as they played.

He and a team of researchers conducted a study that involved putting a musician in a functional MRI machine with a keyboard, and having him play a memorized piece of music and then a made-up piece of music as part of an improvisation with another musician in a control room.

What researchers found:

1. The brains of jazz musicians who are engaged with other musicians in spontaneous improvisation show robust activation in the same brain areas traditionally associated with spoken language and syntax.

Improvisational jazz conversations “take root in the brain as a language,” Limb said.

“It makes perfect sense,” said Ken Schaphorst, chair of the Jazz Studies Department at the New England Conservatory in Boston. “I improvise with words all the time—like I am right now—and jazz improvisation is really identical in terms of the way it feels. Though it’s difficult to get to the point where you’re comfortable enough with music as a language where you can speak freely.”

2. Along with the limitations of musical ability, there’s another key difference between jazz conversation and spoken conversation that emerged in Limb’s experiment.

During a spoken conversation, the brain is busy processing the structure and syntax of language, as well the semantics or meaning of the words.

But Limb and his colleagues found that brain areas linked to meaning shut down during improvisational jazz interactions: this kind of music is syntactic but it’s not semantic.

Music communication, we know it means something to the listener, but that meaning can’t really be described,” Limb said. “It doesn’t have propositional elements or specificity of meaning in the same way a word does. So a famous bit of music—Beethoven’s dun dun dun duuuun—we might hear that and think it means something but nobody could agree what it means.”

So if music is a language without set meaning, what does that tell us about the nature of music?

3. “The answer to that probably lies more in figuring out what the nature of language is than what the nature of music is,” said Mike Pope, a Baltimore-based pianist and bassist who participated in the study.

When you’re talking about something, you’re not thinking about how your mouth is moving and you’re not thinking about how the words are spelled and you’re not thinking about grammar.

With music, it’s the same thing.” Many scientists believe that language is what makes us human, but the brain is wired to process acoustic systems that are far more complicated than speech.

Pope says even improvisational jazz is built around a framework that musicians understand. This structure is similar to the way we use certain rules in spoken conversation to help us intuit when it’s time to say “nice to meet you,” or how to read social clues that signal an encounter is drawing to a close.

4. “In most jazz performances, things are Not nearly as random as people would think,” Pope said. “If I want to be a good bass player and I want to fill the role, idiomatically and functionally, that a bass player’s supposed to fulfill, I have to act within the confines of certain acceptable parameters. I have to make sure I’m playing roots on the downbeat every time the chord changes. It’s all got to swing.”

5. But Limb believes his finding suggests something even bigger, something that gets at the heart of an ongoing debate in his field about what the human auditory system is for in the first place.

“If the brain evolved for the purpose of speech, it’s odd that it evolved to a capacity way beyond speech. So a brain that evolved to handle musical communication—there has to be a relationship between the two. I have reason to suspect that the auditory brain may have been designed to hear music and speech is a happy byproduct.”

Back in New York City, where the jazz conversation continues at 55 Bar almost every night, bartender Kirby makes it sound simple:

“In jazz, there is no lying and very little misunderstanding.”

Overwhelming power of prosecutors in US justice system.

When a kid commits a crime, the US justice system has a choice: prosecute to the full extent of the law, or take a step back and ask if saddling young people with criminal records is the right thing to do every time.

In this searching talk, Adam Foss, a prosecutor with the Suffolk County District Attorney’s Office in Boston, makes his case for a reformed justice system that replaces wrath with opportunity, changing people’s lives for the better instead of ruining them

Patsy Z and TEDxSKE shared a link.

A prosecutor vision for a better justice systemted.com|By Adam Foss

Adam Foss. Juvenile justice reformer

By shifting his focus from incarceration to transforming lives, Adam Foss is reinventing the role of the criminal prosecutor. Full bio

The following are my opinions, and do not reflect the opinions or policies of any particular prosecutor’s office. 

I am a prosecutor. I believe in law and order. I am the adopted son of a police officer, a Marine and a hairdresser. 

I believe in accountability and that we should all be safe in our communities. I love my job and the people that do it. I just think that it’s our responsibility to do it better.

By a show of hands, how many of you, by the age of 25, had either acted up in school, went somewhere you were specifically told to stay out of, or drank alcohol before your legal age?  

How many of you shoplifted, tried an illegal drug or got into a physical fight, even with a sibling?

how many of you ever spent one day in jail for any of those decisions?

How many of you sitting here today think that you’re a danger to society or should be defined by those actions of youthful indiscretion?

When we talk about criminal justice reform, we often focus on a few things, and that’s what I want to talk to you about today.

But first I’m going to give you a confession on my part.

I went to law school to make money. I had no interest in being a public servant, I had no interest in criminal law, and I definitely didn’t think that I would ever be a prosecutor.

Near the end of my first year of law school, I got an internship in the Roxbury Division of Boston Municipal Court

I knew of Roxbury as an impoverished neighborhood in Boston, plagued by gun violence and drug crime.

My life and my legal career changed the first day of that internship. I walked into a courtroom, and I saw an auditorium of people who, one by one, would approach the front of that courtroom to say two words and two words only: “Not guilty.”

They were predominately black and brown. And then a judge, a defense attorney and a prosecutor would make life-altering decisions about that person without their input. They were predominately white.

As each person, one by one, approached the front of that courtroom, I couldn’t stop but think: How did they get here? I wanted to know their stories. And as the prosecutor read the facts of each case, I was thinking to myself, we could have predicted that. That seems so preventable… not because I was an expert in criminal law, but because it was common sense.

Over the course of the internship, I began to recognize people in the auditorium, not because they were criminal masterminds but because they were coming to us for help and we were sending them out without any.

My second year of law school I worked as a paralegal for a defense attorney, and in that experience I met many young men accused of murder. Even in our “worst,” I saw human stories.

And they all contained childhood trauma, victimization, poverty, loss, disengagement from school, early interaction with the police and the criminal justice system, all leading to a seat in a courtroom.

Those convicted of murder were condemned to die in prison, and it was during those meetings with those men that I couldn’t fathom why we would spend so much money to keep this one person in jail for the next 80 years when we could have reinvested it up front, and perhaps prevented the whole thing from happening in the first place.

My third year of law school, I defended people accused of small street crimes, mostly mentally ill, mostly homeless, mostly drug-addicted, all in need of help. They would come to us, and we would send them away without that help. 

They were in need of our assistance. But we weren’t giving them any. Prosecuted, adjudged and defended by people who knew nothing about them.

The staggering inefficiency is what drove me to criminal justice work. The unfairness of it all made me want to be a defender. The power dynamic that I came to understand made me become a prosecutor.

I don’t want to spend a lot of time talking about the problem. We know the criminal justice system needs reform, we know there are 2.3 million people in American jails and prisons, making us the most incarcerated nation on the planet.

We know there’s another 7 million people on probation or parole, we know that the criminal justice system disproportionately affects people of color, particularly poor people of color.

And we know there are system failures happening everywhere that bring people to our courtrooms. But what we do not discuss is how ill-equipped our prosecutors are to receive them.

When we talk about criminal justice reform, we, as a society, focus on three things. We complain, we tweet, we protest about the police, about sentencing laws and about prison. We rarely, if ever, talk about the prosecutor.

In the fall of 2009, a young man was arrested by the Boston Police Department. He was 18 years old, he was African American and he was a senior at a local public school. He had his sights set on college but his part-time, minimum-wage job wasn’t providing the financial opportunity he needed to enroll in school.

In a series of bad decisions, he stole 30 laptops from a store and sold them on the Internet. This led to his arrest and a criminal complaint of 30 felony charges. The potential jail time he faced is what stressed Christopher out the most. But what he had little understanding of was the impact a criminal record would have on his future.

I was standing in arraignments that day when Christopher’s case came across my desk. And at the risk of sounding dramatic, in that moment, I had Christopher’s life in my hands. 

I was 29 years old, a brand-new prosecutor, and I had little appreciation for how the decisions I would make would impact Christopher’s life. Christopher’s case was a serious one and it needed to be dealt with as such, but I didn’t think branding him a felon for the rest of his life was the right answer.

For the most part, prosecutors step onto the job with little appreciation of the impact of our decisions, regardless of our intent. Despite our broad discretion, we learn to avoid risk at all cost, rendering our discretion basically useless.

History has conditioned us to believe that somehow, the criminal justice system brings about accountability and improves public safety, despite evidence to the contrary.

We’re judged internally and externally by our convictions and our trial wins, so prosecutors aren’t really incentivized to be creative at our case dispositions, or to take risks on people we might not otherwise. We stick to an outdated method, counterproductive to achieving the very goal that we all want, and that’s safer communities.

Yet most prosecutors standing in my space would have arraigned Christopher. They have little appreciation for what we can do. Arraigning Christopher would give him a criminal record, making it harder for him to get a job, setting in motion a cycle that defines the failing criminal justice system today.

With a criminal record and without a job, Christopher would be unable to find employment, education or stable housing.

Without those protective factors in his life, Christopher would be more likely to commit further, more serious crime.

The more contact Christopher had with the criminal justice system, the more likely it would be that he would return again and again and again — all at tremendous social cost to his children, to his family and to his peers. And, ladies and gentlemen, it is a terrible public safety outcome for the rest of us.

When I came out of law school, I did the same thing as everybody else. I came out as a prosecutor expected to do justice, but I never learned what justice was in my classes — none of us do. None of us do.

And yet, prosecutors are the most powerful actors in the criminal justice system. Our power is virtually boundless.

In most cases, not the judge, not the police, not the legislature, not the mayor, not the governor, not the President can tell us how to prosecute our cases.

The decision to arraign Christopher and give him a criminal record was exclusively mine. I would choose whether to prosecute him for 30 felonies, for one felony, for a misdemeanor, or at all. I would choose whether to leverage Christopher into a plea deal or take the case to trial, and ultimately, I would be in a position to ask for Christopher to go to jail. 

These are decisions that prosecutors make every day unfettered, and we are unaware and untrained of the grave consequences of those decisions.

One night this past summer, I was at a small gathering of professional men of color from around the city. As I stood there stuffing free finger sandwiches into my mouth, as you do as public servant —  I noticed across the room, a young man waving and smiling at me and approaching me. 

And I recognized him, but I couldn’t place from where, and before I knew it, this young man was hugging me. And thanking me. “You cared about me, and you changed my life.” It was Christopher.

 I never arraigned Christopher. He never faced a judge or a jail, he never had a criminal record. Instead, I worked with Christopher; first on being accountable for his actions, and then, putting him in a position where he wouldn’t re-offend.

We recovered 75 percent of the computers that he sold and gave them back to Best Buy, and came up with a financial plan to repay for the computers we couldn’t recover. 

Christopher did community service. He wrote an essay reflecting on how this case could impact his future and that of the community. He applied to college, he obtained financial aid, and he went on to graduate from a four-year school.

After we finished hugging, I looked at his name tag, to learn that Christopher was the manager of a large bank in Boston. Christopher had accomplished — and making a lot more money than me —

He had accomplished all of this in the six years since I had first seen him in Roxbury Court. I can’t take credit for Christopher’s journey to success, but I certainly did my part to keep him on the path.

There are thousands of Christophers out there, some locked in our jails and prisons. We need thousands of prosecutors to recognize that and to protect them.

An employed Christopher is better for public safety than a condemned one. It’s a bigger win for all of us. In retrospect, the decision not to throw the book at Christopher makes perfect sense. When I saw him that first day in Roxbury Court, I didn’t see a criminal standing there. I saw myself — a young person in need of intervention.

As an individual caught selling a large quantity of drugs in my late teens, I knew firsthand the power of opportunity as opposed to the wrath of the criminal justice system. Along the way, with the help and guidance of my district attorney, my supervisor and judges, I learned the power of the prosecutor to change lives instead of ruining them.

And that’s how we do it in Boston. We helped a woman who was arrested for stealing groceries to feed her kids get a job.

Instead of putting an abused teenager in adult jail for punching another teenager, we secured mental health treatment and community supervision.

A runaway girl who was arrested for prostituting, to survive on the streets, needed a safe place to live and grow — something we could help her with.

I even helped a young man who was so afraid of the older gang kids showing up after school, that one morning instead of a lunchbox into his backpack, he put a loaded 9-millimeter. 

We would spend our time that we’d normally take prepping our cases for months and months for trial down the road by coming up with real solutions to the problems as they presented.

Which is the better way to spend our time? How would you prefer your prosecutors to spend theirs?

Why are we spending 80 billion dollars on a prison industry that we know is failing, when we could take that money and reallocate it into education, into mental health treatment, into substance abuse treatment and to community investment so we can develop our neighborhoods?

why should this matter to you? Well, one, we’re spending a lot of money.

Our money. It costs 109,000 dollars in some states to lock up a teenager for a year, with a 60 percent chance that that person will return to the very same system. That is a terrible return on investment.

Number two: it’s the right thing to do. If prosecutors were a part of creating the problem, it’s incumbent on us to create a solution and we can do that using other disciplines that have already done the data and research for us.

Number three: your voice and your vote can make that happen. The next time there’s a local district attorney’s election in your jurisdiction, ask candidates these questions.

One: What are you doing to make me and my neighbors safer?

Two: What data are you collecting, and how are you training your prosecutors to make sure that it’s working? And

Three: If it’s not working for everybody, what are you doing to fix it? If they can’t answer the questions, they shouldn’t be doing the job.

Each one of you that raised your hand at the beginning of this talk is a living, breathing example of the power of opportunity, of intervention, of support and of love. While each of you may have faced your own brand of discipline for whatever malfeasances you committed, barely any of you needed a day in jail to make you the people that you are today — some of the greatest minds on the planet.

Every day, thousands of times a day, prosecutors around the United States wield power so great that it can bring about catastrophe as quickly as it can bring about opportunity, intervention, support and yes, even love. 

Those qualities are the hallmarks of a strong community, and a strong community is a safe one. If our communities are broken, don’t let the lawyers that you elect fix them with outdated, inefficient, expensive methods.

Demand more; vote for the prosecutor who’s helping people stay out of jail, not putting them in.

Demand better. You deserve it, your children deserve it, the people who are tied up in the system deserve it, but most of all, the people that we are sworn to protect and do justice for demand it.

We must, we must do better


adonis49

adonis49

adonis49

March 2021
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
293031  

Blog Stats

  • 1,462,567 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.adonisbouh@gmail.com

Join 803 other followers

%d bloggers like this: