Our response to the Flint water crisis

22 06 2016

 

An editorial by Nicholas Kristof was published in the February 13, 2016, issue of the New York Times entitled: “Are you a Toxic Waste Disposal Site?” We think Mr. Kristof makes some great points, so we’ve published the entire editorial below:

EVEN if you’re not in Flint, Mich., there are toxic chemicals in your home. For that matter, in you.

Scientists have identified more than 200 industrial chemicals — from pesticides, flame retardants, jet fuel — as well as neurotoxins like lead in the blood or breast milk – of Americans, indeed, in people all over our planet.

These have been linked to cancer, genital deformities, lower sperm count, obesity and diminished I.Q. Medical organizations from the President’s Cancer Panel to the International Federation of Gynecology and Obstetrics have demanded tougher regulations or warned people to avoid them, and the cancer panel has warned that “to a disturbing extent, babies are born ‘pre-polluted.’”

They have all been drowned out by chemical industry lobbyists.

So we have a remarkable state of affairs:

■ Politicians are (belatedly!) condemning the catastrophe of lead poisoning in Flint. But few acknowledge that lead poisoning in many places in America is even worse than in Flint. Kids are more likely to suffer lead poisoning in Pennsylvania or Illinois or even most of New York State than in Flint. More on that later.

■ Americans are panicking about the mosquito-borne Zika virus and the prospect that widespread infection may reach the United States. That’s a legitimate concern, but public health experts say that toxic substances around us seem to pose an even greater threat.

“I cannot imagine that the Zika virus will damage any more than a small fraction of the total number of children who are damaged by lead in deteriorated, poor housing in the United States,” says Dr. Philip Landrigan, a prominent pediatrician and the dean for global health at the Icahn School of Medicine at Mount Sinai. “Lead, mercury, PCBs, flame retardants and pesticides cause prenatal brain damage to tens of thousands of children in this country every year,” he noted.

Yet one measure of our broken political system is that chemical companies, by spending vast sums on lobbying— $100,000 per member of Congress last year — block serious oversight.[1] Almost none of the chemicals in products we use daily have been tested for safety.

Maybe, just maybe, the crisis in Flint can be used to galvanize a public health revolution.

In 1854, a British doctor named John Snow started such a revolution. Thousands were dying of cholera at the time, but doctors were resigned to the idea that all they could do was treat sick patients. Then Snow figured out that a water pump on Broad Street in London was the source of the cholera[2]. The water company furiously rejected that conclusion, but Snow blocked use of the water pump, and the cholera outbreak pretty much ended. This revelation led to the germ theory of disease and to investments in sanitation and clean water. Millions of lives were saved.

Now we need a similar public health revolution focusing on the early roots of many pathologies.

For example, it’s scandalous that 535,000 American children ages 1 to 5 still suffer lead poisoning, according to the Centers for Disease Control and Prevention[3]. The poisoning is mostly a result of chipped lead paint in old houses or of lead-contaminated soil being tracked into homes, although some areas like Flint also have tainted tap water. (Note:  fabrics often contain lead in the dyes used and as a catalyst in the dyeing process.)

lead paint

While the data sets are weak, many parts of America have even higher rates of child lead poisoning than Flint, where 4.9 percent of children tested have had elevated lead levels in their blood. In New York State outside New York City, it’s 6.7 percent. In Pennsylvania, 8.5 percent. In parts of Detroit, it’s 20 percent. The victims are often poor or black.[4]

Infants who absorb lead are more likely to grow up with shrunken brains and diminished I.Q.[5] They are more likely as young adults to engage in risky sexual behavior, to disrupt school and to commit violent crimes. Many researchers believe that the worldwide decline in violent crime beginning in the 1990s is partly a result of lead being taken out of gasoline in the late 1970s. The stakes are enormous, for individual opportunity and for social cohesion.

Fortunately, we have some new Dr. Snows for the 21st century.

A group of scholars, led by David L. Shern of Mental Health America, argues that the world today needs a new public health revolution focused on young children, parallel to the one mounted for sanitation after Snow’s revelations about cholera in 1854. Once again, we have information about how to prevent pathologies, not just treat them — if we will act.

The reason for a new effort is a vast amount of recent research showing that brain development at the beginning of life affects physical and mental health decades later. That means protecting the developing brain from dangerous substances and also from “toxic stress”— often a byproduct of poverty — to prevent high levels of the stress hormone cortisol, which impairs brain development.

A starting point of this public health revolution should be to protect infants and fetuses from toxic substances, which means taking on the companies that buy lawmakers to prevent regulation. Just as water companies tried to obstruct the 19th-century efforts, industry has tried to block recent progress.

Back in 1786, Benjamin Franklin commented extensively on the perils of lead poisoning, but industry ignored the dangers and marketed lead aggressively. In the 1920s, an advertisement for the National Lead Company declared, “Lead helps to guard your health,” praising the use of lead pipes for plumbing and lead paint for homes. And what the lead companies did for decades, and the tobacco companies did, too, the chemical companies do today.

lead

Lead poisoning is just “the tip of the iceberg,” says Tracey Woodruff, an environmental health specialist at the University of California at San Francisco. Flame-retardant chemicals have very similar effects, she says, and they’re in the couches we sit on.

The challenge is that the casualties aren’t obvious, as they are with cholera, but stealthy and long term. These are silent epidemics, so they don’t generate as much public alarm as they should.

“Industrial chemicals that injure the developing brain” have been linked to conditions like autism and attention deficit hyperactivity disorder, noted The Lancet Neurology, a peer-reviewed medical journal. Yet we still don’t have a clear enough sense of what is safe, because many industrial chemicals aren’t safety tested before they are put on the market. Meanwhile, Congress has dragged out efforts to strengthen the Toxic Substances Control Act and test more chemicals for safety.

The President’s Cancer Panel recommended that people eat organic if possible, filter water and avoid microwaving food in plastic containers. All good advice, but that’s like telling people to avoid cholera without providing clean water.

And that’s why we need another public health revolution in the 21st century.

 

[1] http://www.opensecrets.org/lobby/indusclient.php?id=N13&year=2015

[2] http://www.bbc.co.uk/history/historic_figures/snow_john.shtml

[3] http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6213a3.htm

[4] http://www.nytimes.com/2016/02/07/opinion/sunday/america-is-flint.html

[5] http://journalistsresource.org/studies/society/public-health/lead-poisoning-exposure-health-policy?utm_source=JR-email&utm_medium=email&utm_campaign=JR-email&utm_source=Journalist%27s+Resource&utm_campaign=63b82f94eb-2015_Sept_1_A_B_split3_24_2015&utm_medium=email&utm_term=0_12d86b1d6a-63b82f94eb-79637481

Advertisements




Toxic lies

14 07 2015

Julie Gunlock wrote a blog post entitled “The ‘toxic’ lies behind Jessica Alba’s booming baby business” (to read the post, click  here ) We’re not necessarily fond of Jessica Alba nor her Honest Company, but the statements made by Julie Gunlock need to be addressed. She contends that the Honest Company’s main commodity is fear and the “false promise that their products are safer than others.”

I will not comment on her admonitions about how The Honest Company’s products are full of chemicals (as this should be obvious), or that Alba had recognized that “many people  –  particularly women (sic) – have been convinced that common chemicals are a bogeyman that lurks, waiting to harm them” – since everything is made of chemicals, some bad for us, some that are not.  We aren’t part of the “man made is absolutely bad, natural is absolutely good” camp.

What I will address is her claim that chemicals used in products are “there for a reason” and they’re completely safe because “chemicals are regulated under nearly a dozen federal agencies and regulations.”   She states:   “ chemicals in products … are used in trace amounts, often improve the safety of those products and have undergone hundreds of safety tests.”

As she herself says, nothing could be further from the truth.

First, let’s address her contention that “chemicals in products…are used in trace amounts.”

 The idea that chemicals won’t harm us because the amounts used are so tiny is not new; it’s been used by industry for many years. However, new research is being done which is profoundly changing our old belief systems. For example, we used to think that a little dose of a poison would do a little bit of harm, and a big dose would do a lot of harm (i.e., “the dose makes the poison”) – because water, as Julie Gunlock herself reminds us, can kill you just as surely as arsenic, given sufficient quantity.   The new paradigm shows that exposure to even tiny amounts of chemicals (in the parts-per-trillion range) can have significant impacts on our health – in fact some chemicals impact the body profoundly in the parts per trillion range, but do little harm at much greater dosages. The old belief system did not address how chemicals can change the subtle organization of the brain. Now, according to Dr. Laura Vandenberg of the Tufts University Center for Regenerative and Developmental Biology[1] “we found chemicals that are working at that really low level, which can take a brain that’s in a girl animal and make it look like a brain from a boy animal, so, really subtle changes that have really important effects.”

In making a risk assessment of any chemical, we now also know that timing and order of exposure is critical – exposures can happen all at once, or one after the other, and that can make a world of difference.   And we also know another thing: mixtures of chemicals can make each other more toxic. For example: a dose of mercury that would kill 1 out of 100 rats, when combined with a dose of lead that would kill 1 out of 1000 rats – kills every rat exposed.

And finally, the new science called “epigenetics” is finding that pollutants and chemicals might be altering the 20,000-25,000 genes we’re born with—not by mutating or killing them, but by sending subtle signals that silence them or switch them on or off at the wrong times.  This can set the stage for diseases, which can be passed down for generations. So exposure to chemicals can alter genetic expression, not only in your children, but in your children’s children – and their children too.  Researchers at Washington State University found that when pregnant rats were exposed to permethrin, DEET or any of a number of industrial chemicals, the mother rats’ great granddaughters had higher risk of early puberty and malfunctioning ovaries — even though those subsequent generations had not been exposed to the chemical.[2]  Another recent study has shown that men who started smoking before puberty caused their sons to have significantly higher rates of obesity. And obesity is just the tip of the iceberg—many researchers believe that epigenetics holds the key to understanding cancer, Alzheimer’s, schizophrenia, autism, and diabetes. Other studies are being published which corroborate these findings.[3]

So that’s the thing: we’re exposed to chemicals all day, every day – heavy metals and carcinogenic particles in air pollution; industrial solvents, household detergents, Prozac (and a host of other pharmaceuticals) and radioactive wastes in drinking water; pesticides in flea collars; artificial growth hormones in beef, arsenic in chicken; synthetic hormones in bottles, teething rings and medical devices; formaldehyde in cribs and nail polish, and even rocket fuel in lettuce. Pacifiers are now manufactured with nanoparticles from silver, to be sold as ‘antibacterial.’ These exposures all add up – and the body can flush out some of these chemicals, while it cannot excrete others.  Chlorinated pesticides, such as DDT, for example, can remain in the body for 50 years.   Scientists call the chemicals in our body our “body burden”.  Everyone alive carries within their body at least 700 contaminants.[4]

This cumulative exposure could mean that at some point your body reaches a tipping point and, like falling dominoes, the stage is set for something disastrous happening to your health.

The generations born from 1970 on are the first to be raised in a truly toxified world. Probably one in three of the children you know suffers from a chronic illness – based on the finding of many studies on children’s health issues.[5]   It could be cancer, or birth defects – perhaps asthma, or a problem that affects the child’s mind and behavior, such as a learning disorder, ADHD or autism or even a peanut allergy. We do know, for example:

  • Childhood cancer, once a medical rarity, is the second leading cause of death (following accidents) in children aged 5 to 14 years.[6]
  • According to the American Academy of Allergy Asthma & Immunology, for the period 2008-2010, asthma prevalence was higher among children than adults – and asthma rates for both continue to grow. [7]
  • Autism rates without a doubt have increased at least 200 percent.
  • Miscarriages and premature births are also on the rise,
  • while the ratio of male to female babies dwindles and
  • teenage girls face endometriosis.

Dr. Warren Porter delivered a talk at the 25th National Pesticide Forum in 2007, in which he explained that a lawn chemical used across the country, 2,4-D, mecoprop and dicambra was tested to see if it would change or alter the capacity of mice to keep fetuses in utero. The test found that the lowest dosage of this chemical had the greatest effect – a common endocrine response.[8]

Illness does not necessarily show up in childhood. Environmental exposures, from conception to early life, can set a person’s  cellular code for life and can cause disease at any time, through old age. And the new science of epigenetics is showing us that these exposures can impact not only us, but our children, grandchildren and great-grandchildren.

I think that pretty much demolishes the argument that chemicals in “trace amounts” don’t do us any harm.

Second, what about her contention that “chemicals are regulated under nearly a dozen federal agencies and regulations … which have undergone hundreds of safety tests.”

 The chief legal authority for regulating chemicals in the United States is the 1976 Toxic Substances Control Act (TSCA).[9]

It is widely agreed that the TSCA is not doing the job of protecting us, and that the United States is in need of profound change in this area. Currently, legislation entitled the 2013 Chemical Safety Improvement Act, introduced by a bipartisan group of 26 senators, is designed to improve the outdated TSCA but it is still in committee.  The chemicals market values function, price and performance over safety, which poses a barrier to the scientific and commercial success of green chemistry in the United States and could ultimately hinder the U.S. chemical industry’s competitiveness in the global marketplace as green technologies accelerate under the European Union’s requirements.

We assume the TSCA is testing and regulating chemicals used in the industry[10]. It is not:

  • Of the more than 60,000 chemicals  in use prior to 1976, most were “grandfathered in”; only 263 were tested for safety and only 5 were restricted.  Today over 80,000 chemicals are routinely used in industry, and the number which have been tested for safety has not materially changed since 1976.  So we cannot know the risks of exposing ourselves to certain chemicals.  The default position is that no information about a chemical = no action.
  • The chemical spill which occurred in West Virginia in 2014 was of “crude MCHM”, or 4-methylcyclohexanemethanol, one of the chemicals that was grandfathered into the Toxic Substances Control Act of 1976.   That means that nobody knows for sure what that chemical can do to us.
    • Carcinogenic effects? No information available.
    • Mutagenic effects? No information available.
    • Developmental toxicity? No information available.

Lack of information is the reason the local and federal authorities were so unsure of how to advise the local population about their drinking  water supplies.  (And by the way, in January, 2014,  a federal lawsuit was filed in Charleston, WV, which claims that the manufacturer of MCHM hid “highly toxic and carcinogenic properties” of components of MCHM, hexane and methanol, both of which have been tested and found to cause diseases such as cancer.)

We assume that the TSCA requires manufacturers to demonstrate that their chemicals are safe before they go into use. It does not:

  • The EPA requires a “Premanufacture Notification” of a new chemical, and no data of any kind is required[11].   The EPA receives between 40-50 each week and 8 out of 10 are approved, with or without test data, with no restrictions on their proposed use. As 3M puts it on their PMN forms posted on EPA’s web site, “You are not required to submit the listed test data if you do not have it.”
  • The TSCA says the government has to prove actual harm caused by the chemical in question before any controls can be put in place.  The catch-22 is that chemical companies don’t have to develop toxicity data or submit it to the EPA for an existing product unless the agency finds out that it will pose a risk to humans or the environment – which is difficult to do if there is no data in the first place.  Lack of evidence of harm is taken as evidence of no harm.

We assume that manufacturers must list all ingredients in a product, so if we have an allergy or reaction to certain chemicals we can check to see if the product is free of those chemicals. It does not:

  • The TSCA allows chemical manufacturers to keep ingredients in some products secret.   Nearly 20% of the 80,000 chemicals in use today are considered “trade secrets”.  This makes it impossible for consumers to find out what’s actually in a product.  And there is no time limit on the period in which a chemical can be considered a trade secret.

These limitations all help to perpetuate the chemical industry’s failure to innovate toward safer chemical and product design.  It’s one of the reasons the USA is one of the few nations in the world in which asbestos is not banned.

Finally, and because I just couldn’t resist: her example of using what she concedes are “toxic fragrances” to cover up that “other toxic stink – the one coming out of your baby” speaks for itself.

In conclusion, I don’t think that we’re being alarmist in trying to find better alternatives for products we use every day.  Nor are the promises of companies like Alba’s false.

 

[1] Living on Earth, March 16, 2012, http://www.loe.org/shows/segments.html?programID=12-P13-00011&segmentID=1

[2] Sorensen, Eric, “Toxicants cause ovarian disease across generations”, Washington State University, http://news.wsu.edu/pages/publications.asp?Action=Detail&PublicationID=31607

[3]http://www.sciguru.com/newsitem/13025/Epigenetic-changes-are-heritable-although-they-do-not-affect-DNA-structure  ALSO SEE: http://www.eeb.cornell.edu/agrawal/documents/HoleskiJanderAgrawal2012TREE.pdf ALSO SEE: http://www.the-scientist.com/?articles.view/articleNo/32637/title/Lamarck-and-the-Missing-Lnc/

[4] http://www.chemicalbodyburden.org/whatisbb.htm

[5] Theofanidis, D, MSc., “Chronic Illness in Childhood: Psychosocial and Nursing Support for the Family”, Health Science Journal, http://www.hsj.gr/volume1/issue2/issue02_rev01.pdf

[6] Ward, Elizabeth, et al; Childhood and adolescent cancer statistics, 2014, CA: Cancer Journal for Clinicians, Vol 64, issue 2, pp. 83-103, March/April 2014

[7] http://www.aaaai.org/about-the-aaaai/newsroom/asthma-statistics.aspx

[8] Porter, Warren, PhD; “Facing Scientific Realities: Debunking the “Dose Makes the Poison” Myth”, National Pesticide Forum, Chicago, 2007; http://www.beyondpesticides.org/infoservices/pesticidesandyou/Winter%2007-08/dose-poison-debunk.pdf

[9] The “regulations” mentioned, all of which fall under the TSCA, might include:

  • the Environmental Protection Agency’s Chemical Action Plans for certain chemicals – to date, 10 chemicals have Chemical Action Plans in place. These plans attempt to outline the risks each chemical may present and identify the specific steps the agency is taking to address the concerns.
  • Confidential Business Information (CBI) – designed to protect intellectual property and confidential business information.
  • Chemical Data Reporting (CDR) Rule: use and exposure information to help the EPA screen and prioritize chemicals for additional review.
  • Chemical Prioritization: Which allows the EPA to identify which chemicals in commerce warrant additional review.
  • Risk Assessment: Under TSCA, EPA assesses chemicals using conservative assumptions about the possible hazards a chemical may pose.

[10] http://www.chemicalindustryarchives.org/factfiction/testing.asp

[11] Ibid.





Climate change and the Louisiana delta

8 09 2014

 

In the August 28, 2014 issue of Huff Post Green, an article by Bob Marshall of The Lens caught me eye, because it’s another instance of climate change affecting the landscape in one of our most vulnerable areas: the Louisiana delta. I’ve excerpted some of it; if you want to read the full article, click here. So NEXT post will be about how the textile industry is contributing to climate change!

Al Shaw of ProPublica and Brian Jacobs of Knight-Mozilla Open News

Al Shaw of ProPublica and Brian Jacobs of Knight-Mozilla Open News

In just 80 years, some 2,000 square miles of Louisiana’s coastal landscape have turned to open water, wiping places off maps, bringing the Gulf of Mexico to the back door of New Orleans and posing a lethal threat to an energy and shipping corridor vital to the nation’s economy.

And it’s going to get worse, even quicker.

Scientists now say one of the greatest environmental and economic disasters in the nation’s history is rushing toward a catastrophic conclusion over the next 50 years, so far unabated and largely unnoticed.

At the current rates that the sea is rising and land is sinking, National Oceanic and Atmospheric Administration scientists say by 2100 the Gulf of Mexico could rise as much as 4.3 feet across this landscape, which has an average elevation of about 3 feet. If that happens, everything outside the protective levees — most of Southeast Louisiana — would be underwater.

 The effects would be felt far beyond bayou country. The region best known for its self-proclaimed motto “laissez les bons temps rouler” — let the good times roll — is one of the nation’s economic linchpins.

 This land being swallowed by the Gulf is home to half of the country’s oil refineries, a matrix of pipelines that serve 90 percent of the nation’s offshore energy production and 30 percent of its total oil and gas supply, a port vital to 31 states, and 2 million people who would need to find other places to live.

 The landscape on which all that is built is washing away at a rate of a football field every hour, 16 square miles per year.

For years, most residents didn’t notice because they live inside the levees and seldom travel into the wetlands. But even those who work or play in the marshes were misled for decades by the gradual changes in the landscape. A point of land eroding here, a bayou widening there, a spoil levee sinking a foot over 10 years. In an ecosystem covering thousands of square miles, those losses seemed insignificant. There always seemed to be so much left.

Now locals are trying to deal with the shock of losing places they had known all their lives — fishing camps, cypress swamps, beachfronts, even cattle pastures and backyards — with more disappearing every day.

The story of how that happened is a tale of levees, oil wells and canals leading to destruction on a scale almost too big to comprehend — and perhaps too late to rebuild. It includes chapters on ignorance, unintended consequences and disregard for scientific warnings. It’s a story that is still unfolding.

By the time New Orleans was founded in 1718, the main channel of the river was the beating heart of a system pumping sediment and nutrients through a vast circulatory network that stretched from present-day Baton Rouge south to Grand Isle, west to Texas and east to Mississippi. As late as 1900, new land was pushing out into the Gulf of Mexico.

A scant 70 years later, that huge, vibrant wetlands ecosystem would be at death’s door. The exquisite natural plumbing that made it all possible had been dismantled, piece by piece, to protect coastal communities and extract oil and gas.

 For communities along its banks, the Mississippi River has always been an indispensable asset and their gravest threat. The river connected their economies to the rest of the world, but its spring floods periodically breached locally built levees, quickly washing away years of profits and scores of lives. Some towns were so dependent on the river, they simply got used to rebuilding.

o-LOUISIANA-WETLAND-570

That all changed with the Great Flood of 1927.

Swollen by months of record rainfall across the watershed, the Mississippi broke through levees in 145 places, flooding the midsection of the country from Illinois to New Orleans. Some 27,000 square miles went under as much as 30 feet of water, destroying 130,000 homes, leaving 600,000 people homeless and killing 500.

Stunned by what was then the worst natural disaster in U.S. history, Congress passed the Flood Control Act of 1928, which ordered the U.S. Army Corps of Engineers to prevent such a flood from ever happening again. By the mid-1930s, the corps had done its job, putting the river in a straitjacket of levees.

But the project that made the river safe for the communities along the river would eventually squeeze the life out of the delta. The mud walls along the river sealed it off from the landscape sustained by its sediment. Without it, the sinking of land that only occurred during dry cycles would start, and never stop.

If that were all we had done to the delta, scientists have said, the wetlands that existed in the 1930s could largely be intact today. The natural pace of sinking — scientists call it subsidence — would have been mere millimeters per year.

But we didn’t stop there. Just as those levees were built, a nascent oil and gas industry discovered plentiful reserves below the delta’s marshes, swamps and ridges.

At the time, wetlands were widely considered worthless — places that produced only mosquitoes, snakes and alligators. The marsh was a wilderness where few people could live, or even wanted to.

There were no laws protecting wetlands. Besides, more than 80 percent of this land was in the hands of private landowners who were happy to earn a fortune from worthless property.

Free to choose the cheapest, most direct way to reach drilling sites, oil companies dredged canals off natural waterways to transport rigs and work crews. The canals averaged 13 to 16 feet deep and 140 to 150 feet wide — far larger than natural, twisting waterways.

 Eventually, some 50,000 wells were permitted in the coastal zone. The state estimates that roughly 10,000 miles of canals were dredged to service them, although that only accounts for those covered by permitting systems. The state began to require some permits in the 1950s, but rigorous accounting didn’t begin until the Clean Water Act brought federal agencies into play in 1972.

“Once the oil companies come in and started dredging all the canals, everything just started falling apart,” said Joseph Bourgeois, 84, who grew up and still lives in the area.

From 1930 to 1990, as much as 16 percent of the wetlands was turned to open water as those canals were dredged. But as the U.S. Department of the Interior and many others have reported, the indirect damages far exceeded that:

  • Saltwater creeped in

Canal systems leading to the Gulf allowed saltwater into the heart of freshwater marshes and swamps, killing plants and trees whose roots held the soils together. As a side effect, the annual supply of plant detritus — one way a delta disconnected from its river can maintain its elevation — was seriously reduced.

  • Shorelines crumbled

Without fresh sediment and dead plants, shorelines began to collapse, increasing the size of existing water bodies. Wind gained strength over ever-larger sections of open water, adding to land loss. Fishers and other boaters used canals as shortcuts across the wetlands; their wakes also sped shoreline erosion. In some areas, canals grew twice as wide within five years.

  • Spoil levees buried and trapped wetlands

When companies dredged canals, they dumped the soil they removed alongside, creating “spoil levees” that could rise higher than 10 feet and twice as wide.

The weight of the spoil on the soft, moist delta caused the adjacent marshes to sink. In locations of intense dredging, spoil levees impounded acres of wetlands. The levees also impeded the flow of water — and sediments — over wetlands during storm tides.

If there were 10,000 miles of canals, there were 20,000 miles of levees. Researchers estimate that canals and levees eliminated or covered 8 million acres of wetlands.

 All this disrupted the delta’s natural hydrology — its circulatory system — and led to the drowning of vast areas. Researchers have shown that land has sunk and wetlands have disappeared the most in areas where canals were concentrated.

There are other forces at work, including a series of geologic faults in the delta and the rock layers beneath, but a U.S. Department of Interior report says oil and gas canals are ultimately responsible for 30 to 59 percent of coastal land loss. In some areas of Barataria Bay, it’s close to 90 percent.

 Even more damage was to come as the oil and gas industry shifted offshore in the late 1930s, eventually planting about 7,000 wells in the Gulf. To carry that harvest to onshore refineries, companies needed more underwater pipelines. So they dug wider, deeper waterways to accommodate the large ships that served offshore platforms.

 Congress authorized the Corps of Engineers to dredge about 550 miles of navigation channels through the wetlands. The Department of Interior has estimated that those canals, averaging 12 to 15 feet deep and 150 to 500 feet wide, resulted in the loss of an additional 369,000 acres of coastal land.

 Researchers eventually would show that the damage wasn’t due to surface activities alone. When all that oil and gas was removed from below some areas, the layers of earth far below compacted and sank. Studies have shown that coastal subsidence has been highest in some areas with the highest rates of extraction.

 The oil and gas industry, one of the state’s most powerful political forces, has acknowledged some role in the damages, but so far has defeated efforts to force companies to pay for it.

 Even as politicians fought the lawsuit, it was hard to deny what was happening on the ground.

By 2000, coastal roads that had flooded only during major hurricanes were going underwater when high tides coincided with strong southerly winds. Islands and beaches that had been landmarks for lifetimes were gone, lakes had turned into bays, and bays had eaten through their borders to join the Gulf.

Today, in some basins around New Orleans, land is sinking an inch every 30 months. At this pace, by the end of the century this land will sink almost 3 feet in an area that’s barely above sea level today.

Meanwhile, global warming is causing seas to rise worldwide. Coastal landscapes everywhere are now facing a serious threat, but none more so than Southeast Louisiana.

The federal government projects that seas along the U.S. coastline will rise 1.5 to 4.5 feet by 2100. Southeast Louisiana would see “at least” 4 to 5 feet, said NOAA scientist Tim Osborn.

 The difference: This sediment-starved delta is sinking at one of the fastest rates of any large coastal landscape on the planet at the same time the oceans are rising.

Maps used by researchers to illustrate what the state will look like in 2100 under current projections show the bottom of Louisiana’s “boot” outline largely gone, replaced by a coast running practically straight east to west, starting just south of Baton Rouge. The southeast corner of the state is represented only by two fingers of land – the areas along the Mississippi River and Bayou Lafourche that currently are protected by levees.

 Similar predictions had been made for years. But Hurricane Katrina finally galvanized the state Legislature, which pushed through a far-reaching coastal restoration plan in 2007.

 The 50-year, $50 billion Master Plan for the Coast (in 2012 dollars) includes projects to build levees, pump sediment into sinking areas, and build massive diversions on the river to reconnect it with the dying delta.

The state’s computer projections show that by 2060 — if projects are completed on schedule — more land could be built annually than is lost to the Gulf.

But there are three large caveats.

  • The state is still searching for the full $50 billion. Congress so far has been unwilling to help.
  • If the plan is to work, sea-level rise can’t be as bad as the worst-case scenario.
  • Building controlled sediment diversions on the river, a key part of the land-building strategy, has never been done before. The predictions, then, are largely hypothetical, although advocates say the concept is being proven by an uncontrolled diversion at West Bay, near the mouth of the river.

 Trying to keep pace with the vanishing pieces of southeast Louisiana today is like chasing the sunset; it’s a race that never ends.

Signs of the impending death of this delta are there to see for any visitor.

Falling tides carry patches of marsh grass that have fallen from the ever-crumbling shorelines.

Pelicans circle in confusion over nesting islands that have washed away since last spring.

Pilings that held weekend camps surrounded by thick marshes a decade ago stand in open water, hundreds of yards from the nearest land — mute testimony to a vanishing culture.

Shrimpers push their wing nets in lagoons that were land five years ago.

The bare trunks of long-dead oaks rise from the marsh, tombstones marking the drowning of high ridges that were built back when the river pumped life-giving sediment through its delta.

“If you’re a young person you think this is what it’s supposed to look like,” Lambert said. “Then when you’re old enough to know, it’s too late.”

 





How we’re protected from chemical exposures.

4 03 2014

I always thought I wouldn’t have to worry about some things – like, oh,  incoming missiles,  terrorist plots, and chemicals which could destroy me – because I thought my government would have something in place to protect me.  But the recent chemical spill in West Virginia changed that: for those of you who don’t know, that was a spill of  about 10,000 gallons of what is called a “coal cleaner”  into the Elk River, contaminating the water supply of 300,000 people.

When I first began looking into the chemicals used in fabrics, and finding out that the soft, luscious fabrics we surround ourselves with every day are filled with chemicals that can cause me grievous harm, I was stopped in my tracks when someone suggested that the government wouldn’t let those chemicals in products sold in the USA – so how could fabrics contain those chemicals?   I didn’t have an answer for that, because at the time I too thought  that “of course the government must have laws in place to make sure we aren’t exposed to dangerous chemicals”!

The current regulation of chemicals in the US dates back to 1976 and the Toxic Substances Control Act (TSCA), which regulates the introduction of new or already existing chemicals.

But before talking about the TSCA, let’s first take a quick look at what’s changed since 1976,  because our understanding of the extent and pathways of chemical exposures has fundamentally changed since then.

We now know that the old belief that “the dose makes the poison” (i.e.,  the higher the dose, the greater the effect)  is simply wrong.  Studies are finding that even tiny quantities of chemicals – in the parts-per-trillion range – can have significant impacts on our health.  We’re also finding that mixtures of chemicals, each below their “no observed effect level”, may have greater environmental impacts than the chemicals alone.   In other words, toxins can make each other more toxic:   a dose of mercury that would kill 1 out of 100 rats, when combined with a dose of lead that would kill 1 out of 1000 rats – kills every rat exposed.

We also now know that timing and order of exposure is critical –  exposures can happen one after the other, or all at once.  The possible combinations of exposures is huge and knowledge is limited about the effects of mixed exposures.  During gestation and through early childhood  the body is rapidly growing  under a carefully orchestrated process that is dependent on a series of events.  When one of those events is interrupted, the next event is disrupted – and so on –  until permanent and irreversible changes result. These results could be very subtle — like an alteration in how the brain develops which impacts, for example, learning ability.  Or it could result in other impacts like modifying the development of an organ predisposing it to cancer later in life.

Add to that the concept of individual susceptibility.  For instance a large part of the population is unable to effectively excrete heavy metals, so their body burden accumulates faster, and their illnesses are more obvious.  They are the “canaries in the coal mine” in an environment that’s becoming increasingly more toxic.

We’re finding that chemicals migrate from products into the environment (and remember, we are part of the environment).

And this is where it gets really interesting:

Each of us starts life with a particular set of genes, 20,000 to 25,000 of them. Now scientists are amassing a growing body of evidence that pollutants and chemicals might be altering those genes—not by mutating or killing them, but by sending subtle signals that silence them or switch them on at the wrong times.  This can set the stage for diseases which can be passed down for generations.  This study of heritable changes in gene expression – the chemical reactions that switch parts of the genome off and on at strategic times and locations –  is called “epigenetics”.

They’re finding that exposure to chemicals is capable of altering genetic expression, not only in your children, but in your children’s children – and their children too.  Researchers at Washington State University found that when pregnant rats were exposed to permethrin, DEET or any of a number of industrial chemicals, the mother rats’ great grand-daughters had higher risk of early puberty and malfunctioning ovaries — even though those subsequent generations had not been exposed to the chemical.[1]  Another recent study has shown that men who started smoking before  puberty caused their sons to have significantly higher rates of obesity. And  obesity is just the tip of the iceberg—many researchers believe that epigenetics  holds the key to understanding cancer, Alzheimer’s, schizophrenia, autism, and  diabetes. Other studies are being published which corroborate these findings.[2]

With the advent of biomonitoring, and a growing recognition of the importance of early life exposures, low dose effects and epigenetics, the science linking environmental exposures to biological effects (i.e., disease) is becoming overwhelming.

And here’s why the Toxic Substances Control Act of 1976 is not doing the job of protecting us:

  • We assume the TSCA is testing and regulating chemicals used in industry. It is not:
    • Of the more than 60,000 chemicals  in use prior to 1976, most were “grandfathered in”; only 200 were tested for safety and only 5 were restricted.  Today over 80,000 chemicals are routinely used in industry, and the number which have been tested for safety has not materially changed since 1976.  So we cannot know the risks of exposing ourselves to certain chemicals.  The default position is that no information about a chemical = no action.
    • For those of you who don’t know, the spill in West Virginia was of “crude MCHM”, or 4-methylcyclohexanemethanol, one of the chemicals that was grandfathered in to the Toxic Substances Control Act of 1976.   That means that nobody knows for sure what that chemical can do to us.
      • Carcinogenic effects? No information available.
      • Mutagenic effects? No information available.
      • Developmental toxicity? No information available.     Lack of information is the reason the local and federal authorities were so unsure of how to advise the local population about their drinking  water supplies.  (And by the way, in January, 2014,  a federal lawsuit was filed in Charleston, WV, which claims that the manufacturer of MCHM hid “highly toxic and carcinogenic properties” of components of MCHM, hexane and methanol, both of which have been tested and found to cause diseases such as cancer.)
  • We assume that the TSCA requires manufacturers to demonstrate their chemicals are safe before they go into use.  It does not:
    • The law says the government has to prove actual harm caused by the chemical in question before any controls can be put in place.  The catch-22 is that chemical companies don’t have to develop toxicity data or submit it to the EPA for an existing product unless the agency find out that it will pose a risk to humans or the environment – which is difficult to do if there is no data in the first place.  Lack of evidence of harm is taken as evidence of no harm.
  • We assume that manufacturers must list all ingredients in a product, so if we have an allergy or reaction to certain chemicals we can check to see if the product is free of those chemicals.  It does not:
    • TSCA allows chemical manufacturers to keep ingredients in some products secret.   Nearly 20% of the 80,000 chemicals in use today are considered “trade secrets”.  This makes it impossible for consumers to find out what’s actually in a product.  And there is no time limit on the period in which a chemical can be considered a trade secret.

These limitations all help to perpetuate the chemical industry’s failure to innovate toward safer chemical and product design.  It’s one of the reasons the USA is one of the few nations in the world in which asbestos is not banned in many products.

In 2013, the Chemical Safety Improvement Act (CSIA) was introduced, however it does not deliver the critical fixes needed to fix the TSCA, although it is an improvement to the TSCA.  The Natural Resources Defense Council suggests some steps that we must take to reform the TSCA, and these apply to the CSIA also:

  • Require new and existing chemicals be assessed for safety – with mandatory and enforceable deadlines.  “Innocent until proven guilty” should not apply to chemicals.
  • Establish safety standards, especially with regard to children and other vulnerable groups.
  • Give the EPA the authority to protect the public from unsafe chemicals, including expedited action for those deemed the most toxic.
  • “Grandfathering in” spells trouble for the future.
  • Ensure the public’s right to know about the safety and use of chemicals.
  • Allow states to maintain laws which exceed federal protections to safeguard their citizens.




What does the new TB117-2013 mean to you?

16 12 2013

California has approved a new  flammability standard for residential furniture that is receiving widespread praise among environmentalists.  But we’d like you to examine, with us, some details about the new standard that you’ll need to know to keep you and your family safe from these extremely toxic chemicals.

California is the only state in the U.S. with a mandatory flammability standard for residential furniture.  The original law, TB117, was passed with all the good will in the world – to protect people from dying in house fires by giving them time to escape.  But  as is often the case, there were unintended consequences – we have found that the fire retardant chemicals are linked to cancer, developmental problems, reduced IQ and impaired fertility –  and more.  These chemicals  both persist (i.e, last a long time) and  bioaccumulate (i.e., are absorbed at a rate greater than that at which the substance is lost – leading to a risk of chronic poisoning) in human systems.  And the final straw:  ironically, the chemicals don’t protect us from fires – they just allow the material not to fail the flammability test.  In actual fires, the materials do burn, and just as massively as untreated foam,  and that releases toxic smoke into the air; one pundit has said that firefighters have more to fear from the smoke  than from the actual fire.

Recently, there has been growing pressure to change California’s “Technical Bulletin 117”, which required furniture manufacturers to inject flame retardant chemicals into the polyurethane foam used in all upholstered furniture sold in the state.  (Please note: the law only pertained to filling materials.) Because California is such a huge market, this law has become a de facto national standard. This pressure was fueled by a series of articles in the Chicago Tribune entitled “Playing with Fire” (click here to read the articles) , and more recently by the HBO film, Toxic Hot Seat, both of which exposed the considerable health risks of flame retardant chemicals, as well as the attempts by the chemical industry to thwart attempts at reform.

Why are flame retardant chemicals required in polyurethane foam?  Answer:  Because polyurethane is basically solid gasoline, which means it’s basically an accelerant.   The old test required that it pass a test by withstanding an open flame for 12 seconds before igniting.  Because this is impossible, the chemicals were added to prevent ignition.

What makes the new TB117-2013 different is that the test methods have changed.  Legislators decided to amend the manner in which flammability is measured.  They reasoned that most house fires start from smoldering cigarettes, which cause the fabric to smolder and catch fire – not from within the cushion in the foam.   They thought that upholstery cover fabrics play a more important role in fire behavior performance than filling materials – flames start on the fabric,  not from deep within the cushions, so the best way to prevent the foam from igniting is to make sure that the surface materials do not smolder in the first place.

So the new test did away with the 12 second open flame test and replaced it with a smolder-only test.  In this test, a lighted cigarette (not an open flame) is placed on the surface of the furniture.   If charring occurs which is 2 inches or less, the furniture is considered to pass.  This is a much easier test to pass than the open flame test.

So the new TB117-2013 enables foam manufacturers to reduce or eliminate flame retardant chemicals – but it doesn’t forbid their use.   The new law was designed to enable manufacturers to eliminate the flame retardants, but if they choose to use them it’s not illegal.  It’s up to manufacturers to decide how they plan to meet the new standard.

Most fabrics used in upholstery today are  synthetics or synthetic blends (natural fiber/synthetic).  And synthetics are created from crude oil – so they too are basically solid gasoline.  An accelerant.  Fabrics can be fire retarded easily and cheaply, and it’s very commonly done.  So although the foam manufacturers can (if they so choose) eliminate flame retardant chemicals in the foam, the burden of passing a smolder test now falls on the fabric.  It seems to me that the flame retardant chemicals are now just going to be found in the fabrics rather than the foam.

The new law was originally supposed to go into effect on July 1, 2014, but manufacturers, who said they “needed the additional times to deplete current supplies and effectuate the new regulatory changes” extended the new date to January 1, 2015.  However, starting in January, 2014, manufacturers will be able to sell furniture with a “TB117-2013” tag – so consumers should make sure to ask whether the sofa or chair has been treated with flame retardant chemicals.  Manufacturers are not required to disclose whether they use flame retardants or not, and few label their products.

If you really want to be sure, the Center for Environmental Health can test foam to detect the presence of flame retardants.  The tests only indicate whether certain elements are present, such as chlorine or bromine.  If so, it is likely the foam was treated with flame retardants.  If you want information on how to use this free service, click here.

Even if the foam is  tested and found not to contain flame retardants, that is by no means a clean bill of health for your sofa, because the fabrics may well contain flame retardants.  And a TB117-2013 label on a piece of furniture is not a guarantee that there are no flame retardants used in the piece.

And we think it’s pretty critical to add this final caveat – flame retardant chemicals are just ONE of the many chemicals which may be found in your fabrics.  Textile production uses a lot of chemicals,  most of which have toxicity profiles as equally unsavory as flame retardants: consider formaldehyde, perfluorocarbons (PFC’s), benzene, APEO’s, polychlorinated biphenyls (PCB’s) and Bisphenol A in synthetics, and heavy metals such as lead, mercury and cadmium.  So to limit yourselves to eliminating flame retardant chemicals from the fabrics or furniture you live with  – as wonderful as that is – means you’re not seeing the forest for the trees.





I know the polyester fabric costs less, but what else comes with it?

19 06 2013

When plastic was introduced in 1869, it was advertised as being able to replace natural products like ivory and tortoiseshell in items such as jewelry, combs and buttons – so it would “no longer be necessary to ransack the earth in pursuit of substances which are constantly growing scarcer.”(1)

What a success: Plastics are versatile – they can be hard or soft, flexible or brittle, and are durable, lightweight, formable – in fact, they’re so versatile that they’ve become a vital manufacturing ingredient for nearly every existing industry. They are practically ubiquitous. And now we’re beginning to find that our relationship with plastic is not healthy. Using dwindling fossil fuels to manufacture the stuff, plastic leaches toxic chemicals into our groundwater, litters landscapes and destroys marine life. As Susan Freinkel points out in her book, Plastic: A Toxic Love Story, it’s worth noting that discoveries of plastic’s toxic effects are being made in a world that is at least ten times more plastic than it was half a century ago. In the ’60s, an American might have used about 30 pounds of plastic a year – in 2011, 300 pounds. And we’re producing 300 million tons more every year.(2)

Plastics were marketed as “the material of the future”. And how true that is, because large polymers take practically forever to break down, so much of the plastic that has ever been manufactured is still with us, in landfills, in the plastic filled gyres found in our oceans (where the mass of plastic exceeds that of plankton sixfold) (3), and the stomachs of northern seabirds. And it will stay there for hundreds if not thousands of years.

Just as some chemicals can impact children’s bodies much more than adult bodies, Judith Shulevitz, writing in the New Republic, reminds us: “plastic totally dominates the world of the child. Children drink formula in baby bottles and water in sippy cups, eat food with plastic spoons on bright melamine trays, chew on bath books and rubber ducks, and, if they don’t do these things at your house, they’ll do them at someone else’s or at school, no matter how many notes you write or mad-housewife-ish you’re willing to appear.” (4)

There are many studies to support the belief that these plastics are changing us – but what has really changed is that the scientific understanding of how these chemicals are poisoning us has undergone a conceptual revolution – our grandchildren may see our current attitudes about living with these chemicals as being analogous to doctors in the 1950s who appeared in ads for cigarettes.

Old toxicological notions are being stood on their heads. Certainly, the old “dose makes the poison” notion, which was first expressed by Paracelsus in the 16th century and which means that a substance can only be toxic if it is present in a high enough concentration in the body – because all things are poisonous in the right amounts. He wrote: “All substances are poisons; there is none which is not a poison. The right dose differentiates a poison from a remedy”. But today scientists are finding that timing of exposure might be the critical factor – a fetus might respond to a chemical at one-hundredfold less concentration or more than in an adult, and when the chemical is taken away the body is altered for life. Another theory is known as the “developmental origins of health and disease,” or DOHaD (for more about DOHaD, click here), and it paints a picture of almost unimaginably impressionable bodies, responsive to biologically active chemicals until the third generation.(5)

New methods have been developed which have taken the guesswork out of what were once theories: for example, biomonitoring now means that scientists can actually discover the degree to which people have been exposed to poisonous stuff when in the past their conclusions were largely guesswork; and microarray profiling, which means we’re beginning to understand how tiny doses of certain chemicals switch genes on or off in harmful ways during exquisitely sensitive periods of development.

Exposure to all that plastic has a cumulative effect. Now toxicologists can see that lots of tiny doses from many different estrogen-mimicking chemicals entering the body by multiple pathways can have a big impact. “If you’re being exposed to two-hundred fifty chemicals and only thirty of them have estrogenic activity, but they’re each very low, still, thirty of them might add up to be significant,” says Jerrold Heindel, of the National Institute of Environmental Health Sciences (NIEHS).

Judith Shulavith asks– if we live in this plastic environment – why we’re not sicker than we are? And sicker than we used to be? “The answer is, we’re healthier in some ways and sicker in others. Medical advances mean we’re likelier than ever to survive our illnesses, but all kinds of diseases are on the rise. Childhood cancers are up 20 percent since 1975. Rates of kidney, thyroid, liver, and testicular cancers in adults have been steadily increasing. A woman’s risk of getting breast cancer has gone from one in ten in 1973 to one in eight today. Asthma rates doubled between 1980 and 1995, and have stayed level since. Autism-spectrum disorders have arguably increased tenfold over the past 15 years. According to one large study of men in Boston, testosterone levels are down to a degree that can’t be accounted for by factors such as age, smoking, and obesity. Obesity, of course, has been elevated to the status of an epidemic.”(6)

There are many ways to explain upticks in rates of any particular ailment; for starters, a better-informed populace and better tools for detecting disease mean more diagnoses. Other environmental stressors include Americans’ weirdly terrible eating habits, our sedentary lifestyle, and stress itself. But why can’t we just figure this out and come to some conclusions about certain chemicals as the cause of certain diseases? John Vandenberg, a biologist, explains the difficulty : “Well, one of the problems is that we would have to take half of the kids in the kindergarten and give them BPA and the other half not. Or expose half of the pregnant women to BPA in the doctor’s office and the other half not. And then we have to wait thirty to fifty years to see what effects this has on their development, and whether they get more prostate cancer or breast cancer. You have to wait at least until puberty to see if there is an effect on sexual maturation. Ethically, you are not going to go and feed people something if you think it harmful, and, second, you have this incredible time span to deal with.”(7)

Which diseases, exactly, have fetal origins and which chemicals have the power to sidetrack development, and how, is the goal of a giant, 21-year study of 100,000 children called the National Children’s Study (NCS), under the auspices of the National Institutes of Health. However, in 2013, it was announced that the decade-old effort would undergo radical restructuring to cut costs.(8)

Meanwhile, what can you do to protect yourself and your family, since the government isn’t doing that job?  I’ll have some ideas next week.

(1) Freinkel, Susan, “Plastic: Too Good to Throw Away”, The New York Times, March 17, 2011
(2) Ibid.
(3) Moore, C.J., et al, “Density of Plastic Particles found in zooplankton trawls from coastal waters of Northern California to the North Pacific Central Gyre”, Algalita Marine Research Foundation
(4) Shulevitz, Judith, “The Toxicity Panic”, The New Republic, April 7, 2011
(5) Ibid.
(6) Ibid.
(7) Groopman, Jerome, “The Plastic Panic”, The New Yorker, May 31, 2010.
(8) Belli, Brita, “Changes to Children’s Study Threaten its value, experts say”, Simons Foundation Autism Research Initiative; 7 March 2013





The new ecoliteracy

16 05 2013

This blog is supposed to be “textile specific”, meaning we try to keep the topics restricted to those things that apply to the growing of fibers, or the manufacture of synthetic fibers, and the processing of those fibers into cloth.

But society seems to have tunnel vision about many things, such as chemical use. Bisphenol A (BPA) is supposed to be bad for us, so it has been prohibited in baby bottles by legislation. And manufacturers of water bottles advertise that their bottles are “BPA free”. But BPA is used in many other products, from dental sealants to paper cash register receipts – and in textiles, its used in printing ink emulsions.

I had been bothered by the banning of a certain chemical in certain products, and not others (as if BPA in a cash register receipt is not as bad as in a water bottle) when I found this quote by John Muir:

“Whenever we try to pick out anything by itself, we find it hitched to everything else in the universe.”

And then I found Fritjof Capra.

Fritjof Capra, a physicist and systems theorist, is a co-founder of the Center for Ecoliteracy, which supports and advances education for sustainable living. Dr. Capra says that we are all part of an interconnected and self-organizing universe of changing patterns and flowing energy – the “web of life”. Everything is interrelated. He suggests that a full understanding of the critical issues of our time requires a new ecological understanding of life (a new “ecological literacy”) as well as a new kind of “systemic” thinking – thinking in terms of relationships, patterns, and context.

So, in order to understand why world hunger is rising again after a long and steady decline, or what food prices have to do with the price of oil, or why is it so important to grow food locally and organically, we need this new systemic thinking. Fritjof Capra wrote an essay about how to do this, based on a speech he gave at Columbia University in 2008, some of which is excerpted here:

To understand how nature sustains life, we need to move from biology to ecology, because sustained life is a property of an ecosystem rather than a single organism or species. Over billions of years of evolution, the Earth’s ecosystems have evolved certain principles of organization to sustain the web of life. Knowledge of these principles of organization, or principles of ecology, is what we mean by “ecological literacy.”

…In a nutshell: nature sustains life by creating and nurturing communities. No individual organism can exist in isolation. Animals depend on the photosynthesis of plants for their energy needs; plants depend on the carbon dioxide produced by animals, as well as on the nitrogen fixed by bacteria at their roots; and together plants, animals, and microorganisms regulate the entire biosphere and maintain the conditions conducive to life.

Sustainability, then, is not an individual property but a property of an entire web of relationships.

It always involves a whole community. This is the profound lesson we need to learn from nature. The way to sustain life is to build and nurture community. A sustainable human community interacts with other communities – human and nonhuman – in ways that enable them to live and develop according to their nature. Sustainability does not mean that things do not change. It is a dynamic process of co-evolution rather than a static state.

The fact that ecological sustainability is a property of a web of relationships means that in order to understand it properly, in order to become ecologically literate, we need to learn how to think in terms of relationships, in terms of interconnections, patterns, context. In science, this type of thinking is known as systemic thinking or “systems thinking.” It is crucial for understanding ecology, because ecology – derived from the Greek word oikos (“household”) – is the science of relationships among the various members of the Earth Household.

…systems thinking involves a shift of perspective from the parts to the whole. The early systems thinkers coined the phrase, “The whole is more than the sum of its parts.”

What exactly does this mean? In what sense is the whole more than the sum of its parts? The answer is: relationships. All the essential properties of a living system depend on the relationships among the system’s components. Systems thinking means thinking in terms of relationships.

Once we become ecologically literate, once we understand the processes and patterns of relationships that enable ecosystems to sustain life, we will also understand the many ways in which our human civilization, especially since the Industrial Revolution, has ignored these ecological patterns and processes and has interfered with them. And we will realize that these interferences are the fundamental causes of many of our current world problems.

It is now becoming more and more evident that the major problems of our time cannot be understood in isolation. They are systemic problems, which mean that they are all interconnected and interdependent. One of the most detailed and masterful documentations of the fundamental interconnectedness of world problems is the new book by Lester Brown, Plan B (Norton, 2008). Brown, founder of the Worldwatch Institute, demonstrates in this book with impeccable clarity how the vicious circle of demographic pressure and poverty leads to the depletion of resources – falling water tables, wells going dry, shrinking forests, collapsing fisheries, eroding soils, grasslands turning into desert, and so on – and how this resource depletion, exacerbated by climate change, produces failing states whose governments can no longer provide security for their citizens, some of whom in sheer desperation turn to terrorism.

When you read this book, you will understand how virtually all our environmental problems are threats to our food security – falling water tables; increasing conversion of cropland to non-farm uses; more extreme climate events, such as heat waves, droughts, and floods; and, most recently, increasing diversion of grains to biofuel.

A critical factor in all this is the fact that world oil production is reaching its peak. This means that, from now on, oil production will begin to decrease worldwide, extraction of the remaining oil will be more and more costly, and hence the price of oil will continue to rise. Most affected will be the oil-intensive segments of the global economy, in particular the automobile, food, and airline industries.

The search for alternative energy sources has recently led to increased production of ethanol and other biofuels, especially in the United States, Brazil, and China. And since the fuel-value of grain is higher on the markets than its food-value, more and more grain is diverted from food to producing fuels. At the same time, the price of grain is moving up toward the oil-equivalent value. This is one of the main reasons for the recent sharp rise of food prices. Another reason, of course, is that a petrochemical, mechanized, and centralized system of agriculture is highly dependent on oil and will produce more expensive food as the price of oil increases. Indeed, industrial farming uses 10 times more energy than sustainable, organic farming.

The fact that the price of grain is now keyed to the price of oil is only possible because our global economic system has no ethical dimension. In such a system, the question, “Shall we use grain to fuel cars or to feed people?” has a clear answer. The market says, “Let’s fuel the cars.”

This is even more perverse in view of the fact that 20 percent of our grain harvest will supply less than 4 percent of automotive fuel. Indeed, the entire ethanol production in this country could easily be replaced by raising average fuel efficiency by 20 percent (i.e. from 21 mpg to 25 mpg), which is nothing, given the technologies available today.

The recent sharp increase in grain prices has wreaked havoc in the world’s grain markets, and world hunger is now on the rise again after a long steady decline. In addition, increased fuel consumption accelerates global warming, which results in crop losses in heat waves that make crops wither, and from the loss of glaciers that feed rivers essential to irrigation. When we think systemically and understand how all these processes are interrelated, we realize that the vehicles we drive, and other consumer choices we make, have a major impact on the food supply to large populations in Asia and Africa.

All these problems, ultimately, must be seen as just different facets of one single crisis, which is largely a crisis of perception. It derives from the fact that most people in our society, and especially our political and corporate leaders, subscribe to the concepts of an outdated worldview, a perception of reality inadequate for dealing with our overpopulated, globally interconnected world.

The main message of Lester Brown’s Plan B, is that there are solutions to the major problems of our time; some of them even simple. But they require a radical shift in our perceptions, our thinking, our values. And, indeed, we are now at the beginning of such a fundamental change of worldview, a change of paradigms as radical as the Copernican Revolution. Systems thinking and ecological literacy are two key elements of the new paradigm, and very helpful for understanding the interconnections between food, health, and the environment, but also for understanding the profound transformation that is needed globally for humanity to survive.