As the War on Weed Winds Down, Will Monsanto Be the Big Winner?

As the War on Weed Winds Down, Will Monsanto Be the Big Winner?

By Ellen Brown
The Web Of Debt
June 23, 2016

The war on cannabis that began in the 1930s seems to be coming to an end. Research shows that this natural plant, rather than posing a deadly danger to health, has a wide range of therapeutic benefits. But skeptics question the sudden push for legalization, which is largely funded by wealthy investors linked to Big Ag and Big Pharma.

In April, Pennsylvania became the 24th state to legalize medical cannabis, a form of the plant popularly known as marijuana. That makes nearly half of US states. A major barrier to broader legalization has been the federal law under which all cannabis – even the very useful form known as industrial hemp – is classed as a Schedule I controlled substance that cannot legally be grown in the US. But that classification could change soon. In a letter sent to federal lawmakers in April, the US Drug Enforcement Administration said it plans to release a decision on rescheduling marijuana in the first half of 2016.

The presidential candidates are generally in favor of relaxing the law. In November 2015, Senator Bernie Sanders introduced a bill that would repeal all federal penalties for possessing and growing the plant, allowing states to establish their own marijuana laws. Hillary Clinton would not go that far but would drop cannabis from a Schedule I drug (a deadly dangerous drug with no medical use and high potential for abuse) to Schedule II (a deadly dangerous drug with medical use and high potential for abuse). Republican candidate Donald Trump says we are losing badly in the war on drugs, and that to win that war all drugs need to be legalized.

But it is Green Party presidential candidate Dr. Jill Stein who has been called “weed’s biggest fan.” Speaking from the perspective of a physician and public health advocate, Stein notes that hundreds of thousands of patients suffering from chronic pain and cancers are benefiting from the availability of medical marijuana under state laws. State economies are benefiting as well. She cites Colorado, where retail marijuana stores first opened in January 2014. Since then, Colorado’s crime rates and traffic fatalities have dropped; and tax revenue, economic output from retail marijuana sales, and jobs have increased.

Among other arguments for changing federal law is that the marijuana business currently lacks access to banking facilities. Most banks, fearful of FDIC sanctions, won’t work with the $6.7 billion marijuana industry, leaving 70% of cannabis companies without bank accounts. That means billions of dollars are sitting around in cash, encouraging tax evasion and inviting theft, to which an estimated 10% of profits are lost. But that problem too could be remedied soon. On June 16, the Senate Appropriations Committee approved an amendment to prevent the Treasury Department from punishing banks that open accounts for state-legal marijuana businesses.

Boosting trade in the new marijuana market is not a good reason for decriminalizing it, of course, if it actually poses a grave danger to health. But there have been no recorded deaths from cannabis overdose in the US. Not that the herb can’t have problematic effects, but the hazards pale compared to alcohol (30,000 deaths annually) and to patented pharmaceuticals, which are now the leading cause of death from drug overdose. Prescription drugs taken as directed are estimated to kill 100,000 Americans per year.

Behind the War on Weed: Taking Down the World’s Largest Agricultural Crop

The greatest threat to health posed by marijuana seems to come from its criminalization. Today over 50 percent of inmates in federal prison are there for drug offenses, and marijuana tops the list. Cannabis cannot legally be grown in the US even as hemp, a form with very low psychoactivity. Why not? The answer seems to have more to do with economic competition and racism than with health.

Cannabis is actually one of the oldest domesticated crops, having been grown for industrial and medicinal purposes for millennia. Until 1883, hemp was also one of the largest agricultural crops (some say the largest). It was the material from which most fabric, soap, fuel, paper and fiber were made. Before 1937, it was also a component of at least 2,000 medicines.

In early America, it was considered a farmer’s patriotic duty to grow hemp. Cannabis was legal tender in most of the Americas from 1631 until the early 1800s. Americans could even pay their taxes with it. Benjamin Franklin’s paper mill used cannabis. Hemp crops produce nearly four times as much raw fiber as equivalent tree plantations; and hemp paper is finer, stronger and lasts longer than wood-based paper. Hemp was also an essential resource for any country with a shipping industry, since it was the material from which sails and rope were made.

Today hemp is legally grown for industrial use in hundreds of countries outside the US. A 1938 article in Popular Mechanics claimed it was a billion-dollar crop (the equivalent of about $16 billion today), useful in 25,000 products ranging from dynamite to cellophane. New uses continue to be found. Claims include eliminating smog from fuels, creating a cleaner energy source that can replace nuclear power, removing radioactive water from the soil, eliminating deforestation, and providing a very nutritious food source for humans and animals.

To powerful competitors, the plant’s myriad uses seem to have been the problem. Cannabis competed with the lumber industry, the oil industry, the cotton industry, the petrochemical industry and the pharmaceutical industry. In the 1930s, the plant in all its forms came under attack.

Its demonization accompanied the demonization of Mexican immigrants, who were then flooding over the border and were widely perceived to be a threat. Pot smoking was part of their indigenous culture. Harry Anslinger, called “the father of the war on weed,” was the first commissioner of the Federal Bureau of Narcotics, a predecessor to the Drug Enforcement Administration. He fully embraced racism as a tool for demonizing marijuana. He made such comments as “marijuana causes white women to seek sexual relations with Negroes, entertainers and any others,” and “Reefer makes darkies think they’re as good as white men.” In 1937, sensational racist claims like these caused recreational marijuana to be banned; and industrial hemp was banned with it.

Classification as a Schedule I controlled substance came in the 1970s, with President Richard Nixon’s War on Drugs. The Shafer Commission, tasked with giving a final report, recommended against the classification; but Nixon ignored the commission.

According to an April 2016 article in Harper’s Magazine, the War on Drugs had political motives. Top Nixon aide John Ehrlichman is quoted as saying in a 1994 interview:

The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. . . . We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.

Competitor or Attractive New Market for the Pharmaceutical Industry?

The documented medical use of cannabis goes back two thousand years, but the Schedule I ban has seriously hampered medical research. Despite that obstacle, cannabis has now been shown to have significant therapeutic value for a wide range of medical conditions, including cancer, Alzheimer’s disease, multiple sclerosis, epilepsy, glaucoma, lung disease, anxiety, muscle spasms, hepatitis C, inflammatory bowel disease, and arthritis pain.

New research has also revealed the mechanism for these wide-ranging effects. It seems the active pharmacological components of the plant mimic chemicals produced naturally by the body called endocannabinoids. These chemicals are responsible for keeping critical biological functions in balance, including sleep, appetite, the immune system, and pain. When stress throws those functions off, the endocannabinoids move in to restore balance.

Inflammation is a common trigger of the disease process in a broad range of degenerative ailments. Stress triggers inflammation, and cannabis relieves both inflammation and stress. THC, the primary psychoactive component of the plant, has been found to have twenty times the anti-inflammatory power of aspirin and twice that of hydrocortisone.

CBD, the most-studied non-psychoactive component, also comes with an impressive list of therapeutic uses, including against cancer and as a super-antibiotic. CBD has been shown to kill “superbugs” that are resistant to currently available drugs. This is a major medical breakthrough, since for some serious diseases antibiotics have reached the end of their usefulness.

Behind the Push for Legalization

The pharmaceutical industry has both much to gain and much to lose from legalization of the cannabis plant in its various natural forms. Patented pharmaceuticals have succeeded in monopolizing the drug market globally. What that industry does not want is to be competing with a natural plant that anyone can grow in his backyard, which actually works better than very expensive pharmaceuticals without side effects.

Letitia Pepper, who suffers from multiple sclerosis, is a case in point. A vocal advocate for the decriminalization of marijuana for personal use, she says she has saved her insurance company $600,000 in the last nine years, using medical marijuana in place of a wide variety of prescription drugs to treat her otherwise crippling disease. That is $600,000 the pharmaceutical industry has not made, on just one patient. There are 400,000 MS sufferers in the US, and 20 million people who have been diagnosed with cancer sometime in their lives. Cancer chemotherapy is the biggest of big business, which would be directly threatened by a cheap natural plant-based alternative.

The threat to big industry profits could explain why cannabis has been kept off the market for so long. More suspicious to Pepper and other observers is the sudden push to legalize it. They question whether Big Pharma would allow the competition, unless it had an ace up its sleeve. Although the movement for marijuana legalization is a decades-old grassroots effort, the big money behind the recent push has come from a few very wealthy individuals with links to Monsanto, the world’s largest seed company and producer of genetically modified seeds. In May of this year, Bayer AG, the giant German chemical and pharmaceutical company, made a bid to buy Monsanto. Both companies are said to be working on a cannabis-based extract.

Natural health writer Mike Adams warns:

[W]ith the cannabis industry predicted to generate over $13 billion by 2020, becoming one of the largest agricultural markets in the nation, there should be little doubt that companies like Monsanto are simply waiting for Uncle Sam to remove the herb from its current Schedule I classification before getting into the business.

. . . [O]ther major American commodities, like corn and soybeans, are on average between 88 and 91 percent genetically modified. Therefore, once the cannabis industry goes national, and that is most certainly primed to happen, there will be no stopping the inevitability of cannabis becoming a prostituted product of mad science and shady corporate monopoly tactics.

With the health benefits of cannabis now well established, the battlefield has shifted from its decriminalization to who can grow it, sell it, and prescribe it. Under existing California law, patients like Pepper are able to grow and use the plant essentially for free. New bills purporting to legalize marijuana for recreational use impose regulations that opponents say would squeeze home growers and small farmers out of the market, would heighten criminal sanctions for violations, and could wind up replacing the natural cannabis plant with patented, genetically modified (GMO) plants that must be purchased year after year. These new bills and the Monsanto/Bayer connection will be the subject of a follow-up article. Stay tuned.

_________

Ellen Brown is an attorney, Founder of the Public Banking Institute, and author of twelve books, including the best-selling Web of Debt. Her latest book, The Public Bank Solution, explores successful public banking models historically and globally. Her 300+ blog articles are at EllenBrown.com. She can be heard biweekly on “It’s Our Money with Ellen Brown” on PRN.FM.

https://ellenbrown.com/2016/06/23/the-war-on-weed-is-winding-down-but-will-monsanto-be-the-winner/

| Leave a comment

Smoke and Fumes: Six Decades of Oil-Tobacco Nexus of Deception and Attacks on Science

Smoke and Fumes:  Six De cades of Oil-Tobacco Nexus of Deception and Attacks on Science

By Brendan DeMelle
The DeSmog Blog
July 20, 2016

The Center for International Environmental Law (CIEL) today expanded its website SmokeandFumes.org, featuring a new video and more internal industry documents dating back to the 1950s that reveal the nexus between the oil and tobacco industries’ shared campaigns to undermine science to delay accountability and political action to curtail their deadly products.

CIEL has uncovered new evidence showing that it was the work performed for the oil industry by PR firms (particularly Hill & Knowlton) that attracted the tobacco industry to follow suit — in contrast to the prevailing narrative that Big Oil deployed the Tobacco Playbook to ward off responsibility for climate change resulting from its fossil fuel pollution.

Again and again we found both the PR firms and the researchers worked first for oil, then for tobacco,” said CIEL President Carroll Muffett in a statement. “It was a pedigree the tobacco companies recognized and sought out.”

ExxonMobil’s excuse in the face of #ExxonKnew has, in part, relied on the defense that oil is not the new tobacco. At the end of the day, as Muffett points out in the video below, the final result is the same, despite who was first to devise the strategies of deception and attacking inconvenient science.

The infamous “Doubt is our product” tobacco memo articulated the strategy most succinctly, but the whole package of deception, delay, and attacks on science have been shared, refined and endlessly deployed by both industries (and many others) since the 1950s.

It reminds me of that old “I learned it by watching you” anti-drug PSA. You’re both still busted, tobacco and oil industries. It doesn’t matter who came first.

Watch the video for the whole story, and check out SmokeandFumes.org for the incredible cache of internal documents uncovered by the Center for International Environmental Law.

And check out the earlier videos produced by CIEL about Smoke and Fumes too:

 

http://www.desmogblog.com/2016/07/20/smoke-and-fumes-six-decades-oil-tobacco-nexus-deception-and-attacks-science

 

| Leave a comment

Are Oil Trains Just Too Heavy? No Regulations, No Weigh To Know

Are Oil Trains Just Too Heavy? No Regulations, No Weigh To Know

By Justin Mikulka
The DeSmog Blog
July 7, 2016

IMG_1380

The cause of the most recent bomb train derailment and fire in Mosier, OR has been determined to be lag bolts that had sheared off resulting in the derailment. This once again raises concerns that the unit trains of oil are putting too much stress on the tracks due to their excessive weight and length.

There is precedent for this issue according to rail consultant and former industry official Steven Ditmeyer. In the early 1990s, there was a similar problem with some double stacked container cars being too heavy for the infrastructure — because of overloaded containers — resulting in sheared rail spikes.

This sounds like a very similar circumstance to what was happening in the early 1990s with overloaded double stack container cars,” Ditmeyer told DeSmog.

So, since double stacked containers are currently in wide use but there are no longer derailment issues like in the 1990s, what changed?

Once they began weighing the containers before they loaded them — and they made sure the center plates of the trucks under the cars were lubricated so they could swivel more easily — the problem basically went away,” Ditmeyer explained.

So, by implementing a practice of weighing the containers before loading them it was possible to avoid overloaded rail cars. Nothing too far fetched in that reasoning.

These Oil Trains Are Too Heavy, Too Long, Too Fast

While the Mosier accident provides more evidence that unit trains of oil are putting more stress on the tracks, it isn’t the first time we have learned of this problem.

As the LA Times reported in 2015, investigators at Canada’s Transportation Safety Board suspect that the oil trains are causing unusual track damage.

Petroleum crude oil unit trains transporting heavily loaded tank cars will tend to impart higher than usual forces to the track infrastructure during their operation,” the safety board said in a report. “These higher forces expose any weaknesses that may be present in the track structure, making the track more susceptible to failure.”

One of the other suspected causes of the oil train derailments is the length of the trains, which create repetitive stresses on the tracks not made by shorter trains. Doug Finnson, president of the Teamsters Rail Conference of Canada, expressed concerns about this to CBC News after an oil train derailment in Canada last year saying, “These trains are likely too long, too heavy and going too fast for the track conditions in place.”

Federal Railroad Administration Concerned About Train Weight in 2013 

In the weeks following the oil train disaster in Lac-Megantic, a lot of important questions were asked.

Some of these questions were posed by the Federal Railroad Administration (FRA) in a July 2013 letter to the American Petroleum Institute (API).

In the July 2013 letter, Thomas J. Herrmann, Acting Director of the Office of Safety Assurance and Compliance addressed API CEO Jack Gerard outlining several safety concerns including the following:

FRA notes that tank cars overloaded by weight are typically identified when the tank cars go over a weigh-in-motion scale at a railroad’s classification yard. As indicated above, crude oil is typically moved in unit trains, and the cars in a unit train do not typically pass over weigh-in-motion scales in classification yards.

So, we know that the weight of oil trains is a safety concern. And in 2013 the FRA had information showing that the industry wasn’t using weigh-in-motion scales to check loaded tank car weights.

And we also know that using scales to weigh containers before double stacking them solved the overloading problem in the 1990s.

So what was the API’s response to this letter? We don’t know yet because last week the Federal Railroad Administration told DeSmog it will require a Freedom of Information request to get a copy. (DeSmog’s last FOI request submitted to the FRA took almost two years to get a response.)

American Petroleum Institute Makes the Rules

So why was the FRA asking the oil industry’s most powerful lobbying group questions about oil-by-rail safety?

The answer to that helps explain why unsafe oil trains continue to roll through communities across America. The American Petroleum Institute is one of the most powerful lobbying groups in America with a CEO paid as much as $13 million a year to advance the oil industry agenda.

However, the API also happens to be the organization responsible for writing the standards for “safety” for oil trains.

A Washington Post article summed it up nicely saying, “API, which still sets global technical standards…”

Still. Anyone see a problem with letting an organization that has denied the scientific facts about climate change be responsible for determining the facts and science regarding the safe operation of oil trains?

The development of standards is a major part of API’s ongoing work to enhance safety throughout our industry,” said API President and CEO Jack Gerard in 2014. “This particular standard is one element of a much broader approach to safety improvement.”

Gerard was referring to the 2014 standard on filling oil tank cars.

It is important to remember that the more oil you can put in a tank, the more money you can make. So the oil industry has an incentive to overfill tank cars.

Despite the evidence of weighing double stack containers to make sure the trains weren’t too heavy — which eliminated the issue of sheared rail spikes — scales are not being used to avoid overloaded oil tank cars.

So, if as the FRA noted in its 2013 letter, overloaded tank cars are typically identified in the rail industry by weigh-in-motion scales — but those aren’t used for oil unit trains — the question is why not?

And the answer is because the American Petroleum Institute makes the rules. And this is what the API standard says about weighing tank cars to check for overweight cars.

Static (stationary) or weigh-in-motion (dynamic) weigh scales (railway track scales) are acceptable methods for quantity determination of crude oil. If an operator considers weigh scales as an option…

Acceptable, not required, and up to the operator. And thus the scales are not currently used.

If The Oil Is Boiling At Room Temperature, Testing Results May Be “Erroneous”

So if you choose to not weigh your tank cars to see if they are too heavy, what are your options? According to the API standard, there is the option of “Calculating the Loading Target Quantity (LTQ).”

To do this the standard says “the LTQ calculation system or process will require density as an input variable.”

And it goes on to say:

Multiple test methods exist for measuring density. Methods for determining density include API MPMS Ch. 9.1, API MPMS Ch. 9.3 or ASTM D5002 [13]. Application of the test method for density requires a dead crude oil (3.7) or field stabilization of the crude oil prior to sampling and testing. Indications of un-stabilized crude oil are visible bubbling, foaming, and/or boiling.

Never mind the basic fact that the industry may be shipping oil that is boiling at room temperature — which should make everyone worried. The problem with this approach is that all of the Bakken crude is unstabilized. So then the standard includes this note:

NOTE Use of density test methods on un-stabilized/live crude oils can yield erroneous results due to loss of light components.

The API standard notes that testing unstabilized crude for density can yield erroneous results. Which is a bit of a problem since Bakken crude is not stabilized before it’s loaded into rail tank cars.

This is what happens when you let oil companies and oil lobbyists write the standards on how to safely operate oil trains.

Industry Plays By The Rules It Writes, And Wins 

It is clear that heavier trains can increase the likelihood of things like lag bolts shearing as they did in Mosier, Oregon, causing an oil train derailment and subsequent fire and spill. That is basic physics.

And as they did in the 1990s with overloaded container trains, the problem was solved in part by weighing the containers to eliminate overloading.

After the Mosier accident, FRA Administrator Sarah Feinberg commented to the Associated Press on concerns about the oil tank cars being too heavy.

“Feinberg said tank cars that haul crude oil and other products have weight limits, but there’s been no suggestion Union Pacific’s cars exceeded them.”

The FRA was concerned about this issue in 2013. We know Canada’s Transportation Safety Board said, “Petroleum crude oil unit trains transporting heavily loaded tank cars will tend to impart higher than usual forces to the track infrastructure during their operation.”

And in the case of Mosier the sheared lag bolts certainly suggest that the trains could be too heavy.

But the reality is that whether anyone suggests it or not, there is currently no way to know. Because the American Petroleum Institute is writing the rules.

http://www.desmogblog.com/2016/07/07/are-oil-trains-just-too-long-and-heavy-tracks

| Leave a comment

Rail Industry Requests Massive Loophole in Oil-by-Rail Safety To Extend Bomb Trains Well Beyond 2025

Rail Industry Requests Massive Loophole in Oil-by-Rail Safety To Extend Bomb Trains Well Beyond 2025

By Justin Mikulka
The DeSmog Blog
July 21, 2016

In the most recent oil-by-rail accident in Mosier, Oregon the Federal Rail Administration (FRA) concluded that the tank cars involved — the jacketed CPC-1232 type — “performed as expected.” So an oil train derailing at the relatively slow speed of 25 mph should be “expected” to have breached cars resulting in fiery explosions.

Current regulations allow those tank cars to continue rolling on the track carrying volatile Bakken crude oil and ethanol until 2025 with no modifications.

Yet industry lobbying group the Railway Supply Institute (RSI) has now requested the Federal Railroad Administration to essentially allow these jacketed CPC-1232 tank cars to remain on the tracks for decades beyond 2025.

This was just one of the troubling facts that came to light at the National Transportation Safety Board (NTSB) roundtable on tank car safety on July 13th, and perhaps the one of greatest concern to anyone living in an oil train blast zone like Mosier, Oregon.

Just Re-Stencil It and Call It a DOT 117

One of the biggest risks with Bakken oil train accidents is that often the only way to deal with the fires is to let them burn themselves out. This can result in full tank cars becoming engulfed in flames for hours or days in what is known as a pool fire. This can lead to a “thermal tear” in the tank and the signature mushroom cloud of fire so often seen with these derailments.

The new regulations address this issue by requiring tank cars to have a layer of ceramic insulation covering the entire tank car to prevent the oil from heating up to the point of creating a thermal tear (ceramic shown in pink in the image below.)

Image credit: NTSB

However, the RSI has requested the FRA to allow the existing jacketed CPC-1232 cars, like the ones in the Mosier accident, to not require the ceramic thermal protection.

The industry’s argument is that the current fiberglass insulation on the CPC-1232 is sufficient protection. However, the fact that the fiberglass insulation was not designed to protect the contents of a tank car from fire does not seem to bother the RSI.

At the same time the RSI is arguing against thermal protection for CPC-1232s, the RSI has helpful videos on its website explaining the new safety features for DOT-117 tank cars — including “thermal protection.”

The NTSB’s Robert Sumwalt summed up what this request would mean in one simple statement at the July 13 round table event saying, “the same type of cars as in Mosier can be re-stenciled as DOT-117R with nothing more than a new bottom outlet valve.” [R stands for retrofit.]

So, they are essentially asking to paint over the CPC-1232 label on the tank cars with a DOT-117 while doing nothing more than changing the bottom outlet valve. Which means we should expect many more accidents like Mosier in the future since most of these CPC-1232 cars are only a few years old and they have an expected working life of 30-40 years.

As Robert Sumwalt said in his opening statement explaining why we should expect many more fiery oil train derailments with the existing tank car fleet, “just do the math.”

Industry Arguments Laughable If Not For the Consequences

Would you believe that one of the arguments made at the roundtable in favor of not requiring thermal protection on these cars was that the oil itself acts as a heat sink? Which is true. Until the point where the oil absorbs so much heat from the fire that the tank car explodes.

However, the reason this argument is given credibility is that the regulations only require a tank car to endure sitting in a pool fire for 100 minutes without exploding. Forget the fact that many of the Bakken oil train accidents have involved fires that burned for days.

This 100-minute limit was the same reasoning used to justify the fiberglass insulation on the current jacketed CPC-1232 as offering sufficient protection, as per the industry request. Which led to the following exchange between the NTSB’s Sumwalt and RSI representative John Byrne.

Byrne: “In our own modeling the fiberglass insulation system met the federal requirement for thermal protection.”

Sumwalt: “But in reality in the fiberglass situation, doesn’t the fiberglass all just melt… doesn’t it also melt and all end up pooling down in the bottom in the void between the blanket and the shell?”

Byrne: “Basically yes…but at the same time, that whole system acts as a thermal protection system in that it meets the requirement based on the federal law.”

Sumwalt: “Ok, thanks. So it meets the requirements.”

So, along with the oil itself being offered as adequate thermal protection, we also get fiberglass that melts in a fire being offered as protection for anyone in the blast zone.

So what did the regulators have to say about this absurd argument?

FRA’s Karl Alexy made it clear that “industry” concerns were receiving serious consideration saying, “we’re not taking it lightly, we understand what it means to industry… be certain that we are taking this very seriously.”

Well, we do understand what it means to the industry. Adding ceramic thermal protection would cut into profits. And one thing that was made clear repeatedly during the day’s discussion was that this was all about the money and that safety was only for people worried about “risk.”

As usual when there is a discussion about oil train safety, the oil industry lobbying group the American Petroleum Institute had a seat at the table. API representative Susan Lemieux cut to the heart of the issue with some actual honesty.

“In the industry we don’t see transportation as a risk, it is just a function of business.”

Why try to improve the situation when you don’t see any risk?

The FRA and the Pipeline and Hazardous Materials Safety Administration have informed DeSmog that they will issue a formal response to the industry’s request to allow the fiberglass to qualify as thermal protection in the near future.

The Ground Rules – Profits Over Safety

In the above slide shown of the DOT-117, there is one other important thing to note. The shells on those tank cars are 9/16th of an inch thick. The shells of the jacketed CPC-1232 are 7/16th of an inch thick. This difference has safety implications as the thinner shells rupture more easily.  The RSI points out this fact in a video on its website about the advantages of the thicker shells on the DOT-117 which they say are “less prone to puncture.”

But the more important difference, as we have pointed out repeatedly at DeSmog, is that safer car designs are heavier, which means they can transport less oil per car. That lower capacity again cuts into profits. This point was made by ExxonMobil in a slide they presented to regulators arguing against thicker tank shells.

Exxon%20Oil%20by%20Rail%20Car%20Lobbying_0

While Exxon was not at the roundtable, plenty of oil and rail industry representatives were, and they made this point very clear.

Gabe Claypool, President of oil train operators Dakota Plains, explained why it made economic sense to use CPC-1232s over DOT-117s.

“A lot of it’s economics as well…we were just having a conversation around the sizing of the car, the 1232 car type is very much in abundance and it is also a larger car. In the current category of still trying to be profitable, if I can get that extra volume in a larger car that is still regulatorally [sic] compliant, they’re [sic] gonna stick with that.”

Richard Kloster of rail consulting firm Alltranstek was one of the more vocal participants during the roundtable and he repeatedly made points about the economics of retrofitting the CPC-1232 over buying the new DOT-117 saying, “The retrofit is always going to win economically.”

Kloster also made it clear where the industry put its priorities when it came to safety versus profit saying, “There has got to be a balance between safety and the economic viability of moving these products by rail” and that there were a “lot of cases, you know, where economics wins all the time but risk trumps economics in some cases.”

Economics wins all the time.

There was one representative from labor at the roundtable who did not offer a comment until the final closing segment, but he also shared the reality of what was driving the decisionmaking when he discussed the need for safety but stated, “I know it’s about money.”

ExxonMobil Wins Again

So, in the end, ExxonMobil and the oil industry have won again. Watching this roundtable and the many congressional hearings and previous NTSB events in the past few years and seeing the lack of progress on real safety improvements, it almost seems like this all was orchestrated from the start.

In the years leading up to the latest tank car rulemaking, the industry essentially ordered a whole new fleet of CPC-1232 cars which they are currently using. The CPC-1232 cars have the thinner tank shells which makes them more prone to puncture and also more profitable. And they are ok to use, unchanged, until 2025. If the industry request is approved, those cars will just need new bottom outlet valves after 2025.

Regardless, they will always have the thinner tank shells, like Exxon wanted.

At the end of the July 13 event, Robert Sumwalt made an interesting statement. He said, “some of us met yesterday to go over the ground rules.”

The meeting where they went over the ground rules was not open to the public or media. If one were to hazard a guess as to what the first and foremost ground rule set was, it would be a safe bet to posit it was that “economics wins all the time.”

http://www.desmogblog.com/2016/07/19/rail-industry-lobby-petitions-massive-oil-rail-safety-loophole

| Leave a comment

Oil Change International Report: Planned Gas Pipeline Construction on East Coast Puts Climate at Risk

Oil Change International Report:  Planned Gas Pipeline Construction on East Coast Puts Climate at Risk

By Sharon Kelly
The DeSmog Blog
July 22, 2016

Nineteen now-pending pipeline projects, if constructed, would let enough natural gas flow out of the Appalachian basin to cause the entire US to blow through its climate pledges, ushering the world into more than 2 degrees Celsius of global warming, a newly released report by Oil Change International concludes.which was also endorsed by roughly a dozen national and local environmental organizations including 350.org, Earthworks, and the Sierra Club,

Even if the Environmental Protection Agency’s recently-announced methane rules manage to slash leaks from new natural gas infrastructure as planned, building those pipelines would be catastrophic for the climate, the researchers warn.

“All together, these 19 pending pipeline projects would enable 116 trillion cubic feet of additional gas production by 2050,” the report, entitled A Bridge Too Far:  How Appalachian Basin Gas Pipeline Expansion Will Undermine U.S. Climate Goals, says. “The currently planned gas production expansion in Appalachia would make meeting U.S. climate goals impossible, even if the [Obama] Administration’s newly proposed methane rules are successful in reducing methane leakage by 45 percent.”

Why do these pipelines matter so much?

In part, because right now there’s a pipeline bottleneck that’s keeping some of the gas from the East Coast – states like Pennsylvania and West Virginia where the Marcellus shale has sparked a drilling rush – from being tapped. But building these proposed lines would let drillers pump much more gas out over the next several decades. “All together, these 19 pending pipeline projects would enable 116 trillion cubic feet of additional gas production by 2050,” the report concludes.

And in part, those 19 pipeline projects matter so much because building pipelines sets in motion changes that will last for decades. “New gas power plants and pipelines are designed to last at least 40 years,” the report, which was also endorsed by roughly a dozen national and local environmental organizations including 350.org, Earthworks, and the Sierra Club, notes. “Once the initial capital has been spent on them, they will likely operate even at a loss to the detriment of cleaner sources.”

“Expanded natural gas production is a bridge to climate disaster,” said Stephen Kretzmann, Executive Director of Oil Change International. “Our report shows that even if we entirely eliminated emissions from coal and oil, the emissions from the natural gas boom alone would still blow our climate budget.”

Building out gas infrastructure is risky for consumers as well as the climate. While natural gas is historically cheap at the moment, there’s always the potential for things to change dramatically over the next several decades, since the natural gas market is infamous for its sharp swings, dramatic price collapses followed by sudden spikes.

The natural gas industry often argues that gas can help with the transition to renewables because wind and solar are intermittent sources of power – the wind’s not always blowing and the sun’s not always shining. So if electrical power plants can switch over to natural gas fired plants when wind and solar ebb (possible because natural gas plants can be fired up or switched off faster than coal-fired plants), then the power grid can be kept stable, gas advocates argue.

The new report tackles that issue head-on, arguing that renewables are ready to stand on their own. “In many parts of the U.S., renewable energy is today the lowest-cost and lowest-impact means to add generation capacity to our electricity system,” the authors write. “Battery storage and grid management technology are ready to even out the intermittency of wind and solar. Widely held assumptions about the need for fossil fuel baseload power and limits to renewable energy penetration are unravelling fast.”

Failing to switch away from natural gas will carry serious climate consequences. If current U.S. policies are kept in place, the Energy Information Administration predicts that gas production will rise 55 percent by 2040 – and carbon emissions from energy production will drop just 4 percent, according to the Annual Energy Outlook 2016. But, the Oil Change report notes, staying below 2 degrees requires the U.S. to slash emissions 83 percent from 2005 levels by 2050.

“In other words,” it adds, “even if gas were the only source of greenhouse gases in 2040, it would still blow the U.S. carbon budget.”

And that’s taking the EPA‘s new methane rules, announced in May, into account and using conservative estimates for how much natural gas will leak from newly built pipelines. The researchers used methane leakage rates of 3.8 percent – even though they acknowledge that peer-reviewed research concludes that leaks may in reality be as high as 12 percent.

Meanwhile, crucial decisions for the climate are being made right now, at a time when a shale drilling rush powered by fracking has helped drive natural gas prices to historic lows and when coal-fired plants are increasingly being retired.

But right now, the U.S. is building infrastructure that will make the country increasingly reliant on natural gas. “In 2010, there were only 493 natural gas-powered electric plants in the United States,” the Bulletin of the Atomic Scientists reported this week. “By 2015, the number of plants more than tripled to 1,740, and nearly all of the new plants are combined-cycle ones,” or natural gas plants that are designed to provide power 24 hours a day – not as backup for renewable energy sources.

Globally, there is a rapidly closing window to make key infrastructure decisions – a window that could close as soon as next year. “Most recently, a study out of Oxford University examined the ‘2°C Capital Stock’ to see how close the world is to building the electricity generation infrastructure that, if utilized to the end of its economic life, would take the world past the 2°C goal,” the Oil Change report notes. “The disturbing conclusion they came to is that we will be there in 2017.”

The federal government has taken some steps to reduce methane leaks writ large. Earlier this month, the EPA announced rules that it said would reduce methane from landfills by roughly a third – adding up to the equivalent of 8.2 million metric tons of carbon dioxide emissions. Unlike the agency’s new rules for the oil and gas industry, the landfill rules apply to both new and existing locations.

Environmentalists have called on the Obama administration to expand its recently announced methane controls to cover existing pipelines, wells, and other oil and gas infrastructure as well. And in a new report the Government Accountability Office focused on methane leaks from federal lands, faulting the Bureau of Land Management for poor record-keeping on flaring, venting and leaks on government-owned land.

But the federal government isn’t the only regulator involved, since states shoulder most of the responsibility for oil and gas controls due to exceptions and exemptions for the drilling industry from many federal environmental statutes.

Yesterday, California moved forward with proposed rules that would drive methane emissions from new and existing sites down by as much as 45 percent. Those rules would make California the second state after Colorado to issue its own methane rules.

California’s new methane protections are an important step forward and will rein in pollution from one of the dirtiest industries in the state,” said Andrew Grinberg of Clean Water Action in a statement yesterday responding to that announcement. “EPA should follow California’s lead and enact national methane standards for existing oil and gas infrastructure.”

Meanwhile along the East Coast, whistleblowers have warned that another major new pipeline project, the “Algonquin Incremental Market” pipeline – not among the 19 projects covered by the new report – is being constructed without proper inspection, increasing the risk of leaks and explosions, as DeSmog reported earlier this week.

There’s a strong and vital movement of people throughout Appalachia who are standing up to protect their communities and their land from pipelines and the increase in fracking they will bring,” Mr. Kretzmann added. “This report makes it clear that all of these people are fighting for our climate too.”

http://www.desmogblog.com/2016/07/22/planned-gas-pipeline-construction-east-coast-puts-climate-risk-report

| Leave a comment

Former Inspectors Describe Dangerous Flaws in Construction of Major East Coast Gas Pipeline

 

Former Inspectors Describe Dangerous Flaws in Construction of Major East Coast Gas Pipeline

By Sharon Kelly
The DeSmog Blog
July 19, 2016

In April, a massive explosion ripped through rural Salem Township, Pennsylvania when natural gas from a pipeline buried in a field suddenly ignited.

The Salem Township explosion offers a glimpse at how dangerous a natural gas pipeline accident can be — the blast when the 30-inch pipeline ignited blew a 12-foot deep hole in the ground and scorched 40 acres, sending one man to the hospital with burns on 75 percent of his body.

“It looked like you were looking down into hell,” a local fire chief, Bob Rosatti, told ABC News. “As far across my windshield as I could see was just a massive fireball.”

That incident is just the latest accident for Texas Eastern Transmission, a Spectra Energy subsidiary running the pipeline that exploded. In the last decade, Texas Eastern has caused dozens of accidents, resulting in over $13.7 million in property damage, according to Pipeline and Hazardous Materials Safety Administration (PHMSA) records.

A preliminary PHMSA report into the Salem Township explosion revealed that the pipeline at the explosion site had become corroded, and highlighted problems with the coating on a weld made in 1981 (though the final results of the federal investigation are still pending).

Meanwhile, Spectra is working on a billion dollar pipeline expansion project — the “Algonquin Incremental Market” (AIM) pipeline, which will dramatically extend the reach of the Texas Eastern Transmission network — the same pipeline system that exploded in Salem Township.

RELATED DESMOG ARTICLE: While Reviewing Spectra Energy Gas Pipeline Project, FERC Contractor Did Not Disclose Its Hiring by Spectra for Five Other Projects

Two former inspectors on the AIM project have come forward, alleging serious problems and lawbreaking during the construction of the AIM pipelines that put the safety of workers and the public at risk.

After unexpected delays put construction behind schedule, workers were pressured to complete sections of the pipeline quickly, leading to dangerous cut corners, the former inspectors explained.

These allegations are especially worrisome given that the AIM pipeline’s path takes it past the Indian Point nuclear power plant right outside New York City, and into New England.

When the AIM project is completed — which Spectra says will happen in November — those 42” pipelines, a foot wider than the one that exploded in Salem Township destroying a home roughly 1,500 feet away (an estimate that hasn’t been verified, PHMSA preliminarily estimated a burn radius of about a quarter of a mile), will carry highly pressurized natural gas within 1,200 feet of Indian Point’s nuclear reactor #3.

DeSmog obtained audio recordings of interviews conducted by grassroots activists and reporters with the two whistleblowers. Both of the former inspectors independently contacted activists at The FANG Collective over their concerns about safety, environmental and historic preservation violations they personally witnessed during construction of Spectra’s AIM project, which they describe as rushed and extraordinarily unsafe.

DeSmog also interviewed one of the whistleblowers, who provided new information about risks taken by Spectra during pipeline construction.

In the recordings, the whistleblowers, both of whom have said they want to remain anonymous, detail a litany of concerns including injured workers and accidents on AIM job sites.

“We’ve had guys break their legs, burns, cuts, near misses, dropped objects, slough off in holes, working in standing water in holes, not monitoring spaces, huge violations,” Whistleblower One, a former safety inspector on the AIM project, said.

Even more troubling than the individual workplace safety violations are the consequences of Spectra’s repeated practice, as described by that whistleblower, of burying pipes before weld inspection results can be viewed — which means that if x-rays reveal weld problems, inspectors are put in an ethical bind.

In the recordings — excerpts from which are posted online and which include interviews conducted by The FANG Collective and others — Whistleblower One describes it like this:

“They’re short cutting things. They’re not inspecting stuff properly. They’re covering stuff up before an inspector’s had a chance to look at it. So what — so that inspector just watched me get fired. That inspector just watched me get fired for doing my job. Do you think that same inspector is going to go ‘look man, we’re gonna need you to dig that all back up so we can check the flange’? He ain’t going to say that. He ain’t gonna say that. Even though that’s the only way for me to verify — number one, it needs to be un-dug and taken completely apart. And then put back together. Q: Where – where – so that’s like a specific- ? A: Oh that happens all the time. That’s what I’m saying.”

That whistleblower adds:

“You know, people think that welds are weak. Welds are strong. The weld’s the strongest part of the pipe. You got a flanged connection, that’s weak. Every weld is cataloged. I know who did the weld, when they did the weld, what was their rod temperature, what was the surface temperature of the metal before they welded it. I know when it was x-rayed, how it was x-rayed. I know what the film looked like and I can go back and get it. OK? Now what you don’t do is you don’t weld a pipe or bolt something up or doing something like that and bury it underground, all done. And now what? Now you don’t have that information. So now you’re asking a guy, a Level 3 NDT [non-destructive testing] guy, to say, ‘it’s fine.’ Because if it ever fails, it’s coming back on him. But he’s got a family to feed too. And that’s the situation that I was in. I was being asked to look the other way on safety issues.”

Normally, companies wait until x-ray films are developed and inspected before burying welded pipes, industry experts say.

“The biggest problem is that once the pipe is in the ground, the burden of proof is on the inspector,” said Don Deaver, a pipeline and oil and gas industry expert who worked for Exxon Pipeline Company for over three decades and was asked about the whistleblowers’ allegations by DeSmog.

“It’s a big deal.”

Mr. Deaver also reviewed a photograph provided by Whistleblower Two, depicting a wood-and-dirt bridge crossing over a pipeline trench. That bridge could only support a 40-ton load but was crossed by a 50-ton rig, which “broke the bridge,” according to the whistleblower.

“I’ve never seen anything like it,” Mr. Deaver said, noting that if the whistleblower’s description of the incident is accurate, the four pipelines in that trench could easily have been damaged.

The last step before completely burying pipelines is supposed to be hydrostatic testing — which generally involves pumping pipelines full of water at higher than the maximum pressure they’re designed to withstand for many hours and then checking for leaks or stress points — according to Spectra company documents.

But according to Whistleblower Two, that hydrostatic testing was not properly supervised at the site where he worked.

Both inspectors described other alleged lawbreaking at job sites, including the desecration of an indigenous historical site, and a safety culture that was callous rather than careful.

Whistleblower One described workers illegally hauling potentially contaminated scrap metals to recycling yards to make some pocket change on the side. And then there were the preventable accidents, like an excavator flipping over and a crane that dropped its “headache ball,” striking a plate that fortunately wasn’t pressurized — but if it had been, that accident could have resulted in a catastrophic deadly explosion.

“It was a mad house,” the second whistleblower told DeSmog, estimating that between 30 and 50 inspectors out of roughly 150 were fired or quit the project, an unusually high turnover rate. Asbestos covered pipes were cut without proper safety measures, hydrostatic testing was done without supervision from qualified inspectors, and pipeline sections with faulty coatings were buried even though that risks exposing bare metal to the elements, the former inspector said. “They wanted gas to flow and that’s it, period.”

In the rush to build out pipelines nationwide over the last several years, the pipeline accident rate has skyrocketed, according to watchdog groups. Gas pipelines built between 2000 and 2010 had an accident rate of 1.289 per 10,000 miles; pipes laid since then have a rate of 6.64 per 10,000 miles, according to the Pipeline Safety Trust.

“With all the advances in technology and engineering over the past couple of decades, we would hope to see a real drop in the rate of pipeline failures,” Samya Lutz, a trust spokesperson told Al Jazeera in December. “But that is not the case.”

The movement against the AIM pipeline expansion has been gaining traction.

In February, New York Governor Andrew Cuomo urged the federal government to stop construction, citing ongoing problems at the Indian Point nuclear reactor and announcing that New York state planned to conduct its own review of the project’s risks.

In May, Sentators Chuck Schumer and Kirsten Gillibrand also urged the Federal Energy Regulatory Commission to suspend AIM construction.

As I have said before, I have serious concerns with the Algonquin gas pipeline project because it poses a threat to the quality of life, environmental, health and safety of residents across the Hudson Valley and New York State without any long-term benefit to the communities it would impact,” said Sen. Schumer. “It presents even more safety concerns given its proximity to Indian Point.”

The Indian Point nuclear power plant is just 45 miles north of midtown Manhattan, so if a serious accident did occur, millions of people could be in harm’s way.

Instead of responding to the concerns from the community,” said Nick Katkevich of The FANG Collective, which has sponsored protests and civil disobedience against the pipeline citing the risk of a nuclear meltdown or other catastrophe, “Spectra sped up construction and cut corners, making the project even riskier.”

Officials from Entergy Corp., which owns Indian Point, have said that the larger AIM pipeline will be buried deeper than the existing one and covered with concrete slabs.

Representatives for Spectra declined to respond to questions from DeSmog about the allegations from its former inspectors.

When previously asked about some of the allegations by Al Jazeera, Marylee Hanley, a Spectra spokesperson, “wrote in an email that the company follows ‘federal, state and applicable local regulations as well as our own rigorous standards and procedures.’ Contractors are required to report safety incidents on the job, she said, which are ‘investigated and documented, and critical learnings are shared within the organization.’”

Spectra has been caught skipping pipeline tests in the past

In a May 2, 2013 letter, PHMSA told Spectra it planned to fine the company for skipping periodic pipeline integrity tests required by law.

“When asked to produce the records showing the evaluations and assessments needed to assure the integrity of Line 10 and Line 15 of its pipeline, SET [Spectra Energy Transmission] could not comply,” PHMSA wrote. “SET’s manager for pipeline integrity of the Northeast Region, Roderick Rheume, explained that SET could not provide the records because SET had never conducted the evaluations.”

A year before, Spectra’s Texas Eastern Transmission subsidiary was in hot water for neglecting vital inspections. “By failing to inspect removed pipe, an operator can easily miss visible signs of corrosion that could result in a pipeline failure,” PHMSA wrote on December 21, 2012 as it cited the company, adding that “PHMSA investigators observed disbonded coating, atmospheric corrosion, and severe pitting in some locations” along the Texas Eastern pipeline in Louisiana.

Pipeline leaks not only pose the risk of catastrophic explosions, they also carry climate changing consequences. Natural gas is mostly made of the greenhouse gas methane, which warms the atmosphere 86 times as much as CO2 in the first two decades after it leaks.

Leaks in the nation’s roughly 300,000 miles of “transmission pipelines” — or pipelines like the Texas Eastern that carry highly pressurized gas over long distances — can be harder to detect on site because the sulfuric-smelling odorant mercaptan is generally mixed into gas after it leaves the transmission lines.

Between 2010 and 2015, 12.8 billion cubic feet of methane leaked from the nation’s natural gas gathering and transmission lines in nearly 700 incidents reported to PHMSA. Those incidents also killed 70 people and injured 300.

And since oftentimes companies can simply pass the cost of leaked gas on to consumers, there is less incentive to proactively hunt for leaks once gas lines are laid.

“Nationally, consumers paid at least $20 billion from 2000-2011 for gas that was unaccounted for and never used,” a 2013 Congressional staff report concluded.

As for the AIM project, the former inspectors say that state and federal laws governing pipeline safety were repeatedly broken and corners were cut, with worrisome consequences.

“I have had inspectors that have come up to me in the field and have said to me that there is a pipe buried underground that was not inspected appropriately. And the reason that it was not excavated and inspected is that it cost too much money,” said Whisteblower One. “The right thing for the inspector to do is to make them dig it back up. That’s the right thing to do. With the pressure you receive from Spectra, you will never do that.”

http://www.desmogblog.com/2016/07/19/former-inspectors-describe-dangerous-flaws-construction-major-east-coast-gas-pipeline

| Leave a comment

High-Level EPA Adviser Accused of Scientific Fraud in Methane Leak Research

High-Level EPA Adviser Accused of Scientific Fraud in Methane Leak Research

By Sharon Kelly
The DeSmog Blog
June 28, 2016

It’s one of the highest-stakes debates in the battle over climate change policy action: how much methane is spewing from oil and gas sites nationwide, and what do we do as a result? If enough of the odorless, colorless methane gas leaks or is vented into the air, scientists say, then burning natural gas — marketed as a green fuel that can help wean the U.S. off of high-carbon fuels — will actually be worse for the climate than coal, long seen as the fuel that contributes the most to global warming.

Recently, over 100 community and environmental groups sent a letter urging the Environmental Protection Agency’s internal watchdog to investigate claims that a top methane researcher had committed scientific fraud and charging that he had made false and misleading statements to the press in response to those claims.

Earlier this month, NC WARN, an environmental group, presented the EPA Inspector General with evidence it said showed that key research on methane leaks was tainted, and that one of the EPA‘s top scientific advisors fraudulently concealed evidence that a commonly-used tool for collecting data from oil and gas wells gives artificially low methane measurements.

The 68-page complaint dated June 8 laid out evidence that David Allen, a professor of engineering at the University of Texas who served as the chairman of the EPA‘s Science Advisory Board from 2012 to 2015, disregarded red flags that his methane measuring equipment malfunctioned when collecting data from fracked well sites, a problem that caused his University of Texas study to lowball leak rates.

We used the terms scientific fraud and cover-up because we believe there’s possible criminal violations involved,” said NC WARN executive director Jim Warren. “The consequence is that for the past 3 years the industry has been arguing, based largely on the 2013 study, that emissions are low enough that we shouldn’t regulate them.”

Dr. Allen’s research is a part of a high-profile but controversial research series sponsored by the Environmental Defense Fund that received one third of its funding from the oil and gas industry.

In response to the NC WARN complaint, Dr. Allen issued a statement saying that his team’s data was unaffected, saying that “we had 2-3 additional, independent measurement systems” other than the error-prone tool. But the new letter to the Inspector General labeled that response misleading, saying that in fact, there “was virtually no back-up” testing and that Dr. Allen’s response continued “the pattern of covering up the underreporting of methane emissions”.

The sharp rise of the U.S. gas drilling industry over the past decade or so means that it’s crucial for policy-makers and the public to know exactly how much methane — the key ingredient in natural gas, which is also a powerful greenhouse gas that can warm the climate 100 times as much as an equal amount of carbon dioxide — leaks or is deliberately vented into the atmosphere by the oil industry.

In March, federal energy experts predicted that 2016 will be the first year that the U.S. burns more gas than coal to generate electricity — and if enough of that methane leaks, the switch from coal to gas may spell disaster rather than relief for the climate, scientists warn.

The problem with Dr. Allen’s research wasn’t simply that the team used a faulty tool — the Bacharach Hi-Flow methane sampler is widely used by researchers and the industry — but that Dr. Allen rejected warnings from Touche Howard, the man who invented the technology used in the tool, that the readings were artificially low, without any sound scientific justification for waving off warnings, the complaint says.

“The problems Mr. Howard identified have not been openly addressed or corrected, resulting in the failure of the EPA to accurately report methane emissions for more than two years, much less require reductions,” NC Warn, a North Carolina-based environmental group, wrote in its complaint to the EPA‘s internal watchdog. “Meanwhile, the faulty data and measuring equipment are still being used extensively throughout the natural gas industry worldwide.”

While Dr. Allen’s research is not the only time that the flawed tool was used to collect data, his two studies have been used by the oil and gas industry and its supporters to support claims that leaks and venting are too low to require federal regulation. “The Allen studies are high-profile studies that have been widely cited (197 times as of April 2016) and presented before White House and Congressional staff,” the complaint to the EPA said, “and, as such, have given policy makers and the public an incorrect view of methane emissions from production sites.”

The data Allen collected seemed to show far lower levels of methane than most other studies from the past several years.

“From the start, the paper by Allen and colleagues in 2013 seemed to have unusually low estimates for methane emissions from shale gas, certainly in comparison to most of the other recent literature,” said Cornell University’s Dr. Robert Howarth, who in 2011 authored a now-famous paper showing that natural gas might be worse for the climate than coal, just as the shale rush was beginning to take off. “Howard makes a convincing case that instrument failure explains at least part of the problem with the work of Allen and colleagues, and quite possibly with other studies upon which the US EPA has relied.”

As DeSmog reported last August, “The study’s key contribution to the science on methane leaks was that researchers were allowed to access to oil and gas wells, including 27 wells where fracking was underway, and test individual pieces of equipment. ‘This is actual data, and it’s the first time we’ve had the opportunity to get actual data from unconventional natural gas development,’ Mark Brownstein, an Environmental Defense Fund associate vice president, told FuelFix when the UT study was published.”

But the Bacharach Hi-Flow methane sampler, a backpack-size portable device that is the only tool on the market designed to collect instantaneous readings of methane levels, suffers from a flaw that can compromise the data it collects. The sampler contains two meters, a sensitive one that sniffs out low levels of methane, and a more powerful one used to report higher concentrations of the gas. Mr. Howard discovered that the meter can sometimes fail to switch from low to high — meaning that big leaks would look as much as 100 times smaller than they actually were.

That means that Allen’s conclusions could have seriously underestimated how bad the leaks were. “Over 98% of the [methane] inventory calculated from their own data and 41% of their compiled national inventory may be affected by this measurement failure,” the Howard paper concluded.

While the problems were reported in the peer-reviewed press, Dr. Allen failed to adjust his findings, the complaint alleges. “Rather than act on this information by disclosing it to other research participants, Dr. Allen misused his authority, and gave false or misleading information to the EDF Production Group between October 2013 and January 2014, minimizing Mr. Howard’s concerns,” NC WARN said in its complaint.

“I’ve gone to Dr. Allen repeatedly and asked him to address these issues, and since they haven’t been addressed, unfortunately, at this point, I think that is the only solution,” Howard told InsideClimate News.

After NC WARN‘s complaint was made public, Dr. Allen said that he stood by the data he had collected. “Our study team strongly asserts that the instrument we used and the measurements we made were not impacted by the claimed failure,” Allen said in a statement. Dr. Allen previously published a peer-reviewed note responding to Mr. Howard’s research, but Mr. Howard said that the note had failed to adequately address concerns.

But in a June 24 letter, NC Warn’s Jim Warren said that Dr. Allen had withdrawn his rebuttal and that the independent measuring systems that Dr. Allen had described “actually confirmed the occurrence of BHFS sensor failure”.

For its part, the Environmental Defense Fund, downplayed the importance of Allen’s research to its overall conclusions on methane leak rates, but said that it planned to continue incorporating Allen’s data in its final analysis.

“As always, EDF welcomes – and indeed, we encourage – honest and open review of any science we’re involved with, including these two papers,” EDF said in its statement. “But the overall impact of the questions raised on national methane emissions rates is limited. It’s important not to overestimate their individual significance in vast catalog of studies published in the two-and-a-half years since Dr. Allen’s first study was released.”

Overall, scientific fraud is more common than one might think. “Every day, on average, a scientific paper is retracted because of misconduct,” The New York Times reported last year. “Two percent of scientists admit to tinkering with their data in some kind of improper way. That number might appear small, but remember: Researchers publish some 2 million articles a year, often with taxpayer funding.”

But the stakes in methane leak research are extraordinarily high, given both the extreme consequences of climate change and the financial might of the oil and gas industry, which argues that regulation could drive companies out of business.

NC WARN‘s allegations drew the attention of D.C.’s oldest and most widely-circulated newspaper, The Washington Post.

“It’s time to listen to Howard and revisit the study that found a lower level of methane emissions, given what’s known about the quirks in the device that monitored them,” the Post said in its coverage of the complaint.

Earlier this year, the EPA announced new rules designed to reduce methane leaks — but those rules will only apply to new and modified oil and gas operations, not to the nation’s extensive oil and gas facilities already in place. So the new rules won’t apply to aging and decaying pipeline networks or older gas wells, for example. Instead, the EPA is collecting more data about those leaks, a process that won’t end before the Obama administration leaves office.

And given the accelerating pace of climate change, environmentalists are chaffing at the slow pace of regulation.

“The EPA‘s failure to order feasible reductions of methane leaks and venting has robbed humanity of crucial years to slow the climate crisis,” said Mr. Warren. “The cover-up by Allen’s team has allowed the industry to dig in for years of delay in cutting emissions at the worst possible time.”

http://www.desmogblog.com/2016/06/28/high-level-epa-adviser-accused-scientific-fraud-methane-leak-research

| Leave a comment