Wednesday, January 18, 2017



Greenie climate models fail on a regional scale too

Are General Circulation Models Ready for Operational Streamflow Forecasting for Water Management in the Ganges and Brahmaputra River Basins?

Safat Sikder, et al.

Abstract

This study asks the question of whether GCMs are ready to be operationalized for streamflow forecasting in South Asian river basins, and if so, at what temporal scales and for which water management decisions are they likely to be relevant? The authors focused on the Ganges, Brahmaputra, and Meghna basins for which there is a gridded hydrologic model calibrated for the 2002–10 period. The North American Multimodel Ensemble (NMME) suite of eight GCM hindcasts was applied to generate precipitation forecasts for each month of the 1982–2012 (30 year) period at up to 6 months of lead time, which were then downscaled according to the bias-corrected statistical downscaling (BCSD) procedure to daily time steps. A global retrospective forcing dataset was used for this downscaling procedure. The study clearly revealed that a regionally consistent forcing for BCSD, which is currently unavailable for the region, is one of the primary conditions to realize reasonable skill in streamflow forecasting. In terms of relative RMSE (normalized by reference flow obtained from the global retrospective forcings used in downscaling), streamflow forecast uncertainty (RMSE) was found to be 38%–50% at monthly scale and 22%–35% at seasonal (3 monthly) scale. The Ganges River (regulated) experienced higher uncertainty than the Brahmaputra River (unregulated). In terms of anomaly correlation coefficient (ACC), the streamflow forecasting at seasonal (3 monthly) scale was found to have less uncertainty (less than 0.3) than at monthly scale (less than 0.25). The forecast skill in the Brahmaputra basin showed more improvement when the time horizon was aggregated from monthly to seasonal than the Ganges basin. Finally, the skill assessment for the individual seasons revealed that the flow forecasting using NMME data had less uncertainty during monsoon season (July–September) in the Brahmaputra basin and in postmonsoon season (October–December) in the Ganges basin. Overall, the study indicated that GCMs can have value for management decisions only at seasonal or annual water balance applications at best if appropriate historical forcings are used in downscaling. The take-home message of this study is that GCMs are not yet ready for prime-time operationalization for a wide variety of multiscale water management decisions for the Ganges and Brahmaputra River basins.

SOURCE





Safe and healthy (not pristine) air

Federal air quality rules must be based on science – not used to stifle energy and industry

Paul Driessen

It’s called the Clean Air Act, but it was never intended to ensure pure, pristine air. Congress wanted America to have safe, healthy air, and regulations based on solid scientific and medical studies.

The law says costs cannot be considered where human health and safety are actually at stake. But legislators also understood that efforts to bring emissions to zero are unnecessary, technologically impossible, extremely expensive, harmful to electricity generation, factory output, job creation and retention, and living standards – and thus likely to reduce human health, wellbeing and longevity.

The Obama Environmental Protection Agency ignored these facts and employed highly dubious analyses to justify stringent new emission standards that impose enormous costs for no health benefits. The new Congress and Trump Administration must now restore integrity, rigor and balance to the process.

A good place to begin is with EPA’s rules for fine particulates: PM2.5, soot particles smaller than 2.5 microns (a fraction of the size of pollen and mold spores). EPA claims reducing PM2.5 emissions from power plants, factories, refineries, petrochemical plants, cars, light trucks, and diesel-powered vehicles and heavy equipment will save countless lives. In fact, it says, nearly all the (supposed) benefits from its Clean Power Plan and other recent rules are actually “ancillary benefits” from reducing PM2.5 levels.

Premature mortality is “associated with” fine particle pollution “at the lowest levels measured,” Obama EPA Administrator Gina McCarthy has said. “There is no level at which premature death does not occur.” If we could further reduce particulate pollution, previous Obama EPA chief Lisa Jackson told Congress, it would be like “finding a cure for cancer” – hundreds of thousands of lives saved.

These assertions have no basis in reality. Even EPA’s own studies show they are predicated on two things: epidemiological analyses that count deaths within normal variations in death rates and attribute them to soot emissions; and experiments that unethically exposed humans to PM2.5 concentrations at levels which EPA says cause cardiovascular and respiratory disease, cancer and people “dying sooner than they should.”

The agency’s air pollution epidemiological studies are compromised by uncontrollable “confounding factors.” No data exist on actual individual exposure levels, so researchers cannot reliably attribute specific deaths to particulates, emergency room physician John Dunn explains. Moreover, PM2.5 particles emitted by vehicles, power plants and factories cannot be separated from particles from volcanoes, forest fires, construction projects, dust storms, agricultural activities, and even cigarettes that send hundreds of times more tiny particles into lungs than what EPA says is lethal if they come from sources it regulates.

Nor does a death certificate determine whether a death was caused by airborne particles – or by viruses, bacteria, dietary and exercise habits, obesity, smoking, diabetes, cold weather or countless other factors.

If particulates are a short-term cause of death, there should be a clear association between bad air and deaths within clusters of similar areas, and effects should be consistent across clusters, notes statistician Stan Young in discussing causation versus association.  However, a recent re-analysis of 1969-1974 data from 533 US counties confirmed the previous conclusion: improved air quality did not reduce mortality.

Similarly, in 2002, Canadian forest fires sent massive amounts of smoke (composed largely of PM2.5 particles) into Boston and New York City. EPA doctrine says deaths should have shot up, but they did not. 2008 forest fires in California engulfed Los Angeles in smoke and PM2.5 soot, but again deaths did not increase. In fact, they were below normal as soot levels soared during the fires.

EPA has not proposed a plausible medical explanation to support its claim that super-tiny particles cause multiple diseases and kill people by getting into their lungs or bloodstreams. It just counts deaths during arbitrarily chosen intervals of days, and says differences in the number dying in relation to air pollution levels represents “premature” deaths – rather than the fact that more people die on some days than others.

People certainly did die during some atmospheric inversions that trapped large quantities of airborne chemicals in urban areas like London in 1952. However those pollutants have been dramatically reduced in America’s air. For example, since 1970 US cars have reduced tailpipe pollutants by 99% and coal-fired power plants have eliminated over 90% of their particulate, sulfur dioxide and nitrogen oxide emissions.

EPA thus sponsored 20 years of lab experiments that exposed human test subjects to high air pollution levels. That raises legal, ethical and scientific problems. US laws, the Nuremberg Code, the Helsinki Accords and EPA Rule 1000.17 make it unethical or illegal to conduct toxicity experiments on humans.

In addition, researchers failed to advise volunteers that EPA claims the pollution they were going to breathe is toxic, carcinogenic and deadly. Moreover, many of the human guinea pigs were elderly, asthmatics, diabetics, people with heart disease and even children – the very people EPA claims are at greatest risk and most susceptible to getting sick or dying from the pollutants volunteers would breathe.

Finally, test subjects were exposed to eight, thirty or even sixty times more particulates per volume of inhaled air – for varying periods of time: up to two hours – than they would breathe outdoors during routine physical activities. And yet, they did not get seriously ill or die. That raises important questions:

* If PM2.5 particulates are dangerous or lethal when emitted by factories or vehicles, and there is no safe threshold – how can those same pollutants be harmless to people who were intentionally administered pollution many times higher, and for longer periods, than they would encounter in their daily lives? Why didn’t those test subjects have seizures, develop lung, cardiac or cancer problems, or die?

* If they did not, how can EPA say there is no safe level, all PM2.5 particulates are toxic, its regulations are saving countless lives, and regulatory benefits vastly outweigh their multi-billion-dollar annual costs?

Simply put, there is no basis for these claims – or for the Obama EPA’s war on fossil fuels and factories.

America’s air is healthy and safe. EPA’s PM2.5 emission standards and regulations are clearly based on bald assertions, rank conjecture, epidemiological studies that provide no scientific support for the agency, and human testing that actually proves small particulates pose no toxic or lethal risk to risk to human health, even at levels dozens of time higher than what EPA claims are dangerous or lethal in outdoor air.

Any computer models based on these assertions and studies are thus garbage in-garbage out game playing that provide no valid basis for claims about lives saved or regulatory benefits exceeding costs.

(A thorough analysis of this untenable situation can be found in JunkScience.com director Steve Milloy’s new book, Scare Pollution: Why and how to fix the EPA, which documents the ways EPA uses deceptive tactics to frighten people into believing the air they breathe is likely to sicken or kill them.)

The incoming Trump EPA needs to conduct its own internal review of existing agency PM2.5 claims, documents, emission levels and regulations – and fund an independent review by respected medical experts – to determine whether they are based on honest, replicable science. If they are not, everything based on the fraudulent PM2.5 pollution narrative should be subjected to a total do-over.

While all that is being done, EPA should suspend implementation of all policies, guidelines and rules based on the scheme. It must also inform legislators, journalists and citizens about the facts – and clearly and vigorously address inevitable environmentalist objections and denunciations.

The new EPA and Congress should also require that all past, current and future researchers make their raw data and methodologies available for outside peer review. They should stop funding activist groups that have engaged in collusive lawsuits or rubberstamped EPA actions, including the American Lung Association. Last, they should fully reform the agency’s supervisory panels, board of scientific counselors and Clean Air Act Scientific Advisory Committee (CASAC) – and repopulate them with experts who do not have government grant or other conflicts, and will bring integrity and rigor to the scientific process.

These steps will help make EPA credible and accountable, and its actions based on solid science.

Via email





In EPA rebuke, judge orders quick evaluation on coal jobs

CHARLESTON, W.Va. — A judge has ordered federal regulators to quickly evaluate how many power plant and coal mining jobs are lost because of air pollution regulations.

U.S. District Judge John Preston Bailey in Wheeling made the ruling after reviewing a response from outgoing U.S. Environmental Protection Agency Administrator Gina McCarthy.

McCarthy had responded to the judge’s previous order in a lawsuit brought against her by Murray Energy Corp. that the EPA must start doing an analysis that it hadn’t done in decades.

According to Wednesday’s order, McCarthy asserted it would take the agency up to two years to devise a methodology to use to try to comply with the earlier ruling.

“This response is wholly insufficient, unacceptable, and unnecessary,” Bailey wrote.

The judge said the EPA is required by law to analyze the economic impact on a continuing basis when enforcing the Clean Air Act and McCarthy’s response “evidences the continued hostility on the part of the EPA to acceptance of the mission established by Congress.”

Bailey ordered the EPA to identify facilities harmed by the regulations during the Obama presidency by July 1. That includes identifying facilities at risk of closure or reductions in employment.

The EPA had contended that analyzing job loss won’t change global energy trends.

The judge also set a Dec. 31 deadline for the EPA to provide documentation on how it is continuously evaluating the loss and shifts in employment that may result from administration and enforcement of the Clean Air Act.

The EPA said it was reviewing the ruling, first reported by the Wheeling Intelligencer and News-Register. A Murray Energy spokesman didn’t immediately offer comment.

Murray Energy and other coal companies have blamed thousands of layoffs this decade on President Barack Obama’s anti-global-warming push that imposes limits on carbon pollution from coal-fired power plants. The U.S. Supreme Court has delayed implementation of Obama’s Clean Power Plan until legal challenges are resolved.

West Virginia’s economy is reliant on coal mining and gets 96 percent of its electricity from coal-fired plants.

McCarthy has said no administration has interpreted federal law to require job impact analysis for rulemaking since 1977. She said the most that the EPA does is “conduct proactive analysis of the employment effects of our rulemaking actions,” but that has not included investigating power plant and mine closures and worker dislocations on an ongoing basis, according to the order.

Bailey wrote that the EPA can recommend amendments to Congress if it feels strongly enough.

“EPA does not get to decide whether compliance with (the law) is good policy, or would lead to too many difficulties for the agency,” Bailey wrote. “It is time for the EPA to recognize that Congress makes the law, and EPA must not only enforce the law, it must obey it.”

President-elect Donald Trump, who has selected Oklahoma Attorney General Scott Pruitt to head the EPA, has promised to overturn many of the EPA’s regulations on coal.

President-elect Donald Trump’s EPA nominee, Scott Pruitt, has repeatedly sued the EPA since becoming Oklahoma’s attorney general in 2011, including joining with other Republican attorneys general in opposing the Clean Power Plan.

SOURCE





Scrutinizing Sen. Carper's Questions for EPA Nominee Pruitt

Yesterday, The Hill’s Timothy Cama reported that Sen. Tom Carper (D-Del.), ranking member of the Senate Environment and Public Works Committee, is unwilling to hold a confirmation hearing on President-elect Trump’s choice for Environmental Protection Agency administrator until the nominee, Oklahoma Attorney General Scott Pruitt, answers a seven-page questionnaire.

Below, I’ve answered 13 representative questions, in no particular order. A few of Carper’s questions don’t make a lick of sense (see questions 2, 3, 10). Others were based on factual inaccuracies or otherwise demonstrate the Senator’s ignorance of how the EPA works (see questions 8, 12, and 13). Some questions served to demonstrate the excesses of the EPA during the Obama era (see questions 9 and 11). Finally, at least one of his questions serves no purpose other than to gum up the nomination process (see question 7).

Question #1: Do you agree with this statement from NASA: “97% or more of actively publishing climate scientists agree: Climate-warming trends over the past century are extremely likely due to human activities.”? If not, please explain why you do not agree.

Response #1: According to Professor Richard Tol, who has been involved with the Intergovernmental Panel on Climate Change since 1994, the “97% consensus” claim is a “bogus” number that is based on a statistical manipulation. But assuming for the sake of argument that it’s true, the statement is of minimal utility for policymaking at the EPA. For starters, it tells us nothing about how a changing climate influences human well-being, which must be the primary metric from a policymaking standpoint. In fact, there is great uncertainty regarding the magnitude and effect of projected climate change. Finally, the statement tells us nothing about costs and feasibility of greenhouse gas controls, which must be considered under the Clean Air Act. As such, the nominee’s agreement or disagreement with the statement is immaterial.

Question #2: What is your definition of sound science?

Response #2: Carper is asking Pruitt to define the modifier “sound,” which is inherently subjective. Personally, I believe that there must be at least a 98% consensus before science is “sound”; a 97% consensus is insufficient. But seriously, this question is impossibly imprecise. It’s like asking someone to define the color blue.

Question #3: Prior to your nomination, how have you acquired scientific information relevant to the missions of the EPA? And since your nomination?

Response #3: Again, this is a strangely imprecise question. Does it count when Pruitt watches Shark Week with his kids? I assume Pruitt processes “scientific information” constantly, in addition to “legal information” and “family information” and “sports information.”

Question #4: Please list all undergraduate and postgraduate science courses that you have taken. Please describe any other science education that you have completed over the years beyond high school.

Response #4: Is this some sort of litmus test? If so, does this mean that Janet McCabe, the head of EPA’s Office of Air & Radiation—which is the most powerful public health regulatory body at EPA—is unfit for the job? According to her bio, she went to Harvard Law School and then worked as an Assistant Attorney General in Massachusetts, which is a very similar background to Pruitt. In a similar vein, does Al Gore’s science-free tertiary education render him unfit to head the EPA?

Question #5: What degree of scientific certainty should the EPA have about a potential health or environmental threat before acting to protect people from that threat?

Response #5: Of course, the answer depends on the statutory provision at question. The EPA only exists to the extent it has been created by Congress through organic laws that empower the agency with its authorities. So the necessary degree of scientific certainty, and whether or not costs factor into policymaking decisions, are determined by the law. I suspect it is Pruitt’s intention as EPA head to follow the law, unlike the current administration, which has expansively interpreted the law so as to grow the agency’s authority at the expense of Congress and the states.

Question #6: Please provide a list of all financial contributors to your attorney general and state senate campaigns, including their total donations and affiliations.

Response #6: I presume the silly logic behind this question is that Pruitt has been bought. Does this mean that Obama was bought by “Big Oil” when he took $900,000 from them in 2008? Better yet, let me see a list of all financial contributors to your campaigns, Senator Carper.

Question #7: Please provide a list of all the cases, briefs and other legal actions that your office has filed while you have served as attorney general.

Response #7: This looks suspiciously like a tactic to bog down the nomination process. After all, why would Carper want the non-environmental “cases, briefs, and other legal actions” filed by the Oklahoma Attorney General? After all, we’re talking about the nominee for the Environmental Protection Agency, and Sen. Carper is the ranking member of the Senate Environment & Public Works Committee. Why would Sen. Carper want documents relating to larceny or murder? This would likely entail hundreds of thousands of pages, none of which have anything to do with environmental policy. Does that make sense?

Question #8: Every year during your tenure as Oklahoma Attorney General, the American Lung Association gave Oklahoma counties a failing grade for not meeting ozone air pollution health standards. In fact, your home town of Tulsa is ranked 18th out of 228 metropolitan areas for high alert ozone days. Are you concerned about the impacts of soot and smog pollution on Oklahoma citizens? What efforts have you undertaken as Oklahoma Attorney General to protect Oklahomans from soot and smog pollution?

Response #8: EPA is required to set ambient air quality standards for smog and soot at a level that is “requisite to protect public health” with an “adequate margin of safety.” That is, national standards have to be beyond what is necessary to protect public health. According to the EPA, no counties in Oklahoma fail to attain these stringent health standards. So I don’t have any idea what the American Lung Association is talking about. I should note that the accuracy of ALA’s annual air quality grades was recently challenged by Colorado air quality officials.

Question #9: In your joint brief against the Mercury and Air Toxics Standards, it stated “human exposure to methylmercury from coal-fired electric generating units is exceedingly small.” What is the scientific basis for this statement?

Response #9: The scientific basis for this statement is the EPA, according to which it was “necessary and proper” to regulate mercury from power plants in order to protect a putative population of pregnant subsistence fisherwomen who during their pregnancies eat more than 200 pounds of self-caught or family-caught fish from exclusively the top ten percent most polluted bodies of fresh inland water, despite all of the signs that say “DO NOT EAT FISH FROM THIS RIVER IF YOU ARE PREGNANT.” I don’t believe these women exist, and EPA did not provide any examples. Instead, they were modeled to exist. While the rule’s “benefits” are indeed “exceedingly small,” its costs--$10 billion annually—are exorbitant.

Question #10: Who serves as your scientific advisor for climate change related issues during your time as attorney general? Please provide their name, their title and when they served as your science advisor.

Response #10: This is a very silly question. Why would the Oklahoma Attorney General have a climate change science advisor? Oklahoma is litigating one case related to climate change—should it have a discrete science advisory for each case it undertakes? What role would such a climate science advisor play? I fail to see how interpreting the limits of EPA’s statutory authority under the Clean Air Act requires knowledge of climate science.

Question #11: In 2013, you argued that the EPA’s decision to impose a Federal Implementation Plan on Oklahoma to address Regional Haze would cost more than $1 billion over 5 years. It is three years later. Do you still agree with this cost assessment? If not, why not?

Response #11: The $1 billion referred to the cost of four sulfur-dioxide scrubbers at four coal-fired power plants operated by the Oklahoma Gas & Electric. Indeed, the primary justification for EPA’s federal plan was that the agency disagreed with the state’s cost estimate of what the scrubbers would cost. In 2014, the utility proposed a $1.1 billion plan for the 4 scrubber retrofits, but the plan was rejected by state regulators. As a result, the utility re-submitted a different plan to spend $500 million on 2 scrubbers, and to spend about $70 million on converting the other two coal-fired power plants to gas-fired plants. Due to these changes, the current cost of the EPA federal plan is about $570 million.

However, the actual costs of the scrubbers aligns with what Oklahoma had estimated—i.e., the basis for EPA’s federal plan was disproved. More importantly, the “benefits” of the rule are literally imperceptible to the human eye. Thus, EPA’s takeover of the Oklahoma Regional Haze program demonstrates much that was wrong with the Obama-era EPA.

After the state of Oklahoma spent countless hours and resources putting together a visibility strategy, EPA rejected the state plan and then imposed a federal plan which cost $570 million more in order to achieve a visibility “improvement” that is literally invisible. Afterwards, events demonstrated that Oklahoma had been right all along.

Question #12: As attorney general, what types of environmental justice cases have you pursued? Please provide a list of cases and outcomes.

Response #12: I can’t find any evidence that Obama’s EPA brought a single environmental justice case. Given that environmental justice is a federal concept, and that EPA hasn’t brought any such cases during Obama’s tenure, I don’t understand why the Attorney General of Oklahoma would have done so.

Question #13: Would you explain your recent challenges to EPA’s finding that it is appropriate and necessary to regulate the emissions of carbon dioxide and hazardous air pollutants from power plants?

Response #13: This question gets the Clean Air Act wrong. Along with more than 20 other states in addition to Oklahoma, Pruitt did indeed challenge EPA’s determination that it was “appropriate and necessary” to regulate the emissions of hazardous air pollutants from power plants. He did so based on the EPA’s own science, as I explain in Response #9. Almost two years ago, the Supreme Court sided with Pruitt and the other challengers in determining that EPA was required to take costs into account when it rendered this “appropriate and necessary” determination.

However, contrary to Sen. Carper’s query, the EPA never issued a finding that it is “appropriate and necessary” to regulate greenhouse gases from power plants. Instead, EPA issued a determination at the end of 2009 that tailpipe emissions of greenhouse gases from cars and vehicles “endangered” the environment. This was the determination that Pruitt unsuccessfully challenged based, inter alia, on the argument that the structure and design of the Clean Air Act strongly suggests that it was not intended to regulate greenhouse gases.

SOURCE






Australia: Unhinged electricity policy of the Leftist Queensland government

Everyone remembers the slogan: Queensland — beautiful one day, perfect the next. I have to inform you there has been an update: Queensland — beautiful one day, insane the next.

The idea that the state could achieve a target of 50 per cent of electricity generated by renewable energy by 2030 is bizarre, unachievable and mischievous — in a word, it is insane. And it is not just because such a target would drive up electricity prices for households and businesses to the high levels of South Australia — probably higher. It also would destroy the value of most of the electricity assets held by the Queensland government. Talk about shooting yourself in the foot.

Given Queensland’s extreme level of government debt, there is no doubt that, in due course, most of the government-owned cor­porations will be sold, particularly if the cost of servicing the debt were to escalate. The tragedy is that it is likely the value of most of these assets will have fallen through the floor by then.

In the meantime, the flow of dividends that the government is relying on to create the appearance of fiscal rectitude will dry up, even if the present unconventional directive of ordering a payout ratio of 100 per cent of profits of the government-owned corporations continues.

An important question is: why would the Palaszczuk government opt for such an economically harmful and foolish policy? We should not forget that Queensland has the lowest percentage of electricity generated by renewable energy — at just more than 4 per cent.

So the policy involves an increase of 46 percentage points in the penetration of renewable energy as a source of electricity generation in the space of 13 years. Pull the other one.

To provide cover for this madcap policy, the Queensland government appointed a “renewable energy expert panel” to provide a veneer of credibility to the feasibility of the target.

With carefully chosen panel members, the draft report — unsurprisingly — concluded that there were no problems with reaching the target and that electricity costs to households and businesses in Queensland would probably stay steady. Again, pull the other one, but I am running out of other ones.

We should just take a look at the figures. There will need to be between 4000 megawatts to 5500MW of new large-scale renewable energy capacity between 2020 and 2030, something that has not even been achieved for Australia as a whole across the same period. The consensus view is that 1500MW of additional renewable energy a year is the top of the range for Australia and Queensland is only 15 per cent odd of that total.

And don’t you just love the prediction of the panel that electricity prices will remain steady for households and business in Queensland as a result of the government’s bold, go-it-alone policy? The background to this, as noted by the Queensland Productivity Commission, is that “since 2007, Australian residential retail electricity prices have increased faster than any other OECD country and Queensland prices have increased faster than any other state or territory”.

Mind you, it is clear why the Palaszczuk government didn’t simply ask the Queensland Productivity Commission to analyse the feasibility of the 50 per cent state renewable energy target. That would be because it wouldn’t be seen as “reliable”, having made the wholly rational suggestion last year that the state government withdraw the generous and unjustified subsidies to households with solar panels on their roofs.

Premier Annastacia Palasz­czuk was not having a bar of that idea. How could she continue to conflate small-scale solar panels with large-scale renewable energy, thereby buttressing the support of the public (well, the better-heeled part of the public that can afford solar panels) for anything called renewable energy? If X is good, 2X must be better and 12X must be a blast. Continuing to subsidise households with solar panels is part of the political game, hang other electricity users.

So what does that “independent” panel conclude about the impact of the 50 per cent renewables energy target on electricity pricing? The answer is “broadly cost neutral to electricity consumers where the cost of funding the policy action is recovered through electricity market mechanisms”. (This is code for: we could always skin taxpayers or ask Canberra to chip in.)

But here’s the rub: “This occurs as a result of increased renewable generation placing downward pressure on wholesale electricity prices, which is projected in the modelling to offset the payments to renewables.”

Mind you, the point is added that “the pricing outcome is not guaranteed and could differ, for example, if existing generation capacity is withdrawn from the market, especially coal-fired generation”.

Think about this. What the panel is saying is: if existing generators, which are owned by the government in Queensland, are driven out of the market, which is likely because of the renewables energy target — see the South Australian and Victorian cases as live examples — then prices will rise. And the capital value of these withdrawn government-owned generators will be close to zero, having probably experienced years of underinvestment in maintenance.

This leaves the question: why would the Queensland government decide on such a dimwitted, self-defeating and economically damaging policy position?

In keeping with the rule of following the money, it is clear that the lobbying efforts of the clean energy rent-seekers have been directed at the Queensland government, in particular.

After all, the large energy providers generally have a foot in both camps — conventional electricity generation plus renewable energy assets.

But they don’t stand to lose anything in Queensland by virtue of the astronomical state renewable energy target because the conventional electricity generation assets are all owned by the government. If these generators are driven out of business, it’s a big plus for them, not a negative.

Silly estimates of the gains in employment and billions of dollars of investment, mainly in the regions, associated with renewable energy make gormless politicians simply salivate. The sad thing is that it will be lose-lose for Queenslanders down the track.

The challenge for federal Energy Minister Josh Frydenberg is to convince state governments to junk their vacuous, go-it-alone renewable energy targets that will lead to even higher electricity ­prices and further threaten the reliability of the grid.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Tuesday, January 17, 2017



Vatican Invites ‘Population Bomb’ Hoaxer Paul Ehrlich to Address Biodiversity Conference

The Vatican has invited the most notorious population alarmist in recent history to speak at an upcoming Vatican-run conference titled ‘Biological Extinction.”

The conference, sponsored jointly by the Pontifical Academy of Science and the Pontifical Academy of Social Sciences, will address issues of biodiversity, “great extinctions” of history, population and demographics.

Dr. Paul R. Ehrlich—who has defended mass sterilization, sex-selective abortion and infanticide—will speak on “Causes and Pathways of Biodiversity Losses: Consumption Preferences, Population Numbers, Technology, Ecosystem Productivity.”

To allow women to have as many children as they want, Ehrlich has said, is like letting people “throw as much of their garbage into their neighbor’s backyard as they want.”

Ehrlich became famous through the publication of his 1968 doomsday bestseller, The Population Bomb, which generated mass hysteria over the future of the world and the earth’s ability to sustain human life.

In the book, Ehrlich launched a series of frightening predictions that turned out to be spectacularly wrong, creating the myth of unsustainable population growth.

He prophesied that hundreds of millions would starve to death in the 1970s (and that 65 million of them would be Americans), that already-overpopulated India was doomed, and that odds were fair that “England will not exist in the year 2000.”

Ehrlich concluded that “sometime in the next 15 years, the end will come,” meaning “an utter breakdown of the capacity of the planet to support humanity.”

Mankind stood on the brink of Armageddon, the book proposed, because there was no way to feed the exponentially increasing world population. The opening line set the tone for the whole work: “The battle to feed all of humanity is over.”

Despite Ehrlich’s utter failure to predict humanity’s ability to feed itself, his theories will be dusted off and re-proposed in the Vatican in late February.

In its brochure for the upcoming workshop, the Vatican asserts in Ehrlichian doomspeak that “Earth cannot sustain” our desire for “enhanced consumption.”

Humanity is presently using about 156 percent of “the Earth’s sustainable capacity” every year, the text contends, and it is therefore essential to address “the question whether the Earth system is able to support the demands that humanity has been making on it” and “how global inequality and poverty relate to that.”

The conference will also feature a speaker from an environmental advocacy group called the Global Footprint Network (GFN), which each year calculates the day when the year’s available resources supposedly run out and mankind begins overconsuming nature.

“We use more ecological resources and services than nature can regenerate through overfishing, overharvesting forests and emitting more carbon dioxide into the atmosphere than forests can sequester,” the group exclaims on its website.

“On August 8, 2016, we will have used as much from nature as our planet can renew in the whole year,” GFN announced last summer.

In their brochure, the Pontifical Academies make the counterintuitive claim that biodiversity means “everything” for the human race, but then proceed to acknowledge that only 103 species of plants out of an estimated 425,000 species produce about 90 percent of our food worldwide. Moreover, just three kinds of grain, maize, rice, and wheat, produce about 60 percent of the total, the text notes.

Since none of these species are under any danger of extinction, one wonders how biodiversity can mean “everything” for humanity.

Just why the Vatican would wish to showcase the purveyor of debunked, apocalyptic theories is anybody’s guess, but it certainly cannot bode well for the relationship between faith and science.

SOURCE





Where's that food shortage Warmists are always predicting?

A lot of grain silos are so full there is no room for more.  Many grains are in glut (oversupply), driving the prices down -- and causing farmers to switch to other crops

United States farmers have planted their smallest winter wheat crop in 108 years, according to the US Department of Agriculture (USDA).

Its monthly World Agriculture Supply and Demand Estimates (WASDE) revealed winter wheat plantings were down 1.5 million hectares to just over 13 million hectares.

The report also contained projections of cuts to US and global soybean production as well as lower-than-expected ending stocks, which sent Chicago Board of Trade March soybean contracts soaring to a three and a half week high.

The USDA's latest report echoes similar reports from global agencies detailing the globe's enormous stocks of grain, with wheat stocks tipped to reach levels not seen in three decades.

"Global wheat supplies for 2016/17 are raised 1.3 million tonnes on a production increase that is only partially offset by lower beginning stocks," the report said.

"The largest increases are for Argentina, Russia and the European Union."

The cuts to soybean production and wheat plantings reflect farmers' moves to plant higher value crops during a period of major over-supply, according to Chicago trader and PRICE Futures Group vice-president Jack Scoville.

"I think a lot of producers are very unhappy with the wheat price here in the United States.

"They're looking for alternatives, and given where the reductions are out in the great plains, I'm sure we're going to hear about more cotton in the coming growing season, this US summer, and also maybe some more soybeans and perhaps a little bit more sorghum."

Mr Scoville said despite a good session on the CBOT, wheat, corn and soybean prices were all trading near four or five year lows.

"That's creating quite a problem for producers, they really need more money than that."

SOURCE





Global Temperature Trend Propaganda Video: Who Needs Peer Review?

Ronald Bailey

I repeat, once again, that I believe that the balance of the evidence suggests that man-made global warming could become a significant problem for humanity as this century unfolds. OK, that is now out of the way. So let's turn to a sleazy attempt by some climate scientists (activists?) to undercut scientific findings by other researchers that call into question their assertions about global temperature trends.

University of Alabama at Huntsville climate scientists John Christy and Roy Spencer have been reporting data from NOAA satellites that measure the temperature of the mid-troposphere since 1979. Their data show that global average temperature has been essentially flat for the past 18 years. This is very inconvenient for rival researchers whose climate models have projected that significant warming should have occurred during this period as humans continue to burn more fossil fuels and load up the atmosphere with global-warming carbon dioxide. In addition, there is a significant mismatch between the surface temperature data sets that show higher rates of warming than do the satellite data.

So what to do? What good scientists would do is try to reconcile the datasets and debate the issues in the scientific journals. Well, that's messy, slow, and the results are not pre-determined. So what a trio of climate scientists - Michael Mann, Kevin Trenberth, and Ben Santer - have evidently decided to do is participate in a video project funded by an climate activist foundation whose chief aim is to cast doubt on the satellite data.

Why now? Because various government agencies are shortly going to declare that 2015 is the warmest year ever in the historical surface temperature records. The climate scientists in the video evidently fear that "climate deniers" will dismiss these dire declarations by pointing to the satellite data which show a considerably slower rate of warming. Solution: Deny data that contradicts their preferred narrative. This is not science!

Over at Breitbart, Christy responds to the video:

There are too many problems with the video on which to comment, but here are a few.

First, the satellite problems mentioned here were dealt with 10 to 20 years ago. Second, the main product we use now for greenhouse model validation is the temperature of the Mid-Troposphere (TMT) which was not erroneously impacted by these problems.

The vertical “fall” and east-west “drift” of the spacecraft are two aspects of the same phenomenon – orbital decay.

The real confirmation bias brought up by these folks to smear us is held by them.  They are the ones ignoring information to suit their world view.  Do they ever say that, unlike the surface data, the satellite datasets can be checked by a completely independent system – balloons? Do they ever say that one of the main corrections for time-of-day (east-west) drift is to remove spurious WARMING after 2000?  Do they ever say that the important adjustment to address the variations caused by solar-shadowing effects on the spacecraft is to remove a spurious WARMING?  Do they ever say that the adjustments were within the margin of error?

In addition, another group, Remote Sensing Systems, established explicitly to independently evaluate the satellite temperature data finds the same overal temperature trend as the folks at the University of Alabama. See Christy's version of the mismatch between model projections and satellite and weather balloon temperature trends below.



If these researchers have any real arguments showing that the satellite data are wrong, the place to prove that is in the peer-reviewed scientific literature - not a propaganda video.

SOURCE




Venezia ghiacciata per il freddo. Prima volta nella storia (Venice canals freeze for the "1st time in history")



Nelle ultime settimana l’Italia è stata invasa da freddo e neve, persino al sud si sono verificate abbondanti nevicate. Ma questo inizio 2017 verrà ricordato soprattutto per un evento che non ha precedenti nella storia. La laguna e alcuni canali di Venezia si sono completamente congelati [The lagoon and some canals of Venice are completely frozen], l’acqua è diventata ghiaccio e si prevede che questo “congelamento” duri ancora per un paio di giorni. Venezia è solita ad abbondanti nevicate durante l’inverno, tuttavia le basse temperature, insieme a una particolare umidità, hanno creato un totale congelamento dei canali.

I turisti, e anche gli abitanti, sono tanto sorpresi quanto affascinati. Magari dovranno rinunciare a qualche giro in gondola nei canali più grandi (la maggior parte di essi è comunque navigabile) ma lo spettacolo a cui stanno assistendo non ha precedenti. La bassa profondità delle acque e la loro limpidezza ha fatto in modo che il ghiaccio sia dotato di una luminosità incredibile, specie in alcuni punti più centrali della città. Esso appare quasi azzurro con tinte biancastre, e particolarmente lucente. Il meteorologo, Tenente Colonnello Giovanni Piastrucci, ha spiegato che il fenomeno è dovuto alle proprietà igroscopiche delle acque veneziane, che insieme alla temperatura sotto lo zero, creano uno strano effetto di rifrazione dei raggi luminosi.

SOURCE

Whoops! The above pic is a photoshop.  The canals did ice up in 2012 however.  The pic below is from Feb. 6, 2012






Australian energy bills soar in shift from coal power stations

Electricity companies have begun hiking consumer prices around the country, blaming the closure of coal-fired generators and the increased cost of renewable energy for higher-than-predicted increases of more than $130 this year.

EnergyAustralia and AGL have increased electricity tariffs in Victoria by $135 and $132 on average for the year respectively — greatly exceeding state government modelling that concluded bills would rise by $27 to $100.

The Victorian price rises will flow from this week but the companies’ customers in other states, including South Australia and NSW, face a yet-to-be announced price rise in June.

Red Energy, the retailing arm of Snowy Hydro, informed customers in NSW its rates would increase this week because of “increases in the wholesale cost of electricity and the large-scale renewable energy certificates”.

Some tariffs were raised by almost 25 per cent.

The consumer price rises will increase political pressure on state and federal governments to deal with escalating energy costs that have sparked business warnings that rising power charges are undermining competitiveness.

The Australian Energy Council has warned the impact will be greatest in Victoria and South Australia, which face the biggest wholesale price increases.

The South Australian government is under pressure over its heavy reliance on renewable energy, particularly with the closure of the Northern power station and blackouts sparked by severe storms. Queensland, which has a regulated market, is reviewing its energy tariffs with results expected by the middle of the year.

The Energy Council’s corporate affairs general manager, Sarah McNamara, said the Victorian wholesale price increases were a “byproduct of the reduction in the state’s generation capacity by around 20 per cent, a direct consequence of the upcoming closure of the Hazelwood power station in March”. The Energy Council, which represents major electricity and gas producers, has repeatedly called for a national strategy to deal with supply issues and price volatility as older power stations are retired and an increasing amount of large-scale renewable energy is made available.

An EnergyAustralia spokesman said the average $11 a month increase in Victoria reflected “higher generation, general business and government green-scheme costs”. In that state, there was an increase in the cost of buying electricity for 2017 from about $40 a megawatt hour in January to more than $60 a megawatt hour in November, he said.

“The closure of the Northern power station in South Australia, increased demand for gas by large LNG projects in Queensland, reliability issues and … the market’s reaction to the closure of Hazelwood were among the main factors,” he said.

AGL, through a spokesman, said residential electricity prices would rise by $2.59 a week, on average, or a 9.9 per cent increase, while small and medium-size businesses would see costs increase by 13.4 per cent.

Despite the higher charges, the closure of Hazelwood could boost earnings at AGL, which owns the Loy Yang A power station, by up to 10 per cent, according to analysts at investment bank JP Morgan. That analysis, released late last year, assumed the closure of Hazelwood would increase wholesale prices by 15 per cent in Victoria and 10 per cent in NSW.

Victorian coal generators will also face increased royalty costs this year, with the subsidy intended on making renewable energy more attractive rising to 22.8c a gigajoule for companies mining brown coal from 7.6c, netting the government about $250 million over four years.

The Minerals Council’s Victorian executive director, Gavin Lind, said the brown coal royalty increases introduced by the Andrews government were harmful and ignored the practicalities of the electricity market.

“The expected increase in electricity costs will hit Victorian businesses hard, especially the manufacturing sector where uncertain economic conditions are already placing the industry under strain,” he said. “The Victorian government seems intent on increasing the state’s dependence on expensive and part-time energy sources and committing Victorian households and industry to higher energy prices. It will pass the cost of the scheme on to electricity users via their energy bills. In so doing, it will subsidise uneconomic renewable energy projects while driving out affordable, reliable coal-fired energy.’’

A government spokesman defended the increase. “The royalty rate has not changed in a decade, and this will simply bring Victoria into line with the other states. We are ensuring Victorians get a fair return for the use of our state’s natural resources,” he said.

In Queensland, the state’s Competition Authority is in the final stages of setting electricity tariffs for 2017-18, with the Palaszczuk government unveiling a 50 per cent renewable energy target by 2030 that could slash earnings at the government-owned electricity generators.

Renewable energy schemes were blamed by Red Energy for this week’s increase in retail prices, although Snowy Hydro declined to provide details about the increases. “There are a number of factors that can push energy prices higher for consumers and the need to source renewable energy certificates to cover a portion of the energy consumed by customers is one of them,” a spokesman said. “We cover the resulting REC liability through a combination of RECs generated by the Snowy Scheme with the remainder sourced from the market.”

The price of those certificates has jumped in recent months, netting some electricity retailers windfall gains, as concerns grow that Australia will not reach its 2020 renewable energy target.

The spot price of those certificates rose to about $87 at the end of last month compared with an average of $54 in 2015, although the largest retailers can obtain RECs as part of the normal course of business or at lower contract rates.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Monday, January 16, 2017



German Greenies turn German water supply brown

Which has been very vexing. When water comes out of your faucet brown, you know something bad has happened.  You expect that only in poor countries like India. I have seen it in India.  The German authorities do in fact manage to bleach the water before it goes out to households but that's expensive. So why are German streams running brown anyway?

It's because of the German government's hostility to industry.  High electricity prices and other policies have chased a lot of German industry to saner countries and the remaining industries are heaviliy regulated in order to reduce pollution of all sorts.

And one effect of that has been a reduction in the industrial emissions of nitrogen compounds into the air.  But such componds do not stay in in the air forever.  They gradfually fall out into the soil.  And in the soil they react with a lot of other stuff, binding it so that it stays put.  So in the absence of all those nitrates  various other compounds are set free and get washed into the rivers.  And among those are brown plant wastes, "dissolved organic carbon".

So where to now?  Nowhere to go.  They just have to spend more money on treating the water before it is reticulated.  Extensive chemical treatment of the water supply before people drink it doesn't seem very Green, though, does it? Maybe brown drinking water is the way ahead for Germans! LOL

The abstract below puts what I have just said into more precise scientific terms

An interesting sidelight. Andreas Musolff  has written a book on Hitler which dodges the fact that Hitler's policies were socialist.  How did he get from history into hydrology?


Unexpected release of phosphate and organic carbon to streams linked to declining nitrogen depositions

Andreas Musolff et al.

Abstract

Reductions in emissions have successfully led to a regional decline in atmospheric nitrogen depositions over the past 20 years. By analyzing long-term data from 110 mountainous streams draining into German drinking water reservoirs, nitrate concentrations indeed declined in the majority of catchments. Furthermore, our meta-analysis indicates that the declining nitrate levels are linked to the release of dissolved iron to streams likely due to a reductive dissolution of iron(III) minerals in riparian wetland soils. This dissolution process mobilized adsorbed compounds, such as phosphate, dissolved organic carbon and arsenic, resulting in concentration increases in the streams and higher inputs to receiving drinking water reservoirs. Reductive mobilization was most significant in catchments with stream nitrate concentrations less than 6 mg L−1. Here, nitrate, as a competing electron acceptor, was too low in concentration to inhibit microbial iron(III) reduction. Consequently, observed trends were strongest in forested catchments, where nitrate concentrations were unaffected by agricultural and urban sources and which were therefore sensitive to reductions of atmospheric nitrogen depositions. We conclude that there is strong evidence that the decline in nitrogen deposition toward pre-industrial conditions lowers the redox buffer in riparian soils, destabilizing formerly fixed problematic compounds, and results in serious implications for water quality.

SOURCE





Coral not dead after all

There have been incessant fake-news proclamations from Greenies about the Northern third of Australia's  Great Barrier Reef being all but dead.  Problem: People who go there find some bleached bits but most of it is fine.  Report below from a very Northerly part of the reef says it is in superb condition



RAINE Island, located about 620 kilometres northwest of Cairns, is the largest green turtle nesting ground on planet Earth.

The 32-hectare coral island is in the far north section of the reef, about 620 kilometres north of Cairns on the way to Cape York.

Cairns local Jemma Craig recently dived at the island for the first time, documenting her experience with a series of incredible pictures.

In October last year, an environmental writer wrote a snarky obituary, declaring the World Heritage Site dead at 25 million years of age.

It was premature, but just one month later a team of scientists wrote an article for news.com.au saying that two-thirds of coral in the northern part of the reef have died in the worst-ever bleaching event.

Raine Island, however, appears to have escaped with its life.

“I grew up on the Great Barrier Reef, I have worked and dived here for many years and have ventured to the far corners of the Coral Sea in a quest to see more, but nothing; nothing I have ever seen compares to my dive on the reef surrounding Raine Island,” Ms Craig said.

The 24-year-old works as a host on board the MV Spoilsport with Mike Ball Dive Expeditions, which operates out of Cairns.

She said she found it hard to comprehend this part of the reef looked so good.

“The reef flat is simply covered in beautiful, colourful hard coral, turtles cruising and marine life from one end to the other. I didn’t know where to look.”

SOURCE





Ontario mother's powerful plea to Justin Trudeau on soaring energy bills

And Pretty Boy just passes the buck

An Ontario mother struggling to pay her hydro bill took Prime Minister Justin Trudeau to task in an impassioned plea that captured the plight of many Ontarians overwhelmed by high electricity rates.

In a question-and-answer session in Peterborough, Ont., Kathy Katula said she works 15-hour days and is a single mother to four and grandmother to three in rural Buckhorn, Ont. Trudeau is in the midst of a cross-country tour.

"Something's wrong now, Mr. Trudeau. My heat and hydro now cost me more than my mortgage," she said. "I now not only work 75 hours a week, I stay and work 15 hours a day just so I don't lose my home."

"I make almost $50,000 a year, Mr. Trudeau, and I'm living in energy poverty. Please tell me how you are going to fix that."

Holding her hydro bill, Katula challenged Trudeau on carbon pricing and said her hydro bill is upwards of $1,000 as people in the crowd shouted "shame."

She said at one point in the summer, she went without electricity for five days, despite paying a $680 bill.  "I called and I begged our hydro company. They wouldn't do nothing. Five days, I lived in that heat."

She continued to press Trudeau in the emotional exchange.

"How do you justify to a mother of four children, three grandchildren, physical disabilities and working up to 15 hours a day; how is it justified for you to ask me to pay a carbon tax when I only have a $65 left of my paycheque every two weeks to feed my family?"

"I am asking you to fix our hydro system. I am asking you to fix Canada."

Trudeau said he acknowledged her concern while defending his government's policy on climate change and clarifying that hydro bills fall under provincial jurisdiction.

"We're facing a challenge where we have to change behaviours. It is important that those changes happen in a way that doesn't penalize our most vulnerable; that doesn't make it more difficult for families who are already stretched thin to succeed."

The prime minister noted that carbon pricing revenues would stay with provincial governments to be used at their discretion, adding that the national carbon pricing requirement does not take effect until 2019.

"We are a country, in which anyone with a quarter of your strength, of your drive, should be thriving and focused on how are you going to spoil your grandchildren with all your energy as opposed to how are you going to get through the week or the day," Trudeau told Katula.

SOURCE




Reality-based climate forecasting

Continuing to focus on carbon dioxide as the driving force will just bring more bogus predictions

By Paul Driessen

After diving recently among Key West’s fabled ship-destroying barrier reefs, I immersed myself in exhibits from the Nuestra Senora de Atocha, the fabled Spanish galleon that foundered during a ferocious hurricane in 1622. The Mel Fisher Maritime Museum now houses many of the gold, silver, emeralds, and artifacts that Mel and Deo Fisher’s archeological team recovered after finding the wreck in 1985.

Also featured prominently in the museum is the wreck of a British slave ship, the Henrietta Marie. It sank in a hurricane off Key West in 1700, after leaving 190 Africans in Jamaica, to be sold as slaves.

As Fisher divers excavated the Henrietta wreck, at 40 feet below the sea surface they found – not just leg shackles and other grim artifacts from that horrific era – but charred tree branches, pine cones and other remnants from a forest fire 8,400 years ago! The still resinous smelling fragments demonstrate that this area (like all other coastal regions worldwide) was well above sea level, before the last ice age ended and melting glaciers slowly raised oceans to their current level: 400 feet higher than during the frigid Pleistocene, when an enormous portion of Earth’s seawater was locked up in glaciers.

Climate change has clearly been “real” throughout earth and human history. The question is, exactly how and how much do today’s human activities affect local, regional, or global climate and weather?

Unfortunately, politicized climate change researchers continue to advance claims that complex, powerful, interconnected natural forces have been replaced by manmade fossil fuel emissions, especially carbon dioxide; that any future changes will be catastrophic; and that humanity can control climate and weather by controlling its appetite for oil, gas, coal, and modern living standards.

If you like your climate, you can keep it, they suggest. If you don’t, we can make you a better one.

Not surprisingly, climate chaos scientists who’ve relied on the multi-billion-dollar government gravy train are distraught over the prospect that President Donald Trump will slash their budgets or terminate their CO2-centric research. Desperate to survive, they are replacing the term “climate change” with “global change” or “weather” in grant proposals, and going on offense with op-ed articles and media interviews.

“This is what the coming attack on science could look like,” Penn  State modeler and hockey stick creator Michael Mann lamented in a Washington Post column. “I fear what may happen under Trump. The fate of the planet hangs in the balance.” (Actually, it’s his million-dollar grants that hang in the balance.)

A “skeptic” scientist has warmed to the idea that a major Greenland ice shelf may be shrinking because of climate change, a front-page piece in the Post claimed. Perhaps so. But is it manmade warming? Does it portend planetary cataclysm, even as Greenland’s interior and Antarctica show record ice growth? Or are warm ocean currents weakening an ice shelf that is fragile because it rests on ocean water, not land?

The fundamental problem remains. If it was substandard science and modeling under Obama era terminology, it will be substandard under survivalist jargon. The notion that manmade carbon dioxide now drives climate and weather – and we can predict climate and weather by looking only at plant-fertilizing CO2 and other “greenhouse gases” – is just as absurd now as before.

Their predictions will be as invalid and unscientific as divining future Super Bowl winners by modeling who plays left guard for each team – or World Cup victors by looking at center backs.

As climate realists take the reins at the EPA and other federal and state agencies, the Trump Administration should ensure that tax dollars are not squandered on more alarmist science that is employed to justify locking up more fossil fuels, expanding renewable energy and “carbon capture” schemes, reducing U.S. living standards, and telling poor countries what living standards they will be “permitted” to have.

Reliable forecasts, as far in advance as possible, would clearly benefit humanity. For that to happen, however, research must examine all natural and manmade factors, and not merely toe the pretend-consensus line that carbon dioxide now governs climate change.

That means government grants must not go preferentially to researchers who seek to further CO2-centrism, but rather to those who are committed to a broader scope of solid, dispassionate research that examines both natural and manmade factors. Grant recipients must also agree to engage in robust discussion and debate, to post, explain and defend their data, methodologies, analyses, and conclusions.

They must devote far more attention to improving our understanding of all the forces that drive climate fluctuations, the roles they play, and the complex interactions among them. Important factors include cyclical variations in the sun’s energy and cosmic ray output, winds high in Earth’s atmosphere, and decadal and century-scale circulation changes in the deep oceans, which are very difficult to measure and are not yet well enough understood to predict or be realistically included in climate models.

Another is the anomalous warm water areas that develop from time to time in the Pacific Ocean and then are driven by winds and currents northward into the Arctic, affecting U.S., Canadian, European, and Asian temperatures and precipitation. The process of cloud formation is also important, because clouds help retain planetary warmth, reflect the sun’s heat, and provide cooling precipitation.

Many scientists have tried to inject these factors into climate discussions. However, the highly politicized nature of U.S., IPCC, and global climate change funding, research, regulatory, and treaty-making activities has caused CO2-focused factions to discount, dismiss, or ignore the roles these natural forces play.

The political situation has also meant that most research and models have focused on carbon dioxide and other assumed human contributions to climate change. Politics, insufficient data and inadequate knowledge also cause models to reflect unrealistic physics theories, use overly simplified and inadequate numerical techniques, and fail to account adequately for deep-ocean circulation cycles and the enormity and complexity of natural forces and their constant, intricate interplay in driving climate fluctuations.

Speedier, more powerful computers simply make any “garbage in-garbage out” calculations, analyses, and predictions occur much more quickly – facilitating faster faulty forecasts … and policy recommendations.

The desire to secure research funding from Obama grantor agencies also perpetuated a tendency to use El Niño warming spikes, and cherry-pick the end of cooling cycles as the starting point for trend lines that allegedly “prove” fossil fuels are causing “unprecedented” temperature spikes and planetary calamity.

Finally, the tens of billions of dollars given annually in recent years to “keep it in the ground” anti-fossil fuel campaigners, national and international regulators, and renewable energy companies have given these vested interests enormous incentives to support IPCC/EPA pseudo-science – and vilify and silence climate realists who do not accept “catastrophic manmade climate change” precepts.

The Trump Administration and 115th Congress have a unique opportunity to change these dynamics, and ensure that future research generates useful information, improved understanding of Earth’s complex climate system, and forecasts that are increasingly accurate. In addition to the above, they should:

* Reexamine and reduce (or even eliminate) the role that climate model “projections” (predictions) play in influencing federal policies, laws and regulations – until modeling capabilities are vastly and demonstrably improved, in line with the preceding observations.

* Revise the Clean Air Act to remove the EPA’s authority to regulate carbon dioxide – or compel the EPA to reexamine its “endangerment” finding, to reflect the previous bullet, information, and commentary.

* Significantly reduce funding for climate research, the IPCC and the EPA, and science in general. Funding should be more broadly based, not monopolistic, especially when the monopoly is inevitably politicized.

This is not an “attack on science.” It is a reaffirmation of what real science is supposed to be and do.

Via email





It’s the facts the BBC leaves out about climate change that are important

Christopher Booker

Last November, when news that the “climate denier” Donald Trump had been elected president reached the thousands of climate zealots gathered in sunny Marrakech for the UN’s annual dronefest on how to save the planet from global warming, they were reportedly plunged into an almost clinical depression, many bursting into tears.

Last Tuesday, the BBC’s Roger Harrabin picked up on this harrowing scene in a much-trailed Radio 4 documentary, Climate Change: The Trump Card, which was like the BBC’s first major fightback against the horror that was looming up.

The essence of Harrabin’s message was that whatever the dreadful Mr Trump does to reverse President Obama’s world-leading role in keeping global temperatures from rising by more than two degrees, at least we can look for hope to India and China, both now firmly committed to clean, green, “renewable” energy.

His programme began with him enjoying a solar-heated shower in a “backpackers’ camp” on an island off southern India, seguing into India’s prime minister Narendra Modi promising the UN’s mammoth 2015 Paris climate conference “a huge expansion in the power of the sun”.

We heard an interview in Potsdam with one of the high priests of climate alarmism, Hans Schellnhuber, predicting that by 2100 global temperatures could have risen by five or six degrees, with assurances that, whatever Trump does, this will not knock Germany or the EU “off their low carbon course”.

If the US under Trump leaves a “vacuum”, already poised to fill it is China, now the world-leader in producing wind turbines and solar panels. And Harrabin ended back in India, gazing down on “the world’s biggest solar farm”, as “a spectacular monument to India’s energy policy”.

No mention of the fact that, before that Paris conference, China and India formally notified the UN that, to keep their economies growing, they intend between them to build more than 800 new coal-fired power stations; and that by 2030 – as already the world’s first and third largest emitters of CO2 – they plan to double and treble those emissions. Even by the BBC’s standards, as one expert observer put it, this farrago of “deluded groupthink was stunning”.

As always, what was striking was not just what it did say, but how much more it was careful to leave out. How this squares with the BBC’s statutory obligation to report with “accuracy and impartiality” has long been one of the puzzles of the age.  But back in the real world, that dreaded “Trump card” is now fast approaching.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Sunday, January 15, 2017


Trump meets with Princeton scientist who called ‘global warming’ fears ‘pure belief disguised as science’

President-Elect Donald Trump met with prominent Princeton University physics professor Dr. Will Happer, an outspoken climate skeptic, on Friday in New York.

Happer, who has authored 200 peer-reviewed scientific papers, has testified to the U.S. Senate that the Earth is currently in a ‘CO2 ‘famine.’ Happer explained to Congress in 2009:  ”Warming and increased CO2 will be good for mankind…’CO2 is not a pollutant and it is not a poison and we should not corrupt the English language by depriving ‘pollutant’ and ‘poison’ of their original meaning,” Happer added.

In 2014, Happer ridiculed “global warming” fears. “The incredible list of supposed horrors that increasing CO2 will bring the world is pure belief disguised as science,” he noted.

Princeton University also has another prominent climate skeptic. Renowned Princeton Physicist Freeman Dyson: ‘I’m 100% Democrat and I like Obama. But he took the wrong side on climate issue, and the Republicans took the right side’

Happer’s meeting with Trump gave rise to speculation about a role in the administration.

The Washington Post reported: “Happer did not answer questions on his way into the elevator to meet with Trump, according to pool reports. He did not immediately respond to requests for comment from the Post.

E&E News, which was apparently first to report on the meeting, noted that it was ‘unclear’ whether Happer might be under consideration for energy or science positions in the administration. There certainly remain many of those to fill.”

Climate skeptics would rejoice at the prospect of Happer joining a Trump administration. Happer served as the former director of DOE’s Office of Energy Research — now the Office of Science — from 1991 until 1993.

Happer has also directed his scientific ire at the United Nations, declaring that policies to combat “global warming” are “based on nonsense.” “Policies to slow CO2 emissions are really based on nonsense. We are being led down a false path. To call carbon dioxide a pollutant is really Orwellian. You are calling something a pollutant that we all produce. Where does that lead us eventually?,” he asked.

“Many people don’t realize that over geological time, we’re really in a CO2 famine now. Almost never has CO2 levels been as low as it has been in the Holocene (geologic epoch) – 280 (parts per million – ppm) – that’s unheard of. Most of the time [CO2 levels] have been at least 1000 (ppm) and it’s been quite higher than that,” Happer told the Senate Committee. “Earth was just fine in those times,” Happer added. “The oceans were fine, plants grew, animals grew fine. So it’s baffling to me that we’re so frightened of getting nowhere close to where we started,” Happer explained.

SOURCE





Adapting to Warming in Japan: What Has Been Discovered to Date
 
Paper Reviewed:Ng, C.F.S., Boeckmann, M., Ueda, K., Zeeb, H., Nitta, H., Watanabe, C. and Honda, Y. 2016. Heat-related mortality: Effect modification and adaptation in Japan from 1972 to 2010. Global Environmental Change 39: 234-243.

Writing as background for their work, Ng et al. (2016) state that "excessive heat is a health risk," but they also say that "previous studies have observed a general decline in sensitivity to heat despite increasing temperatures." Noting, therefore, that "conclusive evidence is lacking on whether long-term changes of this sensitivity can be attributed to specific adaptation measures, such as air conditioning [AC], or should be linked to societal adaptation such as improved healthcare systems or socioeconomic well-being," they proceed to analyze "daily total [from natural causes], cardiovascular and respiratory disease mortality and temperature data from 1972 to 2010 for 47 prefectures," using a Poisson generalized linear model to estimate the effect of heat on mortality," along with "a random effects model to obtain the mean national effect estimates, and meta-regression to explore the impact of prefecture-level characteristics." And what did they find in so doing?

Ng et al. report that their data "show a general decrease in excess heat-related mortality over the past 39 years despite increasing temperatures [of approximately 1°C]," demonstrating, in their words, "that some form of adaptation to extreme temperatures has occurred in Japan." More specifically, as illustrated in the figure below, their data revealed a national reduction of 20, 21 and 46 cases of deaths per 1,000 due to natural, cardiovascular, and respiratory causes, respectively, which reductions correspond to astounding respective percentage drops of 69, 66 and 81 percent! Similar percentage declines were also noted when analyzing the number of deaths by age group, with the most elderly population age group experiencing the greatest death rate percentage declines (Figure 1b).

In commenting on these notable health improvements, Ng et al. write that an "increase of AC prevalence was not associated with a reduction of excess mortality over time," yet they note that "prefectures and populations with improved economic status documented a larger decline of excess mortality," adding that "healthcare resources were associated with fewer heat-related deaths in the 1970s, but the associations did not persist in the more recent period (i.e., 2006-2010)." Whatever the cause or causes, one thing is certain; whereas the temperature rose, human death rates declined ... and we would call that a good thing!

SOURCE





Activist Criticism Again Misses Mark on EPA Nominee Pruitt

In a previous post, which has since been discussed by Bre Payton of the Federalist, I wrote about a mendacious television attack ad produced by the Environmental Defense Action Fund. In a nutshell, the ad wrongly accuses Environmental Protection Agency Administrator-designate Scott Pruitt of denying mercury’s hazardous effects on human health.

Yesterday, the Sierra Club opened a new line of attack. After being queried by E&E News’s Benjamin Storrow about Pruitt’s association with a political action committee that received almost $210,000 from energy interests, the Sierra Club responded, “The fact that Scott Pruitt intends to take big cash from the very same big polluters he is supposed to be monitoring as EPA administrator is unprecedented and a clear danger to the health of our families.”

I was struck by the Sierra Club’s averment that Pruitt’s association with a PAC that took more than $200,000 from oil and gas interests is an “unprecedented” threat to “the health of our families.” Of course, President Obama has ultimate authority over the EPA, and he took more than $900,000 from oil and gas companies and their employees in 2008, according to U.S. News & World Report. It stands to reason that he reaped a similar haul when he ran for re-election. The upshot is that Pruitt is associated with a PAC that took in from the oil and gas sector only a fraction of the donations that Obama accepted. So it’s implausible to argue that Pruitt’s actions are “unprecedented.”

SOURCE




UK: The Folly of Swansea Bay Tidal Lagoon

If ever there was a textbook example of how to go about Government lobbying and project development, then it is the Swansea Bay Tidal Lagoon project. The developer, Tidal Lagoon Power, has done a frankly incredible job of promoting the project to policymakers and financiers. The project has gone from an interesting idea on paper a few years ago, to being backed financially by investment bank Macquarie amongst others, to garnering significant political support by the likes of the Rt. Hon. Sir Ed Davey (as Secretary of State for Energy and Climate Change), other Coalition Government Cabinet members, and the Welsh Government.

Tomorrow represents a make or break point for the project. The Government will publish a long-awaited review on the potential for tidal power in the UK, led by former Energy Minister Charles Hendry. Both Hendry and the Government have been tight-lipped about the contents of the review, but the terms of reference are to “assess the strategic case for tidal lagoons and whether they could play a cost effective role as part of the UK energy mix.”

Based on my own knowledge of the project and technology, I suggest that it would be folly for the Government to agree to progress the Swansea Bay project further.

The main reason for this is simply the cost of the technology. The developer’s latest estimates are that the Swansea Bay project would cost £1.3 billion to construct. Interestingly, this headline cost has already increased by more than 40% compared to earlier estimates – a 2014 report to the developers assumed a lower capital cost of £913 million.

In order to get a better handle on the relative cost of the technology, it is informative to consider the cost per unit of electrical output (£/MWh) – often referred to as the ‘Levelised Cost of Energy’. The same report from 2014 put the cost of Swansea Bay at £168/MWh, roughly four times the current wholesale price of electricity. By comparison, the Government’s own estimates show that other low carbon technologies are considerably cheaper:

Swansea Bay Tidal Lagoon makes wind, solar, Carbon Capture and Storage, and nuclear look cheap. Moreover, if the question is whether tidal lagoons could play a ”cost effective role”, then it is worth considering even cheaper ways to cut carbon emissions, such as improving energy efficiency. It would be far better to spend the £1.3 billion insulating our homes properly (see our recent report on cost effective routes to decarbonise heating). Backing an expensive technology such as tidal lagoons would leave less space within the Levy Control Framework funding envelope to spend on other, cheaper technologies. The Levy Control Framework budget has already been overspent to 2020 – a point exposed in a Policy Exchange report in July 2015.

The project developer has suggested that the headline “strike price” (or subsidy support level) required by the project is far lower than £168/MWh – potentially similar to the £92.50/MWh agreed strike price for Hinkley Point. However this is only possible by tweaking parameters of the Contract for Difference (CfD) subsidy contract such as the term and indexing rate, and assuming some form of Government grant to the project. These tweaks simply obfuscate the true cost of the project. All technologies could achieve a lower strike price if they were given the same treatment.

The project developers (and other proponents of tidal range) claim that subsequent tidal lagoons in Cardiff and Newport will be considerably cheaper than Swansea Bay – potentially cheaper than offshore wind or nuclear. This is due to the larger size of the subsequent projects and resulting economies of scale. In reality this claim is totally untested and purely based on desk-based analysis by the project developer.

The difficulty with tidal range is that it is caught somewhere between being a mature and immature technology. There is already one significant example of the technology – the 240MW tidal barrage at La Rance in France, commissioned in 1959 – but as yet the technology has not been deployed widely. If tidal range is considered ‘mature’, and the Cardiff/Newport projects are so much cheaper than the Swansea Bay project, then surely we progress with them instead? Or conversely, if tidal range is still considered ‘immature’ and a further ‘experiment’ is required, then is it appropriate for this to be a £1.3 billion experiment paid for by bill-payers or taxpayers? Even if the Government wishes to pursue tidal range as a technology, then it is not unreasonable to question whether Swansea Bay is the ‘right’ project to demonstrate the technology, or whether a smaller scale experiment should be undertaken first.

Proponents also make the case for the Swansea Bay project on the basis that it will create jobs, and could help to develop an export industry (for example see another report commissioned by the developer). A group of manufacturers recently wrote an open letter to this effect, which was published in the Financial Times. It is undoubtedly true that building tidal lagoons will create UK jobs – both in civil engineering and in manufacturing components. No doubt this will also be a politically attractive proposition, since many of these jobs will be situated in South Wales – helping to rebalance the economy away from London.

However, the fact that it will create jobs is it itself not a sufficiently strong argument – spending £1.3 billion on any infrastructure project would create jobs. The Government could instead choose to spend the same money on alternative sources of energy, or other forms of infrastructure such as roads, hospitals or schools. All of these projects would create jobs. The relevant question here is: what is the return on investment (for UK plc) in supporting the Swansea Bay project versus alternatives? Which forms of investment would best tackle the UK’s longstanding economic challenges such as sluggish productivity growth (as highlighted in a recent Policy Exchange report, The New Industrial Strategy)?

It should not be forgotten that a tidal lagoon would need to be paid for via subsidies on energy bills, passed on to energy users (households and businesses). Levy-funded policies push up energy bills, which means that households and businesses have less money to spend or invest in other areas. You cannot simply look at the jobs created by building a tidal lagoon – you also need to look at the wider effects across the economy.

It is also not clear that supporting the Swansea Bay project will establish a significant export industry as claimed. One of the reasons that there has been ongoing interest in tidal is that the UK has amongst the best tidal resource in the world. Tidal range resources tend to be highly concentrated in very specific areas. There is interest in developing tidal lagoons and barrages in a number of countries such as Canada and India, but they cannot be developed everywhere. Total global installed capacity of tidal energy devices (tidal range and tidal stream) stands at around 0.5GWs. If the UK wishes to become a market leader in low carbon technologies, then it might be better to focus on technologies with truly massive global potential – such as solar, wind, nuclear, Carbon Capture and Storage, or electric vehicles.

Finally, the proponents of tidal power claim that it is different from other forms of renewable energy such as wind and solar, due to the fact that it is predictable. Whilst that may be true (the timing of tides can be predicted decades in advance), tidal lagoons still do not produce power 100% of the time, and in that sense are not wholly reliable. In any case, the intermittency of renewables can be managed using a range of technologies including thermal power generation, storage, demand response, and interconnection – as discussed in our recent report, Power 2.0.

To summarise, in our view the Government should resist the urge to back tidal range, or the Swansea Bay project, any more than other low carbon technologies. Whilst the project would deliver many jobs and benefits, the same could be said of other energy or infrastructure projects. If tidal range projects can compete on cost terms, then of course they should be supported, but that is not the proposition which is currently on offer. However the developers try and dress it up, the Swansea Bay project is considerably more expensive than other low carbon technologies. If the Government chooses to back this project, then it will have a job to explain why this represents good value for money. Indeed, since the Competition and Market Authority’s report on the energy market, the Government has an obligation to carry out an Impact Assessment and defend any decision to award CfDs on a non-competitive basis.

SOURCE




Australia: The latest Bureau of Meteorology shenanigans

This summer has been very frustrating for the BOM.  As tireless global warming missionaries, they wanted the Sydney summer to be the "hottest yet".  And the headlines they generated have on several occasions claimed just that.

But the thermometers have in fact been unobliging.  If you read the small print, coastal Sydney has failed to get into the 40s. It was only localities that are normally hot which did that.

And hanging over their heads is the awful truth that the temperature in coastal Sydney reached 42 degrees (108F) in 1790, long before there were any power stations, SUVs and all the other Greenie bugaboos in Sydney.

So what  to do?  They have had a brainwave (below).  Instead of reporting maximum temperatures they are now reporting MINIMUM temperatures.  They say that various minimum (night-time) temperatures have been unusually hot.  But global warming is supposed to cause high maximum temperatures so it is a pretty desperate bit of fake news


SYDNEY residents sweltered through the harbour city’s hottest January night in recorded history last night.

But the good news for the sleepless masses is relief is in sight, with a cool change on its way.

Temperature records tumbled across Sydney as the extreme heatwave peaked overnight.

Among the new records set were in Observatory Hill, where the temperature dropped only to 26.4C, Bankstown (26.2C), Camden (27.1C), Penrith (28.6C), Richmond (28.2C), Horsley Park (26.2C), and Terrey Hills (26.9C).

But relief is on its way.

Conditions across the southern half of NSW are expected to ease over the weekend but the mercury will likely remain in the low to mid 40s in the state’s north.

After copping temperatures up to 45C on Friday, Sydney’s west is forecast for a milder maximum of 35C on Saturday while in the coastal parts of the city it is due to reach 31C.

But for those in the far north it is expected to remain hot with a predicted high of 41C at Grafton.

Queenslanders who have been in the grip of the same heatwave are set to endure another day of blistering conditions before conditions cool on Sunday.

A top of 34C is forecast for Brisbane on Saturday, which is five degrees above the average maximum for this time of year.

(Rubbish!  The temperature in my anteroom regularly tracks the official observations for Brisbane and at 34.5C yesterday  afternoon it did go higher on my thermometer than the forecast. But it had been right on 34C for a week or so)

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************