UK Government To Ban The Sale Of Combustion Engine Cars And Vans By 2040




An ambitious target to phase out sales of new petrol and diesel cars by 2030 could halve UK oil imports, a study by environmental and aid organisations suggests.
The government has announced plans to ban the sale of conventional combustion engine cars and vans by 2040 as part of its efforts to tackle air pollution and climate change, a move the groups welcomed as a step in the right direction.
But a stronger ambition for all new cars and vans to produce zero emissions by 2030 could reduce UK foreign oil imports by 51% in 2035 compared to current projections, as well as cut pollution and boost investment in UK infrastructure such as charging points.
The report by Green Alliance and supported by Cafod, Christian Aid, Greenpeace, RSPB and WWF, comes ahead of the government’s publication of the clean growth plan, which will outline how the UK will cut carbon emissions to meet legal targets.
Ministers have signalled that the delayed plan will be published after parliament’s summer recess, and the aim is for it to be “as ambitious, robust and clear a blueprint as it can be”.
In 2016, transport accounted for 40% of the UK’s total energy consumption, with road transport accounting for three-quarters of this, but other countries such as Norway and India are ahead of the UK in their ambitions to switch to electric vehicles, the new study said.
Along with a tougher target to make the move to zero emission vehicles, the study calls for additional funding for renewables between 2020 and 2025, bringing in zero-carbon standards for new homes and building energy efficiency improvements.
Gareth Redmond-King, head of climate and energy at WWF, said the UK could and must go faster than the 2040 goal.
“To ensure the UK doesn’t miss out on the jobs and investment opportunity in clean, modern vehicles, the UK should up its ambition. Cleaning up transport and boosting home energy efficiency must be priorities for the UK government in the forthcoming clean growth plan.
“Both measures will create jobs for UK businesses and reduce costs to the NHS caused by noxious air pollution and cold, leaky homes,” he said.
Laura Taylor, head of advocacy at Christian Aid, said: “The UK government’s long-overdue clean growth plan needs to prove that this government is serious about speeding up the low carbon transition, not slackening the pace.
“The benefits to citizens are enormous but areas like home energy efficiency and heating are lagging behind and need urgent political attention.”
source:the guardianP

CHINA DOUBLES END OF DECADE SOLAR POWER TARGET




China has more than doubled its end-of-decade solar power target, with new installations dramatically outstripping expectation, according to the government’s energy agency.
By the end of July this year, China’s solar PV capacity topped 112GW, after installing a stunning 35GW in just seven months — more than twice as much as installed by any other country in all of 2016.
As a result, total solar PV capacity now exceeds the government’s 2020 goal of 105GW, set as recently as last year.
This could have created a very confusing situation for the industry – after years of record-setting installations, there was no target to hit – but the National Energy Administration (NEA) responded by setting new, ambitious annual installation targets.
These targets would take capacity to 213GW in 2020 — which is five times larger than current capacity of the United States.
That would mean covering an area of land equivalent to greater London – 1500km2 – with solar panels.
Current growth rates suggest China could even surpass that new, higher target.

Wind is also doing well
China is on track to install at least 110.4 GW in onshore wind capacity over the next three years. 
This would increase the country’s cumulative wind power installation by 2020 to about 264 GW, far exceeding the original target of 210GW set during the 13th Five-Year Plan period.
It’s also considerably more than the total wind power capacity of all of Europe (and that’s including the UK).
New targets
That’s not all the new targets imply.
By 2020, China is aiming to build 54.5GW of large-scale solar projects  PV stations, and agriculture and husbandry combinations.
That alone surpasses the total solar capacity of both the UK and Germany combined.
In addition there will be 8GW of new showcase projects that use higher efficiency solar PV every year.
The new target reflects the huge potential for distributed solar — electricity that is produced at the same spot where it is used. Under the new regulation there are no limitations on the installation of distributed renewables, meaning innovations like rooftop solar panels are on track to soar.

Over the past year, distributed solar installations have shot up.
According to NEA statistics, 7.11 GW of distributed solar PV was added in the first half of 2017, an approximately threefold increase year-on-year.
The astounding growth of wind and solar power in China means that the country is on track to generate Germany’s total electricity consumption from these sources by 2020. Generation from wind and solar would amount to around 9% of China’s own consumption, up from 5.2% last year.
Curtailment
But it’s not all rosy.
China’s wind and solar power sectors are still battling a huge curtailment crisis.
In the first half of 2017, the national wind curtailment rate stood at 13.6%, with solar curtailment in five northwest provinces at 15.5%.
The NEA’s new targets, however, acknowledge the problem and take two key steps to tackle it.
First, provinces with serious wind and solar curtailment problems, such as the western provinces Gansu, Xinjiang and Ningxia, are not permitted to install more capacity.
This should have the effect of nudging these province’s governments towards effectively utilising the enormous capacity they have already installed.
And that will mean challenging coal’s dominance in the energy mix.
Second, seven provinces, including Beijing and Shanghai, are allowed to install as much solar capacity as they want with the important caveat that the new capacity does not cause curtailment in these areas.
That suggests China could in fact smash its own 2020 solar target. Again.
source:greenpeace

HURRICANE HARVEY WORSENED BY CLIMATE CHANGE



What can we say about the role of climate change in the unprecedented disaster that is unfolding in Houston with Hurricane Harvey? There are certain climate change-related factors that we can, with great confidence, say worsened the flooding.


Sea level rise attributable to climate change – some of which is due to coastal subsidence caused by human disturbance such as oil drilling – is more than half a foot (15cm) over the past few decades (see here for a decent discussion). That means the storm surge was half a foot higher than it would have been just decades ago, meaning far more flooding and destruction.
In addition to that, sea surface temperatures in the region have risen about 0.5C (close to 1F) over the past few decades from roughly 30C (86F) to 30.5C (87F), which contributed to the very warm sea surface temperatures (30.5-31C, or 87-88F). 
There is a simple thermodynamic relationship known as the Clausius-Clapeyron equation that tells us there is a roughly 3% increase in average atmospheric moisture content for each 0.5C of warming. Sea surface temperatures in the area where Harvey intensified were 0.5-1C warmer than current-day average temperatures, which translates to 1-1.5C warmer than “average” temperatures a few decades ago. That means 3-5% more moisture in the atmosphere.

That large amount of moisture creates the potential for much greater rainfalls and greater flooding. The combination of coastal flooding and heavy rainfall is responsible for the devastating flooding that Houston is experiencing.
Not only are the surface waters of the Gulf of Mexico unusually warm right now, but there is a deep layer of warm water that Harvey was able to feed upon when it intensified at near record pace as it neared the coast. Human-caused warming is penetrating down into the ocean. It’s creating deeper layers of warm water in the Gulf and elsewhere.
Harvey was almost certainly more intense than it would have been in the absence of human-caused warming, which means stronger winds, more wind damage and a larger storm surge. (As an example of how this works, we have shown that climate change has led to a dramatic increase in storm surge risk in New York City, making devastating events like Hurricane Sandy more likely.)
Finally, the more tenuous but potentially relevant climate factors: part of what has made Harvey such a devastating storm is the way it has stalled near the coast. It continues to pummel Houston and surrounding regions with a seemingly endless deluge, which will likely top out at nearly 4ft (1.22m) of rainfall over a days-long period before it is done.
The stalling is due to very weak prevailing winds, which are failing to steer the storm off to sea, allowing it to spin around and wobble back and forth. This pattern, in turn, is associated with a greatly expanded subtropical high pressure system over much of the US at the moment, with the jet stream pushed well to the north. This pattern of subtropical expansion is predicted in model simulations of human-caused climate change.

More tenuous, but possibly relevant still, is the fact that very persistent, nearly “stationary” summer weather patterns of this sort, where weather anomalies (both high-pressure dry hot regions and low-pressure stormy/rainy regions) stay locked in place for many days at a time, appears to be favoured by human-caused climate change. We recently published a paper in the academic journal Scientific Reports on this phenomenon.
In conclusion, while we cannot say climate change “caused” Hurricane Harvey (that is an ill-posed question), we can say is that it exacerbated several characteristics of the storm in a way that greatly increased the risk of damage and loss of life. Climate change worsened the impact of Hurricane Harvey.
source:the guardian

Is Poor Air Quality Masking Global Warming Effects?



During the 20th century, the average temperature of the continental United States rose by almost 1 degree Fahrenheit (0.5 degree Celsius) — everywhere, that is, except in the Southeast. There, until the 1980s, the temperature actually decreased slightly. Climate scientists dubbed this peculiar phenomenon the "warming hole," and it was the cause of much speculation. But beginning in the 1990s, temperatures in the Southeast began to warm again, and in the early years of the 21st century this warming has accelerated.
A new study published in the journal Remote Sensing presents evidence that a significant improvement in air quality in the region may have contributed to the disappearance of the warming hole after about 1990 — and that other polluted regions outside the United States, such as China and India, may experience the same phenomenon.
One major factor in poor air quality is airborne aerosols — tiny particles of dust, soot from wood burning, coal and oil combustion, or sulfates created by precursor gases emitted from factories and car exhaust, to name a few sources. Aerosols can decrease temperature by dimming sunlight at Earth's surface and by increasing the amount and lifetimes of clouds, which reflect sunlight back into space.
After the warming hole mysteriously disappeared, various studies proposed possible causes: changes in cloud cover, precipitation or in the amount of aerosols produced by air pollution. In 2006, the U.S. Environmental Protection Agency (EPA) began implementing a more stringent cap on the concentration of aerosol particles smaller than about 1/10,000th of an inch (2.5 micrometers) in diameter. To comply with the regulation, many U.S. power utilities and industrial companies began reducing their use of coal and installing filters to reduce emissions.
A similar change to temperature trends occurred in Europe in the 1980s after new regulations improved air quality there. Because reduced aerosol particle concentrations allow more sunlight to reach Earth's surface, the scientists hypothesized that the improvements in U.S. air quality could also be responsible for the temperature change over the Southeast.
To test this hypothesis, a team led by Mika Tosca, a researcher at NASA's Jet Propulsion Laboratory in Pasadena, California (who is now with the School of the Art Institute of Chicago), used three surface temperature data sets. The data sets were compiled by the University of Delaware, the University of California (UC) at Berkeley, and the Global Historical Climatology Network (which compiles surface temperature and precipitation data). They also used aerosol data from two satellite instruments: the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite, launched in 1999, and the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) satellite, a joint mission between NASA and the French space agency, CNES, launched in 2006.
The data show that between 2000 and 2015, while summertime temperatures in the Southeast United States increased by roughly 1.5 degrees Fahrenheit (0.75 degree Celsius), significantly faster than the increase in the continental United States during the 20th century, the amount of summertime aerosols decreased overall by about 20 percent, with a much steeper decline after 2007. The timing of this decline coincided with the implementation of the new EPA standards.
To help determine how much of the temperature change was caused by the changes in aerosols, Tosca and colleagues used a model that simulates how the sun's energy travels through Earth's atmosphere, using the MISR and CALIOP satellite data as inputs. The increase in sunlight shown in the model results matches well with daily measurements taken at a National Oceanic and Atmospheric Administration (NOAA) solar radiation monitoring station in Goodwin Creek, Mississippi, suggesting that the decrease in aerosols is a plausible explanation for most of the disappearance of the warming hole.
Tosca acknowledges that linkages between aerosols and clouds could also play a role. The next step would be to run a more sophisticated climate model that takes into account clouds and the aerosols' effects on them. The team would also like to apply this kind of analysis to other areas with high air pollution levels, such as China and India. They hypothesize that these areas might have "warming holes" of their own — regions where the effects of climate change are being muted by the high concentrations of aerosols in the atmosphere. If these areas reduce air pollution in the future, they might experience a sudden temperature jump as well.
"Overall, the goal is to more accurately predict what will happen to our planet," Tosca said. "This type of observation-based research gives us better models, better models give us better forecasts, and better forecasts enable better policy."
The study is titled "Attributing Accelerated Summertime Warming in the Southeast United States to Recent Reductions in Aerosol Burden: Indications from Vertically-Resolved Observations." Other institutions participating in the study included the Joint Center for Earth Systems Technology, a cooperative agreement between NASA's Goddard Space Flight Center in Greenbelt, Maryland, and the University of Maryland, Baltimore County; the Naval Research Laboratory in Monterey, California; and the University of North Dakota in Grand Forks. MISR was built and is managed by JPL, and CALIOP is jointly administered by NASA and the French space agency, Centre National d'Etudes Spatiales.
source:climate nasa

Is Denying Climate Change Science A Mental Health Problem ?




The elevation of science to a central theme in American politics is an extraordinary development in the co-evolution of science and society. Three months after Donald Trump’s inauguration, 40,000 or so people turned out in the rain in Washington, DC for the March for Science, with similar numbers in other cities. Given Trump’s all-out attack on the role and size of government—his proposed 2018 budget slashes almost all programs other than national defence—there could just as easily have been a March for Education or a March for Affordable Housing.
But the high profile of science in national politics has been building since the turn of the millennium, fuelled by controversies around embryonic stem cell research, and of course climate change. Starting with the year 2000 presidential campaign between George W. Bush and Al Gore, Democrats explicitly began positioning themselves as the party of science. During the 2004 campaign, Democratic candidate John Kerry pledged that “I will listen to the advice of our scientists, so I can make the best decisions. . . . This is your future, and I will let science guide us, not ideology.”
A year later, journalist Chris Mooney published a book whose catchy title, The Republican War on Science, later got picked up by the Democratic party, with a statement on its 2008 campaign website that “We will end the Bush administration’s war on science, restore scientific integrity, and return to evidence-based decision-making.” Indeed, Barak Obama’s 2008 inauguration speech included the memorable promise that he would “restore science to its rightful place.”

So by the time of Trump’s election, science was already a strong issue for Democrats. But everyone wants science on their side, and even Donald Trump insisted, on the day of the science march, that “Rigorous science is critical to my administration’s efforts to achieve the twin goals of economic growth and environmental protection.”
Having science on your side, however, requires a strong voice for expertise in political discussions. And as we all know, one of the more common diagnoses of political pathologies leading to Trump, as well as to the Brexit vote in the UK, is that the voice of experts has been rejected by the citizenry.
So the rhetorical stakes around science and politics are pushed even further. “We live in an age that denigrates knowledge, dislikes expertise and demonizes experts,” wrote Anne Applebaum in the Washington Post last May. Tom Nichols, who teaches at the US Naval War College, and wrote The Death of Expertise, fleshes out the diagnosis: “Americans have reached a point where ignorance—at least regarding what is generally considered established knowledge in public policy—is seen as an actual virtue.”
What makes this so discomfiting is that it cuts close to the bone of our identity as rational humans struggling to make sense of a complex world. Everyone, even Trump, says they want science on their side because being modern and rational is all about basing decisions on reliable knowledge—all the more so in the face of challenges such as climate change, pandemics and cyberwar, to name just a few apocalyptic horsemen. So you can’t make any claim to authority without implying that you have some rational, empirical basis for preferring one course of action over another.

How, then, have we at the same time come to live in a world of post-truth politics, fake news, alternative facts, and counter-narrative?
Amidst the bruising debates over issues like climate change, GM crops, stem cell research, vaccines, and so on, a number of social and behavioural scientists have begun to investigate the question of why people come to the beliefs they have about science. The larger agenda here is to understand how our cognition limits our capacity to act in the way that the Enlightenment model of rationality tells us we should be acting. Particular attention is being focused on why people don’t more readily accept the findings of scientific experts on politically controversial issues with scientific elements.
For example, John Cook, a cognitive scientist at George Mason University, writes: “Science denial, as a behaviour rather than a label, is a consequential and not-to-be ignored part of society…When people ignore important messages from science, the consequences can be dire.” This idea – that “science denial” is a “behaviour rather than a label” – turns the act of people not accepting what experts tell them from an act of individual (and perhaps ill-informed) judgment into a coherent phenomenon that experts themselves can do research on.
Efforts to pathologize “science denial” link to a growing body of work about human cognitive limits that can be traced, in part, to the wonderful set of studies carried out by Daniel Kahneman and Amos Tversky, starting in the 1970s, on judgment under uncertainty. These established that the heuristics most humans readily use to make sense of the world on a daily basis also introduce significant biases into our understanding of the world. Kahneman of course eventually won a Nobel prize for this line of research.
If people are naturally limited and biased in their abilities to see and assess the probabilistic constitution of many of the decisions that they face, it is only a short step to ask if they are, as a matter of evolutionary cognitive development, similarly limited in their more general capacity to think scientifically. And if people naturally look at the world in systematically biased ways, and if certain classes of people – say political conservatives – consistently reject the findings of science, then one might begin to explore the question of whether these two observations could be causally related.
And so, experts have begun studying why experts don’t get more respect. Scienceblind and The Knowledge Illusion are two such books by cognitive scientists published this year. As the titles suggest, they take up the question of why people understand so little about the world around them. The first of these, by Andrew Shtulman, focuses on why we don’t intuit scientific truths about the world. It looks in particular at how children’s misunderstanding of the world can help us see how difficult it is even for adults to acquire correct understanding of how things work.
Shtulman’s central premise is that we need to leave our childish intuitions behind and accept the findings of science in order to act effectively in the world. “Intuitive theories,” Scienceblind tells us, “are about coping with the present circumstances, the here and the now. Scientific theories are about the full causal story—from past to future, from the observable to the unobservable, from the miniscule to the immense.” And the book concludes, “While science denial is problematic from a sociological point of view, it’s unavoidable from a psychological point of view. There is a fundamental disconnect between the cognitive abilities of individual humans and the cognitive demands of modern society.”
The second book, The Knowledge Illusion, by Steven Sloman and Philip Fernbach, looks not only at how little we know, but also at how we know a lot less than we think we do. “Because we confuse the knowledge in our heads with the knowledge we have access to, we are largely unaware of how little we understand.” While the authors recognise that teaching people more facts about science might not change their beliefs about the world, they also believe that if people realised how little they actually do know, they would moderate their positions on key issues, and be open to a wider range of possibilities. “Getting people to think beyond their own interests and experiences may be necessary for reducing their hubris and thereby reducing polarization.” The book attributes “antiscientific thinking” to false causal models that individuals hold in their heads, often in common with their social groups.

Both of these books share the perspective that we’re all dumb but it’s not our fault; we’re born that way. The first step is to recognise how little we each understand of the world, rather like accepting original sin.
It’s hard not to sympathise with this perspective: a little more humility in a lot more people could be good for the world. But we didn’t need cognitive science to tell us that. After recognising our ignorance, the second step must therefore be an acceptance of what scientific experts tell us. Otherwise, what would be the point of accepting our ignorance?
I find this emerging intellectual programme around science denial problematic on so many levels that it’s hard to know where to start. Certainly one part of the problem with the idea of an innate cognitive stance toward science, and with discussions about science in the political world more generally, is the undisciplined way in which the word “science” gets used – as if particle physics, climate modelling, epidemiology and cultural anthropology have so much in common that they are substitutable for “science” in any sentence. Which science does “science denial” pertain to?
Moreover, the entire programme fetishises individual cognition and understanding by positioning the innate ignorance of the individual as the bottleneck at the intersection of knowledge, uncertainty, expertise, and political disagreement. The idea that these books implicitly endorse is that progress in tackling the complex problems of modernity is being blocked by individuals who do not accept new causal knowledge generated by science.
The effort to provide a behavioural explanation for why people might not accept the opinions of experts strikes me as not entirely dissimilar in its implications from the early ambitions for eugenics, in that it seeks in the biology of the individual an explanation for complex social phenomena. It makes one wonder what the appropriate treatment for science denial might actually be?

Meanwhile, the situation in the science enterprise itself is hardly reassuring. There is a reasonable case to be made—and I have tried elsewhere to make it —that much of science is on the verge of a crisis that threatens its viability, integrity, legitimacy and utility. This crisis stems from a growing awareness that much of the science being produced today is, by the norms of science itself, of poor quality; that significant areas of research are driven by self-reinforcing fads and opportunities to game the funding system, or to advance particular agendas; that publication rates continue to grow exponentially with little evidence that much of what is published actually gets read; and that the promises of social benefit made on behalf of many avenues of science are looking increasingly implausible, if not ridiculous.
Maybe a little science denial is actually in order these days? The emergence of science denial as a pathology designed to explain why science is not leading to improved political decision-making seems, if nothing else, completely overwhelmed by the precisely opposite condition.
The vast scale of the knowledge-production enterprise, combined with the likelihood that much of what’s produced is not much good, makes it possible for anyone to get whatever science they need to support whatever beliefs they might have about how best to address any problem they are concerned about – with little, if any, capacity to assess the quality of the science being deployed.
Twenty-five years ago, Silvio Funtowicz and Jerry Ravetz developed their concept of “post-normal science” to help understand the role of knowledge and expertise when facts are uncertain, values in dispute, stakes are high, and decisions are urgent. Under such conditions—which are common to many of today’s societal problems—Funtowicz and Ravetz describe how the “traditional distinction between ‘hard’, objective scientific facts and ‘soft’, subjective value-judgements is now inverted.” That is to say, facts become soft, and values hard.

Under such conditions, our expectations for Enlightenment ideals of applied rationality are themselves irrational. We are asking science to do the impossible: to arrive at scientifically coherent and politically unifying understandings of problems that are inherently open, indeterminate and contested – to provide, as Scienceblind promises us, “the full causal story.”
Meanwhile, the reliability of the very types of science that underlie books like Scienceblind and The Knowledge Illusion are increasingly called into question as evidence of irreproducibility continues to mount, including across many fields of research that make strong generalisations about human behaviour.
Our biggest problem is not science denial; it’s post-normal science denial.

source:the guardian 

AFRICA AND CLIMATE CHANGE:JUST HOW DOES IT AFFECT THE POOREST CONTINENT





To find evidence in how climate change is affecting Africa we have to look at how food production and water availability have affected the health, livelihood and security of the African people 

A study on climate vulnerability showed seven of the ten countries most at risk from climate change are in Africa. 
Southern Africa and parts of central Africa have seen a decrease in rainfall over the last 25 years whereas floods, weather-related disasters and droughts have doubled. Because of this diseases such as choleradengue fever and malaria are on the rise and as many as 67 million are at risk. Health, already compromised by a range of factors, could be further hit by the negative impacts of climate change and climate variability. 


Agricultural production in many African countries and regions will be severely affected by climate change. Agricultural losses are estimated to be possibly severe for several areas accompanied by changes in the length of growing periods impacting mixed rain-fed, arid and semi-arid systems under certain climate projections. In some countries, yields from rain-fed agriculture could be reduced by up to 50% by 2020. This will lead to loss of livelihood and social anarchy amongst the people.


Climate change will result in a loss of species and extinction of many plants and animals. Changes in a variety of ecosystems are already being detected, particularly in southern African ecosystems, at a faster rate than anticipated as a result of a variety of factors, including the influence of climate change.  
 Some regions in East Africa have become drier due to changes in land use pattern and climate. Water sources are becoming intermittent or disappearing; streams that used to run year-round are now seasonal. By 2020, some assessments project that 75-250 million people are estimated to be exposed to increased water stress due to climate change. Some assessments, for example, show severe increased water stress and possible increased drought risk for parts of northern and southern Africa and increases in run-off in East Africa. Water access is, however, threatened not only by climate change, but also by complex river basin management. This, coupled with increased demand, will adversely affect livelihoods. Changes in the ecosystem have also had significant impact on wild sources of food which have become hard to find. 

Africa faces the biggest development challenges of any continent because of an increase in the number of people at risk of water stress, exposure to malaria, and a drop in agricultural yields. Climate change has the potential to undermine sustainable development and increase poverty. It is obvious that climate change and variability will seriously hinder future development of a nation. The biophysical effects of climate change on agriculture induce changes in production and prices, which play out through the economic system as farmers and other market participants adjust autonomously, altering crop mix, input use, production, food demand, food consumption, and trade



We have a number of alternatives to minimize the negative impacts of climate change in Africa continent. The possible alternatives which compact the impact of climate change on Africa continent include communication and outreach, adaptation and mitigation options and climate change related research which supports 
decision making.

Carbon offsetting with companies that support projects in Africa such as Shop2SaveThePlanet is a very effective way for everyone to take part and do something about climate change. The projects are usually based in developing countries and most commonly are designed to reduce future emissions. This might involve rolling out clean energy technologies investing in carbon offsets from an emissions trading scheme. To find out more about the projects that Shop2SaveThePlanet support. CLICK HERE

More