Ground zero.

mmghosh's picture

Bjorn Lomborg's latest spin on things being not too bad after all is pretty silly.  For one, he says

if the main effort to cut emissions is through subsidies for chic renewables like wind and solar power, virtually no good will be achieved—at very high cost.

and then just a little later

President Obama should focus on dramatically ramping up investments into the research and development of green energy. Put another way, it is the difference between supporting an inexpensive researcher who will discover more efficient, future solar panels—and supporting a Solyndra at great expense to produce lots of inefficient, present-technology solar panels.

Why support your inexpensive researcher if solar is a "chic renewable", and what do you mean by support if not subsidy...and so forth.  Nevertheless, he does pick up on the real message

The U.N. Climate Panel in 2012 concluded: "Some regions of the world have experienced more intense and longer droughts, in particular in southern Europe and West Africa, but in some regions droughts have become less frequent, less intense, or shorter, for example, in central North America and northwestern Australia."

So - places historically affected by drought will be drier, and those affected by floods will be wetter.  How much more dramatic can this get than Australia?  A continent always prone to weather extremes, and particularly La Nina and El Nino events, it is rapidly becoming Ground Zero in the spectacular changes that climate change is bringing us.  Previous once-in-decade events are now annual events, as shown by the Queensland floods happening every year now for the past three years, and again this year.

However, many people in the towns of Grantham and Laidley are fleeing their homes, fearful of a repeat of the floods that claimed so many lives in January 2011.
Advertisement
Many of the communities hit hardest in the 2010/11 floods have been inundated again and pummelled by tornadoes generated by the remnants of ex-tropical cyclone Oswald.

Meanwhile, fires and abnormal temperatures continue in the South.

With temperatures soared into the forties on Saturday - despite the first signs of an unexpected cool change - a total fire ban across the state was extended for a further 24 hours through Saturday in response to the threat. Australia is now a country under climate siege.

The fires stem from the continent's lingering, seasonal droughts. Cruel, dry and arid conditions now permeate even in its humid climatic regions.

While the merest stretch of dry years bring the disaster of fire hazards, the summer of 2012/13 has eclipsed most heatwaves on record and witnessed the outbreak of hundreds of fires in virtually every Australian state.

Now in Australia, drought condition conducive to widespread outbreaks of wildfire extends into the Humid Subtropical and even Marine climates. One of the most damaging fires this week occurred in Tasmania, an island state with a climate more usually equated with the highlands of Scotland.

And all this with relatively mild solar insolation!  An abnormally weak sun and few major El Nino events in the past 15 years has meant less heat for the earth's oceans to absorb.  But, because the underlying temperature trend has been climbing, as nicely explained by Tamino, an extreme temperature fluctuation in the next few years could make for truly spectacular weather.

 

Real data — temperature, for instance — are almost always the combination of signal and noise, which we could also refer to as trend and fluctuation. Fluctuations are ubiquitous, they happen all the time. Sometimes they go up, sometimes down, sometimes a little and sometimes a lot, but the one thing they don’t do is stop.

That’s why, even in a stable climate, we’re sure to see extremes. Heat waves will happen. So will floods, drought, and giant storms. It’s the nature of the beast, those things can happen for no apparent reason — for no reason at all, really, just because they are random fluctuations. When extremes arrive, they often bring trouble with them. It’s good to be prepared for such fluctuations, because they’re unavoidable.

How extreme they are, and how often they occur, depends on the nature of the fluctuations. By measuring conditions over long periods of time, we not only get to know what the average conditions are, we also learn about the fluctuations. That enables us to define climate as the mean (average) and variation (fluctuations) of weather.

---

Fortunately such fluctuations are exceedingly rare. A once-in-a-thousand-years heat wave only happens, well, once in a thousand years. On average, that is … we could get two such events in rapid succession just because of (very very) bad luck. Fortunately, such concordances of extreme extremes are very very exceedingly rare.

In just the last decade we’ve seen a number of extreme extremes. Even if we only count the heat waves, recent history is remarkable. Europe in 2003, Australia in 2009 (not once but twice in a single year), Russia in 2010, the U.S. in 2012. And now Australia (again!) in 2013. All these heat waves were extreme, some were extreme extreme, and they all brought disaster. Their frequency is just as remarkable as their severity, having come one after another in rapid succession.

That could be just a coincidence — one hell of a whopper of the worst weather luck imaginable.

---

Just this month, Australians experienced a similar large upward fluctuation in nationwide temperature. Again, all by itself that’s not such a big deal. Heat waves and wildfires happen, and since Australia is a pretty hot place anyway those who live there are well prepared for such fluctuations. But when they are added on top of a substantial trend it becomes like nothing they’ve seen before. 

 

He then goes on to show the estimated trend

 

 

The dot with a red circle around it is the temperature in 2012. That was a national disaster. Now imagine that the endpoint of the red line, so much hotter than what brought about a national disaster, becomes the norm.

 

What will life be like when unprecedented disaster becomes the norm? We are not prepared.

 

Alas, “the post-1975 trend continues in the U.S.” is an optimistic forecast. It only leads to an average temperature anomaly of about +6°F by the year 2100. That’s on the low side of actual forecasts by legitimate climate scientists.

 

 

 

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Misreading

(#299620)
Bird Dog's picture

Lomborg was talking about investments in R&D, not in picking favorites a la Solyndra. Oh, and it looks like the 6-degree jump in temperature by 2100 is a major exaggeration.

Another interesting tidbit. Big Oil isn't Shell or Exxon or BP. It's the countries that have nationalized their oil companies. Hooray for socialism.

 

"Transparency and the rule of law will be the touchstones of this presidency."

--Barack Obama, January 2009

A Single Study From Norway

(#299662)

Norway, the oil and gas capital of Europe.

 

Let's say I am a Norway skeptic.

I am not a pessimist. I am an incompetent optimist.

A single *unpublished* study (nt)

(#299771)

...

"I don't want us to descend into a nation of bloggers." - Steve Jobs

I don't think the Norwegian study is particularly relevant

(#299664)
mmghosh's picture

here is Gavin Schmidt recently on uncertainty in assessment of climate sensitivity.  The range (as in the IPCC 4) is between 2 and 4, depending on the critieria.  

 

In this case, as in other matters to do with climate, it is well to remember that uncertainty is not our friend.

In practice, people often mean different things when they talk about sensitivity. For instance, the sensitivity only including the fast feedbacks (e.g. ignoring land ice and vegetation), or the sensitivity of a particular class of climate model (e.g. the ‘Charney sensitivity’), or the sensitivity of the whole system except the carbon cycle (the Earth System Sensitivity), or the transient sensitivity tied to a specific date or period of time (i.e. the Transient Climate Response (TCR) to 1% increasing CO2 after 70 years). As you might expect, these are all different and care needs to be taken to define terms before comparing things.

---

In the meantime, the ‘meta-uncertainty’ across the methods remains stubbornly high with support for both relatively low numbers around 2ºC and higher ones around 4ºC, so that is likely to remain the consensus range.

Edit: I see Andy Revkin has picked up on this, too.

Is there a missing link or typo somewhere? What did I miss?

(#299636)
brutusettu's picture

After applying data from the past decade, the results showed temperatures may rise 1.9 degrees Celsius if Co2 levels double by 2050, below the 3 degrees predicted by the Intergovernmental Panel on Climate Change.

You mentioned 2100 as does the quote in the diary that I'm seeing, your link mentions 2050.  

 

The quote I'm seeing from the diary mentions degrees in the Fahrenheit scale (6F degrees by 2100), your link mentions the Celsius scale (3C by 2050).

 

 

a 3 degree change in Celsius is 5.4 degrees Fahrenheit change.

"Jazz, the music of unemployment."

 

Frank Zappa

Misreading

(#299639)
Bird Dog's picture

The 3-degree Celsius change is an overstatment, by over 50%.

"Transparency and the rule of law will be the touchstones of this presidency."

--Barack Obama, January 2009

Overstatement

(#299658)
Jay C's picture

Even assuming that the Norwegian estimates will pan out at their revised levels, i.e. a 2° C rise in average world temps by 2050 (vs. 3° ) - and one would assume then (I'm assuming one would assume here, I don't enough about climate science to make more than a guess) a 4° rise by 2100 (vs. 6°) - it still foretells a significant enough change in global climate (and, needless to add, weather patterns) to want to at least TRY to mitigate the effects, if at all possible.

Back to the usage of "Exagerate" (+ the linked to link)

(#299640)
brutusettu's picture

When the researchers at CICERO and the Norwegian Computing Center applied their model and statistics to analyse temperature readings from the air and ocean for the period ending in 2000, they found that climate sensitivity to a doubling of atmospheric CO2 concentration will most likely be 3.7°C, which is somewhat higher than the IPCC prognosis.

But the researchers were surprised when they entered temperatures and other data from the decade 2000-2010 into the model; climate sensitivity was greatly reduced to a “mere” 1.9°C.

 

Professor Berntsen says this temperature increase will first be upon us only after we reach the doubled level of CO2 concentration (compared to 1750) and maintain that level for an extended time, because the oceans delay the effect by several decades.

We used a method that enables us to view the entire earth as one giant ‘laboratory’ where humankind has been conducting a collective experiment through our emissions of greenhouse gases and particulates, deforestation, and other activities that affect climate, explains professor Terje Berntsen at UiO. (Photo: UiB)
Natural changes also a major factor
The figure of 1.9°C as a prediction of global warming from a doubling of atmospheric CO2 concentration is an average. When researchers instead calculate a probability interval of what will occur, including observations and data up to 2010, they determine with 90% probability that global warming from a doubling of CO2 concentration would lie between 1.2°C and 2.9°C.

 

“These results are truly sensational,” says Dr Leck. “If confirmed by other studies, this could have far-reaching impacts on efforts to achieve the political targets for climate.”

 

"Jazz, the music of unemployment."

 

Frank Zappa

The US Government doesn't invest in industry? Doesn't

(#299633)
mmghosh's picture

pick winners in industry?  That is news to me.  Most governments do spend taxpayers money doing just that. The criteria for the loan program established for the Solyndra loan was after bipartisan agreement, including raising private equity first.  In fact, I'm a little suprised that Solyndra went for a US Government loan at all as it meant necessarily having their factories located in the USA, and spending the loan in the USA itself, paying high American wages.  Perhaps if they had gone for a conventional bank loan, or a Chinese bank loan, they could have located in China, and would probably be alive today.  It may not be profitable in that industry to have manufacturing located in the USA yet.

 

As for investing in R&D, a quantum of R&D is also done within industry, so supporting technology with a loan achieves the same purpose.

 

Anything above a 4 degree F temperature rise will have consequences that are quite likely to be detrimental to the lives of people in rich countries.

When you read ThinkProgress,

(#299638)
Bird Dog's picture

you only get half the story. The loan program under Obama was different than the program under Bush (link). Whether under Bush or Obama, a government picking winners and losers is a dicey proposition and more likely than not a waste of taxpayer money. When it comes to R&D, of course corporations may get funds, but everyone benefits from that research, not just the corporation in question. It's not the same as giving them loan guarantees, subsidies and so forth.

 

"Transparency and the rule of law will be the touchstones of this presidency."

--Barack Obama, January 2009

But governments always looks for potential winners

(#299657)
mmghosh's picture

when it comes to government funding of projects.  No one, especially a politician, wants to pick losers.  From construction to military to IT, the getting of government funding for a company has always played a significant part of the development of many industries.  

 

Take pharmaceuticals - Fleming and Florey may have discovered penicllin but it was US government funding and research that kicked off the industrial manufacture of penicillin and the antibiotic revolution.

Florey next visited his old friend Alfred Newton Richards, then vice president for medical affairs at the University of Pennsylvania. More importantly, Richards was chair of the Committee on Medical Research (CMR) of the Office of Scientific Research and Development (OSRD). The OSRD had been created in June, 1941, to assure that adequate attention was given to research on scientific and medical problems relating to national defense. Richards had great respect for Florey and trusted his judgment about the potential value of penicillin. He approached the four drug firms that Florey indicated had shown some interest in the drug (Merck, Squibb, Lilly, and Pfizer) and informed them that they would be serving the national interest if they undertook penicillin production and that there might be support from the federal government.

 

 

Richards convened a meeting in Washington, D.C., on October 8, 1941, to exchange information on company and government research and to plan a collaborative research program to expedite penicillin production. In addition to representatives of the CMR, the National Research Council, and the U.S. Department of Agriculture, participants included research directors Randolph T. Major of Merck; George A. Harrop of the Squibb Institute for Medical Research; Jasper Kane of Pfizer; and Y. SubbaRow of Lederle. The next CMR penicillin conference, held in New York in December, ten days after Pearl Harbor and U.S. entry into the Second World War, was more decisive. At this meeting, which was attended by the heads of Merck, Squibb, Pfizer, and Lederle, as well as the company research directors, Robert Coghill's report on the success at the NRRL with corn steep liquor was encouraging to the industry leaders present.

The fact of the matter is that governments always have to make decisions about funding - either direct funding, or subsidies or loans, and industrial development happens best when government and private equity funding mesh.  Sometimes they don't work out, but often they do.  The principle of government funding is not wrong per se, the real point is whether it involves kickbacks and/or insider deals, same as with private equity.  As for Solyndra, at least the taxpayers money was spent in-country.  As I mentioned before, with the benefit of hindsight, Solyndra may have been better served by manufacture in China, or at least Mexico.

Uh huh

(#299626)
HankP's picture

Exxon is positively tiny.

I blame it all on the Internet

Thank you

(#299674)
Bird Dog's picture

For replying to something I didn't say. I didn't say that ExxonMobil was tiny. I said that Big Oil is smaller than state-owned oil companies, in case you didn't understand my comment.

"Transparency and the rule of law will be the touchstones of this presidency."

--Barack Obama, January 2009

Crowdsourcing funding for research into Arctic ice

(#299612)
mmghosh's picture

for the first time

 

http://darksnowproject.org/

 

Its an interesting conundrum about how science should be funded.  Technically, if this was such an important question, why is it not being funded in the usual way, via research grants?  Admittedly getting a grant is a problem (after my experience with grant-getting, I reckon at least 2 years in my field), and you could argue that getting the answers within the next 5 years could be crucial.  It could also serve as a kickstart to other programs.

 

As a side note about private funding of research, it appears that Alessandro Volta got the idea about electric currents from a dilletante (disclaimer: I'm a sucker for scientific dilletante tales).

A retired British East India Company servant called John Walsh financed a naval expedition in 1772 to discover whether the torpedo fish, which stuns its victims by electric discharge, was producing the same lightning-generated electricity discovered by Benjamin Franklin.

 

The dissection of the cells of torpedo fish into negative and positive charges gave the inspiration for the creation of the first battery by Italian scientist Alessandro Volta in 1800. The storage of electricity created electro-magnetic power and later telegraphy, together with the invention of the steam engine, launched the industrial and telecommunication technology revolution in the West.

How was a former civil servant able to finance such a scientific expedition? The curious history was that Walsh was the former secretary to Lord Clive, the conqueror of India for the British empire. He was awarded £56,000 for his contribution to the British victory at the 1757 Battle of Plassey, when the Indians and their French allies were defeated. Walsh returned to England with an estimated fortune of £147,000 (equivalent to over $12 million today, but worth probably much more since property in those days was much cheaper) and became a scientist.

 

The Battle of Plassey was won partly because the British bribed the leading general of the nawab of Bengal to change sides for £60,000, which surely meant that what Walsh received was a ransom for a kingdom. Clive returned to Britain with a fortune of at least £300,000. It paid to be conquerors.

At the end of the 18th century, the difference in population and GDP size between the East and the West was amazing. There is no accurate data for 1760 or 1800, but according to leading economic historian Angus Maddison, the population of Western Europe in 1820 was only 133 million, whereas China had 381 million and India 209 million. The US population was only 10 million.

 

At that time, China accounted for 32.9 percent of world GDP, compared with 16 percent for India, 23 percent for Europe and 1.8 percent for the US. By 1950, when China and India became new republics, they had declined respectively to 4.5 percent and 4.2 percent of world GDP, whereas the US accounted for 27.3 percent and Western Europe 26.2 percent.

The difference could not have been more contrasting in terms of knowledge and economic power. The Chinese Imperial Encyclopedia, commissioned by the Emperor Kangxi in 1800 and completed after 26 years, contained all extant knowledge in China at that time and comprised 10,000 volumes and 170 million characters. But that knowledge was useless in the face of superior scientific and practical technology that propelled the West in the Industrial Revolution.

 

 

 

Shorter Bjorn:

(#299591)
brutusettu's picture

"We cannot use any effort to slow the leak in the boat, not until we find a plug that stops the leak even better."

 

Bjorn seems like he's still has a way too many artifacts coming out in full force from his AGW denialism era, he reads like he's concern trollling non-denialist.

 

fwiw

 

 

 

 

"Jazz, the music of unemployment."

 

Frank Zappa

We don't really need efficient photovoltaics.

(#299589)

What we need is cheap photovoltaics,  which is another thing entirely.  It's not like useless land and sunlight are scarce resources that have to be used efficiently.   I don't know the details of the Solyndra deal but in general development work manufacturing more or less conventional cells cheaply is probably a higher priority than getting from 20% to 40%.

 

That's a bit of an American POV

(#299606)

Land is scarce in parts of Asia and Europe, and in some parts of those sunlight is scarce as well.

 

Even in some American urban environments, like the northeast, higher efficiency is an important trait.

 

I'll grant that on a global level cost is more important. But cost has come down consistently for a long time and is now approaching the long-sought dollar per watt level, which should prove to be an enabler.

 

You can thank massive subsidies, mostly in Europe, for the economies of scale that have gotten us to this point.

I am not a pessimist. I am an incompetent optimist.

If energy payback time is a concern

(#299610)

the trophy should be for one-dollar-per-watt solar cells made (starting from sand) in a plant that is itself powered by solar cells.

Maybe in Texas, where land and sunlight are non-scarce

(#299602)
mmghosh's picture

resources.  And you have efficient and intelligent grids.  

 

I disagree about efficient PVs. Of course it has to be cheap. But, several other issues.  Over here, we're looking at solar as disruptive tech - in the same way as mobile telephony bypassed landlines.  We then don't have to develop grids, and staying off-grid is always useful when hit by a natural calamity.  Add in very dense populations, so you have a small roof area.  To be truly effective, a solar unit should power (1) lighting (2) water pump (3) cooker (4) extra necessities - fan, fridge, phone charger etc.  We need efficient solar, and efficient battery tech, too.

 

Efficiency also works better where the winter sun is weak.

Really?

(#299608)

There are people in India who want solar but are limited by roof area rather than income?   Even in very wealthy neighborhoods here few people could afford to cover their whole roof with solar cells.  Until prices get to that level I'd say W/$ is more important than W/m^2.

 

Also, electric grids are expensive and unreliable but nowhere close to the cost, safety, and environmental burden of having 3-days worth of battery in every house.  And again,  I'd argue that J/kg is less important than J/$ and finding battery technology that's more benign. 

Yes. Myself, for one.

(#299634)
mmghosh's picture

Fair enough

(#299641)

I now know you have either a very small roof or a lot of money,  or both.

 

Super high-energy density batteries would be very cool for a lot things,  you could even have electric airliners.  Unfortunately after talking to chemist colleagues who do battery research it doesn't appear that we'll be adding extra digits to energy density anytime soon.   I think the best bet would be reversible fuel cells that are robust, closed-cycle, and low maintenance so that they are a "battery" from the users' point of view.  Even there the ultimate storage density is not going to be much higher than (say) gasoline,  but then gasoline is pretty good.  Something the size of microwave oven might be able to run a house for 24 hours.

I live in an apartment, with a shared roof (8 flats)

(#299661)
mmghosh's picture

standard concrete block - which is like a lot of people who live in apartments.  

 

Perhaps most Americans don't live in apartments, but 10 billion people by 2050 means a huge increase in apartment living.

 

Even a fridge-size battery pack would be acceptable.  Now, maybe you haven't lived in the 3rd world, but for most of us 3rd worlders, 12 hour power supply, and therefore 12-hour inverters are a fact of life, and we plan for them, so it would not be a huge move for us to have inverters (and their batteries) powered via an efficient off-grid solar power system.

 

Even AC's can run off inverters.  Ignoring the advertising babble, Hitachi shows the way forward.  We don't have it yet, but are seriously considering it.

i-TEC, the DC INVERTER air conditioner (Cooling type), an air conditioner which is made in India and made for India. It combines 8 Direct Efficient Technologies that are packed into its compact and stylish indoor unit, ensuring energy saving and ultimate comfort.

Fuel Cells, Not Batteries, Are the Future

(#299643)

As another anecdatal point to your chemist friends' comments on the limitations of chemical batteries: Several of the world's largest auto manufacturers have announced recently collaborative efforts to develop fuel cell powertrains. (BWM and Toyota* have formed one such collaboration; Daimler, Ford, and Nissan have formed another.)

 

----------------

*An "axis of fuel cell"?

The math doesn't work, though.

(#299646)

Where is the hydrogen going to come from? Right now it comes from natural gas. Producing hydrogen from water requires way too much energy.

 

The only thing going for fuel cells is that hydrogen distribution looks like gasoline distribution. You need to go to a hydrogen station to fuel up. And, it would be produced by oil companies. In other words, the fossil fuel industry wants this and is likely to be behind the effort in one way or another.

 

Fuel cell car prototypes have been around for a decade or more. That collaboration you mention is talking about starting production in 2017, though no specific models are mentioned. Always in the future.

I am not a pessimist. I am an incompetent optimist.

No (net) hydrogen

(#299650)

What I'm suggesting is closed-circuit fuel cells.  Not a power source, just power storage.  You input electricity to run them backwards and charge them like a battery,  they generate (and store internally) the hydrogen.  Then you let them run in regular fuel cell mode when you want the power.

 

The electricity needs to come from....nuclear, at least for several decades.

 

That's OK in principle...

(#299656)

But what is the efficiency of conversion? I don't think it's better than batteries. I think it's worse actually.

 

Though, if you did it that way, you could keep the oxygen too and avoid bringing in impurities from the air that would damage the fuel cell membranes.

I am not a pessimist. I am an incompetent optimist.

Definitely not as efficient as batteries

(#299663)

you're right about that.   But hydrogen+oxygen stores a lot more energy per unit weight than a battery,  so I think you could (ultimately, not now) get more range between charges.

Sure

(#299665)

But hydrogen isn't that easy to store. You need a really high pressure tank and associated hardware, valves and so on, and then the fuel cell stack.

 

Batteries like those on the Model S are much better, today, but they are still far from ideal. Power to weight is not now something I care about strongly. It's good enough to launch a Model S 0 to 60 in a bit over four seconds and to have a range of 260 miles. The weight of the battery below the floor gives a low center of gravity and mass for crash safety right below the cabin where it is most useful.

 

I care about power to cost, loss of capacity, and product lifecycle (specifically that metals such as cobalt be totally recoverable).

 

If you could have batteries just like Panasonic batteries in the Model S, but at half the cost and twice the lifetime, you could have a very good, practical, long-range and long-lasting EV for USD 30K or so. If the batteries can be fully recycled, then you can produce them pretty much indefinitely.

 

I am not saying a higher power density would not be nice. It would probably be coupled with lower cost since the unit capacity of each cell would be greater, but the cost of manufacture would be similar. Also, even longer range is a nice to have. But from what I have seen, cost and lifetime are bigger issues right now.

I am not a pessimist. I am an incompetent optimist.

Time Will Tell

(#299648)

My suggestion is that these five very large auto manufacturers who have been designing and manufacturing and marketing battery based alt vehicles are sinking resources into fuel cells because their experience with batteries tells them batteries are a bridge to something better.

 

More anecdata: I stopped at a Shell gas station in Orange County a few weeks back that had an island for hydrogen. So big oil is in favor. That's not terrific, but then again, they've got a ton of resources to throw at the problem. I'm not good at math, but surely you get some CO2 relief by the switch to natural gas, centralized at an H2 refinery, and therefore easier to clean and sequester than tailpipe emissions (so even more CO2 savings). Besides, most H2 advocates see natural gas cracking as a bridge, too.

 

And more: Honda, which doesn't have quite the resources as the five noted above, still leases its fuel cell vehicle in So. Cal, AFAICT. Sure, the program is limited to few hundred cars, and expensive, not unlike GM's limited roll out of its EV-1 (also a pie-in-the-sky dream according to skeptical observers) so long ago. That's an actual consumer automobile, on the road today.

 

These things don't just drop out of researchers' labs and hit the road and go. They are iterated. 

Actual Consumer What?

(#299649)

There is nothing consumer about the Clarity. It is highway legal, but it's not even in low volume mass production, cannot be bought, and has the cost of a small airplane.

 

There are about 50 Honda FCX Clarity cars in the world, 20 of those in SoCal. It has been estimated they cost one million a piece to build. It can only be leased.

 

For comparison, Tesla builds that many Model S fully electric sedans every day.

 

I liked the hydrogen idea. Who likes batteries? I even spent $100 to buy a toy reversible fuel cell car which I experimented with (the fuel cell turned out to be short-lived even with some care on my part).

 

But the reality is that there is no raw hydrogen on Earth, and no process to get hydrogen even close to economically from water and electricity. The reality is also that fuel cells are far more expensive than batteries. But the worst problem is that it is a fossil fuel play.

 

The reason these car makers are doing this is not that it's sensible. They are doing it because it is oil company sanctioned greenwashing.

I am not a pessimist. I am an incompetent optimist.

Greenwashing

(#299670)
Bird Dog's picture

Good word.

"Transparency and the rule of law will be the touchstones of this presidency."

--Barack Obama, January 2009

Solyndra

(#299593)

As I understand it, the company was essentially blindsided by the rapid drop in prices due to Chinese production. It also lacked good cost control but, arguably, less wasteful spending would not have solved the underlying mismatch between its business model and the change in market conditions.

I am not a pessimist. I am an incompetent optimist.