> And no, electric cars aren’t the answer either; the power to run them has to come from somewhere. The best case is that people will charge them off the grid at night. This will require power plants to be burning just as much additional fuel as if the cars themselves were doing it, perhaps more given transmission losses.
It has to be said. This guy is simply lazy, and did not do the research necessary to have informed opinions.
Here are the facts I gathered in about 5 minutes. (Its pretty sloppy research, but hey-- I'm just posting a comment!)
Also factor in the alternative sources we could use to create electricity that are not practical for having on the car itself. Wind, solar, geothermal, etc.
If we could produce endless supplies of cheap renewable energy, it would be a no-brainer to switch to electric cars.
That's not necessarily true. To decentralize production it merely needs to be the better economic choice. There is no fundamental reason why polluting energy sources should be cheaper: this is only true with most current technologies in most current situations.
Also, electric car manufacturers seem to be great at making cars that get more km per kJ. They get 2.18 km/KJ of power, which is more than four times the efficiency of the Prius.
That seems like an excellent point. It looks like electric motors are pretty efficient, particularly big, powerful ones. Electric cars, with 100hp or more should achieve about 95% efficiency, according to this:
You don't connect them directly to the battery pack, though. You need "something" in-between (power electronics) to drive the motor. I'd estimate the efficiency of such a controller to be around 80/90%.
"[In comments Eric S. Raymond] Says:
Um, what global warming? There hasn’t been any since 1998. Recently global average temperature has actually been dropping rather dramatically, enough to wipe out the last century of warming trend."
Good lord, did I really consider this guy a luminary in my teens? I am ashamed.
The thing about being a contrarian is that you end up wrong sometimes...possibly a lot of the time. I think we still need them, though, to keep us questioning common wisdom. And ESR, for all his flaws, is still an interesting character and a useful member of society.
My opinion of him isn't too high. He likes to brandy himself about as some sort of hacker hero, but in truth, he's not really contributed much to OSS. Combined with him standing up a packed audience at my college without even bothering to call and cancel, and general nutjobery (see his rants about guns or mysticism) I'm somewhat less than a fan.
We need wackos like Stallman. He's a little south of sanity, but damn, he's slung a lot of code and really changed the landscape of technology with his idealism.
> My opinion of him isn't too high. He likes to brandy himself about as some sort of hacker hero, but in truth, he's not really contributed much to OSS.
OSS ESR has been involved in or founded:
Fetchmail,
GPSd,
CML2,
The Cathedral and the Bazaar,
The Jargon File,
Terminfo/Termcap,
VC Mode/GUB in Emacs (ESR is the second biggest lisp contributor to Emacs after RMS),
Contribs to Gnuplot, Gnome, Python, Groff and Nethack,
GNU Toolkit SED,
Hexdump,
gif2png,
Bogofilter,
Countless Howtos at the LDP
With the exception of the above, you're right he's not contributed much to OSS. I'm not saying ESR's not a polarising person (I've met both ESR and RMS, both can be black and white people) but he has contributed a lot to OSS.
As noted below, his versions (or contributions) to most of those software projects were all pretty trivial. And listing "projects" like hexdump is kind of cute -- it's 211 lines of code.
The one claim that stood out there -- the one that gave a really testable statement that would have surprised me if true was, "ESR is the second biggest lisp contributor to Emacs after RMS". I thought, hey, there'd be a surprise, so I decided to run cvstat on the emacs lisp subdir:
- RMS contributed the 2nd most code to Emacs' Lisp with 217542 lines of changes.
- ESR contributed the 39th most code to Emacs' Lisp with 6367 lines of changes.
The reason that I don't like the guy so much is because he claims to speak for a movement, that by his own prognostication is a meritocracy, and I don't feel like he has the credibility for that. Combined with the fact that I think a lot of what he says is bozo-riffic, I'd prefer him step back from his self-appointed spokesman position.
Pretty much. Most influential wackos tend to be single minded about some ideological point. Sometimes that gets them a peace prize, other times it involves invading neighboring countries. ;-)
I've talked to him. It's amusing. And inspiring. And a little frustrating. At least as of a couple years ago he'd still interrupt you every single time you said "Linux" to stick a "GNU" in there.
That's a long list, but most of them are toys, scripts or quick hacks. Even the ones that almost look significant at first glance -- i.e. bogofilter or sed -- not so much.
His version of bogofilter was 900 lines of code and his version of sed 1700. I'd guess that the total amount of code that he's written that gets packaged for a standard Linux distribution is maybe 5k LOC. That's being generous, honestly. And that's what I find disingenuous when he's busy talking about his prolific OSS background.
The comments about Stallman's sanity weren't meant to be taken literally. I assumed that was obvious.
Ok, you don't think that's much code/of very high quality/of much usefulness . . . very well, I now respectfully ask you to tell us how much/of what quality/how useful code YOU have contributed.
There is a difference between a contrarian and someone who is wrong on basic facts:
"And no, electric cars aren’t the answer either; the power to run them has to come from somewhere. The best case is that people will charge them off the grid at night. This will require power plants to be burning just as much additional fuel as if the cars themselves were doing it, perhaps more given transmission losses." --ESR
"Pacific Northwest National Laboratory calculates that there is enough excess nighttime generating capacity nationwide to charge 84 percent of the 198 million cars, pickups, and SUVs on the road today. Ideally, the energy charging those cars would come from carbon-free sources, which would reduce almost to zero the greenhouse gas emissions per mile. But a joint study by Electric Power Research Institute and the Natural Resources Defense Council found that even if a plug-in vehicle got all its electricity from coal-fired plants (the U.S. electricity grid is about 50 percent coal), it would still emit only two-thirds of the greenhouse gases released by a conventional car. Over the next thirty years, the study concluded, widespread adoption of V2G could eliminate 450 million tons of carbon dioxide annually, the equivalent of retiring a third of the current fleet." From Earth: The Sequel (p. 226)
"This only means no new power plants. The current ones would have to step up production at night."
I realize that. However, also realize that wind power is running at peak production during the night, which is when people would be charging their cars. Also realize the 2/3 number was for if 100% of the power was coming from coal, which it obviously wouldn't be.
However, it accounts for about a third of all new electricity production. You have to remember that a cap-and-trade bill is going to be what makes both V2G and renewable energy profitable, so if it becomes profitable to build V2G cars then it will also be profitable to increase wind capacity. I forgot that solar thermal is also an excellent candidate for creating electricity at night, since you can store excess hot water very efficiently during the day and then use a small amount of natural gas to make sure it's at the optimally efficient temperature to spin the turbines.
[Windpower] accounts for about a third of all new electricity production.
No. Here is the Wikipedia quote: "Wind power accounted for 35% of all new U.S. electric generating capacity in 2007." (Emphasis mine.) Capacity is not production. Capacity-factor helps us understand how much production we can expect from a given capacity. Here are the capacity-factors of two recently-installed wind turbines:
Hull 1 is a turbine that was built to replace an older one that -- as frequently happens to wind turbines, but has never happened to a nuclear power reactor -- had been destroyed in a storm. (http://www.hullwind.org/history.php) Being destroyed in a storm further hurts lifetime capacity-factor.
Only if these percentages are in series. Since we're talking about the same power plants, it seems that 1/3 is all we get.
Checking the study in question (first hit for 'study "Electric Power Research Institute" "Natural Resources Defense Council"'), it seems they did _not_ take into account the production and maintenance costs of each type of vehicle.
I'm skeptical of technologies that promise incremental conservation through massive consumption of other resources (battery material).
"Only if these percentages are in series. Since we're talking about the same power plants, it seems that 1/3 is all we get."
To quote E:TS again:
"Without any major breakthroughs, vehicles that are little different from today's could use one-third the energy per mile, says John DeCicco, Environmental Defense Fund's specialist in automotive strategies. That alone would radically reduce greenhouse gas emissions. If those cars ran on a biofuel made from renewable feedstocks with one-fourth the lifecycle greenhouse gas emissions of today's gasoline (Amyris's biogasoline, for instance, or Verenium's cellulosic ethanol), then the emissions per mile would be one-twelfth what they are today, a reduction of 92 percent. It was the feasibility of such options that, in September 2007, caused Vermont U.S. District Judge William K. Sessions to reject manufacturers' challenges and rule that they could meet California's new standards, requiring carbon dioxide emissions in new cars to be cut about 22 percent in the first phase (2009 through 2012) and 30 percent in the mid-term phase (2013 to 2016). Given the expected doubling by midcentury of vehicle miles traveled in the United States, however, California and the rest of the country will have to go much further--reducing automobile emissions about 80 percent." (p. 229)
The book lists a whole boatload of other ways to save massive amounts of energy with today's technology, many of which are listed in my notes of chapter 9: http://alexkrupp.com/earth.html
The sections on weather prediction, clean cement, carpeting, and fans are insanely cool.
Yeah, exactly. I was picking on electric cars because they only address a subproblem; other technologies are certainly more promising.
...
"Another innovation came in the carpet pattern itself. The company's top-selling pattern, called Entropy, mimics the disorder of a forest floor with its strewn leaves, pebbles, and twigs. That randomness means that the pattern needn't match up from tile to tile, but can be laid out in any direction, eliminating the huge amounts of scrap normally generated at installation. It means few tiles are rejected at the factor: imperfections get lost in the wandering variations of color. It also means the carpet lasts a long time, because worn or stained tiles can be swapped out without replacing the rest." (p. 215)
It's pretty good. (Disclaimer: my dad is the co-author.) It's a little technical in places, especially in the solar photovoltaic chapter, but overall it's far superior to getting your education on renewable energy from articles in Wired or the thinly rewritten press releases that get posted to Reddit and news.yc. It's a current events type book and I don't think it's so insightful that people will be reading it fifty years from now, but it's still definitely worth a read since we're about to see the biggest economic boom in history (as soon as cap-and-trade gets enacted next year) based on the technologies and ideas in the book. I read it twice and I feel like I have a pretty good grasp on the issues now, especially after taking notes the second time. I'd still like to read more books in the same topic area, but right now I'm working through a pile of educational theory stuff.
It's 2/3, but it can come from coal. There are very significant political/economic advantages to that. A simultaneous 1/3 environmental advantage is icing on the cake.
Sure. You can gauge a hacker's grip on reality based on what specific year they stopped taking him seriously. One does not get granularity that fine very often. Few people spend a day a week composing more and more whacky odes to junk science and more and more possessed rants against the cultural tapestries of entire continents. Even fewer people manage to slowly but consistently rack it up over the better part of a decade. Every field should get itself some Raymond or other, really.
Now excuse me while I go sip some latte, nibble some brie, and appease to some Islamofascists with my fellow Idiotarians.
In one important study, Dr. Clenton Owensby and colleagues, studied ruminants grazing in a FACE range with a doubled CO2 level on Kansas’ grassy rangelands. Their studies into the impact of increasing carbon dioxide on those rangelands revealed a surprising effect. The grass these animals foraged from the FACE range had less nitrogen and correspondingly less protein. The protein content and digestibility, even by the four highly efficient stomachs of ruminants, was reduced at increased CO2 level.
Geothermal is like hydropower, economically speaking, but requires unusual geology. Basically the only place it can work on a large scale is in Iceland, home of a full third of the world’s active volcanoes.
Apparently, with advances in drilling technology, and developing the tech to fracture hot dry rock could greatly expand the range of possible geothermal sites.
What I found most sad about this is that in the entire post he didn't post a single link. Not one.
There is a lot of new technologies discussed and I for one would like more information so I can make an informed evaluation. This post doesn't provide it.
Playing around with some figures sourced from wikipedia and assuming best case scenario meeting the demand for oil would require an area the size of Germany be turned over to algae oil production.
One of the points raised in the post that I didn't quite understand was tidal power being presented (although it isn't explicitly stated) as though it isn't time variable. The last time I checked the tide went in and then out twice a day. You can trap it in a reservoir and then slowly release it but there are going to be periods when it generates no power.
You can trap it in a reservoir and then slowly release it but there are going to be periods when it generates no power.
Why not use pumped storage? What is needed is elevation to overcome the head-deficiency inherent to the equilibrium periods. This is achievable by sacrificing some of the stored water, to pump a smaller amount of water to a higher elevation.
A valid point. A similar approach can be taken for solar and wind. Use excess energy to pump water into a higher reservoir which can be released during overcast/still periods.
The only difference I can see being that with tidal you know exactly when the next burst of energy will come and how large it will be.
Pumped storage is quite low capacity and low efficiency for the cost and surface area. 1 kg, that is, 1 liter of water, up 20 meters, is only 200 joules. An apple laptop uses 40 watts: which is that amount every five seconds.
I prefer not to pay attention to pseudo-racists with strong opinions and few proofs. Give me numerical proof anyday over the type of race-baiting uproductively negative rambling that ESR seems to love.
It's fairly well established that the mean IQ (i.e. g, or general intelligence factor) of people of African descent is lower than that of people of European descent, which is in turn lower than that of people of East Asian or Ashkenazi Jewish descent.
Quite apart from the argument you are proposing there - what does that have to do with skin color? Everybody that lives at the equator in all parts of the world have dark skin, it has nothing to do with if a person comes from Africa or not.
And it's not "fairly well established". Not in any way that one would deem scientific. It's fairly well established in the minds of bigots and in the minds of people who are looking to justify their denial of rights to certain groups of people, but it's not established in any meaningful way.
For example, take a look at the body of evidence showing that the earth is round. Now, using the same SCIENTIFIC criteria, show me the evidence that people of African descent are less intelligent than people of european descent - you'll find that the 'studies' that were done are very fuzzy, very open to intepretation, use very skewed sample size, and are not pure at all.
What you are really trying to say is that In tests conducted in the 70s with tiny groups of African-American people of mixed race, who came from long disadvantaged communities, there was a difference in problem solving ability according to a certain metric, compared to affluent children from educated homes.
The evidence is weak and very unscientific. If you REALLY want to prove that the IQ is lower, then I demand sample groups approaching 2000, I demand that the tests be taken in Africa, and written in an African language, I demand that the socio-economic backgrounds of the participants be the same or similar, and I demand that all questions related to verbal ability be removed from IQ tests.
IQ has not been measures properly, and having lived in several continents, nothing I have seen indicates to me that there is any significant difference in IQ between people living on different continents, once you factor in their family background.
But people like you once found a study that fitted in with your narrow perception, and you hang on to it like it was the holy grail. It's flawed, and it's an argument similar to creationism.
nothing I have seen indicates to me that there is any significant difference in IQ between people living on different continents, once you factor in their family background.
What if family background is ignored?
.
I demand that all questions related to verbal ability be removed from IQ tests.
it appears that the variance [IQ subtests] share can be reliably and accurately indexed by reaction time on a task where subjects must merely press a lighted button. The correlations between such simple tasks and g is around .62, which is higher than the correlation between many subscales of IQ tests and the g factor to which they contribute.
.
IQ has not been measures properly
Is that what you gleaned from studying those two books (both listed as ISI Citation Classics)?
What I glean from this discussion is that people who are obsessed with proving another people inferior in some regard are not people I want to associate with. Particularly when I belong to the subgroup they want to prove as being inferior.
I mean what's your point? What do you hope to gain by telling me that people like me are less intelligent than people like you? And how do you think it makes me feel to read that?
This is not an attack on you personally, nor is it an attempt to scientifically "prove" that all whites are smarter than all blacks, or some other such nonsense. I'm of European descent and I'm positive that for any given race/ancestry, you can find someone who is strictly smarter than I am (by any metric you want).
The point you're missing by interpreting our statements as personal attacks is that we're talking about group mean differences. And the reason this is relevant is that for the past 40 years in the United States, inequality of outcome has been taken as evidence of inequality of opportunity (i.e., lower average African-American and Hispanic educational/economic achievement is evidence of discrimination by whites), thereby justifying affirmative action and other forms of income redistribution. This is a perfectly reasonable conclusion if you assume that all races have an equal distribution of cognitive abilities. What we're trying to say is that this assumption is unfounded.
"The combination of these problems [storage and transmission aren't 100% efficient] means that household energy conservation is mainly a way for wealthy Westerners to feel virtuous rather than an actual attack on energy costs. Household conservation slightly decreases the maximum capacity needed locally where the conservation is being practiced, but has little impact further away, where demand has to be supplied by different plants."
Errm...doesn't the fact that transmission is less than 100% efficient just multiply up the gain? If I lose 50% of power sending it to a house, if that house saves Y kwH, I need to generate 2Y less kwH? i.e. the worse the transmission %, the more effect I get from efficiency at the point of use?
I haven't read the whole post yet but what I think he means is that if energy efficiency goes up locally there will be a bunch of power stations sat idling because it isn't efficient to send the energy long distances to areas where demand hasn't fallen. For example cutting demand in the US by 100MW won't stop a 100MW power station from being built in India.
It makes sense but only if you confuse efficiency and demand. Increased efficiency should decrease demand assuming all other things are equal. Surprisingly all other things aren't equal.
Well, the demand generally increases over time, so efficiency just means you can delay building that next 100MW plant in the US a while longer - which is useful.
I agree with his basic point that "a dense form of energy is useful for transport". But he doesn't seem to have much beyond that.
He also neglects the point that turning power -> useful work done is more efficient for electricity from a battery, which is pretty relevant. And for fuels, is more efficient in bigger engines/power plants.
Wikipedia tells me diesel engines get in the region of 45% efficient, petrol 30%, electric 90%.
High energy density is indeed key for mobile applications. This is why Diesels are so popular. 45% efficiency is 1/2 that of 90%, but the fuel is very energy dense and the fuel tank is compact. In addition, the storage and transfer technologies are simple, low-tech, and cheap.
The key is that not everyone needs that kind of power. Most of the people here in Texas running around in work trucks would be just fine with one of those little euro coupe utility vehicles that's as small as a compact car. It's only in a minority of cases where you need the heavy hauling capability of even an F-150.
Absolutely. Demand is, to my knowledge, rising and showing no signs of stopping any time soon. The aim of efficiency drives are to slow this rise, not lead to the decommissioning of power stations.
This is strongly false. Utility scale flywheel systems can store massive amounts of energy, and can do so quite cheaply.
So can compressed air. If you look at household electricity usage, an enormous amount is taken up by refrigeration, and air conditioning. Much of the rest is taken up by appliances: washers, dryers, that sort of thing. Finally there's heating and lighting and electronics. The major energy expenses are now transport, which we're trying to address, stuff, which is hard, and air travel, which will have to be scaled down because it's polluting the environment like crazy.
Utility scale flywheel systems can store massive amounts of energy, and can do so quite cheaply.
Yet, electric utilities use lead-acid batteries, instead (providing ~30 seconds of backup power).
If utility scale flywheel systems are cheap, why aren't they being added to nuclear power plants (so those plants can be used more-effectively for peak power)?
Utility scale flywheel systems are being added to new powerplants, but it's been slow going. Utilities are not exactly dynamic organizations; my cofounder used to consult for one, getting them to change anything was an enormous pain. There are many growing companies in this area: Beacon Power is one in particular.
They aren't being added to nuclear powerplants because nuclear powerplants have a wide and controllable dynamic range (they can run higher or lower depending) and new ones aren't being created.
There are large scale installations of flywheel systems in government use, though. The Princeton Plasma Physics Lab used a flywheel system for peak power delivery, as did many other government labs; including particle accelerators.
Utility scale flywheel systems are being added to new powerplants
Can you name one?
Beacon Power
Has Beacon Power ever made a single sale in its 11-year existence? I see it is trading at $1.25 (up from 84 cents, within the last year). The NASDAQ keeps threatening to delist it.
On making sales, the short answer is yes: they announced a sale as far back as 2001, uncovered from shallow googling. This was for a telecom utility though, not a powerplant scale utility.
[edit: in H1 2007 they produced sales of $842,034, with $30,244 gross profit, minus substantial operating expenses of $6,443,344. Such magnitudes may or maybe not be expected in a startup company]
On adding to powerplants, the intent with the beacon power demonstration plant is to actually commercially deploy the system and make revenues from regulation services. Technically this is not adding to an existing powerplant, though it is adding to the power grid. Eventually the aim to to add to existing power plants where possible, as this avoids transmission losses.
I'm not intimately familiar with the state of the two large utility scale demonstrations. The public information, posted in their stockholders 'results' release, is that they're building a 5 MW plant, have tested a 1 MW system, siting another (10 MW), and ramping up to production; following approval in open bid regulation market.
Beacon expects to have frequency regulation facilities in two locations before the end of 2008 with a total of five megawatts of capacity. To that end, the Company has initiated the process of establishing up to five megawatts of frequency regulation capacity on its Tyngsboro site, and is actively pursuing potential locations in the PJM Interconnection in addition to a site in Stephentown, New York. On July 17, 2008, the Company received a land-use permit it had requested from the town of Stephentown, New York, and subsequently exercised its option to purchase the land. Pending approval of an active interconnection request to NYISO and any other implementation requirements of the NYISO, a possible location for Beacon's first 20 megawatt frequency regulation plant will be in Stephentown.
There's also Pentadyne, which manufactures smaller scale UPS systems, used, for example, in process plants. The economics are much the same though, since it competes against batteries. If anything, economies of scale should tweak toward the utility scale implementation.
Actually, I don't think so. The accuracy of one's predictions must be weighed against the expected importance of their consequences on how one might decide to act. Obviously this is an issue important to both of us. But the energy problem is a big problem, perhaps the problem, with monstrous amounts of data, being generated faster than one can argue about it, at detail.
The goal is to act well, not to make zero mistakes. The arguments we've had have unearthed interesting data, but I would not suggest it's worthwhile to continue much further, unless, for either of us, it is in good sense productive. Some of the arguments and contrary positions you've taken have been very helpful for me. But I must admit, I am getting tired, and there is work to do.
They aren't being added to nuclear powerplants because nuclear powerplants have a wide and controllable dynamic range (they can run higher or lower depending)
Nuclear powerplants, typically, are capital intensive. For this reason, their operators aim for high capacity-factors. Unless economic forces are being ignored, they cannot easily be throttled. They run at 100%, all of the time, except when they are being refueled. The only nation that uses its nuclear powerplants in load-following mode is France -- since such a high proportion, some 80%, of its electricity is nuclear -- and France pays a price for that.
Additionally, it is not mechanically good for a nuclear-powerplant to throttle up and down. Such cycling causes parts to wear-out faster. If it is going to be used at all, it is best to keep it on, at a steady output.
There are other considerations. The nature of reactor poison build-up is such that throttling-up immediately after throttling-down tends, in typical reactor designs, to be difficult to do.
Unlike nuclear-powerplants, hydro and gas-turbine powerplants load-follow well (though the more-efficient gas-turbine powerplants take longer to start-up and might have to burn fuel, idling).
I will confess I know less about fission power plants than you seem to. Good point on the capital intensity. I suppose this does answer why there aren't major flywheel installations are not present at nuclear powerplants.
One thought: why aren't there flywheel systems at wind turbines, moderated by a CVT? This would get rid of a lot of transformation losses, and since a motor is unneeded it would be fairly cheap.
There are two methods to throttle nuclear plants: by control rods, moderating neutron flux, and by temperature, moderating coolant flow (in, say, a pebble bed reactor). Either of these work well for generating heat, though coolant flow is simpler technically. Beyond this the plant is a steam turbine, which would have the same maintenance problems as any other steam turbine designed for the load. It's not as fast as gas, but the fuel is cheaper. But as DabAsteroid pointed out, Nuclear Plant are typically base loaded, in the US at least, so that's not terribly relevant.
I don't understand enough of the specifics of the particular designs Dab alludes to to comment on reactor poison buildup. I know that in pebble bed reactors throttling was intended by design.
I think that within the US market (which was, erroneously on my behalf, what I assumed the conversation was about), the point of new power plants not being yet created still holds: whatever load balancing one would need is an investment already made. (though this is mostly irrelevant to the point of discussion where nuclear plants don't supply peak demand). Of the proposals for which nuclear plants would supply peak power, the question is still out there: how will they balance the load? There are many possibilities, but they're mostly guesses. There is, increasingly, a diverse market for load balancing technology, and companies, struggling with technology and sales and tradition and economy, to try fill that need.
There are also resistive shunts that can be used in conjunction with an entire national 100%-nuclear fleet running at full throttle full-time. The resistive shunts would be used to simply dump any excess power. We use resistive shunts for this, today (which is how we get rid of wind and solar power that grid utilities are required to "buy" from the public), but in the proposed scenario, they would be used even more. To encourage demand during times of excess supply, a real-time free-market could help.
I envision that demand fluctuations would be smaller in the future, because of larger overall demand dwarfing weather/seasonal/earthspin-related (paganistic) heating and cooling fluctuations, and because of a global move toward a continuous day (after all, many of us are up in the "middle of the night", here, making our "day" when we feel like it).
Alt-energy contrarians very often make the same argument: "We'll never be able to supply all power needs from renewables. Therefore the whole alt-energy movement is a doomed, loopy pipe dream."
Has it ever been the goal of serious alt-energy boosters to replace all power with renewables?
Asymptotically. We have three problems: power capacity, power on demand, and power portability. Renewable energy sources can solve the capacity problem, if our energy consumption goes down through efficiency and changing habits (in particular, if we stop flying everywhere...). The main thing we must optimize for is energy per day versus cost.
Power on demand is another major problem: we need to supply a 'peak' of power when people demand it. Renewable energy wavers during its production, so we need to store some proportion of its energy somehow. The main thing to optimize for here is energy capacity versus cost, assuming you can get the power out fast enough. Efficiency also factors in, though not hugely. Anything over 50% is ok.
Portable power is most problematic. We need an energy source that's clean, but that has a high energy density, measured relative to both weight and volume. Solving each problem leads to different possibilities.
For example, if you solve the weight problem, but it takes up tons of room (hydrogen), you might be able to power airplanes. You can also power trains.
If you solve the volume problem, you can easily power automobiles and two wheeled vehicles. (for example, hydrocarbon fuel cells work pretty well here. We're trying to make compressed air vehicles that will work well here)
If you have solved the low cost and efficiency problems, and it doesn't weigh too much, you can power ships, in particular, container ships. If the fuel is buoyant you can make a double hulled vehicle and store the fuel in it. (compressed air works very well here).
If ... it doesn't weigh too much, you can power ships, in particular, container ships. If the fuel is buoyant you can make a double hulled vehicle and store the fuel in it. (compressed air works very well here).
Why would the fuel need to be buoyant? Ships use steel hulls, steel is denser than seawater, and, despite that, ships float. Some ships are even powered by lead-acid batteries (http://en.wikipedia.org/wiki/Submarine#The_snorkel) -- though these batteries are recharged by onboard diesel generators -- and yet they float. Aircraft carriers and icebreakers are even powered by uranium -- 18 times the density of water.
Pressurizing the space between the two hulls of a double-hulled boat would require thicker steel than normal. (For a single curve, the tensile PSI needed is the air-pressure in PSI multiplied by the radius in inches -- for a sphere, divide by two.) Since steel is denser than seawater, this would tend to make the vessel sink (so the fuel would hardly buoyant unless it were further being contained in-between the hulls in carbon-fiber cells -- but that would obviate the need for two hulls). It would also impact the shape of the boat, since containing pressure is easier with a smaller radius-of-curvature (e.g. the usual flat-sides of containter ships would be counter-efficient for this task of retaining air-pressure).
An efficient design of such a container-ship might have two or three long, parallel tubes as hulls, with a container-platform mounted above on blade-pylons (to slice through the water). Optimally, the tubes would remain entirely submerged during cruising, so as not to interact with surface waves (which interaction normally causes efficiency losses).
The principle was to make the air stored in the volume surrounded by the hull, not to store the air in between the hull cases.
One of the major costs in storing air at high pressures is the impact shell. This can be merged with a thick hull, which are already designed with this purpose in mind.
True, there are other alternatives. One can have a heavy fuel and a much larger hull. But if the fuel and tank and hull combo are buoyant you minimize the required material, lowering cost.
Alt-energy contrarians very often make the same argument: "We'll never be able to supply all power needs from renewables.
ESR said the opposite.
The industrial base load is the life blood of technological civilization; without it, we’d have a hideous global population crash, and then revert to pre-1750 conditions in which the economy is almost entirely subsistence farming and life is nasty, brutish, and short.
Hence, we could easily supply all of our power needs from renewables -- if we didn't mind getting medieval.
These designs are quite silly. In fact the best location for a solar updraft tower is in between the forest floor and it's canopy. In fact, there are already billions of implementations of concept, taking advantage of the entropy differential between the strata of forest biospheres: trees.
Accessing those 70,000 years worth is very costly. The carbon in accessible fossil fuels is around 1600 gigatons, which will be exhausted in much closer to 100 years than 70,000. Doing so would pitch our atmospheric carbon content around 2200 gigatons, nearly three times the current content of 600 gigatons.
Based on physical models pointing at similar conclusions from a plethora of different directions, this will raise temperatures. By how much? We'd reach approximately 1650 ppm carbon. At only 1000 ppm, many models predict a raise of temperature from 3.5 to almost ten degrees C (varying across the world). Most importantly, the heat will mostly be circulated to polar regions, melting them. The last step of the argument is quite well known: at 4.5 degrees C, the land borne glaciers will begin to melt, drowning entire cities and countries. Before that happens, 20-50% species will be lost, and billions will face water shortages. The ensuing ecological niches and prevalence of human targets may end up creating many new diseases.
That might objectively be the case, but it is not the case, according to ESR -- and it is his article that we are discussing. Did you read the article?:
[I] don’t believe [CO2 emissions are] driving global warming. ...
The pressing question, then, remains: What’s going to replace oil?
So, aside from greenhouse-gas considerations, why replace oil?
.
accessible fossil fuels ... will be exhausted in much closer to 100 years than 70,000.
For 150 years, oil supplies have continuously been pronounced to be on the verge of running out. And for 150 years, oil production has continuously increased.
• 1879 -- US Geological Survey formed in part because of fear of oil shortages.
• 1882 -- Institute of Mining Engineers estimates 95 million barrels of oil remain. With 25 milliion barrels per year output, "Some day the cheque will come back indorsed no funds, and we are approaching that day very fast," Samuel Wrigley says. (Pratt, p. 124). ...
• 1906 -- Fears of an oil shortage are confirmed by the U.S. Geological Survey (USGS). Representatives of the Detroit Board of Commerce attended hearings in Washington and told a Senate hearing that car manufacturers worried "not so much [about] cost as ... supply."
• 1919, Scientific American notes that the auto industry could no longer ignore the fact that only 20 years worth of U.S. oil was left. "The burden falls upon the engine. It must adapt itself to less volatile fuel, and it must be made to burn the fuel with less waste.... Automotive engineers must turn their thoughts away from questions of speed and weight... and comfort and endurance, to avert what ... will turn out to be a calamity, seriously disorganizing an indispensable system of transportation."
• 1920 -- David White, chief geologist of USGS, estimates total oil remaining in the US at 6.7 billion barrels. "In making this estimate, which included both proved reserves and resources still remaining to be discovered, White conceded that it might well be in error by as much as 25 percent." ...
• 1928 -- US analyst Ludwell Denny in his book "We Fight for Oil" noted the domestic oil shortage and says international diplomacy had failed to secure any reliable foreign sources of oil for the United States. Fear of oil shortages would become the most important factor in international relations, even so great as to force the U.S. into war with Great Britain to secure access to oil in the Persian Gulf region, Denny said.
• 1932 -- Federal Oil Conservation Board estimates 10 billion barrels of oil remain.
• 1944 -- Petroleum Administrator for War estimates 20 billion barrels of oil remain.
• 1950 -- American Petroleum Institute says world oil reserves are at 100 billion barrels. ...
• 2000 -- Remaining proven oil reserves put at 1016 billion barrels.
.
Oil production has continuously increased as society has gotten continuously better at finding and exploiting the oil in the earth's crust. Why should we assume that that process would stop any time soon -- especially in the face of estimates of total in-place oil that put our supply lifetime in the tens of thousands of years?
You cannot separate the argument from the objective reality the argument claims to describe.
Regarding the wild swings in the estimates of remaining oil, the largest discrepancy comes from the fact that oil in different places requires different amount of toil to extract it. At a certain point, this is too much to pay for all but a few applications (where energy density is needed most, or where the stored hydrocarbons can be used for other purposes).
Already the oil infrastructure is some of the most complicated and costly equipment in the world. We don't know where that economic break even point will be, but we do realize that one must exist. This, among other things, is driving futures. Each new type of oil requires a new type of capital investment. The involved parties are making gambles on future technology. Further influencing this is the fact that it's polluting enough to, potentially, influence much of the world away from it, thereby reducing the attractiveness of such bets.
One might gather that price will similarly effect heating costs and flight usage. Fuel economy is one of the foremost items on the minds of Americans now, according to Gallup (#2 in national issues, apparently).
Countries might increasingly wish to reduce their economic dependency on the oil market, as it becomes cheaper and cheaper to do so. There has been much support for an ethanol economy in the US; for example the bipartisan bill to introduce biofuel installations in gas stations.
Already Brazil has an ethanol economy, provoked by the 1973 oil crisis. Nations with similar agricultural capability and high oil dependency have the same incentives to guard
against future market downturns.
Finally, there's the global warming problem. I assume you don't believe CO2 has a major role. But from a pure economic standpoint, it is likely the governments and the populace will increasingly believe it has a major role and will probably do something about it, either collectively or individually. This will probably influence demand to drop, though it may not do so suddenly. Anticipating this, oil companies have a riskier bet to make in large capital investments in heavy oil extraction. Increasingly, such companies will try to diversify into the broader energy business, as many are already making efforts to do.
The oil price today, unlike twenty years ago, is determined behind closed doors in the trading rooms of giant financial institutions like Goldman Sachs, Morgan Stanley, JP Morgan Chase, Citigroup, Deutsche Bank or UBS. The key exchange in the game is the London ICE Futures Exchange (formerly the International Petroleum Exchange). ICE Futures is a wholly-owned subsidiary of the Atlanta Georgia International Commodities Exchange. ICE in Atlanta was founded in part by Goldman Sachs which also happens to run the world’s most widely used commodity price index, the GSCI, which is over-weighted to oil prices.
As I noted in my earlier article, (‘Perhaps 60% of today’s oil price is pure speculation’), ICE was focus of a recent congressional investigation. It was named both in the Senate's Permanent Subcommittee on Investigations' June 27, 2006, Staff Report and in the House Committee on Energy & Commerce's hearing in December 2007 which looked into unregulated trading in energy futures. Both studies concluded that energy prices' climb to $128 and perhaps beyond is driven by billions of dollars' worth of oil and natural gas futures contracts being placed on the ICE. Through a convenient regulation exception granted by the Bush Administration in January 2006, the ICE Futures trading of US energy futures is not regulated by the Commodities Futures Trading Commission, even though the ICE Futures US oil contracts are traded in ICE affiliates in the USA. And at Enron’s request, the CFTC exempted the Over-the-Counter oil futures trades in 2000.
So it is no surprise to see in a May 6 report from Reuters that Goldman Sachs announces oil could in fact be on the verge of another "super spike," possibly taking oil as high as $200 a barrel within the next six to 24 months.
.
Yes, an economic breakeven point for oil exploitation exists. And it continuously moves higher, because oil-exploitation technology continuously improves. The current extraction price of $5/bbl (http://www.google.com/search?q=oil+cost+%22%245+per+barrel%2...) is not dangerously close to the $200/bbl, or $500/bbl, or $1,000/bbl that the market might accept -- is it? Speaking of what the market might accept, that, too, continuously moves higher, because technology continuously improves the efficiency of end-use. As gas-powered devices become more-efficient, the world's practical oil supply grows. More on how this process works can be found here:
It's not "bias" just because you sympathize with the argument. Extraordinary claims require extraordinary evidence; instead of offering any, this commenter simply says "oil is rechargable and we have enough of it for the foreseeable future", despite a welter of evidence contradicting those statements. That strikes me as less of an argument than a cry for attention.
Science is also not the unqualified saying loudly "LALALA I CANT HEAR YOU". In many cases he's simply dead wrong, but people listen because what he says is more pleasant than the truth.
We're running out of resources (not just oil. Copper, platinum, I belive... indium? are all going to be in short supply in a short few hundred years). The climate is changing. We don't care if it's man made or not, it's harming our living conditions. We're still polluting our lakes -- the great lakes have large segments of horribly toxic sediment, for example. (although cleanup is going well, and costing only a few billion dollars.)
And yes, I am a physics student taking some extra classes on analyzing human damage to the environment, and a number of my professors are researching or otherwise involved with this stuff, from attempting to create efficient solar cells to cleaning up the great lakes. I'm not an expert on our environmental challenges, but I think I have a somewhat better-than-average grasp of the situation.
>We're running out of resources (not just oil. Copper, platinum, I belive... indium? are all going to be in short supply in a short few hundred years).
Take a look around you on this board. Who are the people saying LALALALA? It's certainly not the skeptics. The skeptics are more than willing to engage in intelligent debate.
There's only one side of this discussion who doesn't want to hear the truth -- that's the side that keeps down-modding everything that doesn't fit into their worldview.
It has to be said. This guy is simply lazy, and did not do the research necessary to have informed opinions.
Here are the facts I gathered in about 5 minutes. (Its pretty sloppy research, but hey-- I'm just posting a comment!)
1) Electric grid transmission efficiency: ~90%
http://www.energetics.com/gridworks/grid.html
2) Internal Combustion Engine Efficiency: ~20%
http://en.wikipedia.org/wiki/Internal_combustion_engine#Ener...
3) Power Plant Efficiency: ~33%
http://cleantechnica.com/2008/06/26/electricity-generation-e...
4) Battery Efficiency: ~80%
http://xtronics.com/reference/batterap.htm
33 * .9 * .8 = 23.76
23.76 > 20