Further to my previous post, I shall continue my merry trolling of AGW theory.
Here's a graphic to illustrate the point. The AGW theory is that if CO2 is above a certain concentration (about 0.5g/m3) then radiation (at certain frequencies) emitted below that altitude cannot escape to space and warms the atmosphere, oceans and land. If CO2 levels increased by 50% above current levels, the effective emitting altitude (for certain wavelengths) would go up from 5 km to 9.5 km. As a result, everything below that altitude warms by about 1.5 degrees. All figures expressed in terms of a 10km high column of air with a 1m2 cross-section.
1. The amount of energy required to heat such a column of air by 1.5 degrees is about 15 million Joules.
2. The additional CO2 in each such column which has this effect is the amount above the dashed line and between the orange and red lines, which looks to be about 1.5 kg.
3. Therefore, each kg of extra CO2 above the line must be able to 'trap' about 10 million Joules, or about 12 hours' worth of all the radiation from each m2 of Earth's effective emitting surface (which is two-thirds clouds).
4. In case you're wondering, that is a huge number. The biggest number in this context is the latent heat of evaporation of water, which is 2.256 million Joules/kg. Remember, the additional energy required to get boiling water to turn to water vapour (i.e. 'dry' steam rather than visible steam, which is a mix of water vapour and water droplets) is five times as much as the energy required to get water from just above freezing point to boiling point.
5. Or to put it another way, if 1 kg CO2 could absorb 10 million Joules and convert it all to thermal energy without being able to cool down, it would be about 10,000 degrees (difficult to estimate, as specific heat capacity is higher the hotter you go). "That's hot", as Paris Hilton would say, nearly twice as hot as the surface of the Sun. To put it another way, if you did CO2 capture from the air, collected one-third of it, heated it to 10,000 degrees, released it back into the atmosphere and let it mix again, the average temperature of teh atmosphere would go up by about 1.5 degrees (short term).
This just does not seem plausible, does it? Especially as this extra energy seems to be both radiation (electro-magnetic energy, which has no particular temperature) and warmth (thermal energy, which is not electro-magnetic energy on the large scale) at the same time. That's never made clear is it? Is it one or t'other? Or would we need double that amount of Joules, which flip constantly back and forth between the two forms?