Strange as this may seem to you, a local decrease in incoming shortwave radiation of around
1360Wm-2 at the equator does tend to have quite a rapid cooling effect :roll: Only the intentionally self-delusional could persuade themselves that the setting sun proves something about the climate response time of changing CO2 concentrations. Even so, the fact that hot rocks remain warm for many hours after sunset, or that lakes and oceans are only marginally warmer even after eight or nine hours of exposure to a summer day's +1360Wm-2, does help illustrate our planet's thermal inertia despite the flawed analogy.
I have tried to help you out of your confusion - and this is not the first time - but you are determined to remain hopelessly mired in ignorance. If you ever find yourself able to honestly answer this question, you might begin to understand:
What units are used to measure the direct effect of atmospheric CO2 changes?
Here's a hint - it's not degrees Celsius.
Neither IPCC TAR nor AR5 support your self-delusion. You're talking about radiative forcing, but did you even bother to read where that term is explained in your link?
Radiative forcing and forcing variability
In an equilibrium climate state the average net radiation at the top of the atmosphere is zero. A change in either the solar radiation or the infrared radiation changes the net radiation. The corresponding imbalance is called “radiative forcing”. In practice, for this purpose, the top of the troposphere (the tropopause) is taken as the top of the atmosphere, because the stratosphere adjusts in a matter of months to changes in the radiative balance, whereas the surface-troposphere system adjusts much more slowly, owing principally to the large thermal inertia of the oceans.
No, I guess you didn't bother to read that, did you? You prefer cherry-picking quotes (and misrepresenting even those) and hopelessly flawed logic to actually learning what you're talking about :lol: