cc: Anders , Eduardo.Zorita@gkss.de, hegerl@duke.edu, esper@wsl.ch, k.briffa@uea.ac.uk, m.allen1@physics.ox.ac.uk, weber@knmi.nl, t.osborn@uea.ac.uk
date: Fri, 4 Aug 2006 10:45:44 +0100
from: Martin Juckes
subject: Re: McIntyre, McKitrick & MITRIE ...
to: Anders Moberg
Dear Anders and others,
I agree that some clarification is needed here: there is a distinction to be
made between the use of different proxies in the inverse regression, which
MM2003 followed, and the use of different selections of proxies to Principal
Components for the different steps, which they did not follow. The key
section in MM2003 is 2 (i). It is claimed that MBH98 calculated 28 proxy PCs
and that 16 started prior to available data. In Table 7 they state that the
"Available Period" for data from North American tree-rings as 1619-1971. We
all know that there was plenty of such data available prior to this period.
In fact, MBH98 used a total of 52 proxy PCs, with a maximum of 28 in any one
time period.
The supplementary data for MM2003 (which is clearer than that for MBH1998)
makes it clear that they did not recalculate the proxy PCs for different
steps of the reconstruction, but simply selected proxies from a table of 112:
84 raw proxies plus 28 proxy PCs calculated in 5 sub-regions for periods when
ALL the data in those sub-regions was available.
MM2005 (Energy and Environment) correct this error (without admitting to it),
and claim that centering of the proxies is a major issue. However, the
published code fragments suggest that they omitted the standardisation of the
proxies, and their results can only be replicated by imitting
standardisation. The "centering" issue is thus a red-gherring which
conveniently diverts attention away from the serious flaw in their 2003
paper. MM2003 at no point mention the number of proxies going into the
1400-1450 step of their reconstruction. An unpublished manuscript does
acknowledge that no North American PC was used for th 1400-1450
reconstruction (that is, the first revision of the manuscript acknowledges
this omission [attached]: the first submission, which is also available from
McIntyre's web-site, makes no reference to the fact they were omitted and
implies that only the method of calculation is an issue -- I guess they can
get that kind of thing past some editors but not others).
I'll get some supplementary information on this together to submit with the
manuscript.
The numbers on figure 2 need correcting -- thanks for pointing that out.
I think that the statement that there is no North American tree-ring data
prior to 1619 is a "serious" flaw and that we should say so.
The McIntyre and McKitrick (2005) Energy and Environment paper claims that
they reproduce their 2003 result while including the North American proxy
data, but it is abundantly obvious from looking at the figures in the two
papers that this is untrue: their 2003 reconstruction showed 1400-1450 as
0.3K warmer than 1950, whereas their 2005 reconstruction estimates 1400-1450
to be about the same as 1950. In terms of MBH1998 error bars, MM2003
disagreed with MBH1998 by 3 sigma, MM2005 disagrees by 1.5 sigma.
It appears that this disagreement is largely due to the failure to standardsie
the proxies. Both this technical flaw and the inaccurate reference to their
own work are, I think, serious.
I can understand the reservations about getting into a long dispute. That is
one reason why I have simplified the new calculations used: not using any
temperature PCs and not doing any stepwise reconstruction leads to
considerable simplification.
cheers,
Martin
On Friday 04 August 2006 09:18, Anders Moberg wrote:
> Dear Martin and all others,
>
> Having read the new manuscript, I would like to draw the attention of
> all of you to the section about McIntyre&McKitrick vs Mann et al. I am
> not entirely happy with this section. It may be that I am not fully
> updated about all details on their dispute, but it appears to be some
> mistakes in this section of our manuscript. Therefore, I ask all of you
> to check how this section can be improved and clarified. This is very
> important! If we refer incorrectly to the MM-Mann dispute, I am
> convinced that all of us will be involved in lengthy frustrating e-mail
> discussions later on. I anticipiate this from personal experience! Let's
> do our best to avoid this.
>
> The problematic bit of text starts on p. 16, para 4: ("The failure of
> MM2003 ... is partly due to a misunderstanding of the stepwise
> reconstruction method") and slightly below: ("MM2003 only calculate
> principal components for the period when all chronologies are present").
>
> I read through the MM2003 paper yesterday. From what is written there,
> on p. 763-765, it appears that they were well aware of the stepwise
> method. On p. 763, about at the middle of the page, they write:
> "Following the description of MBH98 ... our construction is done
> piecewise for each of the periods listed in Table 8, using the roster of
> proxies available through the period and the selection of TPCs for each
> period listed in Table 8".
>
> This is clearly at odds to what is written in our manuscript. Has it
> been documented somewhere else that MM2003, despite what they wrote,
> really misunderstood the stepwise technique? If it is so, we need to
> insert a reference. If this is not the case, we need to omit the lines
> about the misunderstanding. We also need to explain better why the
> MM2003 calculations differ from MBH.
>
> Moreover, our sentence ("MM2003 only calculate principal components for
> the period when all chronologies are present") imply that MM2003 only
> calculated PCs for the period 1820-1971, as this would be the period
> when all chronologies are present according to the MM2003 Table 8.
> Obviously, they calculated PCs beyond 1820, as their calculations
> actually extend back to 1400.
>
> The problem continues in the legend to our Fig. 2. (" Each of the 212
> data series is shown ... The red rectangle indicates the single block
> used by MM2003, neglecting all data prior to 1619"). The last sentence
> is inconsistent with the information in MM2003 in three ways; a) MM2003
> clearly show in their Table 8 that they analysed the same blocks of data
> as MBH. b) The year 1619 as a starting point of a data block is
> inconsistent with MM Table 8. Where does the year 1619 come from? It is
> not mentioned anywhere in MM2003. c). The red block implies that MM2003
> made calculations back only to 1619, but they did back to 1400.
>
> Moreover, the numbers given in the graph of our Fig. 2 indicate that the
> total number of series is 211, whereas the text in the legend and also
> in the main text on p. 16 says 212. Which number is correct?
>
> I suppose that some of you others will know this subject much better
> than I. I have just read the MM2003 paper, and find our reference to it
> to be inconsistent with it. I hope you all can make efforts to make this
> bit crystal clear. If not, I fear we will get problems!
>
> Finally, I would like to draw your attention to the related sentence in
> our conclusions on p. 26: ("Papers which claim to refute ... have been
> reviewed and found to contain serious flaws"). Are all of you happy with
> this statement? Would it sound better with a somewhat less offending
> sentence, something like:
>
> "Papers which claim to refute ... have been reviewed and found to
> essentially contribute with insignificant information that does not
> affect the consensus, and even to include some flaws."
>
> I attach the MM2003 paper.
>
> I will send some comments to the other parts of the text in a separate mail.
>
> Cheers,
> Anders
>
>
>
> Martin Juckes wrote:
> > Hello All,
> >
> > here is another draft. I've added a new reconstruction, using 19
independent
> > proxies series from Jones et al., Mann et al., Esper et al. and Moberg et
al.
> > This gives a good fit to the calibration data, such that 2 recent years
exceed
> > the maximum pre-industrial estimate by 4 sigma levels. I've included this
> > because without it I found it hard to draw precise and useful conclusions
> > from the 4 partially overlapping reconstructions I had done before.
> >
> > cheers,
> > Martin
> >
> > ------------------------------------------------------------------------
> >
> > \documentclass[cpd,11pt]{egu}
> >
> > \input macs
> > \voffset 5cm
> > \hoffset 1.5cm
> >
> > \begin{document}
> >
> > \title
> > {\bf Millennial Temperature Reconstruction Intercomparison and Evaluation
> > }
> >
> > \runningtitle{Millennial Temperature}
> > \runningauthor{M.~N.~Juckes et al}
> > \author{Martin Juckes$^{(1)}$,
> > Myles Allen$^{(2)}$,
> > Keith Briffa$^{(3)}$,
> > Jan Esper$^{(4)}$,
> > Gabi Hegerl$^{(5)}$,
> > Anders Moberg$^{(6)}$,
> > Tim Osborn$^{(3)}$,
> > Nanne Weber$^{(7)}$,
> > Eduardo Zorita$^{(8)}$}
> > \correspondence{Martin Juckes (M.N.Juckes@rl.ac.uk)}
> > \affil{
> > British Atmospheric Data Centre, SSTD,
> > Rutherford Appleton Laboratory
> > Chilton, Didcot,
> > Oxfordshire, OX11 0QX,
> > United Kingdom
> > }
> >
> > \affil{1: Rutherford Appleton Laboratory,
> > 2: University of Oxford,
> > 3: University of East Anglia,
> > 4: Swiss Federal Research Institute,
> > 5: Duke University,
> > 6: Stockholm University,
> > 7: Royal Netherlands Meteorological Institute (KNMI),
> > 8: GKSS Research Centre
> > }
> > \date{Manuscript version from 31 Oct 2005 }
> > \msnumber{xxxxxx}
> >
> > \pubyear{}
> > \pubvol{}
> > \pubnum{}
> >
> > \received{}
> > %\pubacpd{} % ONLY applicable to ACP
> > \revised{}
> > \accepted{}
> >
> > \firstpage{1}
> >
> > \maketitle
> >
> > \begin{abstract}
> > There has been considerable recent interest in paleoclimate
reconstructions of the temperature history of
> > the last millennium. A wide variety of techniques have been used.
> > The interrelation among the techniques is sometimes unclear, as different
studies often
> > use distinct data sources as well as distinct methodologies.
> > Recent work is reviewed with an aim to clarifying the import of
> > the different approaches.
> > A range of proxy data collections used by different authors are passed
> > through two reconstruction algorithms: firstly, inverse regression and,
> > secondly, compositing followed by variance matching.
> > It is found that the first method tends to give large weighting to
> > a small number of proxies and that the second approach is more robust
> > to varying proxy input.
> > A reconstruction using 19 proxy records extending back to 1000AD shows a
> > maximum pre-industrial temperature of 0.227K (relative to the 1866 to 1970
mean).
> > The standard error on this estimate, based on the residual in the
calibration
> > period is 0.149K. Two recent years (1998 and 2005) have exceeded the
pre-industrial
> > estimated maximum by more than 4 standard errors.
> > \end{abstract}
> >
> >
> > %%\openup 1\jot
> >
> > \introduction\label{sec:intro}
> >
> > The climate of the last millennium has been the subject of much
> > debate in recent years, both in the scientific literature
> > and in the popular media.
> > This paper reviews reconstructions of past temperature,
> > on the global, hemispheric, or near-hemispheric scale, by
> > \citet{jones_etal1998} [JBB1998],
> > \citet{mann_etal1998a} [MBH1998],
> > \citet{mann_etal1999} [MBH1999],
> > \citet{huang_etal2000} [HPS2000],
> > \citet{crowley_lowery2000} [CL2000],
> > \citet{briffa_etal2001} [BOS2001],
> > \citet{esper_etal2002b} [ECS2002],
> > \citet{mann_jones2003} [MJ2003],
> > \citet{moberg_etal2005} [MSH2005],
> > \citet{oerlemans2005} [OER2005],
> > \citet{hegerl_etal2006+} [HCA2006].
> > %%The criticism
> > %%directed at them (mainly MBH1999) by \citet{mcintyre_mckitrick2003}
[MM2003] and others.
> >
> >
> > Climate variability can be partitioned into contributions from
> > internal variability of the climate system and response to forcings,
> > which the forcings being further partitioned in natural and
> > anthropogenic.
> > The dominant change in forcing in the late 20th century
> > arises from human impact in the form of
> > greenhouse gases \citep[primarily carbon dioxide, methane and
> > chloro-fluoro carbons:][]{IPCC2001}.
> > The changes in concentration of these gases in the atmosphere
> > are well documented and their radiative properties which reduce,
> > for a given temperature difference, radiative loss of heat to space
> > from the mid and lower troposphere
> > \citep[for carbon dioxide, this was first documented by][]{arrhenius1896}
> > are beyond dispute.
> >
> > However, there remains some uncertainty on two issues:
> > firstly, how much of the observed change is due to greenhouse forcing as
> > opposed to natural forcing and internal variability;
> > secondly, how significant, compared to past natural changes, are the
> > changes which we now observe and expect in the future?
> >
> > The first question is not answered by the IPCC conclusion cited above
because
> > that conclusion only compares the anthropogenic forcing of the late 20th
century
> > with the natural forcings of the same period. Further back in the past, it
is
> > harder to make definitive statements about the amplitude of variability in
natural
> > forcings. The second question reflects the uncertainty in the response of
the
> > climate system to a given change in forcing. In the last century both the
> > variations in forcing and the variations in response have been measured
with
> > some detail, yet there remains uncertainty about the contribution of
> > natural variability to the observed temperature fluctuations.
> > In both cases, investigation is hampered by the fact that
> > estimates of global mean temperature based on reliable direct measurements
> > are only available from 1856 onwards \citep{jones_etal1986}.
> >
> > Climate models are instrumental in addressing both questions,
> > but they are still burdened with
> > some level of uncertainty and there is a need for more detailed knowledge
> > of the behaviour of the actual climate on multi-centennial timescales
> > both in order to evaluate the climate models and in order to address the
> > above questions directly.
> >
> > The scientific basis for proxy based climate reconstructions may be stated
simply: there are
> > a number of physical indicators
> > which contain information about the past environmental variability.
> > As these are not direct measurements, the term proxy is used.
> >
> >
> > \citet{jones_mann2004} review evidence for climate change in
> > the past millennium and conclude that there had been a
> > global mean cooling since the 11th century
> > until the warming period initiated in the 19th century, but the issue
remains
> > controversial. This paper reviews recent contributions and evaluates the
impact
> > of different methods and different data collections used.
> >
> > Section 2 discusses recent contributions, which have developed a range of
new
> > methods to address aspects of the problem.
> > Section 3 discusses the technique used by MBH1998/9
> > in more detail in the context of criticism by
\citet{mcintyre_mckitrick2003}
> > (hereafter MM2003).
> > Section 4 presents some new results using the data collections from 5
recent studies.
> >
> >
> > \section{A survey of recent reconstructions}
> >
> > This section gives brief reviews of recent
> > contributions, displayed in Fig.~1.
> > Of these, 5 are estimates of the Northern Hemisphere mean temperature
> > (MBH1999, HPS2000, CL2000, MSH2005, HCA2006),
> > 2 of the Northern Hemisphere extra tropical mean temperature (BOS2001,
ECS2002)
> > and 3 of the global mean temperature (JBB1998, MJ2003, OER2005).
> > All, except the inherently low resolution reconstructions of HPS2000 and
OER2005,
> > have been smoothed with a 40 year running mean.
> > With the exception of HPS2000 and OER2005, the reconstructions
> > use partly overlapping methods and data, so they
> > cannot be viewed as independent from a statistical viewpoint.
> > In addition to exploiting a range of different data sources,
> > the above works also use a range of techniques.
> > The subsections below cover different scientific themes,
> > ordered according to the date of key publications.
> > Some reconstructions which do not extend all the way
> > back to 1000AD are included because of their
> > importance in addressing specific issues.
> > The extent to which the global, northern hemisphere and northern
hemisphere
> > extratropical reconstructions might be expected to agree
> > is discussed in Sect.~2.10 below.
> >
> > \subsection{High-resolution paleoclimate records}
> >
> > \citet{jones_etal1998} [JBB1998] present the first annually resolved
> > reconstructions of temperatures back to 1000AD, using
> > a composite of standardised 10 proxies for the northern hemisphere and 7
for the southern,
> > with variance damped in the early part of the series to account for the
> > lower numbers of proxies present (6 series extend back to 1000AD),
following \citet{osborn_etal1997}.
> > The composites are
> > scaled by variance matching (Appendix A) against the annual mean summer
temperatures for 1931-1960.
> > Climate models are also employed to investigate the temperature coherency
> > between proxy sites and it is shown that there are strong large scale
> > coherencies in the proxy data which are not reproduced by
> > the climate model. An evaluation of each individual
> > proxy series against instrumental data from 1881 to 1980
> > shows that tree-rings and historical reconstructions
> > are more closely related to temperature than those
> > from corals and ice-cores.
> >
> > With regard to the temperatures of the last millennium,
> > the primary conclusion of JBB1998 is that
> > the twentieth century was the warmest of the millennium.
> > There is clear evidence of a cool period from 1500 to 1900,
> > but no strong ``Medieval Warm Period" [MWP] (though the second warmest
> > century in the northern hemisphere reconstruction is
> > the 11th). The MWP is discussed further in Sect.~2.4 below.
> >
> > JBB1998 draw attention to the limitations of some of the proxies
> > on longer timescales (see Sect.~3.5 below).
> > Homogeneity of the data record and
> > its relation with temperature may not be guaranteed on longer timescale.
> > This is an important issue, since
> > many climate reconstructions assume a constant relationship between
> > temperature anomalies and the proxy indicators
> > (there are also problems associated with timescale-dependency in the
> > relationship which are discussed further in Sect.~2.6 below).
> >
> > MJ2003 include some additional proxy series and extend to study period
back a
> > further millennium and conclude that the late 20th century warmth
> > is unprecedented in the last two millennia.
> >
> > \subsection{Climate field reconstruction}
> >
> > \citet{mann_etal1999} published
> > the first reconstruction of the last thousand years northern hemispheric
mean
> > temperature which included objective error bars,
> > based on the analysis of the residuals in the calibration period.
> > The authors concluded not only
> > that their estimate of the temperature over the whole period 1000AD to
1860AD
> > was colder than the late twentieth century, but also that 95\% certainty
limits
> > were below the last decade of the twentieth century.
> > The methods they used were presented in MBH1998
> > which described a reconstruction back to 1400AD.
> >
> > MBH1998 use a collection of 415 proxy time indicators, many more than used
in \citet{jones_etal1998},
> > but many of these are too close geographically to be considered
> > as independent, so they are combined into a smaller number of
representative
> > series.
> > The number of proxies also decreases significantly with age:
> > only 22 independent proxies extend back to 1400AD,
> > and, in
> > MBH1999, 12 extend back to 1000AD (7 in the Northern Hemisphere).
> > MBH1998 and MBH1999 have been the subject of much debate since the latter
was cited
> > in the IPCC (2001) report, though the IPCC
> > conclusions\footnote{\citet{IPCC2001} concluded that
> > ``The 1990s are likely to have been the warmest decade of the millennium
in
> > the Northern Hemisphere, and 1998 is likely to have been the warmest
> > year," where ``likely'' implies a greater than 66\% probability.
> > Since 2001 it has been recognised that there is a need to explicitly
> > distinguish between an expression of confidence, as made by the IPCC in
this quote,
> > which should include expert assessment of the robustness of statistical
methods
> > employed, and simple citation of the results of statistical test.
> > In the language of
> > \citet{manning_etal2004} we can say that MBH1999 carried out statistical
> > tests which concluded that the 1990s have been the warmest decade of the
> > millenium with 95\% likelihood, while IPCC (2001), after assessing all
> > available evidence had a 66\% confidence in the same statement.}
> > were weaker than those of MBH1999.
> >
> > This work also differ from Jones et al. (1998) in using spatial patterns
of temperature
> > variability rather than hemispheric mean temperatures. In this way the
study aims
> > to exploit proxies which are related to temperature indirectly: for
> > instance, changes in temperature may be associated with changes in
> > wind and rainfall which might affect proxies more strongly than
> > temperature. Since wind and rainfall are correlated with
> > changes in temperature patterns, it is argued, there may be important
non-local
> > correlations between proxies and temperature.
> >
> > Different modes of atmospheric variability are evaluated through an
> > Empirical Orthogonal Function [EOF] analysis of the time period 1902 to
1980,
> > expressing the global field as a sum of spatial patterns (the EOFs)
multiplied by
> > Principal Components (PCs -- representing the temporal evolution).
> > Earlier instrumental data are too sparse to be used for this purpose:
> > instead they are used in a validation calculation to determine how
> > many EOFs should be included in the reconstruction.
> > Time series for each mode of variability are then reconstructed from the
proxy data using
> > a optimal least squares inverse regression.
> >
> > Finally, the skill of the regression of each PC is tested using the
> > 1856 to 1901 validation data.
> > Prior to 1450AD it is determined that only
> > one PC can be reconstructed with
> > any accuracy. This means that the main advantage of the
> > Climate Field Reconstruction method does not apply at earlier dates.
> > The methodology will be discussed further in Sect.~3 below.
> >
> > The reconstructed temperature evolution (Fig.~1) is rather less variable
than that of Jones et al. (1998),
> > but the differences are not statistically significant.
> > The overall picture is of gradual cooling until the mid 19th century,
> > followed by rapid warming matching that evaluated by the earlier work.
> >
> > \subsection{Borehole temperatures}
> >
> > \citet{huang_etal2000} [HPS2000] estimate northern hemisphere temperatures
> > back to 1500AD using
> > measurements made in 453 boreholes (their paper also presents global and
> > southern hemisphere results using an additional 163 southern hemisphere
boreholes).
> > The reconstruction is included here, even though it does not extend back
to 1000AD,
> > because it has the advantage of being completely
> > independent of the other reconstructions shown.
> > Temperature fluctuations at the surface propagate slowly downwards, so
that measurements
> > made in the boreholes at depth contain a record of past surface
temperature fluctuations.
> > HPS2000 used measurements down to around 300m.
> > The diffuse nature of the temperature anomaly means that short time scale
fluctuations
> > cannot be resolved. Prior to the 20th century, the typical resolution is
about 100 years.
> >
> > \citet{mann_etal2003} analyse the impact of changes in land use and snow
cover
> > on borehole temperature reconstructions and conclude that
> > it results in significant errors.
> > This conclusions has been refuted by
> > \citet{pollack_smerdon2004} (on statistical grounds),
\citet{gonzalez-rouco_etal2003}
> > (using climate simulations) and \citet{huang2004} (using an expanded
network of 696
> > boreholes in the northern hemisphere).
> >
> > \subsection{Medieval Warm Period}
> >
> > Despite much discussion
> > \citep[e.g.][]{hughes_diaz1994, bradley_etal2003}, there is no clear
quantitative
> > understanding of what is meant by the ``Medieval Warm Period'' [MWP].
> > \citet{crowley_lowery2000}
> > [CL2000] discuss the evidence for a global MWP, which they interpret as
> > a period of unusual warmth in the 11th century. All the reconstructions
> > of the 11th century temperature shown
> > in Fig.~1 estimate that century to have been warmer than most of the
> > past millennium. However, the question of practical importance is not
> > whether it was warmer than the 12th to 19th centuries, which is
> > generally accepted, but whether it was a period of comparable
> > warmth to the late 20th century. MBH1999 concluded, with 95\% confidence,
that
> > this was not so. CL2000 revisit the question
> > using 15 proxy records, of which 9 were not used in the studies
> > described above. Several of the series used have extremely low temporal
resolution.
> > %%CL2000 sought to select tree ring chronologies with consistent quality
> > %%throughout their length, as measured by the "sample replication"
> > %%\citep{cook_etal2004}.
> > %%[check usage of "sample replication" -- cook etal (QSR) is available
from Jan's website]]
> >
> > They draw attention to the spatial localization of the MWP in their proxy
series:
> > it is strong in North America, North Atlantic and Western Europe, but not
> > clearly present elsewhere. Periods of unusual warmth
> > do occur in other regions, but these are short and asynchronous.
> >
> > Their estimate of northern hemispheric temperature over the past
millennium is consistent
> > with the works discussed above. They conclude that the occurrence of
decades of
> > temperatures similar to those of the late 20th century cannot be
unequivocally ruled
> > out, but that there is, on the other hand, no evidence to support the
claims
> > that such an extended period of large-scale warmth occurred.
> >
> > \citet{soon_baliunas2003} carry out an analysis of local climate
reconstructions.
> > They evaluate the number of such reconstructions which show (a) a
sustained ``climate
> > anomaly" during 800-1300AD, (b) a sustained ``climate
> > anomaly" during 1300-1900AD and (c)
> > their most anomalous 50 year period in the 20th century.
> > Their definition of a ``sustained climate anomaly" is 50 years of warmth,
> > wetness or dryness for (a) and (c) and 50 years of coolness, wetness
> > or dryness in (b).
> > It should be noted that they do not carry out evaluations which allow
direct comparison between
> > the 20th century and earlier times:
> > they compare the number of extremes occurring in the 20th century with the
> > number of anomalies occurring in periods of 3 and 4 centuries in the past.
> > Both the use of sampling periods of differing length and different
selection criteria make interpretation
> > of their results problematic.
> > They have also been criticised for interpreting
> > regional extremes which occur at distinct times as being indicative of a
global
> > climate extremes \citep{jones_mann2004}. This issue is discussed further
in
> > Sect.~2.9 below.
> > \citet{osborn_briffa2006} perform a systematic analysis along the lines of
\citet{soon_baliunas2003}
> > and conclude that the proxy records alone, by-passing the problem of proxy
calibration
> > against instrumental temperatures, show an unprecedented anomaly in the
20th century.
> >
> > \subsection{Segment length curse}
> >
> > \citet{briffa_etal2001} and \citet{briffa_etal2002} discuss the impact of
> > the ``segment length curse'' \citep{cook_etal1995a, briffa_etal1996,
briffa2000} on
> > temperature reconstructions from tree rings.
> > Tree rings have been shown to have much greater sensitivity
> > than other proxies on short timescales (JBB1998), but there is a concern
that this may not
> > be true on longer timescales. Tree ring chronologies are often made up of
> > composites of many trees of different ages at one site.
> > The width of the annual growth ring
> > depends not only on environmental factors but also on the age of the
> > tree. The age dependency on growth is often removed by subtracting
> > a growth curve from the tree ring data for each tree. This process,
> > done empirically, will not only remove age related trends but also any
environmental
> > trends which span the entire life of the tree.
> > \citet{briffa_etal2001} use a more sophisticated method
> > (Age Band Decomposition [ABD], which
> > forms separate chronologies from tree rings in different age bands,
> > and then averages all the age-band chronologies)
> > to construct northern hemisphere
> > temperatures back to 1400AD, and show that
> > a greater degree of long term variability is preserved.
> > The reconstruction lies between those
> > of MBH1999 and JBB1998, showing the cold 17th century of the former,
> > but the relatively mild 19th century of the latter.
> >
> > The potential impact of the segment length limitations is analysed further
> > by \citet{esper_etal2002b, esper_etal2003}, using `Regional Curve
Standardisation' (RCS)
> > \citep{briffa_etal1992}.
> > In RCS composite growth curves (different curves reflecting
> > different categories of growth behaviour) are obtained from all the trees
> > in a region and this, rather than a fitted curve, is subtracted
> > from each individual series. Whereas ABD circumvents the need to
> > subtract a growth curve, RCS seeks to evaluate a growth curve which
> > is not contaminated by climate signals.
> > The ECS2002 analysis agrees well with that of MBH1999 on short
> > time scales, but has greater centennial variability
\citep{esper_etal2004}.
> > ECS2002 suggest that this may be partly due to the lack of tropical
proxies
> > in their work, which they suggest should be regarded as an extratropical
> > Northern Hemisphere estimate. The extratropics are known to have
> > greater variability than the tropics.
> > %[check]:from eduardo:: Table 1 in MBH GRL 99 --add ref??
> > However, it has to be also noted that among the proxies used by MBH1999
> > (12 in total), just 2 of them are located in the tropics, both at one
location
> > (see table 1 below).
> >
> > \citet{cook_etal2004} study the data used by ECS2002 and pay particular
attention
> > to potential loss of quality in the earlier parts of tree-ring
chronologies
> > when a relatively small number of tree samples are available. Their
analysis
> > suggests that tree ring chronologies prior to 1200AD should be treated
with
> > caution.
> >
> > \subsection{Separating timescales}
> >
> > \citet{moberg_etal2005} follow BOS2001 and ECS2002 in trying to address
> > the ``segment length curse'', but rather than trying to improve the
> > tree-ring chronologies by improving the standardizations,
> > they discard low frequency component of the tree-ring data,
> > and replace this with low-frequency information from proxies with lower
temporal resolution.
> > A wavelet analysis is used to filter different temporal scales.
> >
> > Each individual proxy series is first scaled to unit variance and then
wavelet transformed.
> > Averaging of the wavelet transforms is made separately for tree ring data
> > and the low-resolution data.
> > The average wavelet transform of tree-ring data for timescales less than
80
> > years is combined with the averaged wavelet transform of the
low-resolution data for
> > timescales longer than 80 years to form one single wavelet transform
covering all timescales.
> > This composite wavelet transform is inverted to create a dimensionless
temperature
> > reconstruction, which is calibrated against the instrumental record of
> > northern hemisphere mean temperatures, AD 1856-1979, using a variance
matching method.
> >
> > Unfortunately, the calibration period is too short to independently
calibrate the
> > low frequency component. The variance matching represents a form of
cross-calibration.
> > In all calibrations against instrumental data, the long period
(multi-centennial)
> > response is determined by a calibration which is dominated by
> > sub-centennial variance. The MSH2005 approach makes this explicit and
> > shows a level of centennial variability which is much larger than in
> > MBH1999 reconstruction and
> > similar to that in simulations of the past millennium with two
> > different climate models, ECHO-G \citep{storch_etal2004} and NCAR CSM
> > (``Climate System Model'') \citep{mann_etal2005}.
> >
> > \subsection{Glacial advance and retreat}
> >
> > \citet{oerlemans2005} provides another independent estimate of the global
mean temperature
> > over the last 460 years from an analysis of glacial advance and retreat.
> > As with the bore hole based estimate of HPS2000, this work uses a
> > physically based model rather than an empirical calibration.
> > The resulting curve lies within the
> > range spanned by the high-resolution proxies, roughly midway between
> > the MBH1999 Climate Field Reconstruction and the HPS2000 bore hole
estimate.
> >
> > Unlike the borehole estimate, but consistent with most other works
presented
> > here, this analysis shows a cooling trend prior to 1850, related to
glacial
> > advances over that period.
> > It should be noted that
> > the technique used to generate the bore hole estimate
\citep{pollack_etal1998}
> > assumes a constant temperature prior to 1500AD. The
> > absence of a cooling trend after this date may be influenced by this
> > boundary condition.
> >
> > \subsection{Regression techniques}
> >
> > Many of the reconstructions listed above depend on empirical relationships
> > between proxy records and temperature. \citet{storch_etal2004} suggest
> > that the regression technique used by MBH1999
> > under-represents\footnote{This is sometimes referred to as
``underestimating'',
> > which will mean the same thing to many people, but something slightly
different
> > to statisticians. Any statistical model (that is, a set of assumptions
about the
> > noise characteristics of the data being examined) will deliver estimates
of
> > an expected value and variability. The variability of the expected value
is
> > not generally the same as the expected value of the variability.}
> > the variability of past climate.
> > This conclusion is drawn after a applying a method similar to that of
MBH1999 to output from a
> > climate model using a set of pseudo-proxies: time series generated from
> > the model output and degraded with noise which is intended to match the
noise
> > characteristics of actual proxies.
> > \citet{mann_etal2005} use the same approach and arrive at a different
conclusion:
> > namely, that their regression technique is sound.
> > \citet{mann_etal2005} show several implementations of their
> > Climate Field Reconstruction Method in the CSM simulation, using different
levels
> > of white noise in their synthetic pseudo proxies.
> > For a case of pseudo-proxies with a realistic signal-to-noise ratio of
0.5, they use
> > a calibration period (1856-1980) which is longer than that
> > used in MBH1998 and MBH1999 (1901-1980).
> > It turns out that the difference in the length of the calibration period
is critical
> > for the skill of the method (Zorita, personal communication et al.,
submitted).
> > % (I think you can refer to Buerger et al 2006 here. Check with Eduardo if
this is OK.
> > % By the way, update the reference list: Tellus, 58A, 227-235) [AM]
> >
> > There is some uncertainty about the true nature of noise on the proxies,
and
> > on the instrumental record, as will be discussed further below.
> > The optimal least squares estimation technique of MBH1998 effectively
> > neglects the uncertainties in the proxy data relative to uncertainties
> > in the temperature.
> > Instead,
> > \citet{hegerl_etal2006+} use total least squares regression
\citep{allen_stott2003, adcock1878}.
> > This approach
> > allows the partitioning of noise between instrumental temperatures
> > and proxy records to be estimated, on the assumption that the instrumental
> > noise is known. \citet{hegerl_etal2006+} show that this approach leads to
greater variability in the reconstruction.
> >
> > \citet{rutherford_etal2005} take a different view. They compare
reconstructions
> > from 1400AD to present using a regularised expectation maximisation
technique \citep{schneider2001}
> > and the MBH1998 climate field reconstruction method and find only minor
differences.
> > Standard regression techniques assume that we have a calibration period,
in which
> > both sets of variables are measured, and a reconstruction (or prediction)
period
> > in which one variable is estimated, by regression, from the other.
> > The climate reconstruction problem is more complex:
> > there are hundreds of instrumental records
> > which are all of different lengths, and similar numbers of proxy records,
> > also of varying length. The expectation maximisation technique
> > \citep{little_rubin1987}
> > is well suited to deal with this: instead of imposing an
> > artificial separation between a calibration period and a reconstruction
> > period, it fills in the gaps in a way which exploits all data present.
> > Regularised expectation maximisation is a generalisation
> > developed by \citet{schneider2001} to deal with ill posed problems.
> > Nevertheless, there is still a simple regression equation at the heart of
the technique.
> > That used by \citet{rutherford_etal2005} is similar to that used by
> > %new: corrected
> > MBH1998, so the issue raised by \citet{hegerl_etal2006+} is unanswered.
> >
> > \subsection{Natural variability and forcings}
> >
> > Global temperature can fluctuate through internally generated variability
of
> > the climate system (as in the El Ni\~no phenomenon), through
> > variability in natural forcings (solar insolation, volcanic aerosols,
> > natural changes to greenhouse gas concentrations) and human changes.
> > Reconstructions of variations in the external forcings for the last
> > millenium have been
> > put forward \citep{crowley2000}, although recent studies have
> > suggested a lower amplitude
> > of low-frequency solar forcing \citep{lean_etal2002, foukal_etal2004}.
> >
> > Analysis of reconstructed temperatures of MBH1999 and CL2000 and
> > simulated temperatures using reconstructed solar and volcanic forcings
> > shows that changes in the forcings can explain the reconstructed long
> > term cooling through most of the millenium
> > and the warming in the late 19th century \citep{crowley2000}.
> > The relatively cool climate in the second half of the 19th century may be
> > attributable to cooling from deforestation \citep{bauer_etal2003}.
> > \citet{hegerl_etal2003} analyse the correlations between four
> > reconstructions (MBH1999, BOS2001, ECS2002, and a modified version of
> > CL2000)
> > and estimated forcings \citep{crowley2000}.
> > They find that that natural forcing, particularly by
> > volcanism, explains a substantial fraction of decadal variance.
> > Greenhouse gas forcing is detectable
> > with high significance levels in all analyzed reconstructions except
> > MSH2005, which ends in 1925.
> > \citet{weber2005b} carries out a similar analysis with a wider range
> > of reconstructions. It is shown that the regression of reconstructed
> > global temperatures on the forcings has a similar dependence on timescale
> > as regressions derived from the climate model. The role of solar forcing
is
> > found to be larger for longer timescales, whereas volcanic forcing
dominates
> > for decadal timescales.
> > The trend component over the period 1000 to 1850 is, however, in all
> > reconstructions larger than the trend implied by the forcings.
> >
> > The methods employed by
> > \citet{hegerl_etal2006+} attribute about a third of the early 20th
> > century warming, sometimes
> > more, in high-variance reconstructions to greenhouse gas forcing.
> > These results indicate that enhanced variability in the past does not
> > make it more difficult to detect greenhouse warming, since a large
> > fraction of the variability can be attributed to external forcing.
> > Quantifying the influence of external forcing on the proxy records is
> > therefore more relevant to understanding climate variability and its
> > causes than determining if past periods were possibly as warm as the
> > 20th century.
> >
> > \citet{goosse_etal2005} investigate the role of internal variability using
> > an ensemble of 25 climate model simulations of the last millennium
> > and forcing estimates from \citet{crowley2000}.
> > They conclude that internal variability dominates local and regional
> > scale temperature anomalies, implying that most of the variations
> > experienced by a region such as Europe over the last millennium could
> > be caused by internal variability. On the hemispheric and global scale,
> > however, the forcing dominates.
> > This agrees with results from a long
> > solar-forced model simulation by \citet{weber_etal2004}.
> > %%similar This reinforces similar statements made by JOS1998. [where does
this come from?]
> > \citet{goosse_etal2005}
> > make the new point, that noise can lead to regional temperature anomalies
> > peaking at different times to the forcing, so that disagreements in
> > timing between proxy series should not necessarily be interpreted as
> > meaning there is no common forcing.
> >
> > \subsection{The long view}
> >
> > The past sections have drawn attention to the problems of calibrating
> > temperature reconstructions using a relatively short
> > period over which instrumental records are available.
> > For longer reconstructions, with lower temporal resolution,
> > other methods are available. Pollen
> > reconstructions of climate match the ecosystem types with those
> > currently occurring at different latitudes. The changes in
> > ecosystem can then be mapped to the temperatures at which
> > they now occur \citep[e.g.][]{bernabo1981, gajewski1988}.
> > These reconstructions cannot resolve decadal variability,
> > but they provide an independent estimate of local low-frequency
> > temperature variations. The results of \citet{weber_etal2004}
> > and
> > \citet{goosse_etal2005} suggest that such estimates
> > centennial mean temperatures can provide some information about
> > global mean anomalies, as they strongly reflect the external forcings on
> > centennial and longer timescales. However, there has, as yet,
> > been no detailed intercomparison between the pollen based
> > reconstructions and the higher resolution reconstructions.
> >
> >
> > \section{Critics of the IPCC consensus on millennial temperatures}
> >
> > The temperature reconstructions described in the previous section
> > represent (including their respective differences and similarities)
> > the scientific consensus, based on objective analysis
> > of proxy data sources which are sensitive to temperature.
> > Nevertheless, there are many who are strongly attached to the view that
past
> > temperature variations were significantly larger and that, consequently,
> > the warming trend seen in recent decades should not be considered
> > as unusual.
> >
> >
> > The criticism has been directed mainly at the \citet{mann_etal1998a,
mann_etal1999}
> > work.
> > Therefore, this section focuses mainly on this criticism.
> > %new
> > Though some of the critics identify the consensus with the MBH1998 work,
> > this is not the case: the consensus rests on a broader body of work, and
> > as formulated by IPCC2001 is less strong than the conclusions of
> > MBH1998 (Sect.~3.2).
> > \citet{mcintyre_mckitrick2003} [MM2003]
> > criticize MBH1998 on many counts, some related to deficiencies
> > in the description of the data used and possible irregularities in the
data
> > themselves. These issues have been largely resolved in
\citet{mann_etal2004}.
> > %%\footnote{ftp://holocene.evsc.virginia.edu/pub/MANNETAL1998}.
> >
> > As noted above, the MBH1998 analysis is considerably more complex than
others,
> > and uses a greater volume of data.
> > There are 3 main stages of the algorithm: (1) sub-sampling of
> > regions with disproportionate numbers of proxies, (2) regression,
> > (3) validation and uncertainty estimates.
> >
> > Stage (1) is necessary because some parts of the globe, particularly
> > North America and Northern Europe, have a disproportionate number of
> > proxy records. Other authors have dealt with this by using only
> > a small selection of the available data or using regional
> > averages \citep[BOS2001;][]{hegerl_etal2006+}. MBH1998
> > use a principal component analysis to extract the common signal from the
records in
> > densely sampled regions.
> >
> > The failure of MM2003 to replicate the MBH1998 results is partly due to
> > a misunderstanding of the stepwise reconstruction method. MBH1998 use
> > different subsets of their proxy database for different time periods.
> > This allows more data to be used for more recent periods.
> >
> > For example, Fig.~2 illustrates
> > how the stepwise approach applies to the North American tree ring network.
> > Of the total of 212 chronologies, only 66 extend back beyond 1400AD.
> > MM2003 only calculate principal components for the period when all
> > chronologies are present. Similarly, MBH1998 use one principal
> > component calculated from 6 drought sensitive tree-rings chronologies from
South West Mexico
> > and this data is omitted in MM2003.
> > %%[is this clear now?? (AM)]]
> > %new
> > %%Table 7 of MM2003 indicates only 20 series for the region, as the
> > %%supplementary information provided with MBH2003 omitted 2
> > %%\citep{mann_etal2004}.
> > %endnew
> > \citet{mcintyre_mckitrick2005a} [MM2005] continue the criticism of the
techniques
> > used by MBH1998 and introduce a ``hockey stick index": defined in terms of
the ratio
> > of the variance at the end of a time series
> > to the variance over the remainder of the series.
> > MM2005 argue that the way in which
> > a principal component analysis is carried out in MBH generates an
artificial
> > bias towards a high ``hockey-stick index" and that the statistical
significance of
> > the MBH results may be lower that originally estimated.
> >
> > The issue arises because the tree ring chronologies are standardized:
> > this involves subtracting a mean and dividing by a variance.
> > MBH1998 use the mean and variance of the detrended series evaluated
> > over the calibration period. MM2005 are of the view that this is
> > incorrect.
> > They suggest that each series should be standardised with respect to the
> > mean and variance its full length.
> >
> > The code used by MM2005 is not, at the time of writing available,
> > but the code fragments included in the text imply
> > that their calculation used data which had been
> > centred (mean removed) but had not been normalized to unit variance
(standardised).
> > Figure 3 shows the effect of the changes, applied to the
> > North American tree ring sub-network of the data used by MBH1998,
> > using those chronologies which extend back to 1400AD.
> > The calculation used here does not precisely reproduce the archived
MBH1998
> > result, but the differences may be due to small differences in
> > mathematical library routines used to do the decomposition.
> > The effect of replacing the MBH1998 approach with centering and
> > standardising on the whole time series is small, the effect of
> > omitting the standardisation as in MM2005 is much larger:
> > this omission causes the 20th century trend to be removed from the
> > first principal component.
> >
> > \citet{storch_zorita2005} look at some of the claims made in MM2005
> > and analyses them in the context of a climate simulation.
> > They find the impact of the modifications suggested by McIntyre and
McKitrick to
> > be minor.
> > \citet{mcintyre_mckitrick2005b} clarify their original claim, stating that
the
> > standardisation technique used by MBH98 does not create the
``hockey-stick" structure
> > but does ``steer" the selection of this structure in principal component
> > analysis.
> >
> > \citet{mcintyre_mckitrick2005c} [MM2005c] revisit the MM2003 work and
correct
> > their earlier error by taking the stepwise reconstruction technique into
account.
> > They assert that the results of MM2003, which show a 15th century
> > reconstruction 0.5K warmer than found by MBH1998,
> > are reproduced with only minor changes to the MBH1998 proxy data base.
> > Examination of the relevant figures, however, shows that this is not
entirely
> > true. The MM2005c predictions for
> > the 15th century are 0.3K warmer than the MBH1998
> > result: this is still significant, but, unlike the discredited MM2003
result, it
> > would not make the 15th century the warmest on record.
> >
> > MM20005c and \citet{wahl_ammann2005} both find that
> > excluding the north American bristlecone pine data from the proxy
> > data base removes the skill from the 15th century reconstructions.
> > MM2005c justify this removal on the grounds that the first principal
component
> > of the North American proxies, which is dominated by the
> > bristlecone pines, is a statistical outlier with respect to the joint
distribution
> > of $R^2$ and the difference in mean between 1400 to 1450 and 1902 to 1980.
> > %%first ref to table 1
> > Table 1, which lists a range of proxies extending back to 1000,
> > shows that the North American first principal component (``ITRDB [pc01]''
in that table)
> > is not an outlier
> > in terms of its coherence with northern hemispheric mean temperature from
1856 to 1980.
> >
> > \begin{table}[t]
> > \small
> > %% output from mitrie/pylib/multi_r2.py, editted
> > \begin{tabular}{|p{7.0cm}|r|r|l|r|l|}
> > \hline
> > Name & Lat. & Lon. & Id & $R^2$ & Type \cr
> > \hline
> > GRIP: borehole temperature (degC) (Greenland)$^1$ & 73 & -38 & *,Mo &
0.67 & [IC] \cr
> > China: composite (degC)$^2$ & 30 & 105 & *,Mo & 0.63 &
[MC] \cr
> > Taymir (Russia) & 72 & 102 & He & 0.60 & [TR
C] \cr
> > Eastern Asia & 35 & 110 & He & 0.58 & [TR
C] \cr
> > Polar Urals$^3$ & 65 & 67 & Es, Ma & 0.51
& [TR] \cr
> > Tornetraesk (Sweden)$^4$ & 58 & 21 & Mo & 0.50 &
[TR] \cr
> > ITRDB [pc01] & 40 & -110 & Ma & 0.49 & [TR
PC] \cr
> > Mongolia & 50 & 100 & He & 0.46 & [TR
C] \cr
> > Arabian Sea: Globigerina bull$^5$ & 18 & 58 & *,Mo & 0.45 &
[CL] \cr
> > Western Siberia & 60 & 60 & He & 0.44 & [TR
C] \cr
> > Northern Norway & 65 & 15 & He & 0.44 & [TR
C] \cr
> > Upper Wright (USA)$^6$ & 38 & -119 & *,Es & 0.43 &
[TR] \cr
> > Shihua Cave: layer thickness (degC) (China)$^7$ & 40 & 116 & *,Mo &
0.42 & [SP] \cr
> > Western Greenland & 75 & -45 & He & 0.40 &
\cr
> > Quelcaya 2 [do18] (Peru)$^8$ & -14 & -71 & *,Ma & 0.37 &
[IC] \cr
> > Boreal (USA)$^6$ & 35 & -118 & *,Es & 0.32 &
[TR] \cr
> > Tornetraesk (Sweden)$^9$ & 58 & 21 & *,Es & 0.31 &
[TR] \cr
> > Taymir (Russia)$^{10}$ & 72 & 102 & *,Es, Mo &
0.30 & [TR] \cr
> > Fennoscandia$^{11}$ & 68 & 23 & *,Jo,Ma &
0.28 & [TR] \cr
> > Yamal (Russia)$^{12}$ & 70 & 70 & *,Mo & 0.28
& [TR] \cr
> > Northern Urals (Russia)$^{13}$ & 66 & 65 & *,Jo & 0.27
& [TR] \cr
> > \hline
> > \end{tabular}
> > \caption{Continued overleaf.}
> > \end{table}
> >
> > \renewcommand{\thetable}{\arabic{table}}
> > \addtocounter{table}{-1}
> > \begin{table}[t]
> > \small
> > \begin{tabular}{|p{7.0cm}|r|r|l|r|l|}
> > \hline
> > Name & Lat. & Lon. & Id & $R^2$ & Type \cr
> > \hline
> > ITRDB [pc02] & 42 & -108 & Ma & 0.21 & [TR
PC] \cr
> > Lenca (Chile)$^{14}$ & -41 & -72 & Jo & 0.18 & [TR] \cr
> > Crete (Greenland)$^{15}$ & 71 & -36 & *,Jo & 0.16
& [IC] \cr
> > Methuselah Walk (USA) & 37 & -118 & *,Mo & 0.14 &
[TR] \cr
> > Greenland stack$^{15}$ & 77 & -60 & Ma & 0.13 &
[IC] \cr
> > Morocco & 33 & -5 & *,Ma & 0.13 &
[TR] \cr
> > North Patagonia$^{16}$ & -38 & -68 & Ma & 0.08 &
[TR] \cr
> > Indian Garden (USA) & 39 & -115 & *,Mo & 0.04 &
[TR] \cr
> > Tasmania$^{17}$ & -43 & 148 & Ma & 0.04 &
[TR] \cr
> > ITRDB [pc03] & 44 & -105 & Ma & -0.03 & [TR
PC] \cr
> > Chesapeake Bay: Mg/Ca (degC) (USA)$^{18}$ & 38 & -76 & *,Mo & -0.07
& [SE] \cr
> > Quelcaya 2 [accum] (Peru)$^{8}$ & -14 & -71 & *,Ma & -0.14
& [IC] \cr
> > France & 44 & 7 & *,Ma & -0.17 &
[TR] \cr
> > \hline
> > \end{tabular}
> > \caption{(continued)
> > The primary reference for each data set is indicated by the superscript in
the first column as
> > follows:
> > 1: \citep{dahl-jensen_etal1998}, 2: \citet{yang_etal2002}, 3:
\citet{shiyatov1993}, 4: \citet{grudd_etal2002}, 5: \citet{gupta_etal2003},
> > 6: \citet{lloyd_graumlich1997}, 7: \citet{tan_etal2003}, 8:
\citet{thompson1992},
> > 9: \citet{bartholin_karlen1983}, 10: \citet{naurzbaev_vaganov1999}, 11:
\citet{briffa_etal1992},
> > 12: \citet{hantemirov_shiyatov2002}, 13: \citet{briffa_etal1995}, 14:
\citet{lara_villalba1993},
> > 15: \citet{fisher_etal1996}, 16: \citet{boninsegna1992}, 17:
\citet{cook_etal1991}, 18: \citet{cronin_etal2003}.
> > the "Id" in column 4 refers to the reconstructions in which the data were
used.
> > The type of proxy is indicated in column 6:: tree-ring [TR], tree-ring
composite [TR C],
> > tree-ring principle component [TR PC], coral [CL], sediment [SE], ice core
[IC],
> > multi-proxy composite [MC]. The 19 proxy series marked with a "*" in
column 4 are used in the
> > ``Union'' reconstruction.
> > }
> > \end{table}
> >
> > \citep[][; MM2005c]{briffa_osborn1999} suggest that
> > rising CO$_2$ levels may have contributed significantly to the
> > 19th and 20th century increase in growth rate in some trees,
> > particularly the bristlecone pines, but such an
> > effect has not been reproduced in controlled experiments with mature trees
> > \citep{korner_etal2005}.
> >
> > Once a time series purporting to represent past temperature has been
obtained,
> > the final, and perhaps, most important, step is to verify its
> > and estimate uncertainty limits. This is discussed further in the next
section.
> >
> > \section{Varying methods vs. varying data}
> >
> > One factor which complicates the evaluation of the various reconstructions
is
> > that different authors have varied both method and data collections. Here
we will
> > run a representative set of proxy data collections through two algorithms:
> > inverse regression and scaled composites. These two methods, and the
different
> > statistical models from which they may be derived, are explained in the
> > Appendix A.
> >
> > Esper et al. (2005) investigated the differing calibration approaches used
in the recent literature, including
> > regression and scaling techniques, and concluded that the methodological
differences in calibration result in differences
> > in the reconstructed temperature amplitude/variance of about 0.5K.
> > This magnitude is equivalent to the mean annual temperature change for the
Northern Hemisphere reported in the last
> > IPCC report for the 1000-1998 period.
> > \citet{burger_etal2006} take another approach and investigate a family of
32 different regression algorithms
> > derived by adjusting 5 binary switches, using pseudo-proxy data.
> > They show that these choices, which
> > have all been defended in the literature, can lead to a wide variety of
different
> > reconstructions given the same data.
> > They also point out that the uncertainty is greater when we
> > attempt to estimate the climate of periods which lie outside the range
experienced
> > during the calibration period. The relevance of this point to the last
millennium is
> > under debate: the glacier based temperature estimates of OER2005 suggest
that the
> > coldest northern hemisphere mean temperatures occurred close to the start
of
> > the instrumental record, in the 19th century. The borehole
reconstructions,
> > however, imply that there were colder temperatures experienced in the 16th
to 18th centuries.
> > For the question as to whether the warmth of the latter part of the
calibration
> > period has been experienced in the past, however,
> > this particular issue is not directly relevant.
> >
> > As noted above, much of the MBH1999 algorithm is irrelevant to
reconstructions
> > prior to AD 1450, because before that date the data only suffice,
> > according to estimates in that paper, to determine one degree of freedom.
> > Hence, we will only look at direct evaluation of the hemispheric mean
temperature.
> >
> > Several authors have evaluated composites and calibrated those composites
> > against instrumental temperature. Many of the composites contain more
samples in later
> > periods, so that the calibration may be dominated by samples which do
> > not extend into the distant past. Here, we will restrict attention to
> > records which span the entire reconstruction period.
> > The data series used are listed in table 1.
> >
> > \subsection{Proxy data quality issues}
> >
> > As noted previously, their has been especially strong criticism of
> > MBH1998, 1999, partly concerning some aspects of their data collection.
> > Figures 4 and 5 show reconstructions made using the MBH1999 and MBH1998
data respectively.
> > Regression against northern hemispheric mean temperature from 1856 to 1980
is used
> > instead of regression against principal components of
> > temperature from 1902 to 1980. There are differences, but key features
remain.
> > MM2003 draw attention to the fact that one time series,
> > ``CANA036" in the ITRDB classification, contributed
> > by Gasp\'e, appears twice in the MBH1998 database.
> > This error is corrected in the red dashed curve of Fig.~5,
> > which is almost identical to the green curve, which retains the
duplication.
> >
> > \subsection{Reconstruction using a union of proxy collections}
> >
> > The following subsection will discuss a range of reconstructions using
different
> > data collections. The first 5 of these collections are defined as those
proxies used by
> > JBB1998, MBH1999, ECS2002, MSH2005 and HCA2006, respectively, which extend
back to 1000AD.
> > These will be referred to below as the JBB, MBH, ECS, MSH, HCA composites
below
> > to distinguish them from the composites used in the published articles,
which include
> > additional, shorter, proxy data series.
> > Finally there is a `Union' composite made using 19 independent northern
> > hemisphere proxy series marked with ``*" in table 1. Apart from the China
composite
> > record, all the data used are individual series. The PCs used by MBH1999
have been
> > omitted in favour of individual series used in other studies.
> > Two southern hemisphere tropical series, both from the Quelcaya glacier,
Peru,
> > are included ensure adequate representation of tropical temperatures.
> > This 'Union' collection contains 11 tree-ring series, 4 ice-cores, and one
each of
> > coral, speleothem, lake sediment and a composite record including
historical data.
> >
> > \subsection{Intercomparison of proxy collections}
> >
> > Figure 6 shows reconstructions back to 1000AD using
> > composites of proxies and variance matching [CVM] (for the proxy
> > principal components in the MBH1998, MBH1999 data collections the sign
> > is arbitrary: these series have, where necessary, had the sign reversed so
that
> > they have a positive correlation with the northern hemisphere
> > temperature record).
> > Surprisingly, the `Union' does not lie in the range spanned by the other
reconstructions,
> > and reaches colder temperatures than any of them. It does, however, fit
the calibration period
> > data better than any of the sub-collections.
> >
> > The reconstructions shown in Fig.~7 use the same data is used: this time
> > using inverse regression [INVR] (Appendix A), as used by MBH1998
> > (the method used here differs from that of MBH1998 in using northern
hemisphere
> > temperature to calibrate against, having a longer calibration period,
> > and reconstructing only a single variable instead of multiple EOFs).
> > The spread of values is substantially increased relative to the CVM
reconstruction.
> >
> > With INVR, only one reconstruction (that using the ECS2001
> > data) shows temperatures warmer than the mid 20th century.
> > The inverse regression technique applies weights to the
> > individual proxies which are proportional to the
> > correlation between the proxies and the calibration temperature
> > signature.
> > For this time series the 5 proxies are weighted as:
> > 1.7 (Boreal); 2.9 (Polar Urals); 1.7 (Taymir); 1.8 (Tornetraesk); and 2.3
(Upper Wright).
> > Firstly, it should be noted that this collection samples North America and
the
> > Eurasian arctic only. The bias towards the arctic is strengthened by the
weights
> > generated by the inverse regression algorithm, such that the
reconstruction has poor geographical coverage.
> >
> > The MBH1999 and HPS2000 published reconstructions are shown in Fig.~6 for
comparison: the MBH1999
> > reconstruction lies near the centre of the spread of estimates, while the
HPS2000 reconstruction
> > is generally at the lower bound.
> >
> > Much of the current debate revolves around the level of
> > centennial scale variability in the past.
> > The CVM results generally suggest
> > a low variance scenario comparable to MBH1999. The inverse regression
> > results, however, suggest greater variability. It should be noted
> > that the MBH1999 inverse regression result use greater volumes of
> > data for recent centuries, so that the difference in Fig.~7 between the
> > dashed red curve and the full green curve in the 17th
> > century is mainly due to reduced proxy data input in the latter
> > (there is also a difference because MBH1999 used inverse regression
> > against temperature principle components rather than northern hemisphere
> > mean temperature as here).
> >
> > Table 2 shows the cross correlations of the reconstructions in Fig.~6,
> > for high pass (upper right) and low pass (lower left) components
> > of the series, with low pass being defined by a 40 year running mean.
> > The low pass components are highly correlated.
> >
> > \begin{table}[t]
> > %% output from mitrie/pylib/pp.py
> > \begin{tabular}{|l|c|c|c|c|c|c|}
> > \hline
> > & Ma & Mo & Es & Jo & He & Union\cr
> > \hline
> > Ma & -- & 14\% & 25\% & 60\% & 20\% & 61\% \cr
> > Mo & 69\% & -- & 37\% & 11\% & 13\% & 60\% \cr
> > Es & 64\% & 77\% & -- & 14\% & 36\% & 57\% \cr
> > Jo & 62\% & 51\% & 46\% & -- & 11\% & 35\% \cr
> > He & 72\% & 75\% & 85\% & 53\% & -- & 26\% \cr
> > Union & 67\% & 71\% & 62\% & 45\% & 84\% & -- \cr
> > \hline
> > \end{tabular}
> > \caption{Cross correlations between reconstructions from
> > different proxy data bases: Mann et al (Ma), Moberg et al (Mo),
> > Esper et al (Es), Jones et al (Jo), Hegerl et al (He).
> > Lower left block correspond to low pass filtered series,
> > upper right to high pass filtered.}
> > \end{table}
> >
> > The significance of the correlations between these five proxy data samples
> > and the instrumental temperature data during the calibration period
(1856-1980)
> > has been evaluated using a Monte-Carlo simulation
> > with (1) a first order Markov model and (2) random time series
> > which reproduces the lag correlation structure of the data samples (see
Appendix A).
> > Figure 8 shows the lag correlations. The instrumental record had a
pronounced
> > anti-correlation on the 40 year time-scale. This may be an artifact of
the short
> > data record, but it is retained in the significance calculation as the
best available
> > estimate which is independent of the proxies.
> > The `Union' composite shows multi-centennial correlations which are not
present in the other data.
> > The MBH and JBB composites clearly underestimate the decadal scale
correlations, while
> > the HCA and 'Union' composites overestimate it.
> > %%first ref to table 3
> > Results are shown in table 3.
> > If the full lag correlation structure of the data were known, it would be
true,
> > as argued by MM2005, that the first order approach generally
> > leads to an overestimate of significance. Here, however, we only have a
> > estimated correlation structure based on a small sample. Using this finite
> > sample correlation is likely to overestimate long-term correlations and
hence
> > lead to an underestimate of significance. Nevertheless, results are
presented here
> > to provide a cautious estimate of significance.
> > For the MBH and JBB composites, which have short lag-correlations, the
difference
> > between the two methods is minimal. For other composites there is a
substantial difference.
> > In all cases the $R^2$ values exceed the 99\% significance level. When
> > detrended data are used the $R^2$ values are lower, but still above the
95\%
> > level -- with the exception of the Hegerl et al. data. This data has only
decadal
> > resolution, so the lower significance in high frequency variability is to
be expected.
> >
> >
> > \begin{table}[t]
> > %% output from mitrie/pylib/sum_ac.py
> > \begin{tabular}{|l||c|c||c|c||c||c|p{1.1cm}|}
> > \hline
> > Source & $R^2_{95|h}$ & $R^2_{95|AR}$ & $R^2$ & $R^2_{detr}$ & $\sigma$ &
Signif. & Signif. (detrended) \cr
> > \hline
> > Mann et al. & 0.205 & 0.170 & 0.463 & 0.286 & 0.186 & 99.99\% &
98.75\%\cr
> > \hline
> > Moberg et al., (hi+lo)/2 & 0.225 & 0.183 & 0.418 & 0.338 & 0.153 &
99.87\% & 99.25\%\cr
> > \hline
> > Esper et al. & 0.335 & 0.220 & 0.613 & 0.412 & 0.158 & 99.96\% &
98.11\%\cr
> > \hline
> > Jones et al. & 0.187 & 0.180 & 0.371 & 0.274 & 0.203 & 99.93\% &
99.17\%\cr
> > \hline
> > Hegerl et al. & 0.440 & 0.266 & 0.618 & 0.357 & 0.133 & 99.56\% &
90.13\%\cr
> > \hline
> > Union & 0.337 & 0.236 & 0.655 & 0.414 & 0.149 & 99.98\% & 97.91\%\cr
> > \hline
> > \end{tabular}
> > \caption{
> > $R^2$ values evaluated using the Northern Hemisphere mean temperature
(1856 to 1980) and various
> > proxy records.
> > Columns 2 and 3 show $R^2$ values for the 95\% significance
> > levels, evaluated using a Monte Carlo simulation with 10,000 realisations.
In columns
> > 2, 7 and 8 the full lag-correlation structure of the data is used, in
column
> > 3 a first order auto-regressive model is used, based on the lag one
auto-correlation.
> > Column 4 shows the $R^2$ value obtained from the data and column 5 shows
the same
> > using detrended data.
> > Column 6 shows the standard error (root-mean-square residual) from the
calibration
> > period. Columns 7 and 8 show significance levels, estimated using
> > Monte Carlo simulations as in column 2, for the full and detrended $R^2$
values.
> > }
> > \end{table}
> >
> > Figure 9 plots this reconstruction,
> > with the instrumental data
> > in the calibration period.
> > The composite tracks the changes in northern hemisphere temperature well,
> > capturing the steep rise between 1910 and 1950 and much of the decadal
> > scale variability. This is reflected in the significance scores (Tab.~3)
> > which are high both for the full series and for the detrended series.
> > The highest temperature in the reconstructed data, relative to the
1866-1970 mean is
> > 0.227K in 1091AD. This temperature was first exceeded in the instrumental
record in 1878,
> > again in 1937 and frequently thereafter. The instrumental record has not
gone below this level since 1986.
> > Taking $\sigma=0.149$ as the root-mean-square residual in the calibration
period
> > 1990 is the first year when the 1091 maximum was exceed by $2\sigma$.
> > This happened again in 1995 and every year since 1997.
> > 1998 and every year since 2001 have exceeded the preindustrial maximum by
$3\sigma$.
> >
> > \conclusions\label{sec:end}
> >
> > There is general agreement that global temperatures cooled
> > over the majority of the last millennium and have risen sharply
> > since 1850. In this respect, the recent literature has not produced
> > any change to the conclusions of JBB1998, though there remains
> > substantial uncertainty about the magnitude of centennial scale
variability
> > superimposed over longer term trends.
> >
> > The IPCC 2001 conclusion that temperatures of the past millennium
> > are unlikely to have been as warm, at any time prior to the 20th
> > century, as the last decades of the 20th century is supported
> > by subsequent research and by the results obtained here.
> >
> > The greatest range of disagreement among independent
> > assessments occurs during the coolest centuries, from 1500 to
> > 1900, when the departure from recent climate conditions
> > was strongest and may have been outside the range of
> > temperatures experienced during the later
> > instrumental period.
> >
> > There are many areas of uncertainty and disagreement within
> > the broad consensus outlined above, and also some who
> > dissent from that consensus. Papers which claim to refute the
> > IPCC2001 conclusion on the climate of the past millennium have been
> > reviewed and found to contain serious flaws.
> >
> > A major area of uncertainty concerns the accuracy of the long time-scale
> > variability in the reconstructions. This is particularly
> > so for timescale of a century and longer. There does not appear to be any
> > doubt that the proxy records would capture rapid change on
> > a 10 to 50 year time scale such as we have experienced in recent decades.
> >
> > Using two different reconstruction methods on a range of proxy data
> > collections, we have found that inverse regression
> > tends to give large weighting to
> > a small number of proxies and that the relatively simple
> > approach of compositing all the series and using variance matching to
> > calibrate the result gives more robust estimates.
> >
> > A new reconstruction made with a composite of 19 proxies extending back
> > to 1000AD fits the instrumental record to within a standard error of
0.149K.
> > This reconstruction gives a maximum pre-industrial temperature of 0.227K
> > relative to the 1866 to 1970AD mean. The maximum temperature from the
> > instrumental record is 0.841K, over 4 standard errors larger.
> >
> > The reconstructions evaluated in this study show considerable disagreement
> > during the 16th century. The new 19 proxy reconstruction implies 21-year
mean
> > temperatures close to 0.6K below the 1866 to 1970AD mean. As this
reconstruction
> > only used data extending back to 1000AD, there is a considerable volume of
16th century
> > data which has not been used. This will be a focus if future research.
> >
> > {\bf Acknowledgments}
> >
> > This work was funded by the Netherlands Environment Assessment Agency
(RIVM) as part of the
> > Dutch Scientific Assessment and Policy Analysis (WAB) programme.
> > Additional funding was provided as follows:
> > from the UK Natural Environment Research Council for M.N. Juckes,
> > from the Swedish Research Council for A. Moberg.
> >
> > \vfill\eject
> >
> > \def\thesection{A}
> > {\bf Appendix A: Regression methods}
> >
> > Ideally, the statistical analysis method would be determined by the
> > known characteristics of the problem. Unfortunately, the error
> > characteristics of the proxy data are not sufficiently well
> > quantified to make the choice clear.
> > This appendix describes two methods and the statistical models which can
be
> > used to motivate them.
> >
> > \subsection{Inverse regression [INVR]}
> >
> > Suppose $x_{ik}$, $i=1,N_{pr}$, $k=1,L$ is a set of $N_{pr}$
> > standardised proxy records of length $L$ and that we are trying
> > to obtain an estimate $\hat{y_i}$ of a quantity $y_i$ which is
> > known only in a calibration period ($i\in C$).
> >
> > Several ``optimal" estimates of $y_i$ can be obtained, depending on
> > the hypothesised relation between the proxies and $y$.
> >
> > Inverse regression follows from the model
> > $$
> > \beta_i y_k + {\cal N}
> > =
> > x_{ik}
> > $$
> > where $\cal N$ is a noise process, independent between proxies.
> > It follows that optimal estimate for the coefficients $\beta_i$ are
> > $$
> > \hat{\beta_i} = {\sum_{k\in C} x_{ik} y_k \over \sum{k\in C} y_k^2 }
> > .
> > $$
> > Given these coefficients, the optimal estimate of the $y_k$ outside
> > the calibration period is
> > $$
> > \hat{y_k} = { \sum_i \hat{\beta_i} x_{ik} \over \sum_i \hat{\beta_i}^2 }.
> > $$
> >
> > \subsection{Composite plus variance matching [CVM]}
> >
> > This method is rather easier. It starts out from the hypothesis that
different
> > proxies represent different parts of the globe. A proxy for the global
mean
> > is then obtained as a simple average of the proxies:
> > $$
> > \overline{x_k} = N_{pr}^{-1} \sum_i x_{ik}
> > .
> > $$
> >
> > Suppose
> > $$
> > \overline{x_k} = \beta y_k + {\cal N}
> > ,
> > $$
> > then an optimal estimate of $\beta$ is easily derived as
> > $\hat\beta = \sum_{k\in C} x_k y_k/\sum_{k\in C} y_k^2$.
> > However, $y^*_k = \hat\beta^{-1} x_k$ is not an optimal estimate
> > of $y_k$.
> >
> > Because of the added noise, $\overline{x_k}$ is generally an overestimate
> > of $\beta y_k$. To correct for this we should use:
> > $$
> > \beta y_k^* = \overline{x_k}
> > \sqrt{ \left( \beta^2 \sigma^2_y \over \beta^2 \sigma^2_y + \sigma_{\cal
N}^2 \right) }
> > ,
> > $$
> > where $\sigma^2_y$ and $\sigma_{\cal N}^2$ are the expected variance of
$y$ and the
> > respectively.
> > This leads to an estimate:
> > $$
> > y_k^* = \overline{x_k} \left( \sigma_y \over \sigma_x \right)
> > .
> > $$
> > This is known as the variance matching method because it matches the
> > variance of the reconstruction with that of observations over
> > the calibration period.
> >
> > \def\thesection{B}
> > \setcounter{subsection}{0}
> > {\bf Appendix B: Statistical tests}
> >
> >
> > \subsection{Tests for linear relationships}
> >
> > The simplest test for a linear relationship is the anomaly correlation
> > (also known as: Pearson Correlation, Pearson's product moment correlation,
$R^2$,
> > product mean test):
> > \be
> > R = { \overline{ y^\prime x^\prime } \over
> > \sqrt{ \overline{ y^{\prime2} } \, \overline{ x^{\prime2} } } }
> > \ee
> > where the over-bar represents a mean over the data the test is being
applied to,
> > and a prime a departure from the mean
> > \citep{pearson1896}.
> >
> > The significance of an anomaly correlation can be estimated using the
> > $t$ statistic:
> > \be
> > t = {R \sqrt{n-2} \over \sqrt{1-R^2} }
> > \ee
> > where $n$ is the sample size (for independent variables).
> > Two Gaussian variables will produce a $t$ statistics which obeys the
> > Student's t-distribution of $n-2$ degrees of freedom.
> >
> > Ideally, if the noise affecting all the $x$ and $y$ values is independent,
> > $n$ is simply the number of measurements. This is unlikely to be the case,
> > so an estimate of $n$ is needed. The Monte-Carlo approach is more
> > flexible: a large sample of random sequences with specified correlation
> > structures is created, and the frequency with which the specified
> > $R$ coefficient is exceeded can then be used to estimate its significance.
> >
> > \subsection{Lag correlations}
> >
> > Following \citet{hosking1984}, a random time series with a specified
> > lag correlation structure is obtained from the partial correlation
coefficients,
> > which are generated using Levinson-Durbin regression.
> >
> > It is, however, not possible to generate a sequence matching an
arbitrarily
> > specified correlation structure and there is no guarantee that an
> > estimate of the correlation structure obtained from a small sample will
> > be realizable. It is found that the Levinson-Durbin regression diverges
> > when run with the lag correlation functions generated from the
\citet{jones_etal1986}
> > northern hemisphere temperature record and also that from the HCA
composite.
> >
> > For the northern hemisphere temperature record, this is resolved by
truncating the regression after $n=50$.
> > The sample lag-correlation coefficients are, in any case, unreliable
beyind this point.
> > Truncating the regression results in a random sequence with a lag
correlation fitting that
> > specified up to lag 50 and then decaying.
> > For the HCA composite, the sample lag-correlation, $C(n)$, is scaled by
$\exp( - 0.0001 n )$,
> > where $n$ is the lag in years.
> >
> > {\bf Appendix C: Acronyms}
> >
> > Table 4 shows a list of acronyms used in this paper.
> > \begin{table}
> > \begin{tabular}{|l|p{12cm}|}
> > \hline
> > ABD & Age Band Decomposition tree ring standardisation method \cr
> > \hline
> > CSM & Climate System Model: A coupled ocean-atmosphere climate model
produced by NCAR,
> > http://www.cgd.ucar.edu/csm/ \cr
> > \hline
> > CFM & Climate Field Reconstruction: method for reconstructing spatial
structures
> > of past climate variables using proxy data \cr
> > \hline
> > CVM & Composite plus Variance Matching reconstruction method \cr
> > \hline
> > ECHO-G & Hamburg coupled ocean-atmosphere climate model \cr
> > \hline
> > EOF & Empirical Orthogonal Component \cr
> > \hline
> > INVR & Inverse Regression reconstruction method \cr
> > \hline
> > IPCC & The Intergovernmental Panel on Climate Change, established by the
> > World Meteorological Organization (WMO) and the United Nations Environment
Programme (UNEP)
> > to assess scientific, technical and socio-economic information relevant
for the understanding of climate change, its potential impacts and options
for adaptation and mitigation. It is open to all Members of the UN and of
WMO. \cr
> > \hline
> > ITRDB & International Tree-Ring Data Bank, maintained by the NOAA
Paleoclimatology
> > Program and World Data Center for Paleoclimatology
(www.ncdc.noaa.gov/paleo) \cr
> > \hline
> > MWP & Medieval Warm Period \cr
> > \hline
> > PC & Principal Component \cr
> > \hline
> > RCS & Regional Curve Standardisation tree ring standardisation method \cr
> > \hline
> > \end{tabular}
> > \caption{Acronyms used in the text}
> > \end{table}
> >
> > \bibliographystyle{egu}%
> > \bibliography{citations,extras}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by idl/mitrie/plot_recon.pro
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f01}}
> > \caption{\label{fig:1}
> > Various reconstructions. With mean of 1900 to 1960 removed.
> > }
> > \end{figure*}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by idl/paleo/mbh_70.pro
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f02}}
> > \caption{\label{fig:2}
> > Data blocks for PC calculation by MBH1998. Each of the 212 data series is
shown as a horizontal
> > line over the time period covered. The dashed blue rectangles indicate
some of the blocks of data
> > used by MBH1998 for their proxy principal component calculation, using
fewer series for longer time
> > periods. The red rectangle indicates the single block used by MM2003,
neglecting all data prior
> > to 1619.
> > }
> > \end{figure*}
> >
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/do_eof.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f03}}
> > \caption{\label{fig:3}
> > First Principal Component of the North American proxy record collection,
following MBH1998.
> > The black line is the MBH1998 archived version.
> > The other lines differ only in the method of standardisation of series
prior to calculation of the
> > principal components.
> > Red: calculated following the MBH1998 method, the individual series have
the mean of the calibration
> > period removed and are normalised by the variance of the detrended series
over that period;
> > Blue: with the mean of the whole series removed, and normalised with the
variance of the whole series.
> > Green: mean removed but no normalisation.
> > }
> > \end{figure*}
> >
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f13}}
> > \caption{\label{fig:4}
> > Reconstruction back to 1000, calibrated on 1856 to 1980 northern
hemisphere temperature,
> > using the MBH1999 proxy data collection.
> > The MBH1999 NH reconstruction and the Jones et al. (1986) instrumental
data are shown for comparison.
> > All data have been smoothed with a 21-year running mean.
> > }
> > \end{figure*}
> >
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f12}}
> > \caption{\label{fig:5}
> > As Fig.~4, but using the MBH1998 data collection back to 1400AD.
> > }
> > \end{figure*}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f10}}
> > \caption{\label{fig:6}
> > Reconstruction back to 1000AD, calibrated on 1856 to 1980 northern
hemisphere temperature,
> > using a composite and variance matching,
> > for a variety of different data collections.
> > The MBH1999 and HPS2000 NH reconstructions and the Jones et al. (1998)
instrumental
> > data are shown for comparison.
> > Graphs have been smoothed with a 21-year running mean and centered on 1866
to 1970.
> > The maximum of the 'Union' reconstruction in the pre-industrial period
(0.227K, 1091AD) is shown
> > by a short cyan bar, the maximum of the instrumental record (0.841K,
1998AD) is shown as a
> > short purple bar.
> > }
> > \end{figure*}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f11}}
> > \caption{\label{fig:7}
> > As Fig.~6, except using inverse regression.
> > }
> > \end{figure*}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f14}}
> > \caption{\label{fig:8}
> > Lag correlations for proxy composites and instrumental record (gray).
> > }
> > \end{figure*}
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f09}}
> > \caption{\label{fig:9}
> > The ``Union'' reconstruction, using `composite plus variance scaling', for
the
> > calibration period. Also shown is the level of the maximum plus two
standard errors.
> > The Jones and Mann instrumental data is plotted as a dashed line.
> > }
> > \end{figure*}
> >
> > \end{document}
> > diliberate bad speling
> >
> > \vfill\eject
> >
> > {\it\small
> > Both these questions could be answered by a detailed knowledge of the
> > climate and its forcings over the past 1000 years, but the detailed
> > instrumental record only extends back to 1856. Hence ... [[]]
> >
> > %%The motivation for the study of past climate variability is twofold:
> > Current projections of future climate change are still burdened with
> > some level of uncertainty, even within a particular scenario of future
> > greenhouse concentrations. Although all climate models simulate an
> > increase of global temperatures in this century, the range of warming
> > simulated by different models still covers a wide range \citep{IPCC2001}.
> > A much pursued goal is to reduce this uncertainty range.
> > A question is whether warming of magnitude similar to that observed in the
> > 19th and 20th centuries, very likely caused at least to a large part by
> > anthropogenic greenhouse gas, has also occurred in the preindustrial
recent past,
> > when, to a large extent, only natural forcings of the climate system
were active.
> >
> > {\small\it Reconstructions of the climate of the past millennium can help
us to
> > answer the second point by describing the magnitude of
> > global temperature fluctuations in the past and can address the first
> > point by helping to quantify the climate sensitivity: the
> > ratio of the response to the forcing.}
> > Progress in both questions can be achieved through the analysis
> > of reconstructions and simulations of the climate of the past millennium:
> > firstly,
> > we wish to know whether current high global temperatures are
> > within the range of natural variability. Secondly, we wish to
> > evaluate the skill and reliability of climate models.
> > %%The rise in global mean temperatures since then is
> > Therefore, some form of empirical reconstruction based on
early-instrumental
> > records, documentary evidence and proxy data is needed.
> > %%On the other hand,
> > %%the global warming observed in the past 2 centuries may be partly
> > %%due to the recovery from an extended
> > %%period of anomalously low temperatures which was reflected
> > %%in a large number of indirect European records.
> > %%[omit above sentence (AM)??]
> > %%[justify "recovery" (JE)??]
> > %%[was it really gradual (JE)]
> > %%"gradual deleted": jones and mann suggest that hemispheric mean cooling
trend
> > %% is "relatively steady" in contrast to more episodic cooling in Europe,
> > %% but esper etal (2002) suggests that attributing this difference to
> > %% hemisphere vs. europe is wrong, it might be whole hemisphere vs.
extra-tropical,
> > %% or it might be failure to resolve variability.
> > %[check]: copied from Gabi's email -- needs clearing up.
> > %%However, some unsolved questions will remain.
> > %%For instance, the climate sensitivity may depend on the nature of the
external
> > %%forcing (greenhouse gas, solar irradiance, etc), so that an estimation
> > %%of past climate sensitivity has still to be considered with some care.
> > %%There are indeed indications that climate sensitivity to changes in
solar
> > %%forcing is lower than to changes to greenhouse gas forcing
> > %%\citep{tett_etal2005+, joshi_etal2003}.
> > %%[ be more precise -- (i.e. in terms of $K W^-1 m^2$ ??)]]
> > %%::joshi etal show a 0-20% difference between sensitivity to solar
forcing
> > %%compared to CO2 forcing. This is much less than variability in
sensitivity
> > %%among models.
> > %%[this is not really relevant if the difference in climate sensitivity
between
> > %%forcings is much less than that between, say, models]
> >
> > A wide range of proxy
> > data sources which have been exploited for this problem
> > \citep[reviewed in][]{jones_mann2004}.
> > Tree rings are a particularly important source of information
> > within the time frame of the last millennium. The precise dating which
> > is provided by the annual growth rings allows anomalous growth
> > rates to be compared reliably with historical events.
> > However, its not straightforward to retrieve the climate variability
> > at timescales that exceed the typical life span of a tree (see Sect.~2.5
below).
> > Statistical regression against instrumental temperature data is often used
> > because the majority of proxy records cannot be directly related to
temperature
> > by deterministic models
> > (two exceptions, reconstructions obtained from borehole temperatures
> > and those based on glacial advance and retreat, are discussed below).
> > Appendix A gives mathematical details of some basic statistical measures.
> > The measures of skill used by MBH1998, MBH1999 are the
> > $R^2$ test, which measures the degree of coherence between two data
> > sets, and the ``Reduction of Error'' (RE) statistic, which measures the
> > effectiveness of one series (typically a model or prediction)
> > in explaining the total (i.e. including the mean) variance in another (the
verification data).
> >
> > The statistical tests on these measures of skill are described
> > in many text books, and their application is straight forward
> > when all sources of noise contaminating the
> > data are well characterised. The difficulty which arises
> > in many applications, including climate reconstructions, is that
> > the noise has significant but poorly characterised correlations.
> > %%[is this true for tests of skill -- probably not for analytical tests of
RE]]
> > }
> > \vfill\eject
> > \vfill \eject
> >
> > The B\"urger et al. analyses use a collection of pseudo-proxies created
from
> > pseudo observations of a climate simulation with added white noise.
> > This is a pragmatic approach -- there is little reliable information about
> > the true nature of the noise spectrum. It has been suggested that
bristlecone pines
> > in N. America have an anomalous growth trend in the 20th century which is
> > coherent among that species. The inverse regression algorithm can give
large
> > weight to individual proxies and negative weight to others: this may be
> > correct in some circumstances, but in others it could amplify the error.
> > The composite approach, on the other hand, is robust:
> > simply taking the mean of the available proxies does not rely on
> > specific assumptions about the noise spectrum.
> >
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{figz/c_var_nh_reconc_10_1000_c}}
> > \centering{\includegraphics[width=12cm]{figs/cpd-2006-xxxx-f04}}
> > \caption{\label{fig:1}
> > As Fig.~7, except
> > using composite and variance matching.
> > }
> > \end{figure*}
> >
> > \vfill\eject
> >
> > \begin{figure*}[h]
> > %% produced by pylib/plot_regc.py
> > \centering{\includegraphics[width=12cm]{figz/c_var_nh_reconc_10_1000_c}}
> > \centering{\includegraphics[width=12cm]{figs/cpd-2006-xxxx-f04}}
> > \caption{\label{fig:1}
> > The ``Union'' reconstruction, using `composite plus variance scaling', for
the
> > calibration period. Also shown is the level of the maximum plus two
standard errors.
> > The Jones and Mann instrumental data is plotted as a dashed line.
> > }
> > \end{figure*}
> >
> >
> >
> > Willmott, C.J., 1981. On the validation of models. Phys. Geog., 2, 184-194
> >
> >
> > {\bf A2: Principal Components}
> >
> > Principal component analysis is a standard technique for reducing the
> > volume of data while attempting to retain as much of the variability
> > of the original data as possible.
> >
> > Stage (2) establishes an empirical link between the proxy records and
> > temperature. In MBH1998 inverse least squares regression of the
> > proxy network against the principal components of the measured temperature
field,
> > over the period 1902 to 1980, is used.
> >
> > Stage (3), the verification stage, determines how many, if any, of the
> > reconstructed time series for the principal components can be
> > considered to have some descriptive value. This is done by evaluating the
> > fit of the implied fields to the observations in the verification period,
1856 to 1901.
> > The northern hemisphere mean temperature is calculated from the
> > The uncertainties are calculated from the residuals to the fit in the
calibration period.
> >
> > \citet{mcintyre_mckitrick2005c} assert that the fact that omission of data
> > led to a different result demonstrates that the method is unreliable.
> > This would be true if the computation of a time series were the
> > end point of the analysis. However, the need to verify the computed series
> > was recognised by MBH1998. This is discussed further below.
> >
> > \subsubsection{Spurious metaphors}
> >
> > The term ``hockey-stick" has become widely used, particularly in the US
> > media, to refer to the temperature history implied by the MBH1999
> > temperature reconstruction. It did not originally apply to the
reconstruction
> > itself, which has a relatively minor temperature increase in the early
> > 20th century, but rather to the combination of this series with the
> > more recent observed temperature trends: the combination shows
> > a dramatic increase in the 20th century, substantially greater than
anything
> > that occurred in the past millennium.
> > The first attempt to attach any scientific meaning to the phrase
> > was with the introduction of a ``hockey stick index''
> > \citet{mcintyre_mckitrick2005a} (hereafter MM2005).
> > This index is defined in terms of the ratio of the variance at the end of
a time series
> > to the variance over the remainder of the series.
> > MM2005 argue that the way in which
> > a principal component analysis is carried out in MBH generates an
artificial
> > bias towards a high ``hockey-stick index" and that the statistical
significance of
> > the MBH results may be lower that originally estimated.
> > %% and that this is responsible for the
> > %%shape in the MBH temperature reconstruction.
> > %%Martin: I think that what MM05 indicate is that "hockey-stick may arise
from random time series more easlily as previously thought, when using the
decentered PCs. I am not sure if they make this decentering responsible for
the final output in MBH.
> > %%
> > \subsection{Validation}
> >
> > As noted above, MM2003 have shown that removing data
> > degrades the result, as might be expected.
> > Among the adjustments which they characterize as ``corrections'' was the
> > omission of the 3 principal components mentioned above.
> > In fact, 70\% of the 90 time series extending back to
> > 1400 are omitted from their analysis.
> >
> > In principle, it would be possible to estimate the accuracy of
> > reconstructions calculated by regression from the data in the
> > calibration period. However, this calculation can easily be biased
> > by unreliable assumptions about the noise covariances within
> > the calibration period.
> > MBH1998, 1999 follow a more robust approach, using independent
> > data from a validation period (1856 to 1901) to,
> > firstly, determine whether a reconstruction has any relation to
temperature
> > and, secondly, estimate the error variance.
> >
> > MM2003, however, omitted the validation phase.
> > \citep{wahl_ammann2005} have carried out a detailed investigation
> > of the robustness of the MBH1998 technique to address this
> > and many other issues. They find that the MM2003 series fails the
> > validation tests used by MBH1998.
> >
> > As an illustration of the robustness of the reconstruction,
> > figures 5 and 6 shows a reconstructions made using the MBH1999 and MBH1998
data respectively.
> > Regression against northern hemispheric mean temperature is used
> > instead of regression against principal components of
> > temperature. There are differences, but key features remain.
> > [[more details in appendix and/or supplementary materials]]
> > MM2003 draw attention to the fact that one time series,
> > ``CANA036" in the ITRDB classification, contributed
> > by Gasp\'e, appears twice in the MBH1998 database.
> > This error is corrected in the red dashed curve of Fig.~5,
> > which is almost identical to the green curve, which retains the
duplication.
> >
> > With our simplification of the method it is possible
> > to use the entire instrumental record for calibration.
> > This leaves no data for validation, but the difference
> > between this and a reconstruction based on a shorter
> > period gives some idea of the robustness.
> > Figure 4b shows the result.
> >
> > Finally, MM question the calculation of uncertainty limits.
> > This depends on the number of degrees of freedom
> > assigned to the data. MM state that the standard method used
> > by MBH is wrong, and that a lower number of degrees of
> > freedom is appropriate because of long range correlations in
> > the data. MBH use the lag-one autocorrelation to estimate
> > the degrees of freedom.
> >
> > In all such tests it is necessary to remember the distinction between the
> > sample correlation, which one is forced to deal with, and
> > the actual correlation, we cannot know exactly. For this reason
> > it is generally unwise to use methods which rely on statistics
> > which cannot be estimated robustly in a small sample.
> >
> > MM05 also confuse the auto-correlation structure of the tree-ring data,
> > which are known to have an environmental signal with correlations
> > on at least the decadal time-scale, with the auto-correlation of the
> > residuals which should be used in estimating the noise structure.
> > \vfill\eject
> > \begin{figure*}[h]
> > %% produced by idl/paleo/mbh_70.pro
> > \centering{\includegraphics[width=12cm]{cpd-2006-xxxx-f03}}
> > \caption{\label{fig:1}
> > Data blocks for PC calculation by MBH.
> > }
> > \end{figure*}
> >
> >
> >
> > \subsection{Natural variability and forcings}
> >
> > Global temperature can fluctuate through natural internal variability of
> > the climate system (as in the El Ni\~no phenomenon), through
> > variability in natural forcings (solar insolation, volcanic aerosols,
> > natural changes to greenhouse gas concentrations) and human changes.
> >
> > Analysis of the physical links between the estimated temperature changes
> > of the past millennium and estimated variations in the
> > different forcing mechanisms can give improve our understanding of those
> > mechanisms and help to validate the estimated temperature and
> >
> > \citet{goosse_etal2005} investigate the role of natural variability using
> > an ensemble of 25 climate model simulations of the last millennium
> > and forcing estimates from \citet{crowley2000}.
> > They conclude that natural variability dominates local and regional
> > scale temperature anomalies, implying that most of the variations
> > experienced by a region such as Europe over the last millennium could
> > be caused by natural variability. On the hemispheric and global scale,
however, the
> > external forcing dominates.
> > This reinforces similar statements made by JOS1998.
\citet{goosse_etal2005}
> > make the new point, that noise can lead to regional temperature anomalies
> > peaking at different times to the forcing, so that disagreements in
> > timing between proxy series should not necessarily be interpreted as
meaning there
> > is no common forcing.
> >
> > Analysis of natural climate forcings \citep{crowley2000}
> > show that changes in atmospheric aerosol content due to changes
> > in volcanic activity and changes in solar irradiance
> > can explain this long term cooling through most of the millenium,
> > shown by paleoclimate reconstructions,
> > and the observed warming in the late 19th century.
> > \citet{hegerl_etal2003} analyse the correlations between four
> > reconstructions (MBH1999, BOS2001, ECS2002, and a modified version of
CL2000)
> > and estimated forcings \citep{crowley2000}.
> > They find that that natural forcing, particularly by
> > volcanism, explains a substantial fraction of decadal variance, also in
> > new high-variance reconstructions. Greenhouse gas forcing is detectable
> > with high significance level in all analyzed reconstructions analyzed.
> > \citet{weber2005b} carries out a similar analysis with a wider range
> > of reconstructions.
> > It is shown that the correlation between reconstructed
> > global temperatures and forcings are similar to those derived from
> > the ECBILT climate model \citep{opsteegh_etal1998}.
> > The trend component over the period 1000 to 1850 is, however, larger in
the
> > reconstructions compared to the forcings.
> >
> > The methods employed by
> > \citet{hegerl_etal2006+} attribute about a third of the early 20th century
warming, sometimes
> > more, in high-variance reconstructions to greenhouse gas forcing.
> > These results indicate that enhanced variability in the past does not
> > make it more difficult to detect greenhouse warming, since a large
> > fraction of the variability can be attributed to external forcing.
> > Quantifying the influence of external forcing on the proxy records is
> > therefore more relevant to understanding climate variability and its
> > causes than determining if past periods were possibly as warm as the
> > 20th century.
> >
> > The dominance of volcanic forcing over solar variability found in some of
the
> > above studies is consistent with recent questioning of the
> > magnitude of low-frequency solar forcing \citep{lean_etal2002,
foukal_etal2004}.
> > \subsection{Tests of skill in reconstructions}
> >
> > RE: Reduction of Error
> >
> > \be
> > RE = 1. - { \overline{ (y- \hat y^\prime)^2 } \over
> > \overline{ y^2 } }
> > \ee
> >
> >
>
> --
>
> Anders Moberg
> Department of Physical Geography and Quaternary Geology
> Stockholm University
> SE-106 91 Stockholm
> Sweden
>
> Phone: +46 (0)8 6747814
> Fax: +46 (0) 8 164818
> www.geo.su.se
> anders.moberg@natgeo.su.se
>
>
Attachment Converted: "c:\eudora\attach\MM.resub.pdf"