[Senate Hearing 108-359]
[From the U.S. Government Publishing Office]



                                                        S. Hrg. 108-359

CLIMATE HISTORY AND THE SCIENCE UNDERLYING FATE, TRANSPORT, AND HEALTH 
                      EFFECTS OF MERCURY EMISSIONS

=======================================================================

                                HEARING

                               BEFORE THE

                              COMMITTEE ON
                      ENVIRONMENT AND PUBLIC WORKS
                          UNITED STATES SENATE

                      ONE HUNDRED EIGHTH CONGRESS

                             FIRST SESSION

                               __________

                             JULY 29, 2003

                               __________

  Printed for the use of the Committee on Environment and Public Works


                    U.S. GOVERNMENT PRINTING OFFICE
92-381                      WASHINGTON : 2004
____________________________________________________________________________
For Sale by the Superintendent of Documents, U.S. Government Printing Office
Internet: bookstore.gpo.gov  Phone: toll free (866) 512-1800; (202) 512�091800  
Fax: (202) 512�092250 Mail: Stop SSOP, Washington, DC 20402�090001


               COMMITTEE ON ENVIRONMENT AND PUBLIC WORKS

                      one hundred eighth congress
                             first session

                  JAMES M. INHOFE, Oklahoma, Chairman
JOHN W. WARNER, Virginia             JAMES M. JEFFORDS, Vermont
CHRISTOPHER S. BOND, Missouri        MAX BAUCUS, Montana
GEORGE V. VOINOVICH, Ohio            HARRY REID, Nevada
MICHAEL D. CRAPO, Idaho              BOB GRAHAM, Florida
LINCOLN CHAFEE, Rhode Island         JOSEPH I. LIEBERMAN, Connecticut
JOHN CORNYN, Texas                   BARBARA BOXER, California
LISA MURKOWSKI, Alaska               RON WYDEN, Oregon
CRAIG THOMAS, Wyoming                THOMAS R. CARPER, Delaware
WAYNE ALLARD, Colorado               HILLARY RODHAM CLINTON, New York

                Andrew Wheeler, Majority Staff Director
                 Ken Connolly, Minority Staff Director

                                  (ii)

  
                            C O N T E N T S

                              ----------                              
                                                                   Page

                             JULY 29, 2003
                           OPENING STATEMENTS

Allard, Hon. Wayne, U.S. Senator from the State of Colorado, 
  prepared statement.............................................    11
Cornyn, Hon. Jon, U.S. Senator from the State of Texas, prepared 
  statement......................................................    58
Inhofe, Hon. James M., U.S. Senator from the State of Oklahoma...     1
Jeffords, Hon. James M., U.S. Senator from the State of Vermont, 
  prepared statement.............................................     7
Voinovich, Hon. George V., U.S. Senator from the State of Ohio...     3

                               WITNESSES

Legates, David R., director, Center for Climatic Research, 
  University of Delaware.........................................    12
    Prepared statement...........................................   209
Levin, Leonard, program manager, Electric Power Research 
  Institute......................................................    40
    Prepared statement...........................................   211
Mann, Michael E., assistant professor, University of Virginia, 
  Department of Environmental Sciences...........................     9
    Prepared statement...........................................   173
    Responses to additional questions from:
        Senator Inhofe...........................................   178
        Senator Jeffords.........................................   194
Myers, Gary, professor of Neurology and Pediatrics, Department of 
  Neurology, University of Rochester Medical Center..............    44
    Prepared statement...........................................   299
Rice, Deborah C., toxicologist, Bureau of Remediation and Waste 
  Management, Maine Department of Environmental Protection.......    42
    Prepared statement...........................................   283
    Responses to additional questions from Senator Jeffords......   284
Soon, Willie, astrophysicist, Harvard-Smithsonian Center for 
  Astrophysics...................................................     6
    Prepared statement...........................................    58
    Responses to additional questions from Senator Jeffords......   155

                          ADDITIONAL MATERIAL

Articles:
    Climate Research, Vol. 23:89-110, 2003, Proxy Climatic and 
      Environmental Changes of the Past 1000 years..............127-148
    Energy & Environment Vol. 14, Nos. 2 and 3, 2003, 
      Reconstructing Climatic and Environmental Changes of the 
      Past 1000 Years: A Reappraisal.............................60-126
    Geophysical Research Letters, Vol. 31, Estimation and 
      Representation of Long-term (>40 year) Trends of Northern-
      Hemisphere-gridded Surface Temperature: A Note of Caution.149-154
    Original Contributions, Effects of Prenatal and Postnatal 
      Methylmercury Exposure From Fish Consumption on 
      Neurodevelopment..........................................302-308
    Personal Health, Tip the Scale in Favor of Fish: The 
      Healthful Benefits Await...................................   321
    Risk Analysis, Vol. 23, No. 1, 2003, Methods and Rationale 
      for Derivation of a Reference Dose for Methylmercury by the 
      U.S. EPA..................................................290-298
    The Atlanta Journal-Constitution, June 6, 2003, Clear Skies 
      Mercury Curb Put in Doubt..................................   317
    The Lancet, Prenatal Methylmercury Exposure from Ocean Fish 
      Consumption in the Seychelles Child Development Study.....309-316
    The New York Times, July 29, 2003, Does Mercury Matter? 
      Experts Debate the Big Fish Question.......................   319
    The Philadelphia Inquirer, March 7, 2003, Mercury Rising.....   318
Chart, National Mean Mercury Concentration in Tissues of Selected 
  Fish Species (all sample types)................................   289
Letter, to Senator Inhofe, from John Christy.....................   323
Report, EPRI, May 2003, A Framework for Assessing the Cost-
  Effectiveness of Electric Power Sector Mercury Control Policie217-282

 
CLIMATE HISTORY AND THE SCIENCE UNDERLYING FATE, TRANSPORT, AND HEALTH 
                      EFFECTS OF MERCURY EMISSIONS

                              ----------                              


                         TUESDAY, JULY 29, 2003

                               U.S. Senate,
         Committee on Environment and Public Works,
                                                    Washington, DC.
    The committee met, pursuant to notice, at 9 o'clock a.m. in 
room 406, Senate Dirksen Building, Hon. James M. Inhofe 
(chairman of the committee) presiding.
    Present: Senators Inhofe, Allard, Carper, Clinton, Cornyn, 
Jeffords, Thomas and Voinovich.

 OPENING STATEMENT OF HON. JAMES M. INHOFE, U.S. SENATOR FROM 
                     THE STATE OF OKLAHOMA

    Senator Inhofe. The meeting will come to order.
    We have a policy that we announced when I became chairman 
of the committee that we will start on time, whether anyone is 
here or not here, members, witnesses or others. So I appreciate 
all of you being punctual in spite of the fact that the 
Senators are not.
    One of my primary objectives as chairman of the committee 
is to improve the way in which science is used. I think that 
when I became chairman of this committee, I announced three 
very outrageous things that we were going to do in this 
committee that have not been done before. No. 1, we are going 
to try to base our decisions, things that we do, on sound 
science. No. 2, we are going to be looking at the costs of some 
of these regulations, some of these policies that we have, and 
determine what they are going to be. And No. 3, we are going to 
try to reprogram the attitudes of the bureaucracy so that they 
are here not to rule, but to serve.
    Good public policy decisions depend on what is real or 
probable, not simply on what serves our respective political 
agendas. When science is debated openly and honestly, public 
policy can be debated on firmer grounds. Scientific inquiry 
cannot be censored. Scientific debate must be open. It must be 
unbiased. It must stress facts rather than political agendas.
    Before us today, we have two researchers who have published 
what I consider to be a credible, well-documented, and 
scientifically defensible study examining the history of 
climate change. Furthermore, these are top fields of inquiry in 
the Nation's energy environment debate and really the entire 
world's energy environment debate. We can all agree that the 
implications of this science are global, not only in terms of 
the environmental impacts, but also energy impacts, global 
trade impacts, and quite frankly, no less than global 
governance impacts.
    We could also all agree that as a result of the import and 
impact of these issues, it is absolutely crucial that we get 
this science right. False or incomplete or misconstrued data 
are simply not an acceptable basis for policymaking decisions 
in which the Congress of the United States is involved. Such 
data would violate the Data Quality Act, which we passed on a 
bipartisan basis here in the Senate and which we have 
bipartisanly embraced. If we need more data to satisfy our 
standards, then so be it.
    This Administration is prepared to do so in an aggressive 
strategy that the climate change strategic plan outlines. The 
1000-year climate study that the Harvard-Smithsonian Center for 
Astrophysics has compiled is a powerful new work of science. It 
has received much attention, and rightfully so. I would add at 
this time, it did not receive much attention from some of the 
liberal media who just did not want to believe that any of the 
facts that were disclosed were accurate.
    I think the same can be said in terms of work that has 
recently received attention of the hockey stick study. In many 
important ways, the Harvard-Smithsonian Center's work shifts 
the paradigm away from the previous hockey stick study. The 
powerful new findings of this most comprehensive study shiver 
the timbers of the adrift Chicken Little crowd.
    I look forward to determining whose data is most 
comprehensive, uses the most proxies, maintains the regional 
effects, avoids losing specificity through averaging 
statistics, considers more studies, and most accurately 
reflects the realities of the Little Ice Age, reflects the 
realities of the Medieval Warming Period, and more.
    Mercury presents a different set of issues. That would be 
our second panel. It is well-established that high levels of 
exposure to methyl-mercury before birth can lead to neuro-
development problems. But what about mercury consumed through 
fish, the most common form of prenatal exposure? Mercury makes 
its way into fish through various ways, but primarily though 
deposition from air emissions, with 80 percent of emissions 
deposited either regionally or globally, not locally. Global 
mercury emissions are about 5,000 tons a year. About half of 
those are man-made emissions.
    In the United States, a little more than 100 tons are 
emitted from non-power plant sources. Industry is making great 
strides in reducing these emissions. I would like to submit for 
the record this EPA document available on their Web site which 
indicates that when rules now on the books are fully 
implemented at non-power plant, nationwide emissions will be 
cut by nearly 50 percent. Power plants emit about 50 tons of 
mercury annually, about 1 percent of the worldwide emissions.
    In setting policy, key questions need to be answered, such 
as how would controls change this deposition; what portion of 
mercury exposure can not be controlled; and what are the health 
impacts of prenatal exposure. We will hear testimony today that 
indicates any changes to mercury exposure in fish would be 
minimal under even the most stringent proposal to regulate 
mercury. Today, we will also hear testimony that the most 
recent and comprehensive study to date found no evidence that 
prenatal mercury exposure from ocean fish presents a 
neurological risk.
    So we have diverse opinions that will be discussed today, 
and that is the reason for this hearing, to wade through that 
so that those on the panel that will be making policy decisions 
will understand. I think it is no secret that we are not 
scientists up here, so we look at things logically.
    With that, I would recognize one of my colleagues here that 
I have a great deal of respect for. Senator Voinovich and I 
started out together as we were mayors of cities almost 25 
years ago. I consider him to be one of the real experts in the 
area of air. In fact, I can remember calling him in as an 
expert when he was Governor of Ohio and we were holding these 
hearings and I was chairman at that time of the Clean Air 
Subcommittee. I would recognize Senator Voinovich for any 
comments he would like to make or opening statements.

  OPENING STATEMENT OF HON. GEORGE V. VOINOVICH, U.S. SENATOR 
                     FROM THE STATE OF OHIO

    Senator Voinovich. Thank you, Mr. Chairman.
    I want to congratulate you for the very comprehensive floor 
speech that you gave yesterday on the issue of climate change.
    Senator Inhofe. I guess I should apologize. It was 12,000 
words and I know you were anxious to get some floor time, so I 
appreciate your patience.
    Senator Voinovich. Your words were much more scientifically 
based than mine.
    [Laughter.]
    Senator Voinovich. The two issues that we are going to 
explore at the hearing today, the science of mercury and the 
science of climate change, are both important and timely. I 
commend you for holding this hearing.
    I think I do not have to remind you that we have had 
hearings on climate change now during the last 4 or 5 years. I 
think I had a couple when I was chairman of even the 
Transportation Infrastructure Committee. Senator Lieberman had 
hearings over in Governmental Affairs when he was chairman of 
the committee a year or so ago. So it is not a subject that is 
brand new to this committee.
    I have stated time and time again here in the committee and 
on the floor that we must recognize that energy policy and 
environmental policy are two sides of the same coin, and the 
Senate has responsibility to harmonize these policies. We have 
an obligation here in the committee to ensure that legislation 
that we consider will protect our environment. We also have an 
obligation to ensure that any legislation we consider takes 
into account its potential impact on our economy and we have a 
moral obligation to ensure that we consider a bill's particular 
impact on the poor and the elderly who must survive on fixed 
incomes.
    When the Senate takes up consideration of climate change 
and multi-pollutant legislation, we must keep that moral 
obligation in mind. We must ensure that we do not pass 
legislation that will significantly drive up the cost of 
electricity and home heating for those who can least afford 
them.
    Several members of this committee have introduced pieces of 
legislation this year to reduce power plant emissions, 
including mercury, and address the issue of carbon emissions 
and climate change by capping carbon. Examples include 
Jeffords-Lieberman four-P bill, the Carper four-P bill, and the 
McCain-Lieberman climate change bill, which I understand will 
likely be offered as an amendment to the energy bill, just this 
week we are going to be considering it.
    These bills will establish a nationwide cap on carbon 
emissions and their passage would force the utility sector, 
that is now using coal to generate over half of our Nation's 
electricity. To rely solely on natural gas for generation, we 
will have fuel switching--capping carbon equals fuel switching 
equals no-coal--to rely on natural gas regeneration despite the 
fact we have over a 250-year supply of domestic coal and are 
currently in the grips of a natural gas crisis in this country.
    This crisis is a result of environmental policies that have 
driven up the use of natural gas in electricity generation 
significantly, while domestic supplies of natural gas have 
fallen, partly because we cannot do the exploration that we 
need to do for natural gas.
    The result is predictable: tightening supplies of natural 
gas, higher natural gas prices, and higher electricity prices. 
Home heating prices are up dramatically, forcing folks on low 
and fixed incomes to choose between heating their homes and 
paying for other necessities such as food or medicine. The 
language that has been offered by Senators Jeffords, McCain, 
Lieberman and Carper if enacted will force our utilities to 
fuel switch to natural gas; will significantly raise energy 
prices; and will cause thousands of jobs to be lost, 
particularly in manufacturing States like my State of Ohio, 
which is already under duress in terms of manufacturing.
    During the debate last year on the Jeffords-Lieberman four-
P bill, I put together a white paper that discussed the impact 
that the bill would have if it were enacted. The numbers are 
staggering: an overall reduction in GDP of $150 billion by 
2020, the loss of over 900,000 jobs by 2020, and a decline in 
national household earnings of $550 annually.
    The cost of climate-change language such as the McCain-
Lieberman bill could come without any benefits to our air 
quality or public health. Not even the most ardent supporter, 
and I hope this comes up, of carbon regulation will claim that 
there are demonstrable health benefits from carbon regulation. 
Yet the Energy Information Administration estimates that the 
passage of the McCain-Lieberman bill, if enacted, will raise 
petroleum product prices by 31 percent, raise natural gas 
prices by 79 percent, raise electricity prices by 46 percent, 
and reduce GDP by up to $93 billion by 2025.
    Carbon caps and unrealistic mercury caps means fuel 
switching, again. The fuel switching means the end of 
manufacturing in my State, enormous burdens on the least of our 
brethren. It means moving jobs and production overseas, where 
there are less stringent environmental programs. And will 
actually, if you really think about it, increase global levels 
of pollution.
    The question we face in this committee is whether we should 
do something reasonable to improve our understanding of the 
issues surrounding carbon emissions and climate change, and 
attempt to reduce atmospheric concentration of carbon and 
mercury emissions without harming our economy, or rush into 
short-sighted policy that will cap carbon and mercury at 
unreasonable levels, shut down our economy, cut thousands of 
jobs, and move manufacturing overseas.
    In a recent column, former Secretary of Energy James 
Schlesinger commented that:

          ``In climate change, we have only a limited grasp of the 
        overall forces at work. Uncertainties have continued to abound 
        and must be reduced. In any approach to policy formation, this 
        is very important, under conditions of such uncertainty should 
        be taken only on an exploratory or a sequential basis. A 
        premature commitment to a fixed policy could only proceed with 
        fear and trembling.''

    I would like to have that column inserted in the record, 
Mr. Chairman.
    Senator Inhofe. Without objection, so ordered.
    Senator Voinovich. As I mentioned previously once or twice, 
I am working with Chairman Inhofe and the Administration on 
moving Clear Skies forward, which I intend to mark up in my 
subcommittee this fall. I am currently working with business 
and environmental groups to find a bipartisan compromise on 
dealing with carbon and global warming, with an emphasis on 
sound science, carbon sequestration, development of clean coal 
technologies, and a responsible approach that focuses more on 
consensus rather than politics.
    We need more Senators to focus on moving forward in a 
responsible way and move away from harshly ideological 
positions that advance nothing other than the agenda of some 
environmental groups that have made carbon cap a political 
litmus test.
    I thank the chairman for holding this important hearing and 
I look forward to hearing the testimony from our witnesses.
    Senator Inhofe. That is an excellent opening statement, 
Senator Voinovich. I go back to one of your first sentences 
when you talked about the number of hearings we have had. We 
have to keep in mind that each new hearing has new data. For 
example, the 1,000-year Harvard-Smithsonian was not even out 
until March of this year. So there are new things that are 
coming along and I see a new trend-line which I discussed on 
the House of the Senate yesterday. So this will be a very 
valuable hearing.
    Senator Cornyn, would you have any opening statement to 
make?
    Senator Cornyn. I would like to reserve any statement until 
later, Mr. Chairman.
    Senator Inhofe. Yes, that is fine. First, I would like to 
ask the first panel to come up. Dr. Legates, Dr. Willie Soon 
and Dr. Mann, would you three come up? First of all, we are 
honored to have who I consider three very excellent and 
professional scientific witnesses here today. Normally, we 
restrict the opening statements to 5 minutes, but it would be 
fine if you want to go about 7 minutes because I know you have 
come a long way and what we are dealing with here is probably 
one of the most significant things facing America, facing our 
economy, facing our environment today.
    So I would introduce all three. Dr. David Legates is the 
director of the Center for Climatic Research at the University 
of Delaware. Dr. Willie Soon is the astrophysicist at Harvard-
Smithsonian Center for Astrophysics, and Dr. Michael Mann is 
assistant professor at the University of Virginia Department of 
Environmental Sciences. I will first ask Dr. Willie Soon to 
give his opening statement.

 STATEMENT OF WILLIE SOON, ASTROPHYSICIST, HARVARD-SMITHSONIAN 
                    CENTER FOR ASTROPHYSICS

    Dr. Soon. Mr. Chairman, distinguished Senators, my fellow 
panelists, Dr. Mann and Dr. Legates, and members of the 
audience, my name is Willie Soon. About a month or two ago, I 
became a very proud and grateful U.S. citizen. I just cannot 
believe where I am sitting today.
    I am an astrophysicist with the Harvard-Smithsonian Center 
for Astrophysics in Cambridge, Massachusetts. My training is in 
atmospherics and space physics. My research interests for the 
past 10 years include changes in the sun and their possible 
impact on climate.
    I am here today to testify that the climate of the 20th 
century is neither unusual nor the most extreme. Around 1,000 
years ago, the temperature over many parts of the world was 
warm. A widespread cooling then set in for several centuries, 
followed by a recovery to 20th century warming.
    My colleague and I collected the information on climate by 
proxy. We studied environmental indicators of local climate 
change going back some 1,000 years from many locations around 
the world. Based on work of approximately 1,000 researchers and 
hundreds of peer-reviewed papers, we conclude the following 
three points about climate history of the last 1,000 years.
    On a location-by-location basis, point No. 1, there was 
warming from 800 to 1300 A.D., all about 1,000 years ago, over 
many parts of the world. This period is called the Medieval 
Warm Period. Following the warming of 1,000 years ago was a 
general cooling from about 1300 to 1900 A.D. This period is 
called the Little Ice Age.
    Point No. 2, there is no convincing evidence from local 
proxy to suggest that the 20th century had higher temperatures 
or more extreme climate than the warm period 1,000 years ago.
    Point No. 3, local and regional, rather than global average 
changes are the most relevant and practical measure of climate 
changes and its impact. Much of the climate proxy results using 
our work are new. Most papers were published in the scientific 
literature in the recent 5 to 10 years. There are two points to 
note about our methods. First, we keep the local or regional 
information contained in each climate proxy. This is important 
for studying geographical patterns of climate, which does not 
change everywhere at the same time.
    Second, climate is more than just temperature, so we keep 
the climate information like rainfall, expansion or contraction 
of forests, all advances or retreats of glaciers, et cetera. 
Our approach makes use of the richness of information in 
climate proxies, which map out local environmental and climate 
properties, rather than just temperature alone.
    The entirety of climate proxies over the last 1,000 years 
shows that over many areas of the world, there has been and 
continues to be large local climatic changes. Those changes 
provide important changes for the computer simulations of 
climate. The full models which explore the Earth region by 
region can be tested against the natural patterns of change 
over the last 1,000 years that are detailed by the climate 
proxies.
    Having computer simulation, we produced past patterns of 
climate which has been influenced predominantly by natural 
factors and is key to making an accurate forecast that includes 
all potential human-made warming and cooling effects.
    In summary, based on expert conclusions from climate 
proxies in several hundred peer-reviewed papers by over 1,000 
researchers from around the world, we find the following. No. 
1, from one location to another, large natural swings in 
climate have occurred over the last 1,000 years. Those patterns 
have not always been synchronous.
    No. 2, there was widespread warmth about 1,000 years ago, 
followed by widespread cooling ending by the beginning of the 
20th century.
    No. 3, the local and regional climate proxies cannot 
confirm that the 20th century is the warmest or most extreme 
over much of the world, compared especially to the Medieval 
Warm Period approximately 1,000 years ago.
    This is all for my oral remarks and I thank you for the 
opportunity to be here.
    Senator Inhofe. Dr. Soon, we appreciate that excellent 
opening statement. You did not even take all of your time. That 
is very unusual.
    At this time, Dr. Mann if you don't mind, I would like to 
interrupt your testimony. We have been joined by the Ranking 
Minority Member, Senator Jeffords. Senator Jeffords, do you 
have an opening statement you would like to make at this time?
    Senator Jeffords. I would ask unanimous consent that it be 
made as part of the record and would prefer listening to the 
witnesses.
    [The prepared statement of Senator Jeffords follows:]

   Statement of Hon. James Jeffords, U.S. Senator from the State of 
                                Vermont

    We're here today to discuss two very important topics--climate 
change and mercury pollution. As most of you know, I am the author of 
ambitious legislation--the Clean Power Act of 2003--which addresses 
these environmental problems, as well as ozone, acid rain, and human 
health damage from fine particulate matter.
    Unfortunately, we aren't here today to talk about moving forward to 
find innovative solutions to these real world problems. Instead, 
today's hearing will largely be a mirror or the reverse of the robust 
and growing consensus in the mainstream scientific community on climate 
and mercury pollution.
    The disappointing result will be more delay. Delay on the part of 
Congress, and even worse, the ongoing backsliding on the part of the 
Administration, means that we fail to act responsibly as a society to 
protect future generations. That means increasingly greater risks of 
global warming and mercury poisoning.
    There is no doubt that the scientific process must inform 
policymakers as new information comes in. Unfortunately, there is no 
new information to be found here today that would dissuade us from 
acting quickly and responsibly to reduce greenhouse gas and mercury 
emissions. In today's discussion of a literature survey of climate 
research, the skeptics are trotting out an argument that is several 
years old and already discarded by their peers.
    It is abundantly clear that now is the time to act.
     The National Academy of Sciences has said, ``Despite the 
uncertainties, there is general agreement that the observed warming is 
real and particularly strong within the past 20 years.''

     NOAA currently says that,

          ``The climatic record over the last thousand years clearly 
        shows that global temperatures increased significantly in the 
        20th Century, and that this warming was likely to have been 
        unprecedented in the last 1200 years.''

     EPA's website says that, ``There is new and stronger 
evidence that most of the warming over the last 50 years is 
attributable to human activities.''
    One would have to be madder than a March hare to fail to see the 
need to act. Yet, the Administration's new research plan falls squarely 
into hare territory--denying the reality staring them in the face.
    I want to show you the latest odds on warming. MIT says that there 
is a one in five chance that the temperature of the earth will warm by 
approximately 4 or 5 degrees over the course of this century, assuming 
there is no action to reduce emissions.
    As my dear departed friend, Senator John Chafee, said in 1989:

          ``It is clear that we are facing a serious threat. The 
        scientists are telling us that if we continue to stroll along 
        as if everything is fine, we will transform Earth into a planet 
        that will not be able to support life as we now know it.''

    While mercury contamination does not have the same dramatic effect 
on earth's systems, it is still a dangerous global and local pollutant 
because it is bio-accumulative and toxic to human health.
    Long ago, Congress decided that toxic air emissions should be 
reduced and took very aggressive steps in 1990 to make that happen, 
especially if they fall into the Great Lakes and other great waters 
like Lake Champlain. Unfortunately, the Agency has fallen significantly 
behind in complying with the Clean Air Act's schedule. A settlement 
agreement mandates controlling toxic air pollutants from utilities by 
2008.
    In 1998, related to the controversy around EPA's late reports to 
Congress on utility air toxics, Congress directed the National Academy 
of Sciences (NAS) to recommend an appropriate reference dose for 
mercury exposure. In 2000, the NAS reported that EPA's reference dose 
was scientifically sound and adequate to protect most Americans. That 
NAS review considered all health effects studies, including the 
Seychelles study that we'll discuss today.
    We know that mercury is a potent toxic. It affects the human brain, 
spinal cord, kidneys, liver and the heart. It affects the ability to 
feel, see, taste and move. We know that mercury can affect fetal 
development, preventing the brain and nervous system from developing 
normally. Long term exposure to mercury can result in stupor, coma and 
personality changes.
    ``Mad as a Hatter'' is the phrase that was used in the 1800's to 
describe the employees of the felt hat industry whose constant exposure 
to mercury changed their behavior. Fortunately, Americans exposure from 
commercial and recreational fish consumption is substantially less than 
that, though dozens of health warnings are posted nationwide.
    But, it's crazy for anyone to suggest that we should not reduce 
mercury emissions significantly, since we know its health effects and 
we have the technologies to control it.
    We should have a hearing on how to export those control 
technologies and Congress should urge the Administration to negotiate 
binding global reductions in mercury, as the Senate did last year in 
the Energy bill for greenhouse gas emissions.
    At a minimum, we should pass four-pollutant legislation now that 
gets reductions faster and deeper than required by the current Clean 
Air Act. I'm sad to say that there have been no negotiations on that 
front since I initiated some in early 2002. And the Administration has 
done nothing to reduce these emissions with its abundant authority in 
the Act.
    We can't afford to leave these problems to future generations to 
solve. We can't let our children and grandchildren wake up to find that 
our delays have cost them dearly in terms of health and the global and 
local environment. It's time to act responsibly.
    Finally, I ask that material from the journal EOS, the NOAA 
website, the Atlanta Journal Constitution, the National Center for 
Atmospheric Research, and the American Geophysical Union be included in 
the hearing record.

    Senator Jeffords. I might point out, we have got to do 
something about this traffic out there.
    [Laughter.]
    Senator Inhofe. Well, the name of our subcommittee is 
Transportation and Infrastructure, so maybe we can do something 
about the traffic out there.
    Senator Jeffords. I hope so.
    Senator Inhofe. Dr. Mann, you are recognized.

 STATEMENT OF MICHAEL E. MANN, ASSISTANT PROFESSOR, UNIVERSITY 
       OF VIRGINIA, DEPARTMENT OF ENVIRONMENTAL SCIENCES

    Dr. Mann. Senators, my name is Michael Mann. I am a 
professor in the Department of Environmental Sciences at the 
University of Virginia. My research involves the study of 
climate variability and its causes. I was a lead author of the 
IPCC Third Scientific Assessment report. I am current 
organizing committee chair for the National Academy of 
Sciences' Frontiers of Science, and have served as a committee 
member or adviser for other National Academy of Sciences' 
panels.
    I have served as editor for the Journal of Climate of the 
American Meteorological Society for 3 years and I am a member 
of the advisory panel for the NOAA Climate Change Data and 
Detection Program. I am a member of numerous other 
international and U.S. scientific working groups, panels and 
steering committees. I have coauthored more than 60 peer-
reviewed publications on diverse topics within the fields of 
climatology and paleoclimatology.
    Honors I have received include selection in 2002 as one of 
the 50 leading visionaries in science and technology by 
Scientific American magazine, and the outstanding scientific 
publication award of NOAA for 2000.
    In my testimony here today, I will explain, No. 1, how 
mainstream climate researchers have come to the conclusion that 
late 20th century warmth is unprecedented in a very long-term 
context and that this warmth is likely related to the activity 
of human beings; and No. 2, why a pair of recent articles 
challenging these conclusions by astronomer Willie Soon and his 
coauthors are fundamentally unsound.
    It is the consensus of the climate research community that 
the anomalous warmth of the late 20th century cannot be 
explained by natural factors, but instead indicates significant 
anthropogenic, that is human influences. This conclusion is 
embraced by the position statement on climate change and 
greenhouse gases of the American Geophysical Union, by the 2001 
report of the IPCC, the Intergovernmental Panel on Climate 
Change, and by a National Academy of Sciences' report that was 
solicited by the Bush Administration in 2001.
    More than a dozen independent research groups have now 
reconstructed the average temperature of the northern 
hemisphere in past centuries, both by employing natural 
archives of past climate information or proxy indicators such 
as tree rings, corals, ice cores, lake sediments and historical 
documents, and through the use of climate model simulations. If 
I can have the first exhibit here, as shown in this exhibit, 
the various proxy reconstructions agree with each other, as 
well as with the model simulations, all of which are shown, 
within the estimated uncertainties. That is the gray-shaded 
region.
    The proxy reconstructions, taking into account these 
uncertainties, indicate that the warming of the northern 
hemisphere during the late 20th century, that is the northern 
hemisphere, not the globe, as I have sometimes heard my study 
incorrectly referred to, the northern hemisphere during the 
late 20th century, that is the end of the red curve, is 
unprecedented over at least the past millennium and it now 
appears based on peer-reviewed research, probably the past two 
millennia.
    The model simulations demonstrate that it is not possible 
to explain the anomalous late-20th century warmth without the 
contribution from anthropogenic influences. These are the 
consensus conclusions of the legitimate community of climate 
and paleoclimate researchers investigating such issues.
    Astronomers Soon and Baliunas have attempted to challenge 
the scientific consensus based on two recent papers, henceforth 
collectively referred to as SB, that completely misrepresent 
the past work of other legitimate climate researchers and are 
deeply flawed for the following reasons. No. 1, SB make the 
fundamental error of citing evidence of either wet or dry 
conditions as being in support of an exceptional Medieval Warm 
Period. Such an ill-defined criterion could be used to define 
any period of climate as either warm or cold. It is pure 
nonsense.
    Experienced paleoclimate researchers know that they must 
first establish the existence of a temperature signal in a 
proxy record before using it to try to reconstruct past 
temperature patterns. If I can have exhibit two, this exhibit 
shows a map of the locations of a set of records over the globe 
that have been rigorously analyzed by my colleagues and I for 
their reliability as long-term temperature indicators. I will 
refer back to that graphic shortly.
     No. 2, it is essential to distinguish between regional 
temperature changes and truly hemispheric or global changes. 
Average global or hemispheric temperature variations tend to be 
far smaller in their magnitude than those for particular 
regions. This is due to a tendency for the cancellation of 
simultaneous warm and cold conditions in different regions, 
something that anybody who follows the weather is familiar 
with, in fact.
    As shown by exhibit three, if I can have that up here as 
well now, thank you, this exhibit plots the estimated 
temperature for various locations shown in the previously 
displayed map. As you can see, the specific periods of relative 
cold and warm, blue and red, differ greatly from region to 
region. Climatologists, of course, know this. What makes the 
late 20th century unique is the simultaneous warmth indicated 
by nearly all the long-term records. It is this simultaneous 
warmth that leads to the anomalous late-20th century warmth 
evident for northern hemisphere average temperatures.
    The approach taken by SB does not take into account whether 
warming or cooling in different regions is actually coincident, 
despite what they might try to tell you here today.
     No. 3, as it is only the past few decades during which 
northern hemisphere temperatures have exceeded the bounds of 
natural variability, any analysis such as SB that compares past 
temperatures only to early or mid-20th century conditions; you 
repeatedly hear Dr. Soon refer to the 20th century; 
climatologists do not consider that a meaningful baseline 
because there has been a dramatic warming during the 20th 
century and the early 20th century and the late 20th century 
are almost as different as the late 20th century and any other 
period during the past 1,000 years at least. So a study that 
refers only to early or mid-20th century conditions or generic 
20th century conditions and does not specifically address the 
late 20th century, cannot address the issue of whether or not 
late-20th century warmth is anomalous in a long-term context.
    To summarize, late-20th century warming is unprecedented in 
modern climate history at hemispheric scales. A flawed recent 
claim to the contrary by scientists lacking expertise in 
paleoclim-
atology is not taken seriously by the scientific community.
    The anomalous recent warmth is almost certainly associated 
with human activity and this is the robust consensus view of 
the legitimate climate research community.
    Thank you.
    Senator Inhofe. Thank you, Dr. Mann.
    Dr. Legates.
    First, I would ask Senator Allard, did you want to make an 
opening statement?
    Senator Allard. Mr. Chairman, I do have an opening 
statement and in deference to the panel and you I would just 
like to have it put in the record. If you would do that, then I 
would be happy.
    Senator Inhofe. Without objection.
    [The prepared statement of Senator Allard follows:]

Statement of Hon. Wayne Allard, U.S. Senator from the State of Colorado

    Mr. Chairman, I want to thank you for holding this important 
hearing today.
    As a veterinarian, I have some scientific training in my 
background. I strongly believe that we should use scientific principals 
as a guidepost when formulating any regulation. This scientific 
guidepost approach is particularly important when looking at 
regulations with the implications and magnitude of regulations on 
climate change and mercury control.
    Climate change has been an ongoing discussion for many years. 
However, during the 1970's the concerns were exactly opposite of what 
they are now. Then we were told that there was a threat of massive 
global cooling. Headlines screamed that we were in danger of entering 
another ice age. Now we are told that massive warming trends are going 
to cause overheating across the globe. We need answers, not rhetoric.
    All of the witnesses here today have a great deal of experience. 
All of the witnesses here have spent many years analyzing data related 
to the areas of their expertise. But, I am concerned that, at times, 
data may be reviewed selectively and in isolation. I am also concerned 
that emphasis may fall on a limited number of studies. In science we 
have all learned that the only way to solidly prove a theory is by 
conducting tests, studies or experiments that repeatedly arrive at the 
same result. We cannot simply ignore the studies that do not have the 
outcome we are looking for. This applies whether we are looking at 
climate change, mercury or any other issue.
    I want to spend most of my time and attention today on potential 
mercury regulations. While today's hearing is intended to focus on 
science, I would also like to touch on the impact that potential 
regulations will have on the economy of my state and the west. As many 
of you know, western coal differs from other types of coal in several 
ways. The higher chlorine content in western coal makes it more 
difficult to remove mercury when burning it. And, while western coal 
does contain mercury, when it is burned it gives off mercury in the 
elemental form. It is my understanding that this is not the type of 
mercury that deposits in the ecosystem to potentially be absorbed by 
the environment.
    The economies of Colorado, and the entire west, will be impacted by 
harsh regulations placed on their coal. Economies undoubtedly will be 
damaged by the decrease in use of coal mined in the West. In addition, 
while jobs are being lost due to the subsequent inability to fully 
utilize western coal supplies, if power can no longer be generated by 
using coal mined in the west, other less efficient coal types will have 
to be transported across long distances. This additional expenditure 
will add to the price of electricity generation, driving up electricity 
costs and further damaging an economy that will already be struggling.
    This is why it is so important to me that we be cautious when 
dealing with situations such as these and why we should place strong 
emphasis on the use of sound science. Our regulations must be 
thoughtful reflections of what we know--they should not be reflexive or 
reactive attempts to legislate a cure before we know what the disease 
is.
    Again, Mr. Chairman, thank you for holding this hearing. I look 
forward to hearing the witness testimony and discussions to come.

    Senator Inhofe. That being the case, let's dispense with 
any further opening statements.
    Dr. Legates, thank you very much for being here. You are 
recognized.

 STATEMENT OF DAVID R. LEGATES, DIRECTOR, CENTER FOR CLIMATIC 
                RESEARCH, UNIVERSITY OF DELAWARE

    Dr. Legates. Thank you. Mr. Chairman, Distinguished 
Senators, Doctors Mann and Soon, and members of the audience, I 
would like to thank the committee for inviting my commentary on 
this important topic of climate history and its implications. 
My research interests have focused on hydroclimatology. That is 
the study of water in the atmosphere and on the land, and as 
well as on the application of statistical methodology in 
climatological research.
    I am familiar with the testimony presented here by Dr. 
Soon. My contributions to Dr. Soon's research stem from my 
grappling with the striking disagreement between the 
longstanding historical record and the time series recently 
presented by Dr. Mann and his colleagues. It also stems from my 
own experiences in compiling and merging global estimates of 
air temperature and precipitation from a variety of disparate 
sources.
    My Ph.D. dissertation resulted in the compilation of high-
resolution climatologies of global air temperature and 
precipitation. From that experience, I have become acutely 
aware of the issues associated with merging data from a variety 
of sources and containing various biases and uncertainties. By 
its very nature, climatological data exhibit a number of 
spatial and temporal biases that must be taken into account. 
Instrumental records exist only for the last century or so, and 
thus proxy records can only be used to glean information about 
the climate for earlier time periods. But it must be noted that 
proxy records are not observations and strong caveats must be 
considered when they are used. It, too, must be noted that 
observational data are not without bias either.
    Much research has described both the written and oral 
histories of the climate, as well as the proxy climate records. 
It is recognized that such records are not without their 
biases. For example, trees respond not to just air temperature 
fluctuations, but to the entire hydrologic cycle, including 
water supply, precipitation, and demand, which is only in part 
driven by air temperature.
    Nevertheless, such accounts indicate that the climate of 
the last millennium has been characterized by considerable 
variability and that extended periods of cold and warmth 
existed. It has been generally agreed that during the early 
periods of the last millennium, air temperatures were warmer 
and that temperatures became cooler toward the middle of the 
millennium. This gave rise to the terms the Medieval Warm 
Period and the Little Ice Age, respectively. However, as these 
periods were not always consistently warm or cold, nor were the 
extremes geographically commensurate in time, such terms must 
be used with care.
    In a change from its earlier reports, however, the Third 
Assessment Report of the Intergovernmental Panel on Climate 
Change, and now the U.S. National Assessment of Climate Change, 
both indicate that hemispheric and global air temperatures 
followed a curve developed by Dr. Mann and his colleagues in 
1999. This curve exhibits two notable features, and I will 
point back to Dr. Mann's exhibit one that he showed a moment 
ago. First is a relatively flat and somewhat decreasing trend 
in air temperature that extends from 1000 A.D. to about 1900 
A.D. This feature is an outlier that is in contravention to 
thousands of authors in the peer-reviewed literature.
    This is followed by an abrupt rise in the air temperature 
during the 1900's that culminates in 1998 with the highest 
temperature on the graph. Virtually no uncertainty is assigned 
to the instrumental record of the last century. This conclusion 
reached by the IPCC and the National Assessment is that the 
1990's was the warmest decade, with 1998 being the warmest year 
of the last millennium.
    Despite the large uncertainty, the surprising lack of 
significant temperature variations in the record gives the 
impression that climate remained relatively unchanged 
throughout most of the last millennium, at least until human 
influences began to cause an abrupt increase in temperatures 
during the last century. Such characterization is a scientific 
outlier. Interestingly, Mann et al replace the proxy data for 
the 1900's by the instrumental record and present it with no 
uncertainty characterization. This, too, yields the false 
impression that the instrumental record is consistent with the 
proxy data and that it is error-free. It is neither.
    The instrumental record contains numerous uncertainties, 
resulting from measurement errors, a lack of coverage over the 
world's oceans, and underrepresentation of mountainous and 
polar regions, as well as undeveloped nations and the presence 
of urbanization effects resulting from the growth of cities. As 
I stated before, the proxy records only in part reflect 
temperature. Therefore, a simultaneous presentation of the 
proxy and instrumental record is the scientific equivalent to 
calling apples and oranges the same fruit.
    Even if a modest uncertainty of plus or minus one-tenth of 
a degree Celsius were imposed on the instrumental record, the 
claim of the 1990's being the warmest decade would immediately 
become questionable, as the uncertainty window would overlap 
with the uncertainty associated with earlier time periods. 
Note, too, that if the satellite temperature record, where 
little warming has been observed over the last 20 years, had 
been inserted instead of the instrumental record, it would be 
impossible to argue that the 1990's was the warmest decade. 
Such a cavalier treatment of scientific data can create 
scientific outliers, such as the Mann et al curve.
    So we are left to question why the Mann et all curve seems 
to be at variance with the previous historical characterization 
of climatic variability. Investigating more than several 
hundred studies that have developed proxy records, we came to 
the conclusion that nearly all of these records show 
considerable fluctuations in air temperature over the last 
millennium. Please note that we did not reanalyze the proxy 
data. The original analysis from the various experts was left 
intact, as it formed a voluminous refereed scientific 
literature. Most records show the coldest period is 
commensurate with at least a portion of what is termed the 
Little Ice Age, and the warmest conditions at concomitant with 
at least a portion of what is termed the Medieval Warm Period.
    Our conclusion is entirely consistent with conclusions 
reached by Drs. Bradley and Jones and not all locations on the 
globe experience cold or warm conditions. Moreover, we chose 
not to append the instrumental record, but to compare apples 
with apples and determine if the proxy records themselves 
indeed confirm the claim of the 1990's being the warmest decade 
of the last millennium. That claim is not borne out by the 
individual proxy records.
    However, the IPCC report in the chapter with Dr. Mann as 
the lead author and his colleagues as contributing authors, 
also concludes that the research ``support the idea that the 
15th to 19th centuries were the coldest of the millennium over 
the northern hemisphere overall.'' Moreover, the IPCC report 
also concludes that the Mann and Jones research shows 
temperatures from the 11th to 14th centuries to be ``warmer 
than those from the 15th to 19th centuries.'' This again is 
entirely consistent with our findings and in contravention of 
their own error assessment.
    Where we differ with Dr. Mann and his colleagues is in the 
construction of the hemisphere average time series and their 
assertion that the 1990's was the warmest decade of the last 
millennium. Reasons why the Mann et al curve fails to retain 
the fidelity of the individual proxy records are detailed 
statistical issues into which I will not delve. But a real 
difference of opinion focuses solely on the Mann et al curve, 
and how it is an outlier compared to the balance of evidence on 
millennial climate change. In a very real sense, this is a 
fundamental issue that scientists must address before the Mann 
et al curve can be taken as fact.
    In closing, let me state that climate is simply more than 
annually averaged global air temperature. Too much focus, I 
believe, has been placed on defining air temperature time 
series and such emphasis obscures the true issue in 
understanding climate change and variability. If we are truly 
to understand climate and its impacts and driving forces, we 
must push beyond the tendency to distill climate to a single 
annual number. Proxy records which provide our only possible 
link to the past are incomplete at best. But when these 
voluminous records are carefully and individually examined, one 
reaches the inescapable conclusion that climate variability has 
been a natural occurrence and especially so over the last 
millennium.
    Given the uncertainties and biases associated with the 
proxy and instrumental records----
    Senator Inhofe. Dr. Legates, we are going to have to cut it 
off. You have exceeded your time and I am sure you will have an 
opportunity to finish your thoughts during the question and 
answer period.
    Dr. Legates. Thank you for the privilege.
    Senator Inhofe. We are going to, if it is all right, use 5 
minutes and maybe try to get a few rounds here. Is that 
acceptable? These will be 5 minute rounds for questioning. I 
will start.
    First of all, Senator Thomas joined us. Thank you for 
coming, Senator Thomas.
    I will address my first question to Dr. Legates. In my 
speech on the Senate floor yesterday, I noted your comments 
regarding--can you find that chart of those comments?--the 
comments regarding Dr. Mann's work as shown on the chart. I 
have a small copy of this. No, that is not it. It is this chart 
right here. OK.
    First of all, this is a comparison. As I mentioned in my 
opening statement, we sit up here as non-scientists so we try 
to look at these things and see what is logical, how we should 
weigh and compare diverse opinions. Now, the first thing I 
noticed was that Dr. Mann, yours I believe was in the area of 
the timeframe of 1999----
    Dr. Mann. Excuse me. That is incorrect.
    Senator Inhofe [continuing]. And Dr. Soon, you are 2003. So 
I think that the timing would mean something because I know 
that this is not a static target. This is a moving target.
    May I first ask Dr. Legates, do you stand by the statements 
that are made on this chart up here, on the contrasting methods 
that were used?
    Dr. Legates. I have not had a chance to actually look at 
the chart before now.
    Senator Inhofe. Is this the one that he had here? OK, let's 
put that up. All right, then, this statement here,

          ``Although Mann's work is now widely used as proof of 
        anthropogenic global warming. We have become concerned that 
        such analysis is in direct contradiction to most of the 
        research and written histories available. My paper shows this 
        contradiction and argues that the results of Mann are out of 
        step with the preponderance of the evidence.''

    I am not Tim Russert, but do you stand by these statements?
    Dr. Legates. I do stand by them, sir.
    Senator Inhofe. All right. I note that you are an expert in 
statistical techniques. In my speech on the Senate floor 
yesterday, I noted that even assuming all of the science used 
by the political left, come the end of 50 years hence, the 
Kyoto Protocol would have no measurable affect on temperature. 
Do you agree with that?
    Dr. Legates. Yes, generally.
    Senator Inhofe. And if the Kyoto Protocol forces harsher 
mandates, does it follow that the weaker legislative proposals 
that are out there right now before us in the Senate would have 
likewise no measurable effect?
    Dr. Legates. That is likely true.
    Senator Inhofe. All right. Let's see. Dr. Mann, since you 
have characterized your colleagues there in several different 
ways as nonsense, illegitimate, and inexperienced, let me ask 
you if you would use the same characterization of another 
person that I quoted on the floor yesterday. I would like to 
call your attention to the recent op/ed in the Washington Post 
by Dr. James Schlesinger, who was Energy Secretary under 
President Carter. In it, he wrote, ``There is an idea among the 
public that the science is settled. That remains far from the 
truth.'' He has also acknowledged the Medieval Warming Period 
and the Little Ice Age. Do you question the scientific 
integrity of Dr. Schlesinger?
    Dr. Mann. I do not think I have questioned scientific 
integrity. I have questioned scientific expertise in the case 
of Drs. Willie Soon and David Legates with regard to issues of 
paleoclimate. As far as Schlesinger is concerned, I am not 
familiar with any peer-reviewed work that he has submitted to 
the scientific literature, so I would not be able to evaluate 
his comments in a similar way. If I could clarify one----
    Senator Inhofe. OK. Well, you can't because there isn't 
time. I am going to stay within my timeframe and I want to get 
to questions so others will have plenty of opportunity to 
respond to questions I am sure.
    Dr. Soon, how many studies did you examine in total and how 
many were appropriate for the criteria you established?
    Dr. Soon. Senator, the number is roughly in the order of, 
if you speak in terms of the peer-reviewed literature, I would 
say several hundred. And the number of people involved in these 
paleoclimatic research would be at least 1,000. Of course, I 
have to emphasize I am not a paleoclimate scientist, but all of 
us are ruled by one simple goal, to understand the nature of 
how climate works. The basis to get to the goal is to figure 
out the exact expressions of the physical laws.
    The short answer is there is a huge number of literature 
that we consulted that feed the criteria. This is why we wrote 
it as a scientific paper.
    Senator Inhofe. I was trying to get to the 240 proxies that 
were used and the number used.
    Dr. Soon. Yes, we listed about 240 proxy studies in our 
papers.
    Senator Inhofe. Last, I would say, do you have more data in 
your study than Dr. Mann did in his 1999 work? And is your data 
newer?
    Dr. Soon. Yes. I would emphasize that most of the proxy 
records come from the most recent 5 years.
    Senator Inhofe. Thank you, Dr. Soon.
    Senator Jeffords.
    Senator Jeffords. Dr. Mann, would you care to respond?
    Dr. Mann. Yes, first of all I wanted to clarify a 
misstatement earlier on the part of Senator Inhofe. The results 
that I showed in my first graphic which demonstrate that it is 
a clear consensus of the climate research community that a 
number of different estimates, not just ours, but at least 12 
different estimates of the history of the northern hemisphere 
average temperature for the past 1,000 years give essentially 
the same result, within the uncertainties. We published a paper 
just a month ago demonstrating that that is a robust result of 
a large number of mainstream researchers in the climate 
research community.
    Phil Jones and I also have a paper in press in the Journal 
of Geophysical Research letters, which demonstrates those 
results further. So in fact, the latest word and the word of 
the mainstream climate research community is the one that I 
have given you earlier.
    Now, as far as the issue of data, how much data was used, 
there are a number of misstatements that have been made about 
our study. One of them is with regard to how much data we used. 
We used literally hundreds of proxy records. We often 
represented those proxy records, as statistical climatologists 
often do, in what we call a state space. We represented them in 
terms of a smaller number of variables to capture the leading 
patterns of variability in the data. But we used hundreds of 
proxy indicators, more in fact than Dr. Soon referred to. In 
fact, we actually analyzed climate proxy records. Dr. Soon did 
not.
    Senator Jeffords. Dr. Soon, in a 2001 article in Capitalism 
magazine, you said that because of the pattern of frequent and 
rapid changes in climate throughout the holocene period, we 
should not view the warming of the last 100 years as a unique 
event or as an indication of manmade emissions' effect on the 
climate.
    But according to NOAA's Web site ``upon close examination 
of these warm periods,'' including all the ones that you cited 
in your past and most recent article,

          ``It became apparent that these periods are not similar to 
        the 20th century warming for two specific reasons. One, the 
        periods of hypothesized past warming do not appear to be global 
        in extent or, two, the period of warmth can be explained by 
        known natural climate forcing conditions that are uniquely 
        different than those of the past 100 years.''

    Why didn't either of your articles make an impact on the 
state of the science or NOAA's position?
    Dr. Soon. Thank you for your question, Senator. As you may 
be aware, my paper just got published this year, January 2003 
and April 2003, so it is all fairly recent. I have just written 
up this paper very recently, so I do not know what impact it 
will have on any general community, but I do know all my works 
are done consulting works from all major paleoclimatologists in 
the field, including Dr. Mann and his esteemed colleagues.
    As to the comments about the Capitalism magazine, I am not 
aware of that particular magazine. I do not know whether I 
submitted anything to this journal or this magazine. I do stand 
by the statement that it is important to look at the local and 
regional change before one takes global averages because 
climate tends to vary in very large swings in different parts 
of the world. That really is the essence of climate change and 
one ought to be really looking very carefully at the local and 
regional change first, and also one should not look strictly at 
only the temperature parameter, as Dr. Mann has claimed to have 
done. That I think is very important to take into account.
    Senator Jeffords. Dr. Mann, could you comment?
    Dr. Mann. Yes. Both of those statements are completely 
incorrect. If Dr. Soon had actually read any of the papers that 
we have published over the past 5 years or so, he would be 
aware of the fact that we use statistical techniques to 
reconstruct global patterns of surface temperature. We average 
those spatial patterns to estimate a northern hemisphere mean 
temperature, just as scientists today seek to estimate the 
northern hemisphere average temperature from a global network 
of thermometer measurements. We use precisely the same approach 
based on proxy reconstructions of spatial patterns of surface 
temperature.
    So what Dr. Soon has said is completely inaccurate. The 
first line on that contrasting methods table up there is also 
completely inaccurate.
    In terms of variables other than temperature, my colleagues 
and I have published several papers reconstructing continental 
drought over North America and reconstructed sea-level pressure 
patterns. We have looked at just about every variable that 
climatologists are interested in from the point of view of 
paleoclimate indicators. I think Dr. Soon needs to review my 
work more carefully.
    Senator Inhofe. Thank you, Senator Jeffords.
    Senator Allard.
    Senator Allard. Thank you, Mr. Chairman.
    In my mind, I do not think there is any question that the 
climate has shown a period of warming here. The question that I 
bring up and where I see the debate is, what is causing it and 
whether it is the changes that are happening and whether they 
are significant or not.
    I also wonder what your thinking this world might look like 
1,000 years from now, looking at the data that we have now. I 
wondered if maybe each one of you would just give me a brief 
response as to what you think of what we are seeing today may 
look like projected out over 1,000 years from now. I will start 
with Dr. Soon.
    Dr. Soon. The factors causing climate change are extremely 
complicated. As I emphasized already, I am very much interested 
to learn how the climate changes on a local or regional scale 
first before I can speak in terms of global climate. After all, 
local and regional climate are indeed the most relevant 
climatic factors that human activities are being influenced by 
or the reverse way.
    As to the factors of climate change, I believe that it is 
extremely difficult yet still to confirm the facts of being, 
let's say, even the late 20th century has anything to do with 
CO2. We do know that the CO2 is rising, 
but at the same time we know that climate depends on many other 
factors. It could be doing it internally all by itself because 
of ocean current movements. It could be done, for example, by 
variability imposed externally from the sun, variable outputs. 
Our sun is a variable star. That is a very well known fact.
    These are the kinds of factors one has to look very 
comprehensively at. Additional important factors of human 
activity would include land use changes. Those are very well 
known factors that one has to keep a good record, or time 
history, to really understand what are the causes of the 
change.
    I don't think I should speculate anything about futures. It 
is always very dangerous to talk about the future of any 
climate.
    Senator Allard. Dr. Mann.
    Dr. Mann. Yes. Well, I certainly agree with your statement 
that one of the key issues is what we call the detection or the 
attribution of human influence on climate, not just how has 
climate changed over the past 100 years or past 1,000 years, 
but can we actually determine the causal agents of change.
    There has been a solid decade of research into precisely 
that question by, again, the mainstream climate research 
community in addressing the issue of the relative role of 
natural factors, as well as anthropogenic factors. That 
includes the role of the sun, the role of human land use 
changes, and the role of human greenhouse gas increases. The 
model estimates are typically consistent with what we have seen 
in the observations earlier.
    As far as the next 1,000 years, that is not a particular 
area of expertise of mine, but I am familiar with what the 
mainstream climate research community has to say about that. 
The latest model-based projections indicate a mean global 
temperature increase of anywhere between .6 and 2.2 degrees 
Centigrade. That is one degree to four degrees Fahrenheit 
relative to 1990 levels by the mid-21st century under most 
scenarios of future anthropogenic changes.
    While these estimates are uncertain, even the lower value 
would take us well beyond any previous levels of warmth seen 
over at least the past couple of millennia. The magnitude of 
warmth, but perhaps more importantly the unprecedented rate of 
warming, is cause for concern.
    Senator Allard. Dr. Legates.
    Dr. Legates. Yes. I agree, too, that attribution is one of 
our important concerns. As a climatologist, I am very much 
interested in trying to figure out what drives climate. We know 
that a variety of factors exist. These include solar forcing 
functions; these include carbon dioxide in the atmosphere; 
these include biases associated with observational methods; 
these also include such things as land use changes. For 
example, if we change the albedo or reflected amount solar 
radiation, that too will change the surface temperature.
    So it is really a difficult condition to try to balance all 
of these possible combinations and to try to take a very short 
instrumental record and discern to what extent that record is 
being driven by a variety of different combinations.
    My conclusion probably in this case to directly answer your 
question is that the temperature likely would rise slightly, 
again due to carbon dioxide, but it would be much more 
responsive to solar output. If the sun should quiet down, for 
example, I would expect we would go into a cooling period.
    Senator Allard. I guess the question that I would have, 
now, you know you have increased CO2. So how is the 
environment in the Earth going to respond to increased 
CO2? Have any of you talked to a botanist or 
anything to give you some idea of what happens when 
CO2 increases in the atmosphere? Plants utilize 
CO2, extract oxygen. We inhale oxygen and extract 
CO2. Will plants be more prosperous with more 
CO2? How does that impact the plant life? Can that 
then come back on the cycle and some century later mean more 
O2 and less CO2?
    So I am wondering if any of you have reviewed some of these 
cycles with botanists and see if they have any scientific data 
on how plants respond to CO2 when that is the sole 
factor. I am not sure I have ever seen a study. There is 
moisture and other things that affect plant growth, but just 
CO2 by itself. Have any of you seen any scientific 
studies in that regard?
    Dr. Soon. I have seen that. In fact, I have written a small 
paper that has a small section regarding that.
    Senator Allard. And what was their conclusion?
    Dr. Soon. The conclusion is that in general, of course, 
under enrichment of the CO2 in the free air, that 
yes, plant growth will be enhanced. For example, as indicated 
by your chart, the crop yield can increase by 30 percent or 
higher for a doubling of CO2, depending on the 
actual constraints in the field, like types of crops, how wet 
or how dry, etc. All of these examples are very well known and 
well verified in the field of botany.
    Senator Allard. My time has run out. Would the other two 
agree with what he said?
    Dr. Mann. Not quite.
    Senator Allard. What is your modification?
    Dr. Mann. In fact, a number of studies have been done, what 
are called ``FACE'' experiments. They are open canopy 
experiments in which CO2 is elevated in the forest 
and scientists examine the changes in the behavior of that 
forest. What scientists at Duke University are finding is that 
while there is a tendency for an uptake of CO2 by 
the plants in the near term, what happens is eventually those 
plants will die. They will rot. When that happens, this happens 
on generational time scales.
    Senator Allard. Just CO2 being the variable and 
not moisture and anything else?
    Dr. Mann. Just CO2. The CO2 will go 
back into the atmosphere because the plants that take it up----
    Senator Allard. Do they have an explanation of why the rot 
occurred?
    Dr. Mann. Well, just when things die, they will rot and 
they will give up their CO2 back to the atmosphere 
eventually.
    Senator Allard. Well, that really does not get to the point 
I was trying to make.
    Doctor.
    Dr. Legates. To follow on that, enhanced CO2 and 
dying plants would also provide the ability for more plants to 
therefore grow in its place. In particular, one of the people 
on our study, Dr. Sherwood Idso, has done a lot of this study 
with carbon dioxide and enhanced where you can control the 
amount of water and energy available to plants associated with 
lowered CO2 and higher CO2.
    Senator Allard. So your conclusion is that CO2 
increases plant growth?
    Dr. Legates. Yes.
    Senator Allard. OK.
    Thank you, Mr. Chairman.
    Senator Inhofe. Thank you, Senator Allard.
    Senator Carper, we were going to go by the early bird rule. 
Is it all right if Senator Thomas goes ahead of you here?
    Senator Carper. Sure.
    Senator Inhofe. Senator Thomas.
    Senator Thomas. Thank you. I am a little confused about 
where we even ask the questions. Obviously, there is a 
difference of view. We are expected to make some policy 
decisions based on what we ought to be doing with regard to 
these kinds of things, but yet there does not seem to be a 
basis for that kind of a decision. Where would you suggest we 
get the information that is the best information we could get 
to make policy decisions for the future? Would each of you like 
to comment shortly on that?
    Dr. Mann. Sure. I guess I would reiterate the comments that 
I made earlier, that in a National Academy of Sciences study 
that was commissioned by the Bush Administration in 2001, the 
National Academy of Sciences in essence stated their agreement 
with the major scientific findings of the Intergovernmental 
Panel on Climate Change, the IPCC, which is the United Nations 
panel of scientists, thousands of scientists from around the 
world who put together a report on the state of our knowledge 
about all of these things--climate change scenarios, our 
uncertainty about various attributes of the climate system. The 
conclusions that I stated earlier are the consensus conclusions 
of the IPCC.
    Senator Thomas. That is where you would go.
    Dr. Mann. That is where they have gone, yes.
    Dr. Legates. I would generally argue the IPCC is a bit of a 
political document to the extent to which it does present some 
biased science. There is a lot of good science in there, but a 
lot of the conclusions are sort of not borne out by the facts. 
Having been president of the Climate Specialty Group of the 
Association of American Geographers, which is probably the 
largest group of climatologists available, I know from talking 
to rank-and-file members that they generally--my impression is 
that most climatologists agree it takes a rather strong 
viewpoint.
    So I have real serious concerns that it really represents a 
consensus, and in particular when, for example, in this 
discussion when we change dramatically what a lot of people 
have held true, that is the Little Ice Age, Medieval Warming 
and so forth, and replace it with a flat curve very quickly, I 
do not think we have given it enough time to really decide if 
in fact that is an appropriate change in paradigm.
    Dr. Soon. Although I am not able to comment on anything on 
public policies, I am certainly able to testify that the 
science is completely unsettled. There are just so many things 
that we do not know about how the climate really works and what 
are the factors that cause it to change, to really jump to the 
conclusion that it will all be CO2.
    Senator Thomas. Thank you. That helps a lot.
    [Laughter.]
    Senator Inhofe. You still have some time remaining. Did you 
have an opportunity to see the chart up here that Dr. John 
Reilly, MIT Joint Program on Science Policy and Global Change? 
On the floor yesterday, I talked at some length on this. There 
seems to be a lot of consensus that there are some very 
positive benefits.
    Senator Thomas. It is really interesting, you know, in 
Schlesinger's thing it indicates that the temperature after 
1940 dropped until 1977. So that makes you wonder what we ought 
to do. The rise in temperature during the 20th century occurred 
between 1900 and 1940. So now we are faced with making policy 
decisions where there is no real evidence that the things that 
the greenhouse gases measurable by the U.N. is the basis for 
doing these things.
    I know in science everyone has little different ideas, but 
I do think we are going to have to, Mr. Chairman, as you 
pointed out yesterday, either take it a little more slowly in 
terms of policy, or we are not going to have something more 
basic to base it on than we have now in order to make 
significant policy changes.
    Thank you.
    Senator Inhofe. Thank you, Senator Thomas.
    Senator Carper.
    Senator Carper. Thank you, Mr. Chairman. I want to welcome 
our witnesses this morning. Dr. Legates, it is great to have a 
fighting Blue Hen here from the University of Delaware. We are 
delighted that you are here. Dr. Mann, thanks for coming up, 
and Dr. Soon, welcome. We thank you for your time and your 
interest and your expertise on these issues, and your 
willingness to help us on some tough public policy issues that 
we face.
    Dr. Mann, I would start off if I could and direct a 
question to you. I understand we have had thermometers for less 
than 200 years, and yet we are trying to evaluate changes in 
temperature today in this century and the last century with 
those that occurred 500 or 1,000 or 2,000 years ago. I 
understand that we use proxies for thermometers, if you will, 
and for those kinds of changes in temperature.
    I wonder if you could help me and maybe the committee 
better understand how we compare today's temperature 
measurements to the proxies of the past. Are there potential 
risks with relying on some of those proxies?
    Dr. Mann. Absolutely. We have to use them carefully when we 
try to reconstruct the past temperature history. So when I say 
we have to use them carefully, it means some of the things that 
I discussed in my testimony earlier, that we need to actually 
verify that if we are using a proxy record to reconstruct past 
temperature patterns, that proxy record is indeed reflective of 
temperature changes. That is something that typically 
paleoclimate scientists first check to make sure that the data 
they are using are appropriate for the task at hand. Of course, 
we have done that in our work. I did not see evidence that Soon 
and colleagues have done that.
    First of all, we next have to synthesize the information. 
There have been some misleading statements made here earlier on 
the part of the other testifiers with regard to local versus 
regional or global climate changes. Of course, we have to 
assimilate the information from the local scale to the larger 
scales, just as we do with any global estimate of quantity. So 
we take the regional information; we piece together what the 
regional patterns of change have been, which may amount to 
warming in certain areas and cooling in other areas. Only when 
we have reconstructed the true global or hemispheric regional 
patterns of change can we actually estimate the northern 
hemisphere average, for example.
    A number of techniques have been developed in the climate 
research community for performing this kind of estimate. My 
colleagues and I have described various statistical approaches 
in the detailed climate literature. Some of the estimates are 
based on fairly sophisticated techniques. Some of them are 
based on fairly elementary techniques. Yet all of the results 
that have been published in the mainstream climate research 
community using different techniques and different assortments 
of proxy data have given, as I showed earlier in my graph, the 
same basic result within the uncertainties. That has not 
changed. An article that appeared last month in the American 
Geophysical Union, which is actually the largest professional 
association of climatologists, showed that indeed that is the 
consensus viewpoint of the climate research community.
    Senator Carper. Thank you.
    Dr. Legates, if I could ask a question of you, please. Have 
you or anyone of your colleagues, at the University of 
Delaware, to your knowledge studied the historical climate and 
temperature records in our part of the country, in Delaware, 
the Delmarva Peninsula, or the mid-Atlantic region?
    Dr. Legates. We do not have anybody on staff presently that 
does paleoclimatology. One of the basic understandings that you 
must come up with when you study climate is that you must 
understand various things of hydroclimatology, physic 
climatology, and that includes paleoclimate study. So you must 
be at least versed in these things if you are not necessarily a 
paleoclima-
tologist.
    We do have Dr. Brian Hanson at the University of Delaware 
who has looked at glacier movements over long time periods, as 
well as Dr. Fritz Nelson who has looked at changes associated 
with permafrost locations.
    Senator Carper. If someone were to do a study for our part 
of the country, what do you think they might find?
    Dr. Legates. A study regarding?
    Senator Carper. Historical climate and temperature changes.
    Dr. Legates. Over the East Coast of the United States? Most 
of the assessments indicated that generally the East Coast has 
gone through a variety of changes over long time periods. 
Historically, we have had a condition where in the 1960's, for 
example, we had conditions where there was much more snowfall. 
We have had a lot of variability associated with air 
temperature rising and falling over the local conditions. 
Variability is usually the characteristic of climate over the 
near-term as well.
    Senator Carper. OK. Dr. Soon, if I could ask you and maybe 
Dr. Legates the same question, the following question. That 
question is, do you believe that it is possible to emit 
unlimited amounts of CO2 into our atmosphere without 
having any impact on climate or temperature?
    Dr. Soon. I do not know how to precisely answer the 
question. If you fill up every single molecule of the air with 
CO2, that would be poisonous, of course. I do not 
know the answer to the question, but I do like to add about the 
evidence available on climate change.
    Senator Carper. Before you do that, let me direct, if I 
could, the same question to Dr. Legates. I do appreciate your 
candor. It is not everyday that we find that here in this hall.
    Dr. Legates. Generally, what we have found is that as 
carbon dioxide has increased, the temperature has followed, 
where in some cases historically the temperature has gone up 
and the carbon dioxide has fallen. So generally from a purely 
physical point of view, if you do increase the carbon dioxide, 
you should wind up with some trapping of gases, and hence wind 
up with a slightly increased temperature.
    The question is, there is a lot of additional feedbacks 
associated with it. For example, warmer surface temperature 
leads to more instability or rising air which leads to more 
cloudiness. Clouds can warm at night, but also reflect energy 
in the daylight. So you have these odd playbacks into the 
climate system which make it very difficult to say that if I 
hold everything else constant and change one variable, what 
will happen. Well, in reality, it is impossible to hold 
everything constant because it is a very intricate and 
interwoven system that one change does have feedbacks across 
the entire spectrum.
    Senator Carper. Thanks. I think my time has expired, Mr. 
Chairman. Is that correct?
    Senator Inhofe. Yes. Thank you, Senator Carper.
    Senator Carper. Thank you.
    Senator Inhofe. We will have another round here. In fact, I 
will start off with another round. Let's start with Dr. 
Legates. Dr. Legates, was the temperature warmer 4,000 to 7,000 
years ago than it is today?
    Dr. Legates. My understand was during about 4,000 to 7,000 
years ago, in a period referred to as the climatic optimum, 
which sort of led to enhanced agriculture and led to 
development of civilization, generally the idea is that warmer 
temperatures lead to more enhanced human activity; colder 
temperatures tend to inhibit. Again, as we get back 4,000 to 
7,000 years ago, it becomes, the error bars are getting wide as 
well. But the general consensus is that temperatures were a bit 
warmer during that time period.
    Senator Inhofe. OK. Senator Thomas had something about, he 
had alluded to 1940. Yesterday when I was giving my talk and 
doing the research for that, it was my understanding that the 
amount of CO2 emitted since the 1940's increased by 
about 80 percent. Yet that precipitated a period of time from 
about 1940 to 1975 of a cooling-off period. Is that correct?
    Dr. Legates. That is correct. It is sort of a perplexing 
issue in the time series record that from 1940 to 1970 
approximately, while carbon dioxide was in fact increasing, 
global temperatures appear to be decreasing.
    Senator Inhofe. Dr. Mann, you have I might say impugned the 
integrity of your colleagues and a few other people during your 
presentation today. The Wharton Econometric Forecasting 
Associates did a study as to the effect of regulating 
CO2 and what would happen. American consumers would 
face higher food, medical and housing costs; for food, an 
increase of 11 percent; medicine, an increase of 14 percent; 
and housing, an increase of 7 percent. At the same time, the 
average household of four would see its real income drop by 
$2,700 in 2010.
    Under Kyoto, the energy and electricity prices would nearly 
double and gasoline prices would go up an additional 65 cents a 
gallon. I guess I would ask at this point, what is your opinion 
of the Wharton study?
    Dr. Mann. OK. First, I would respectfully take issue with 
your statement that I have impugned the integrity of the other 
two testifiers here. I have questioned their, and I think 
rightfully, their qualifications to state the conclusions that 
they have stated. I provided some evidence of that.
    Senator Inhofe. Well, ``illegitimate, inexperienced, 
nonsense''----
    Dr. Mann. Those are words that I used. Correct.
    Senator Inhofe [continuing]. That is a matter of 
interpretation.
    Go ahead.
    Dr. Mann. I would furthermore point out that the very 
models that I have referred to track the actual instrumental 
warming and the slight cooling in the northern hemisphere. 
There was no cooling of the globe from 1940 to 1970, the 
northern hemisphere----
    Senator Inhofe. OK. The question I am asking you is about 
WEFA.
    Dr. Mann. I am not a specialist in public policy and I do 
not believe it would be useful for me to testify on that.
    Senator Inhofe. Dr. Legates, have you looked at the report 
that Wharton came out with concerning the possible effects, 
economic results of this?
    Dr. Legates. Again, I am not a public policy expert either, 
and so the economic impacts are not something which I would be 
qualified to testify on.
    Senator Inhofe. OK, Dr. Legates, do you think you have more 
data than Dr. Mann?
    Dr. Legates. I think we have looked at a large variety of 
time series. We have looked at essentially a large body of 
literature that existed both prior to Dr. Mann's analysis and 
since Dr. Mann's analysis, in attempting to figure out why his 
curve does not reflect the individual observations. It is one 
issue associated with when you put together data sets, to make 
sure that the composite sort of resembles the individual 
components.
    Senator Inhofe. OK. The timeline, Dr. Mann, is something I 
have been concerned with, and those of us up here are listening 
to you and listening to all three of you and trying to analyze 
perhaps some of the data that you use and the conclusions you 
came to, having been 4 or 5 years back, compared to a study 
that was done referring to Smithsonian-Harvard, the 1,000-year 
study that was just completed, or at least given to us in March 
of this year. I would like to have each of you look at the 
chart up here and just give us a response as to what you feel 
in terms of the data that both sides are using today.
    Dr. Mann. I guess you referred to me first?
    Senator Inhofe. That is fine. Yes.
    Dr. Mann. OK. Well, I think we have pretty much 
demonstrated that just about everything there is incorrect. In 
a peer-reviewed publication that was again published in the 
Journal Eos of the American Geophysical Union about a month 
ago, that article was cosigned by 12 of the leading United 
States and British climatologists and paleoclimatologists. We 
are already on record as pretty much pointing out that there is 
very little that is valid in any of the statements in that 
table. So I think I will just leave it at that.
    Senator Inhofe. Do the other two of you agree with that?
    Dr. Legates. If I may add, the Eos piece was actually not a 
refereed article. It is an Eos Forum piece, which by definition 
is an opinion piece by scientists for publication in Eos. That 
is what is contained on the AGU Web site for Eos Forum.
    Senator Inhofe. All right. Let me ask one last question 
here. Dr. James Hansen of NASA, considered the father of global 
warming theory, said that the Kyoto Protocol ``will have little 
affect on global temperatures in the 21st century.'' In a 
rather stunning followup, Hansen said it would take 30 Kyotos, 
let me repeat that, 30 Kyotos to reduce warming to an 
acceptable level. If one Kyoto devastates the American economy, 
very much by the findings of Wharton, what would 30 Kyotos do? 
Is Dr. Hansen one of the most respected scientists in your 
field or is he way off base?
    Dr. Mann. Dr. Hansen is certainly one of the most respected 
scientists in my field and I personally have great scientific 
respect for him. I think that his conclusions have been grossly 
taken out of context. His point is simply that Kyoto would, and 
this is his point, these are not my opinions, would do very 
little to ameliorate the warming over the next century for two 
reasons.
    No. 1, there is something that scientists call the 
commitment to warming. Once we put CO2 into the 
atmosphere, it takes many decades, on orders of decades to 
maybe centuries for it fully to equilibrate with the ocean and 
the atmosphere. So some of that CO2 is taken up by 
the ocean. So the effect of it is delayed. So cutting back on 
CO2 now may not affect global temperatures for 50 
years, but 50 years later it is going to come back to roost.
    Senator Inhofe. All right, that was a rather long answer, 
so let me just, with the indulgence of my fellow Senators here, 
I just want to ask one last question. I quoted Dr. Frederick 
Seitz, the past president of the National Academy of Sciences 
yesterday, and professor emeritus at Rockefeller University, 
who compiled an Oregon petition which says there is no 
convincing scientific evidence that human release of carbon 
dioxide, methane and other greenhouse gases is causing, or will 
in the foreseeable future cause catastrophic heating of the 
Earth's atmosphere and disruption of the Earth's climate.
    Moreover, there is substantial scientific evidence that 
increases in atmospheric carbon dioxide produce many beneficial 
effects upon the natural plant and animal environments of the 
Earth. Do each of the three of you agree or disagree with his 
statement?
    Dr. Soon. I agree.
    Dr. Mann. I find little in there to agree with.
    Dr. Legates. I would tend to agree.
    Senator Inhofe. All right.
    Senator Jeffords.
    Senator Jeffords. As you may know, this is to all of you, 
the editor-in-chief of the magazine Climate Research resigned 
the position yesterday over problems with Dr. Soon's paper. In 
an e-mail sent to my staff, he said,

          ``My view, which is shared by many, but not all editors and 
        review editors of Climate Research, is that the review of the 
        Soon et al paper failed to detect significant methodological 
        flaws in the paper. The critique published in the Eos journal 
        by Mann et al is valid. The paper should not have been 
        published in this forum, not because of the eventual 
        conclusion, but because of the insufficient evidence to draw 
        this conclusion.''

    What methodological flaws does he mean?
    Dr. Mann.
    Dr. Mann. Well, I have tried to outline the most severe of 
those methodological flaws. I believe it is the mainstream view 
of just about every scientist in my field that I have talked to 
that there is little that is valid in that paper. They got just 
about everything wrong. They did not select the proxies 
properly. They did not actually analyze any data. They did not 
produce a reconstruction. They did not produce uncertainties in 
a reconstruction. They did not compare to the proper baseline 
of the late-20th century in trying to make conclusions about 
modern warmth.
    So I think it is the collective view of our entire research 
community that that is one of the most flawed papers that has 
appeared in the putative peer-reviewed research in recent 
years.
    Senator Jeffords. Dr. Soon, do any scientists besides your 
coauthors support using wetness or dryness as indicators of 
past temperatures, instead of actual temperatures or proxy data 
that reflects temperatures?
    Dr. Soon. As we explain clearly in our paper, and as it has 
been highly mischaracterized by my fellow colleague here, Dr. 
Mann, we certainly agree when we speak in term of the Medieval 
Warm Period, temperature is one of the important parameters. As 
we emphasize and specify in our papers that climate is not 
temperature alone. One has to look in terms of the water cycle, 
in terms of even the air cycles, in terms of the vegetation 
changes. These are the kind of details that we did not make any 
presumptions, but simply want to look at the patterns of change 
geographically all over the world, and see how complete the 
datas are, and then begin to start to see how do we assemble 
all such information.
    Senator Jeffords. This is for the whole panel. I would like 
to know whether the unusual melting of Greenland ice sheets 
shown in this picture over the years 2001, 2002 and 2003, has 
been matched in the long-term climate history any other time? 
And according to NASA, by the end of the year 2002 season, the 
total area of surface melt in the Greenland ice sheet had 
broken all known records. By the end of that summer ``Sea ice 
levels in the Arctic were the lowest in decades and possibly 
the lowest in several centuries.''
    NASA says this warming is happening faster and earlier than 
in previous periods. What is happening now and what is going to 
happen if this continues?
    Dr. Mann.
    Dr. Mann. Well, this is, of course, one particular region, 
one potentially isolated region, Greenland, in which there is 
evidence of mass oblation of ice. But if we look at what is 
going on the world over, mountain glaciers in the tropics 
throughout the world, glaciers in both the northern hemisphere 
and the southern hemisphere, what is seen is that glacial 
retreat during the late 20th century is unprecedented on 
similar time scales to the time scales I have spoken of before, 
the past 1,000 to 2,000 years.
    I believe Professor Lonnie Thompson of Ohio State 
University has testified in this Senate before with regard to 
the dramatic evidence of worldwide glacier retreat. So that is 
a cause for concern. It is a harbinger of the warming because 
in fact the warming that is shown in those glacier retreats is 
actually warming that we are already committed to for decades 
to come.
    Dr. Legates. Historically, it has been demonstrated in the 
refereed literature that much of this glacial retreat actually 
began in the late 1800's, before much of the carbon dioxide 
came into the atmosphere. This is very much consistent with the 
demise of the Little Ice Age and longer time-scale variations. 
Therefore, it is very difficult to say that these kind of 
events are directly attributable to human impacts on the 
climate, when they in fact pre-date human impacts on the 
climate.
    Senator Jeffords. Dr. Soon.
    Dr. Soon. My only comment regarding that kind of chart or 
the claim that it has never happened before is that to think 
about the available, detailed observation that we have. We do 
not really have any satellite record longer than 20 to 30 
years, so the statement that it has never happened before I 
think is dangerously inaccurate.
    Senator Jeffords. Dr. Mann.
    Dr. Mann. Yes. It is unfortunate to hear comments about the 
supposed inconsistencies of the satellite record voiced here, 
years after that has pretty much been debunked in the peer-
reviewed literature, in Nature and Science. Both journals have 
in recent years published several rigorously peer-reviewed 
articles indicating that in fact the original statement that 
the satellite record showed cooling was flawed because the 
original author, John Christy, did not take into account a 
drift in the orbit of that satellite, which actually leads to a 
bias in the temperatures from the satellite.
    Christy and colleagues have claimed to have gone back and 
fixed that problem, but just about every scientist who has 
looked at it says that their fix is not correct. If you fix it 
correctly, then the satellite record actually agrees with the 
surface record, indicating fairly dramatic rates of warming in 
the past two decades.
    Senator Jeffords. I have one last question, Dr. Mann. What 
are the implications of your peer-reviewed work for future 
manmade warming?
    Dr. Mann. As I said before, there have been a number of 
modeling simulations that have shown a fairly good match to our 
reconstruction and that of several independent research groups 
who have also produced these reconstructions of northern 
hemisphere temperature. So to the extent that the models match 
that record of the past 1,000 years when they are forced with 
various estimates of natural changes in the system, it gives us 
reason to trust what the models say about the future. As I 
testified before, the models tell us that we are likely to see 
a one degree to four degree Fahrenheit warming by the mid-20th 
century, given most predicted scenarios of continued 
anthropogenic influence on the climate.
    Dr. Legates. If I may add something, one of the things I 
have heard is that science has been debunked and, for example, 
we pointed to Dr. Christy's curve up here and said that because 
one paper has been written, that curve is now called into 
question. We have talked about--you mentioned von Storch's 
resignation from Climate Research because apparently he has 
admitted that this paper never should have been published.
    I want to point out that science debate goes on and on. In 
particular, Dr. Christy has had some very important 
contributions to indicate that his curve is not incorrect. That 
is part of scientific debate. Furthermore, I will say with 
respect to Climate Research, Otto Kinne, who is director of 
Inter-Research, the parent organization of Climate Research, 
asked Chris de Freitas who was the editor who served on the 
Soon and Baliunus papers, and I can relay this because I am a 
review editor of Climate Research so I am familiar with what 
has been taking place.
    There were several people complaining that Chris de Freitas 
should be removed simply because he published the Soon and 
Baliunus paper. That question was brought to Otto Kinne. He 
asked for Chris de Freitas to provide him with the reviews, the 
changed manuscripts and so forth. He provided a letter in late 
June to all of us in which he said,

          ``I have reviewed the evidence and I have indicated that the 
        reviews, four for each manuscript, in fact there was a second 
        or an earlier Soon and Baliunus article on another topic that 
        was also called into question by these people leveling 
        charges.''

    Essentially what he concluded was that the reviewers 
provided good and appropriate comments; that Doctors Soon and 
Baliunus provided an appropriate dressing or incorporation of 
these concerns; and that Chris de Freitas had in fact provided 
analysis appropriately.
    Toward that end, Dr. von Storch was approached. Climate 
Research was putting in an editorial stating essentially this 
article should never have been published. Otto Kinne was 
informed and he has asked him not to submit that because it is 
not founded, and as a result Dr. von Storch, I now understand, 
has said he would resign.
    Senator Jeffords. Dr. Mann.
    Dr. Mann. Yes, just a very short comment. It is 
unprecedented in my career as a scientist to hear of a 
publisher of a journal going in and telling the editor-in-chief 
that he cannot publish an editorial. I find that shocking and a 
bit distressing. I do not know what the circumstances are 
behind it, but it is disturbing.
    Dr. Legates. It is also unprecedented to find an editor 
being attacked, and this has also happened with the editorial 
staff of Energy and Environment, which is the other paper, to 
find an editor attacked for simply publishing an article that 
has been peer-reviewed and approved by reviewers.
    Senator Inhofe. All right. The time has expired. We are 4 
minutes over.
    Senator Jeffords. I think that my witness should have the 
last word on my question, if I could. Dr. Mann, do you have any 
response to that?
    Dr. Mann. Actually, my understanding is that Chris de 
Freitas, the individual in question, frequently publishes op/ed 
pieces in newspapers in New Zealand attacking IPCC and 
attacking Kyoto and attacking the work of mainstream 
climatologists in this area. So this is a fairly unusual editor 
that we are talking about.
    Senator Inhofe. All right, thank you.
    Senator Clinton has joined us. Senator Clinton would you 
like to have your round now?
    Senator Clinton. Thank you very much, Mr. Chairman. I thank 
you for this hearing. I understand that the questioning and the 
testimony has been somewhat lively, if not controversial and 
contested. The bottom line for me is whether we are doing what 
we need to do to ensure the best possible climatology outcome 
for future generations. I would stipulate that the Earth's 
climate has changed through the millennia. There is no doubt 
about that. I have read enough to know that we have had ice 
ages and we have had floods and we have had volcanoes. We have 
had lots of naturally occurring events which have affected our 
climate. We have El Nino and his spouse, El Nina. We have all 
of that. That is not debatable.
    The issue is whether the introduction and acceleration of 
anthropogenic activity primarily related to the burning of 
fossil fuels is putting into place conditions that will make it 
difficult, if not impossible for the Earth to regain its 
balance, that will support the conditions of life that we have 
inherited and are blessed with.
    I know these debates have political implications because 
heaven forbid that we would tell somebody in the private sector 
not to do something, or that we might have to make sacrifices 
in the quality of our life for future generations. I think that 
it is not useful to carry out this kind of argumentation when 
it is clear that by the very nature of human development and 
industrialization, we have changed what is in the atmosphere, 
what is in the earth, what is in the waters.
    That does not mean there was no change before we came 
along, and certainly in the last century that change has 
accelerated because the quality of life has improved, we have 
created chemicals that were never known in nature before. We 
have done a lot of things.
    But I think that our goal should be to try to figure out 
how to do no harm or do the least amount of harm, and to ask 
ourselves, what are we willing to perhaps sacrifice to make 
sure that we are not contributing to irreversible changes. I 
know that academia is probably the most political environment 
in America. I was once on a staff of a law school. It was more 
difficult than any politics I had ever been involved in 
beforehand. I know that people have very strong opinions and 
hold on to them.
    From my perspective, I just want to believe that I am 
making a contribution to ensuring that the quality of life for 
future generations is not demonstrably diminished. I would feel 
terrible if I participated, either as a willing actor or a 
bystander, in this potential undermining of our Earth's 
sustainability.
    So Dr. Mann let me ask you, what was the Earth's climate 
like the last time that there was atmospheric concentration of 
carbon dioxide at today's levels of 370 parts per million?
    Dr. Mann. Thank you, Senator, that is an excellent 
question. We have to go back fairly far into the past to find 
CO2 levels approaching the CO2 levels 
today. Ice core studies that have been done over the past 
decade or so have told us that today's CO2 level is 
unprecedented now in at least four glacial or inter-glacial 
cycles. That is more than 400,000 years.
    In fact, now as we look back from other evidence that is a 
bit more tentative, it appears that modern CO2 
levels probably have not been observed in 10 million to 20 
million years. So we have to go back to the time of the 
dinosaurs, probably, to find CO2 levels that we know 
were significantly higher than CO2 levels today.
    Some people will say, ``Well look that was a great time.'' 
The dinosaurs were roaming near the poles. It was warm near the 
north pole. There were palm trees in the poles. Isn't that what 
we want? Well, that was a change that occurred on time-scales 
of tens of millions of years. What we are observing right now 
is a similar change that is occurring on time-scales of 
decades.
    Senator Clinton. Thank you. Thank you, Dr. Mann.
    Senator Inhofe. Senator Clinton, if you would like to have 
some more time, since we are on the second round now, feel free 
to take another couple of minutes.
    Senator Clinton. Thank you very much, Mr. Chairman.
    I guess that is, for me, the dilemma, because I certainly 
understand the testimony of the other two witnesses, and I read 
with great interest former Secretary Schlesinger's op/ed. I 
know that there are those, who are in a minority, let's at 
least admit that, who are in a minority, but who certainly have 
a very strongly held set of beliefs, and I respect that.
    But I do believe that the compression of time in which 
these changes are occurring is extraordinarily significant. We 
can go back and look at the Earth's natural 125,000-year cycle, 
but I do not think we want to risk the enormous changes that 
could occur. I do not think we have a million or 10 million 
years or even 100,000 to experiment.
    I think that the challenge confronting us is not to put our 
heads in the sand and let the academic argument take place, but 
figure out how in a sensible, prudent manner we could 
ameliorate these changes significantly enough so that if Dr. 
Soon and Dr. Legates are right, no harm done. If Dr. Mann is 
right, we will have saved ourselves a lot of potential damage 
and difficulty.
    So I hope that we could put our heads together. I commend 
my two colleagues, both Senator Jeffords and Senator Carper, 
who have very sensible legislative answers to trying to get a 
handle on this. As I have said in this committee before, I 
stand ready to figure out ways to hold harmless our industrial 
base and others. I think it is a significant enough political, 
economic and moral challenge that if there are ways to make it 
financially possible for companies to do what needs to be done 
with respect to carbon dioxide and other atmospheric pollutants 
that have accelerated their presence in our atmosphere so 
dramatically in the last 100 years, I think we should do that.
    This is not just a private sector problem. We all have 
benefited from the increasing use of fossil fuels, for example. 
Our standard of living is dramatically better. One of our 
problems is what is going to happen if China and India get a 
standard of living anywhere comparable to ours, and then begin 
to really--and I see Dr. Soon nodding--I mean really dump into 
the atmosphere untold amounts of new pollutants of whatever 
kind, leading certainly with carbon dioxide.
    So this is a problem we need to get ahead of, and it is not 
a problem that the United States alone should be responsible 
for. It is not a problem that the private sector alone should 
be responsible for. But I believe, just as a prior generation 
of decisionmakers really put a lot of work into the law of the 
oceans and trying to figure out how we could protect our 
oceans, we need to do the same on the atmospheric level. There 
has got to be a way that we can come together on this big 
challenge.
    So Mr. Chairman, I appreciate your continuing attention to 
this. I, for one, stand ready to work with you and our other 
colleagues because I just think this is too risky a proposition 
not to act on, given the weight of opinion, even with the 
dissenters, who I think do rightly point out the incredible 
natural cycle, but we are now so influencing that natural 
cycle, I do not know if we have the time to contemplate the 
balance once again regaining itself in our wonderfully 
regenerating Earth.
    Senator Inhofe. Thank you, Senator Clinton.
    Senator Carper.
    Senator Carper. Thanks, Mr. Chairman. I just want to 
followup. Senator Clinton was kind in her comments on the 
legislation, the one that Senator Jeffords has introduced and 
second on legislation I have introduced along with Senators 
Judd Gregg, Lincoln Chafee and Lamar Alexander.
    Are any of you familiar with that legislation? Would you 
like to become familiar over the next 5 minutes?
    [Laughter.]
    Dr. Soon. No, we will stick to science. Politics is too 
complicated.
    Senator Carper. All right. That may be the best approach.
    We are trying to figure out if there is a reasonable middle 
ground on this issue. I am part of a group that Buddy MacKay, a 
former colleague of mine from Florida, calls the flaming 
moderates or flaming centrists. We can spend a whole lot of 
time discussing the impact of Kyoto caps, or we can focus on 
what steps we actually need to take.
    The approach that Senators Gregg and Chafee and Alexander 
and myself have taken, at least with respect to four 
pollutants, we say unlike the President's proposal where he 
only addresses sulfur dioxide and nitrogen oxide and mercury, 
and does not address CO2, as you know, because he 
thinks we need to study it a bit more. Our approach says that 
there ought to be caps on CO2; that they should be 
phased in; that we should use a cap and trade system; we should 
give utilities the opportunity to buy credit for levels of 
CO2 emissions that they maintain at high levels; and 
they should be able to contract with, among others, farmers and 
those who would be forced out of lands to change their planning 
patterns or change their animal feedlot operations in order to 
be able to sequester some of the CO2 that occurs in 
our planet.
    We have something called new source review. The President 
would eliminate it entirely. I think in Senator Jeffords' 
approach, it is pretty much left alone. There is a good 
argument that says that utilities under current law, if they 
make some kind of minor adjustment and minor investment in 
their plant, that they have to make a huge investment with 
respect to the environmental controls. As a result, it keeps 
them from making even common sense kinds of investments in 
their plants--sort of the laws of unintended consequences. That 
is sort of the approach that we have taken.
    Now that you know all about it, if you were in our shoes, 
what kind of an approach would you take? Let me just start with 
our University of Delaware colleague here, Dr. Legates.
    Dr. Legates. Generally, I favor no regrets policies, where 
they have other applications as well. But again, getting into 
the politics and the non-science aspects of what to do is out 
of my area of expertise. I may have my own beliefs, but they 
are no more important or less important than the average 
person. I would rather not testify to those here.
    Senator Carper. If you were convinced, and some of my 
colleagues have heard me talk about Dr. Thompson before, I 
don't know that they testified before this committee, but 
Doctors Knoll and Thompson spend their lives going around the 
world and they chart the disappearance of snow caps in some of 
the tallest mountains. I first met them here in Delaware about 
5 or 6 years ago to receive an award for their research.
    But they tell us that the snow caps around some of the 
tallest mountains in the world, the Himalayas and others, are 
not just disappearing, they will be gone, and they will be gone 
in our lifetime. When I heard them speak and talk about their 
work and what they were charting and finding, it got my 
attention. When you hear that, Dr. Legates and Dr. Soon, how 
does it affect you?
    Dr. Soon. As a scientist, I am still questioning the actual 
evidence. The fact is that meltings may be recorded for certain 
glaciers. But among the things that we know is that there are 
about 160,000 glaciers around the Earth, but only 40 to 50 
glaciers have been measured for 10 years or longer to tell us 
how much the ice has accumulated or has ablated.
    Some of the specific melting examples, like Kilimanjaro, 
that Dr. Lonnie Thompson has looked at, or some places in Peru 
may be true. But the quality of the data records is really 
telling us that we do not have enough strong evidence to 
suggest that all the ice will disappear quickly and completely, 
or that all of it is unprecedented. Climate change is part of 
nature. As I tried to emphasize in my research by looking 
carefully into all the climate proxies, there are large local 
swings in the climatic changes.
    Senator Carper. Dr. Soon, what would it take to convince 
you that this is a problem we need to deal with?
    Dr. Soon. As to some of the glaciers disappearing now in 
some parts of the mountains, I do not consider that to be 
either a problem or strong evidence----
    Senator Carper. No, no, the big issue. What would it take 
with respect to the concerns about global warming fed by 
CO2 accumulation, what would it take to convince you 
that this is a problem we need to do something about?
    Dr. Soon. OK. Scientifically, I would go by this very 
simple test. The simple test should be that the warming should 
be occurring first at the troposphere, the layer of air about 
four kilometers above us. That is a key part of the atmosphere 
that one should expect the CO2 greenhouse effect to 
work its way downward toward the surface. I would urge, of 
course, very seriously that we do not lose sight in all these 
debates about science, we must sustain a certain kind of level 
of observational effort to keep track of data so that while we 
are arguing around what to do, that one has some records about 
any level of change that may occur.
    So what it would take is that the CO2 warming 
should happen at the layer of air four kilometers first. I 
would require it be strongly sustained for maybe 20 years or 
so. Then I would really believe that we have clear 
CO2 fingerprints somewhere.
    Senator Carper. Mr. Chairman, I know my time has expired. 
Could I just ask that same question of Dr. Legates? What would 
it take to convince you?
    Dr. Legates. Proof. Generally the problem we have seen in 
the record is that there is an awful lot of variability and 
there are things where changes occur, for example, between 1940 
and 1970 where the temperature decreased, even though carbon 
dioxide was increasing. That sort of indicates to me that 
carbon dioxide may not be the biggest player in the game. Solar 
variability is likely to be the bigger player, changes in solar 
output. After all, if the sun goes out, our temperature drops 
considerably. We know historically that as the sun fluctuates 
in terms of its output, the climate does respond.
    So there are a lot of other factors involved and I am not 
entirely convinced, based upon the proof, that carbon dioxide 
is a driving force. It is a contributory force in a small case, 
but not driving enough, because we wind up making policies 
potentially that can lead us to try to keep back the ocean, if 
you will. You cannot stop the waves from coming in.
    Senator Carper. Dr. Mann.
    Dr. Mann. Two quick points. First of all, it grates on me 
to hear this argument about cooling from 1940 to 1970 
continually cited here as evidence against anthropogenic 
climate change. That cooling was almost certainly anthropogenic 
and there has been a decade of research demonstrating that, 
anthropogenic sulphate aerosols, which have a cooling effect on 
the climate. What is happening now is that the much greater 
effect of increasing greenhouse gas concentrations is 
overtaking that small cooling effect of sulphate aerosols, also 
an anthropogenic influence, but not the one that is going to 
take us to doubled levels of CO2 in the next 
century.
    One quick other comment, if I could. Lonnie Thompson's 
work, which is some of the best work in our field, it is not 
like he has been looking for ice cores that are melting. He is 
actually looking for ice cores that are not melting because he 
wants to get long records. So if there is any belief that there 
might be some bias in the glaciers that he has gone to, if 
anything it is the opposite. He is looking for long records, so 
that makes it that much more impressive that they are all 
melting.
    Senator Carper. Thank you.
    Senator Inhofe. Senator Allard.
    Senator Allard. Thank you, Mr. Chairman.
    What agency do you think we probably have the most 
expertise in as far as climatology change and what is happening 
with global climate? Would that be the agency on the National 
Oceanographic and Atmospheric Science, would that probably be 
where we would have most of our experts? If not, which agency 
do you think we would have most of our experts as far as the 
government is concerned? To any member of the panel, I would 
like to know whether any of you concur or not.
    Dr. Mann. Well, I think that the different agencies 
specialize in different areas of the climate change research 
question, if you will. NOAA's specialty is in looking at 
climate variability, particularly with regard to oceanic 
variability. So they emphasize that area of the research. A lot 
of the peer-reviewed research, for example Lonnie Thompson's 
work that we just spoke of, is funded by the National Science 
Foundation in large part. There are other organizations.
    Senator Allard. The Foundation, is that an agency of the 
Federal Government?
    Dr. Mann. Well, not directly.
    Senator Allard. The question is, what is an agency of the 
Federal Government? The only one that I could think of was 
NOAA, but are there other agencies?
    Dr. Legates. NASA does a lot of research, satellite-related 
efforts trying to estimate climate trends, incorporating 
satellite measurements as well.
    Dr. Mann. As well as the Department of Energy and EPA.
    Senator Allard. Yes, the Department of Energy.
    Dr. Legates. The Department of Interior as well.
    Senator Allard. OK. But we do not have any, say, each 
agency would have their own area of interest, but it seems to 
me that we need to look at global warming from a total 
perspective and I am trying to figure out if there is an agency 
that does that. I have talked to people within NOAA. There are 
arguments going on within that agency on the very topic that we 
are talking about here. There is absolutely no consensus within 
the agency, and I am trying to figure out if there is an agency 
out here that is taking on an overall view. I guess really 
there is not. We are just going to have to rely on the science 
community somehow or the other pulling all these views out from 
these various agencies. They look at the atmosphere, like you 
say, NASA looks at the stratosphere and higher up where your 
satellites are.
    Dr. Legates. On the surface, too.
    Senator Allard. We need somebody that looks at the effect 
on plant life, animal life, the total cycle; oxygen, 
CO2 and all that before you reach conclusions. I am 
just wondering who pulls all this together so that we can come 
up with a total picture of what is happening as far as changes 
to this Earth is concerned, because it is more than just one 
science.
    Dr. Mann.
    Dr. Mann. There is a program, the U.S. Global Change 
Research Program, which seeks to coordinate the various 
agencies on issues of fundamental importance in the research of 
climate variability and climate change. So I think that is 
their role.
    Senator Allard. OK. I want to get back a little bit to the 
absorption of sunlight, for example, on the Earth's surface. It 
seems to me, and I don't know how accurate this is. I want to 
check this out because it has been suggested to me by a number 
of people, that our absorptive surface on the Earth has 
increased. We still have the same amount of surface, but for 
example you have pavement in urban areas. We know that pavement 
is absorptive. Has that had an impact on global warming?
    Dr. Mann. Most definitely.
    Senator Allard. In your view?
    Dr. Mann. Yes, your statement is correct. The main increase 
in the absorption by the Earth's surface is due to the melting 
of snow and ice. That has certainly had a very large influence 
on the warming, but it is part of the warming.
    Senator Allard. So you do not think the construction of--we 
have more pavement than we did two centuries ago or a century 
ago.
    Dr. Mann. Most models suggest that that is a cooling.
    Senator Allard. Is there enough of that that we have more 
fields probably because of agriculture throughout the world, 
just not the United States. This is all over the world.
    Dr. Mann. Yes. Most estimates suggest that there is a small 
cooling of the Earth's surface due to those changes.
    Senator Allard. Would you all agree to that?
    Dr. Legates. The pavements are associated with the 
urbanization effect, which is part of the problem associated 
with where we have observational measurements. Generally where 
you have a decrease in the light and heat exchange that is 
evaporation of water taking place because we have removed 
trees; the fact that you have darker surfaces; you have canyon-
like effects. All of these lead to warmer temperatures in the 
city. The urban heat ion effect is well-documented and that is 
where virtually all of our observations are located.
    But there are also changes in land surface effects by the 
fact that we are removing vegetation and replacing it with 
grasslands, for example, deforestation, de-vegetation. A lot of 
these are on very large-scales too, and they do change the 
color and character of the Earth's surface and hence the 
absorptive characteristic.
    A lot of the cryosphere, a lot of the ice and snow is 
temporally variable. We have a growing area and decreasing 
area, so that does integrate itself out over time to some 
extent.
    Senator Allard. Does the absorptive surface of the Earth's 
surface have an impact on whether we have a warmer temperature 
or not today?
    Dr. Legates. Yes, absolutely.
    Dr. Soon. Oh certainly, yes.
    Senator Allard. I am a little bit confused of what the 
final view is. Do we increase temperature or do we cool the 
temperature?
    Dr. Mann. Can I comment?
    Senator Allard. Yes. You said that it cooled.
    Dr. Mann. Yes, the effects that----
    Senator Allard. OK, now, I would like to hear from----
    Dr. Mann [continuing]. That is not the whole story. What he 
said is correct, but the effect that is dominant in models in 
about three or four different studies published in the past 2 
years on precisely this question is actually the change in 
absorption by the land surface due to deforestation and other 
agricultural changes. That leads to an overall cooling of the 
globe, even in the face of other possible effects of warming.
    Senator Allard. Would you agree with that?
    Dr. Legates. Not necessarily. In particular, you are 
changing a characteristic, but you are also changing the other 
interactions. You are changing the vegetation and you are 
changing the evaporative characteristics.
    Senator Allard. But your bottom line is that you think 
that, with increased absorptive rate on the Earth's surface, it 
has a cooling or a warming effect?
    Dr. Legates. If you increase the absorption rate on the 
Earth's surface, you will have to have a net warming effect.
    Dr. Soon. You have to have a warming.
    Senator Allard. You have a warming.
    I mean, to me this is a fairly fundamental concept, and 
here we are, we have disagreement at this table about that.
    Dr. Soon. I don't think Dr. Mann is listening to your 
question.
    Senator Allard. To me, from my practical experience, it 
seems to me that there is a warming effect. When I walk out on 
a pavement with my bare feet, they get burnt. If I walk on 
grass, my feet feel a lot cooler. I just look at it from a 
practical aspect. So Dr. Mann, would you explain to me why 
there is a difference in what you say and what I am feeling 
physically when I walk on the surface of the Earth?
    Dr. Mann. Sure. When you are walking, you are only covering 
a pretty small fraction of the surface area of the Earth. The 
effect that you are talking about, for example, the urban heat 
island effect of blacktop and its tendency to absorb heat, that 
is overwhelmed by larger-scale changes that we do not 
necessarily see because they are not where we are walking 
around. Large areas of the surface area of the Earth are being 
changed in terms of their vegetation characteristics. That has 
a net cooling. The answer on that is clear in the peer-reviewed 
research.
    Senator Allard. The reason I bring this up is that in the 
State of Colorado we have a lot of variation. We go from 3,000 
to over 14,000 feet and we have a lot of different ecological 
systems in Colorado, depending on altitude and moisture and 
everything.
    We have a weather reporting station in a rural area, in the 
plains of Colorado, and the data that I am getting from them, 
there is no indication of change as far as temperature is 
concerned. Yet as we move into the more urban areas, then we 
get weather stations that are indicating a higher temperature. 
So I am wondering worldwide, with the urbanization of the 
world, is there a possibility that we could be dealing with 
some temperature changes that are a result of the absorptive 
surface on the Earth like urbanization, you mentioned 
urbanization, we have a lot more than we used to have. Doesn't 
this have an impact on temperature?
    Dr. Legates. Yes, definitely. Essentially, I do not think 
Dr. Mann answered the question appropriately in that your basic 
question was, if we absorb more radiation at the surface, will 
the temperature not go up? That is correct. The temperature 
will go up. In a sense, that is physics.
    Senator Allard. Would you agree with that, Dr. Mann?
    Dr. Mann. No. He has gotten about three different things 
wrong here.
    Senator Allard. No, listen.
    Dr. Mann. His first statement is wrong.
    Senator Allard. I understand your statement. You are taking 
a broader atmospheric picture. You are taking a total picture. 
But the statement he made at this point, would you agree with 
that?
    Dr. Mann. No. It is not correct.
    Senator Allard. You would not agree?
    Dr. Mann. The statement that he made was that there is an 
urban heat bias in the estimate of the surface temperature 
changes of the Earth.
    Senator Allard. I did not hear him say that.
    Dr. Mann. He said that earlier when he talked about urban 
heat bias.
    Senator Allard. I am talking about the comment that he just 
made. Would you repeat the comment, Dr. Legates?
    Dr. Legates. I essentially said the basic physics is that 
if you make the Earth's surface darker, you will absorb more 
energy, you will reflect less energy, as a result the surface 
temperature should increase.
    Senator Allard. Would you agree with that scientific fact?
    Dr. Mann. That statement would be in the first chapter of 
most textbooks. Yes.
    Senator Allard. Dr. Soon, I did not mean to ignore you. You 
wanted to say something?
    Dr. Soon. I tried to just emphasize that that is all you 
are asking.
    Senator Allard. Yes.
    Dr. Soon. If you increase absorptivity of the surfaces by 
changing it through any means, then more heat will be retained.
    Senator Allard. I think part of the problem that we are 
running into here on the testimony is that we are not talking 
on the same terms. I think that we have to be very careful when 
we review the record and when we are listening to the witnesses 
here, Mr. Chairman, that we understand that we are all talking 
on the same terms in making the same point. I think the 
committee gets confused when we start talking from different 
terms and different perspectives.
    I am just trying to simplify this argument down. I guess 
what I am coming to is that, as I have stated earlier, it is 
easy for me to believe that there is a trend in warming. The 
bottom line is what is causing it and what is going to be the 
long-term effects with this.
    To me, the science is not entirely clear on that, and I do 
not see that that is being entirely clear on this panel because 
when I asked that question earlier, nobody gave me a specific 
on what they saw the effects were going to be. Maybe Dr. Mann 
did, and said that there was going to be warming. But most 
scientists when I talk to them just won't give me what they 
think the Earth is going to look like 1,000 years from now, or 
they will not necessarily step right out and say what are the 
causes of it because there are an awful lot of variables. I am 
not sure that scientists understand all those variables.
    Dr. Legates. I think that is the issue. It is so uncertain 
and there are so many things that go into the mix, that to say 
fairly definitively it will be such in the future is very 
difficult to say.
    Dr. Soon. We have to keep emphasizing that CO2 
is not the only player, the only factor. It is just highly 
short-sighted to just look at CO2 as just one sole 
cause of change for every other change that we see or any 
variations that we manage to record.
    Senator Allard. Yes. And when we talk about greenhouse 
gases, I think there is a tendency for us to think just in 
terms of CO2.
    Dr. Soon. Right.
    Senator Allard. But isn't water vapor? Water vapor is a big 
part of greenhouse gases.
    Dr. Soon. That would be the area of expertise by Professor 
David Legates. He studied that for almost 20 years.
    Senator Allard. I do not know as we understand all of the 
aspects of each one of those fractionated, if we were to pull 
out each CO2 or put out water vapor. What other 
gases do we have out there? Those are the main ones.
    Dr. Mann. The other two have commented. May I comment as 
well?
    Senator Allard. Let me finish my point. What are the 
greenhouse gases that we have?
    Dr. Mann. I will speak to that.
    Dr. Soon. Methane.
    Senator Allard. Oh, methane. OK. We have methane. But the 
main ones are water vapor and CO2. Water vapor being 
the largest, right?
    Dr. Soon. Yes.
    Dr. Mann. Can I comment on that?
    Senator Allard. Dr. Mann.
    Dr. Mann. Yes. There are trace gases like methane, carbon 
dioxide, chlorofluorocarbons, which we can actually control.
    Senator Allard. Well, carbon dioxide is a very small part 
of greenhouse gases? Is that what you are saying?
    Dr. Mann. No. There are several different greenhouse gases 
that we have to keep in mind, and it would be short-sighted to 
only talk about carbon dioxide. That is absolutely true.
    Senator Allard. Right.
    Dr. Mann. It is extremely misleading, however, when 
scientists cite the role of water vapor as a greenhouse gas. 
The concentration of water vapor in the atmosphere cannot be 
controlled by us directly, unlike the other trace gases. It is 
fixed by the surface temperature of the Earth itself. This is 
actually another chapter one textbook-type of result that we 
know to be true in the scientific community.
    So we cannot change that freely. We can only change the 
other trace gases. When we do change those, we warm the Earth. 
We evaporate more water vapor and that gives us what we call a 
positive feedback that actually exaggerates the problem. But 
the water vapor itself cannot be the source of the problem.
    Dr. Soon. It is really also scientifically inaccurate to 
say that we can really control CO2. The global 
carbon cycle--we do not understand it well enough to really 
match or account for the CO2 that we emitted. How 
much of it is really going into the ocean? How much of it has 
really gone into the forest? We do not have actually a full 
control of those parameters, as Dr. Mann would like to state on 
the record.
    Senator Allard. Dr. Legates, do you have any comment?
    Dr. Legates. Generally, the idea is that water vapor is the 
most important greenhouse gas. Period. That is Chapter One of 
any introductory text. The issue is, then, if we are 
associating with the effects of carbon dioxide and methane, 
which by the way has actually started to decrease over time, 
what we have found out is that in particular we are dealing 
with with small matters where the bigger issues are not 
controllable.
    Again, the sun is the biggest game in town and it is not 
controllable. At least I do not know that we can turn off the 
sun or control its output.
    Senator Allard. OK. Senator Carper I think has a few 
questions.
    Senator Inhofe. We have a serious problem here now, I am 
sorry to say, and that is that we are 30 minutes past our first 
panel and we are going to have to cut it off right now.
    Senator Allard. OK, Mr. Chairman.
    Senator Inhofe. I am very, very sorry. Thank you very much. 
I appreciate the fact that you are here.
    We would call our next panel up. I apologize to the next 
panel because of the length of the first panel, we will have to 
cut this one short.
    Dr. Leonard Levin is the program manager, Electric Power 
Research Institute; Dr. Gary Myers, professor of neurology and 
pediatrics, University of Rochester Medical Center; and Dr. 
Deborah Rice, the toxicologist, Maine Department of 
Environmental Protection, Bureau of Remediation and Waste 
Management.
    I would like to ask each of you to confine your opening 
comments to 5 minutes, if you would. Your entire statement will 
be made a part of the record. We would start, Dr. Levin, with 
you.

  STATEMENT OF LEONARD LEVIN, PROGRAM MANAGER, ELECTRIC POWER 
                       RESEARCH INSTITUTE

    Dr. Levin. Thank you, Mr. Chairman, members of the 
committee.
    I am Dr. Leonard Levin. I have come to discuss recent 
findings on mercury in the human environment. I serve as 
technical leader at EPRI, which is a nonprofit collaborative 
research organization. My remarks today represent my synthesis 
of research findings and are not an official statement of EPRI 
position.
    It is a privilege to provide the committee this testimony 
on the science of mercury. I would like to address three key 
questions: sources of mercury; its deposition from the 
atmosphere to the Earth's surface; its potential accumulation 
in fish.
    Where does mercury in the U.S. environment originate? 
Mercury is clearly a global issue. Recent estimates are that 
2,340 tons of industry-related mercury are emitted globally. 
Over half of these originated from Asian sources. Of the global 
total, the United States is estimated to emit roughly 166 tons 
in total; U.S. utilities about 46 tons. In addition, it is 
estimated that another 1,300 tons of mercury emanates from 
land-based natural sources around the globe, and another 1,100 
or so tons comes from the world's oceans.
    Recent findings from the joint United States and Canadian 
METAALICUS field experiment show that a fairly small amount of 
deposited mercury, no more than 20 percent or so, re-admits to 
the atmosphere, even over a 2-year period. The implications are 
that mercury may be less mobile in the environment than we 
previously thought.
    Studies by EPRI have shown that much of the mercury 
depositing in the United States may originate on other 
continents. Model results show that for three-quarters of the 
continental U.S. land area, more than 60 percent of the mercury 
received comes from outside the country. Only 8 percent of U.S. 
territory receives two-thirds or more of its mercury from U.S. 
sources.
    To check this with data, aircraft measurements were carried 
out by EPRI and the National Center for Atmospheric Research in 
Boulder, Colorado. Mercury and winds from the Shanghai, China 
region were tracked over the Pacific for 400 miles toward the 
United States. A second set of flights from Monterey, CA found 
that same plume from China crossing the California coast and 
entering U.S. territory. One implication is that there may be a 
management floor for U.S. mercury, a level below which the 
amount of mercury depositing to the surface cannot be reduced 
by domestic action alone.
    Second, what are the primary sources of mercury in fish in 
the environment? Global mercury emissions appear to have peaked 
in the 1980's and declined or held steady since then. Professor 
Francois Morel of Princeton University, and colleagues, 
recently analyzed specific tuna for mercury, comparing recent 
catches with those from the 1970's. Despite changes in mercury 
emissions over those 30 years, mercury levels in tuna did not 
change between the samples. One conclusion they reached is that 
the mercury in such marine fish is not coming from emission 
sources on land, but from natural submarine sources of mercury. 
Again, this implies there may be a management floor for mercury 
in marine fish, which make up most of the U.S. fish diet.
    Third, how can potential mercury reductions change mercury 
deposition? EPRI recently completed work to assess what might 
ensue in the atmosphere and in U.S. fish if further mercury 
emission reductions are carried out in the United States. The 
approach linked models of atmospheric mercury chemistry and 
physics with Federal data on mercury in fish in the U.S. diet, 
along with a model of costs that would be needed to attain a 
given reduction level. There are currently about 179 tons of 
mercury depositing each year in the United States from all 
sources, global and domestic. Current U.S. utility emissions of 
mercury are about 46 tons per year.
    EPRI examined one proposed management scenario that cut 
these utility emissions from 46 tons to 25 tons per year. The 
analysis showed that this emissions cut of 47 percent resulted 
in an average 3 percent decline in mercury deposition in the 
United States. Some isolated locations making up less than one 
one-hundredth of the U.S. land area experienced drops of up to 
30 percent. The economic model showed that costs to attain 
these lower levels would be between $2 billion and $5 billion 
per year for 12 years. This demonstrated U.S. mercury patterns 
may be relatively insensitive to the effects of this single 
category of sources.
    In addition, most of the fish consumed in the United States 
are ocean fish which would be only slightly impacted by a 
reduction of 24 tons of mercury per year solely in the United 
States, out of 2,300 tons globally. Wild freshwater fish within 
the United States might show a greater reduction in mercury 
content, but they make up a very small part of the U.S. diet, 
compared to ocean or farm-raised fish.
    These deposition changes were translated into how much less 
mercury might enter the U.S. diet via these three categories of 
fish. We found that less than one-tenth of 1 percent fewer 
children would be born at-risk due to their mother's taking in 
mercury at lower levels from fish consumed in the diet.
    So to summarize, a drop of nearly half in utility mercury 
emissions resulted in an average drop of 3 percent in mercury 
depositing to the ground, and a drop of less than one-tenth of 
a percent in the number of children at risk. These recent 
findings are a small part of the massive international research 
effort to understand mercury and its impacts. EPRI and others, 
including U.S. EPA and the Department of Energy, are jointly 
racing to clarify the complex interactions of mercury with 
natural systems, an important part of its cycling, and its 
impacts on human health. With improved understanding, informed 
decisions can be made on the best ways to manage mercury.
    Thank you for this opportunity to deliver these comments to 
the committee.
    Senator Inhofe. Thank you, Dr. Levin.
    Dr. Rice.

     STATEMENT OF DEBORAH C. RICE, TOXICOLOGIST, BUREAU OF 
     REMEDIATION AND WASTE MANAGEMENT, MAINE DEPARTMENT OF 
                    ENVIRONMENTAL PROTECTION

    Dr. Rice. I would like to thank the committee for this 
opportunity to present information on the adverse health 
consequences of exposure to methyl-mercury in the United 
States.
    I am a neurotoxicologist who has worked on the 
neurotoxicity of methyl-mercury for over two decades and have 
published over 100 papers on the neurotoxicity of environmental 
chemicals. Until 3 months ago, I was a senior toxicologist at 
the Environmental Protection Agency. I am a coauthor of the 
document that reviewed the scientific evidence on the health 
effects of methyl-mercury for EPA. This document included the 
derivation of the acceptable daily intake level for methyl-
mercury.
    I would like to focus on four points. No. 1, there is 
unequivocal evidence that methyl-mercury harms the developing 
human brain. No. 2, EPA used analyses of three large studies in 
its derivation of an acceptable daily intake, including the 
studies in the Seychelles Islands which found no adverse 
effects. No. 3, 8 percent of women of childbearing age have 
levels of methyl-mercury in their bodies above this acceptable 
level, and studies have documented cardiovascular disease in 
men at low levels of methyl-mercury, suggesting that an 
additional potentially large segment of the population is at 
risk.
    Studies performed around the world have documented harmful 
effects of environmental methyl-mercury exposure on children's 
mental development. Three major studies were analyzed by the 
National Research Council panel in their expert review: In the 
Faroe Islands in the North Atlantic, and the Seychelles Islands 
in the Indian Ocean, and in New Zealand. Two of these major 
studies, as well as six smaller studies, identified impairment 
associated with methyl-mercury exposure. The Seychelles Island 
study is anomalous in finding no effects. Adverse effects 
include decreased IQ and deficits in memory, language 
processing, attention and fine motor coordination.
    The NRC modeled the relationship between the amount of 
methyl-mercury in the mother's body and the performance of the 
child, and calculated the level associated with the doubling of 
the number of children that would perform in the abnormally low 
range. The NRC panel did this for each study separately and for 
all of the three studies combined, including the negative 
Seychelles study.
    EPA used the NRC analyses in deriving its acceptable daily 
intake level of methyl-mercury. EPA performed the relevant 
calculations based on each of the two positive studies, as well 
as the integrative analysis of all three studies. The 
acceptable level is the same whether it is based on the 
integrative analysis of all three studies, or on the Faroe 
Islands study alone.
    The acceptable level would be lower if only the New Zealand 
study were considered. Only if the negative Seychelles study 
alone were used, while ignoring the values calculated for the 
Faroe Islands and New Zealand studies, would the acceptable 
intake level be higher than the current value. EPA believed 
that to do so would be scientifically unsound and would provide 
insufficient protection to Americans.
    Data from a survey representing the U.S. population 
collected over the last 2 years revealed that about 8 percent 
of women of childbearing age had blood concentration of methyl-
mercury above the level that EPA believes is safe. This 
translates into over 300,000 newborns at risk for adverse 
effects on intelligence and memory, ability to pay attention, 
language skills and other abilities that are required to be 
successful in our highly technological society.
    There is an additional concern regarding the potential for 
harm as a result of environmental methyl-mercury exposure. 
Three studies found a relationship between increased methyl-
mercury levels and atherosclerosis, heart attacks and death, 
and it is unknown whether there is a level of mercury that will 
not produce harm. It is important to understand that the 
cardiovascular effects associated with methyl-mercury may put 
an additional very large portion of the population at risk.
    In summary, there are four points that I would like the 
committee to keep in mind. First, at least eight studies based 
on populations around the globe found an association between 
methyl-mercury levels and impaired neuropsychological function 
in children. The Seychelles Islands study is anomalous in 
finding no effects. Second, both the NRC and the EPA included 
the Seychelles Islands study in their analysis. The only way 
that the acceptable intake of methyl-mercury could be higher 
would be to ignore the two major positive studies, as well as 
six smaller studies and rely solely on the one study that 
showed no effects.
    Third, there is a substantial percentage of women of 
reproductive age in the United States with levels of methyl-
mercury in their bodies above what EPA considers safe. As a 
result, over 300,000 newborns each year are exposed to 
potentially harmful levels of methyl-mercury. Fourth, increased 
exposure to methyl-mercury may result in cardiovascular disease 
and even death in men from heart attack, suggesting an 
additional large segment of the population is at risk.
    Additional information has been provided to the committee. 
Thank you for your time and attention.
    Senator Inhofe. Thank you, Dr. Rice.
    Dr. Myers.

STATEMENT OF GARY MYERS, PROFESSOR OF NEUROLOGY AND PEDIATRICS, 
DEPARTMENT OF NEUROLOGY, UNIVERSITY OF ROCHESTER MEDICAL CENTER

    Dr. Myers. Thank you for the opportunity to present the 
views of our research group on the health effects of methyl-
mercury exposure. My name is Gary Myers. I am a pediatric 
neurologist and a professor at the University of Rochester in 
New York, and just one member of a large international team 
that has been studying the human health effects of methyl-
mercury for nearly 30 years. For 20 of those years, our group 
has specifically studied the effects of prenatal methyl-mercury 
exposure.
    In 1971 and 1972, there was an epidemic of methyl-mercury 
poisoning in Iraq. The source of exposure, unlike in Japan, was 
maternal consumption of sea grain coated with a methyl-mercury 
fungicide. We looked at a number of children in that study and 
measured the exposure of the fetus using the maternal hair as 
the biomarker. It is the only biomarker that has been 
correlated with brain levels. We concluded that there was a 
possibility that exposure as low as 10 parts per million in 
maternal hair might be associated with adverse effects on the 
fetus. This value is over 10 times the average in the United 
States and five times the average in Japan, but individuals 
consuming large quantities of fish can easily achieve this 
level.
    The hypothesis of our study in the Seychelles was that 
methyl-mercury from fish consumption might affect child 
development. In fact, we all thought it would. Since millions 
of people around the world consume fish as their primary source 
of protein, we thought it was only reasonable to investigate 
the question directly. We selected the Seychelles because of 
two reasons. First, they eat large amounts of fish. The average 
mother eats 10 times as much as women here in the United 
States.
    Second, the fish in Seychelles has an average mercury 
content of about 0.3 parts per million, which is approximately 
the same as commercial fish here in the United States. The 
Seychelles study is a collaborative study which was begun under 
the auspices of the WHO and has been carried out by a U.S.-led 
team of international researchers from the University of 
Rochester, Cornell University and the Ministries of Health and 
Education in Seychelles. The funding has come from the National 
Institutes of Environmental Health Sciences, with some minor 
funding from the Food and Drug Administration and the 
governments of Seychelles and Sweden.
    The Seychelles was chosen for a number of reasons, 
primarily because there was no overt mercury pollution and many 
of the factors that complicate epidemiological studies of low-
level exposures were simply not present. There was universal 
free and readily available health care in Seychelles. Prenatal 
care is nearly 100 percent. The birthrate is high and the 
general health of the mothers and children is very good. In 
addition, education is free, universal, and it starts at age 
3\1/2\.
    Before starting the study, we carefully controlled for a 
number of things. To minimize the possibility of bias, a number 
of decisions were made. First, no one in Seychelles, including 
any of the researchers who visit the island, would know the 
level of exposure of any child or mother unless our results 
indicated that children were indeed at risk. Second, because of 
the known problems with developmental delay in certain 
disorders, those children would be excluded from the study. 
Third, the tests administered would include all of the tests 
that have been used in other studies, plus other things that we 
thought might detect subtle changes.
    Fourth, we would do this testing at specific age windows. 
Fifth, we would adjust for multiple confounding factors, things 
that are actually known to affect child development such as 
socioeconomic status, the mother's intelligence, and birth 
weight. And sixth, we established a data analysis plan before 
the data were collected to minimize the possibility that the 
data would just be repeatedly analyzed until the anticipated 
effect was in fact determined.
    We have now carried out five evaluations of the children 
over 9 years. The study has focused on prenatal exposure. The 
exposure of both mothers and children has been in the range of 
concern, from 1 to 27 parts per million. We have done extensive 
testing with over 57 primary endpoints determined so far. The 
study has found three statistical associations with prenatal 
methyl-mercury exposure. One was adverse; one was beneficial; 
and one was indeterminate. These results might be expected to 
occur by chance and do not support the hypothesis that adverse 
developmental effects result from prenatal methyl-mercury 
exposure in the range commonly achieved by consuming large 
amounts of fish.
    The findings from our research have been published in the 
world's leading medical journals, including the Journal of the 
American Medical Association, the Lancet, and a soon-to-be-
published review in the New England Journal of Medicine. We do 
not believe that there is presently good scientific evidence 
that moderate fish consumption is harmful to the fetus. In the 
words of Dr. Lyketsos, a distinguished researcher from Johns 
Hopkins, who wrote the editorial with our Lancet articles:

          ``On balance, the evidence suggests that methyl-mercury 
        exposure from fish consumption during pregnancy of the levels 
        seen in most parts of the world does not have measurable 
        cognitive or behavioral effects in later childhood. However, 
        fish is an important source of protein in many countries and 
        large numbers of mothers around the world rely on fish for 
        proper nutrition. Good maternal nutrition is essential to the 
        baby's health.''

    Thank you.
    Senator Inhofe. Thank you, Dr. Myers.
    We are going to try to adhere to a 5-minute round of 
questioning. Let me just share with you, which I think you 
already know, you folks are looking at the medical effects of 
mercury. We also up here have to consider the economic 
effects--the problems that are out there. Right now on the 
Senate floor, they are debating the energy bill. We have an 
energy crisis in this country, and if cofire should go out, and 
that could happen from either CO2 or mercury, it 
would be a very serious crisis. I think anticipating that this 
will happen, several people have moved off-shore, moved to 
other places. So that is something that is really, I guess you 
would say our major, at least one of my major concerns.
    Now, just for all of the witnesses, you stated that the 
U.S. utility mercury emissions are 46 tons a year. Tell us what 
happens to this mercury. Help us visualize where does it come 
from; where does it go; how much is deposited in the United 
States; how does this compare with the amount that is deposited 
in the United States from global sources.
    Would you like to start, Dr. Rice?
    Dr. Rice. That is really not my area of expertise, so I 
cannot speak to it.
    Senator Inhofe. All right.
    Dr. Myers.
    Dr. Myers. It is not my area of expertise.
    Senator Inhofe. Come on, Dr. Levin.
    Dr. Levin. All right.
    [Laughter.]
    Dr. Levin. Utility mercury of the various sources of 
mercury is probably the best-studied category, partially 
because there are more individual sources than there are of 
many of the other categories. We believe that roughly half on 
average coming out from utilities is made up of the divalent 
form of mercury, which is about a million times or so more 
soluble in water than the elemental form, which is the silvery 
liquid that you probably remember from high school chemistry. 
So of this mercury emitting from all utilities in the United 
States, roughly half of it is more highly water soluble and the 
other half will tend to go into regional and global 
circulation.
    We calculate that about 70 percent or so of the mercury 
emitted from utilities leaves the United States, and the other 
30 percent or so deposits within the United States across the 
country. These are somewhat similar to the numbers that EPA is 
deriving as well. Some of this mercury that deposits to the 
surface will wind up in receiving waters, and a very small 
fraction of it, probably less than 1 percent, will eventually 
be turned into the organic form by bacterial action. It is that 
organic form that has the potential to reach humans through 
accumulation in some fish.
    Again this does not happen in all waterways and with all 
fish species. It tends to happen in waterways that have full 
food webs that go to high-level fish that grow quite large, and 
it is larger, older fish that tend to accumulate more mercury.
    Of the exposure in the community in the United States, 
almost all of it is through intake from fish and the mercury in 
those fish, although the levels taken in can vary from very 
little or almost none, to amounts of concern. There is almost 
no exposure by inhalation. That is a very small part of the 
exposure.
    So our concern is to follow this mercury from its sources 
through to where it winds up in fish and eventually may be 
consumed by humans. That is the trick, scientifically.
    Senator Inhofe. Thank you, Dr. Levin.
    Dr. Rice, the American Heart Association and the World 
Health Organization recommend that fish should be a part of 
everyone's diet, concluding that the benefits of eating fish 
outweigh the risks of adverse effects, which as you state in 
your testimony are potential risks. Since eating fish offers 
substantial health benefits, shouldn't the EPA's referenced 
dose be revised to take this into account, or does it?
    Dr. Rice. Well, I agree totally, and I have to say that I 
am no longer with EPA so I am not speaking as a representative 
of the agency. I need to make that clear. So some of these 
opinions will be those of the agency when I left, and some will 
be mine.
    But the scientific community at large and the EPA and me 
personally recognize that fish is a good source of protein. It 
also confers cardio-protective effects. There are also omega-
three fatty acids in fish that are essential when the fetus is 
building its brain. There is new evidence that eating fish also 
may be beneficial to the mental development or the mental 
function of the elderly. I suspect that it is probably 
important for all of us.
    So the dichotomy is not eat fish/don't eat fish. The 
important thing to be able to do is to come out with some 
recommendations to the community that allow people to eat fish, 
but not to eat fish that has increased levels of methyl-
mercury. So EPA thinks that, I was part of that EPA panel, so 
when I was part of that EPA panel we firmly believe that the 
RFD should not be any higher, and in the light of some evidence 
that we were not able to analyze at the time, might even should 
be lower than it is presently.
    So it is not a question of increasing the reference dose. 
It is a question of making sure that the American public can 
eat fish that does not have undue levels of methyl-mercury in 
them.
    Senator Inhofe. Thank you very much.
    Dr. Myers, in selecting the Seychelles as a location for 
your research, what other locations did you consider other than 
the Seychelles Islands?
    Dr. Myers. We started studies on the coast of South America 
and looked also at the Maldive Islands as another possibility.
    Senator Inhofe. Yes. I kind of wanted to get to the Faroe 
Islands. Did you consider them for your research?
    Dr. Myers. We did not consider the Faroes in our research.
    Senator Inhofe. It is my understanding that, and for those 
of us who are not scientists here, that some of the problems, 
let's take the Faroe Islands and see if I have this right, that 
there is an inordinate amount of whale meat that is consumed 
there and there are PCBs in there. I do not know whether you 
can distinguish between the harm of one or the other, but is 
this a factor that should be considered?
    It is my understanding, and I won't say this right, but 
there are different levels of mercury that are found. One is 
from the primary fish, and the other is from whales that eat 
other fish, so it has a multiplying effect. Is this taken into 
consideration?
    Dr. Rice. The Faroe Islands study and the Seychelles 
Islands together have been reviewed by at least two very 
distinguished peer-review panels. That issue, the issue of the 
pattern of intake of methyl-mercury and potential co-exposure 
for PCBs has been discussed extensively by the scientific 
community.
    The Faroe Islands' population does eat whale meat. They may 
eat a large whale dinner occasionally. They also tend to dry 
the whale meat, and so they snack on it in addition to eating a 
so-called bolus dose, what we call a bolus dose. So they have a 
low level of methyl-mercury intake which may be occasionally 
punctuated with a higher intake level. The source of methyl-
mercury does not matter, whether it is through fish or through 
whale. So the fact that it is whale meat per se is not really 
relevant.
    None of the panels, including the National Research Council 
panel, could come to any kind of conclusion about the 
importance of the pattern of intake, because the data just are 
not available. There just are not scientific data that speak 
directly to that. But what the Faroe Islands investigators have 
done because this was raised as a concern and because they have 
hair, and they had hair from their population that was stored, 
they were able to go back and do segmental analysis, so that 
you cut the hair up into tiny little pieces and look at mercury 
levels across the length of the hair.
    What they did was they eliminated the mothers that had the 
most variable hair levels that might suggest that there was 
this bolus exposure of these particular women and these 
particular fetuses. What they found was that the effect was 
actually stronger when they eliminated these women, which makes 
a certain amount of sense because you are decreasing 
variability when you do that.
    Senator Inhofe. Thank you, Dr. Rice.
    Senator Jeffords.
    Senator Jeffords. Thank you all for your testimony on this 
very important and timely topic.
    Some of you have seen this morning's New York Times full-
page article on mercury and its health effects. This helps to 
set a context for our discussion.
    Dr. Rice, what exactly is a reference dose level and what 
does it mean in terms of the so-called safe levels of fish 
consumption? Does EPA reference dose level include a built-in 
tenfold safety threshold?
    Dr. Rice. The reference dose is designed to be a daily 
intake level that a person could consume over the course of 
their lifetime without deleterious effects. So it is designed 
to be the amount of mercury you could eat every day in your 
life and not harm yourself.
    Now, when EPA did its calculation, it is important to 
understand that when the National Academy of Sciences modeled a 
number of endpoints for each of the studies, and those were the 
Faroe Islands study, the New Zealand Study, both of which found 
effects, as well as the Seychelles study which did not, they 
identified not a no-effect level. They identified a very 
specific effect level. That effect level is associated with a 
doubling of the number of children that would perform in the 
abnormal range, in other words, the lowest 5 percent of the 
population. So this is in no way a no-effect level.
    To that, the EPA applied a tenfold so-called uncertainty 
factor. The point of that was to take into account things that 
we did not know, data that we did not have, as well as the 
pharmacodynamic and the pharmacokinetic variability. Now, there 
were actually data that was again modeled by the NAS and 
reviewed by the NAS, that says that the pharmacokinetic 
variability, in other words the woman's ability to get rid of 
methyl-mercury from her body, differs by a factor of three. So 
that already takes up half of the uncertainty factor.
    But in addition to that, it is important to understand that 
when the Faroe Islands folks analyzed their data, they 
eliminated mothers with mercury levels above 10 ppm in their 
hair, which was really right about at the effect level that the 
NAS identified. The effects were just about as strong even 
below 10 ppms. So again, that is very strong evidence that 
there is not a factor of 10 safety.
    In addition to that, when the NAS modeled their data, it 
turned out that both of the New Zealand study and the Faroe 
Islands study not only was there no evidence that there was a 
threshold, in other words a level below which there were no 
effects, but in fact the curve was actually steeper at the 
lower levels. The NAS used a straight line when they modeled 
the data because they were uncomfortable about using curves 
that were steeper at the lower end than they were at the higher 
end, but subsequent to that there have been studies come out 
with regard to lead exposure, for example. There are now 
several studies where that has also been found for lead 
exposure.
    So this may in fact be a very real effect. So not only is 
there not a safety factor of 10. There might be virtually no 
safety factor at all.
    In addition to that, something that EPA recognized at the 
time, but we were not able to quantitate because we did not 
have the data, but it has now been quantitated, we assumed that 
the relationship between the mother's blood level of methyl-
mercury and the fetus' blood level of methyl-mercury were the 
same, because of course we have the body burden; we have cord 
blood in the fetus, we have to get back to intake by the 
mother. We know now that in fact the ratio is more like 1.7, 
and for some mothers it is as much as over 3.
    So if we were to recalculate the reference dose just based 
on this new information, it would decrease from 0.1 to 0.06.
    Senator Jeffords. Dr. Rice and Dr. Myers, would you 
recommend that Members of Congress and regulatory agencies base 
their decisions on whether and how much to reduce human-made 
mercury emissions on the findings from any one study?
    Dr. Myers. Our group has been involved in the science of 
studying whether you could find effects at low levels, and we 
have not been involved in policy. There is a general scientific 
principle, I think it is important to look at multiple 
different studies. However, these studies are complicated and 
one has to look at what kind of studies you are dealing with. 
Some are simply descriptive. They take a group of people and 
describe something. It is a basic epidemiological principle 
that you cannot assign causation from a descriptive study.
    So one has to look at the studies that are larger and 
follow children over time, and control for a lot of confounding 
factors which complicate these type of studies very much 
actually. The Seychelles study in fact is not a negative study, 
as has been stated. We did, in fact, find associations with 
things that are known to affect child development, such as 
socioeconomic status, maternal intelligence, the home 
environment and other things. What we did not find was an 
adverse association with prenatal methyl-mercury exposure in 
the Seychelles.
    Senator Jeffords. Dr. Rice.
    Dr. Rice. I agree with Dr. Myers. These studies are very 
complex. I think that that is even more reason not to rely on 
one study while eliminating other studies for consideration.
    Again, these studies have been peer-reviewed numerous 
times. The Seychelles Islands study and the Faroe Islands study 
have been reviewed now by several panels. They are both thought 
to be very high quality, very well-designed and well-executed 
studies.
    The NAS, as well as the previous panel, talked at great 
length about what might account for the differences between 
these studies. We really do not know what accounts for the 
differences between these studies. The NAS modeled three 
studies. The New Zealand study was also a positive study.
    The National Academy of Sciences and the EPA agreed with 
them that it was not scientifically justifiable for protection 
of the health of the American public to rely on the negative 
study and exclude the two positive studies. I said at least a 
couple of times in my testimony that what the NAS did to try to 
address that was to do an integrative analysis that included 
all three studies, including the Seychelles Islands study, and 
modeled it statistically.
    When EPA then took those analyses and derived, what we did 
was we derived a series of reference doses, kind of sample 
reference doses, that were based on a number of endpoints from 
both the New Zealand study and the Faroe study, as well as the 
integrative analysis of all three studies. The integrative 
analysis of all three studies also yields a reference dose of 
0.1. So that made me personally very comfortable that we were 
doing the right thing scientifically in our derivation of the 
reference dose.
    Senator Inhofe. These are supposed to be 5-minute rounds 
and it has been 8 minutes, so we will recognize Senator Allard.
    Senator Allard. Dr. Rice and Dr. Myers, you have in your 
comments talked about methyl-mercury as being the toxic 
compound as far as human health is concerned. Are there other 
mercurial compounds that are toxic to humans?
    Dr. Rice. Yes. All forms of mercury are toxic to humans.
    Senator Allard. Including the elemental form?
    Dr. Rice. Yes.
    Senator Allard. OK.
    Dr. Rice. But in terms of environmental exposure, it is 
really the methyl-mercury form that we are worried about 
because that is the form that gets into the food chain and is 
concentrated and accumulated up the food chain. That is what 
people actually end up being exposed to.
    Senator Allard. OK. Thanks for clarifying that. I 
appreciate that. So this gets into the environment and 
consequently in the fish or food chain or whatever. Is the 
starting point always bacteria operating on the elemental form 
of mercury? Or is it these various compounds that bacteria 
operate on and then end up being assimilated into the food 
chain? How does that happen?
    Dr. Rice. In most circumstances, it is the inorganic form, 
not the elemental mercury, but the inorganic form that is 
available to be taken up by various microorganisms.
    Senator Allard. How do we get to that organic form, the 
methyl-mercury? How do we get there?
    Dr. Rice. The microorganisms actually put a methyl group on 
as part of their metabolic processes.
    Senator Allard. Do they get that from elementary mercury? 
Is that the origin, or is it various compounds of mercury?
    Dr. Rice. Yes, it is just straight mercury. Now, in the 
Japanese outbreak, it was actually methyl-mercury that was put 
into the water, but that is a relatively unusual situation.
    Senator Allard. I see. OK, so my understanding, Dr. Levin, 
is that a lot of the mercury that is introduced into the 
environment of this country does not originate within the 
borders of this country. Is that correct? The suggestion is 
that a lot of the sources of mercury that come across that we 
may pick up in the soil is actually carried over by wind and 
what not from the Asian countries. Is that correct?
    Dr. Levin. That is correct, Senator, as far as the modeling 
shows, and that is consistent with work that EPRI has done, EPA 
and others have also done in the modeling.
    Senator Allard. Is this the elemental mercury that is being 
brought over?
    Dr. Levin. It is elemental, or the elemental form. It is 
also the inorganic form or the form that can be combined into 
salts.
    Senator Allard. Now, the inorganic form is not processed 
into the food chain? Did I understand that correctly?
    Dr. Levin. It is the inorganic form that is processed into 
the food chain.
    Senator Allard. Yes, it is the organic form.
    Dr. Levin. The two forms that are emitted from combustion 
sources are the elemental form, the chemicals found on the 
periodic chart.
    Senator Allard. Right.
    Dr. Levin. And the inorganic form, which combines with, for 
example, chlorine, to form the pure chloride, or is the form 
also found in minerals. Those two forms that wind up in the 
proper aquatic environments, it is the inorganic form that may 
be methylated and turned into the organic form.
    Senator Allard. Right.
    Dr. Levin. But it has to go from elemental to inorganic 
before the methylation can occur.
    Senator Allard. But my question is, is that the type of 
mercury that is being brought in from Asia, what form of 
mercury is that?
    Dr. Levin. Because of its long-range transport, it is 
primarily the elemental form, but the atmospheric chemistry of 
mercury changes that progressively into the inorganic form, 
which is the form that readily deploys.
    Senator Allard. Now, can the inorganic form be transferred 
into methyl-mercury?
    Dr. Levin. Yes, sir. That is the form.
    Senator Allard. So all those type of compounds get acted on 
by bacteria and then that is how that gets into the food chain.
    Dr. Rice. The elemental form and the inorganic form are 
converted back and forth.
    Senator Allard. I see.
    Dr. Rice. So it does not make any difference whether it 
reaches the North American shores as elemental mercury or 
inorganic mercury. Once it is deposited into the soil or the 
river, it is going to become inorganic mercury that then 
becomes available to be able to be turned into methyl-mercury.
    Senator Allard. OK, thank you.
    Now, here is the question, and I would like to have all of 
you respond to this. In your opinion, would a decrease in U.S. 
anthropogenic mercury emissions have an effect on global 
mercury levels? And part of the rest of the question is, 
apparently there is a high percentage of mercury present in the 
United States from outside our borders, so what effects can we 
expect from a decrease in our emissions? We have a couple of 
questions there and I would like to have all of you respond to 
those if you would.
    Dr. Rice. There is no question that there is a global 
cycling of mercury. A lot of the mercury in the United States 
comes in from someplace else, comes in from the West, but some 
of it may have in fact originated in the United States 
originally. This stuff really does circle the globe. So just 
because it is coming in from the West does not mean it wasn't 
ours to start with.
    Senator Allard. We do not know how much starts here.
    Dr. Rice. No, we do not, and I am not a modeler so I really 
cannot speak to that. But what I do know is that there is local 
deposition. In other words, the mercury that is released from 
power plants in the Midwest ends up downwind. I just moved to 
Maine, and Maine is the so-called tailpipe for that local 
deposition, for that local emission. There is a percentage of 
it, and Dr. Levin can tell you what the percentage is better 
than I can, that is locally deposited. I think it is something 
like 30 percent.
    Getting rid of those local sources would certainly at least 
help the Northeastern United States. Originally, the modeling, 
it was thought that this would take a long, long time. There 
are newer data now where small studies have actually been done 
that suggest that it might not be as grim as we originally 
thought; that these local changes can take place in a 
relatively shorter time, over the course of several years, 
rather than decades and decades as we originally may have 
feared.
    Senator Allard. Dr. Myers, do you have a comment on that?
    Dr. Myers. It is outside of my area of expertise.
    Senator Allard. Dr. Levin.
    Dr. Levin.  Dr. Rice is primarily correct on that. The 
deposition within the United States makes up about 30 percent 
of U.S. emissions. The rest of the emission go globally. Our 
modeling considered the fate of U.S. emissions and accounted 
for the amount that basically circles the globe and comes down 
after one trip around the world.
    It is also correct that there is local deposition that in 
some cases may be significant near particular groupings of 
sources. I indicated that in my testimony, that although the 
average change in deposition for the scenario was 3 percent, 
there were some small areas where it was as much as 10 times 
that on a percentage basis.
    So it calls for more detailed studies and particularly more 
looking at the science of tracking mercury found in fish back 
to its sources scientifically, that is, figuring out where it 
came from.
    Senator Allard. Thank you, Mr. Chairman. I believe my time 
has expired.
    Senator Inhofe. Yes, thank you.
    Senator Carper.
    Senator Carper. Thank you, Mr. Chairman.
    To our witnesses, again thank you for joining us. Thank you 
for your patience in bearing with us.
    Dr. Rice, did I understand you to say you have concluded 
two decades of work at EPA?
    Dr. Rice. Well actually most of it was not at EPA. I was at 
Health Canada for 22 years. I am American, but I graduated from 
the University of Rochester, got my Ph.D. from the University 
of Rochester so I have known Dr. Myers for many years. Then I 
went up there to work at Health Canada.
    Senator Carper. I see. Thank you for your service at EPA, 
and thank you all for real interesting testimony today.
    Sometimes these are fairly technical issues. What is 
helpful for me as I listen to the comments of each of your 
testimonies and your responses to our questions is to look for 
threads of consensus; not to focus so much on where you 
disagree, but to find some areas where you agree. I would just 
ask each of you to take a minute or two and just to talk about 
some of the areas where you think you agree, and which might be 
helpful to us as we wrestle with whether to craft legislation, 
enact legislation along the lines that Senator Jeffords has 
introduced, I have introduced, or the President has proposed.
    Can you help me with that? Dr. Levin, why don't you go 
first.
    Dr. Levin. Thank you, Senator. We agree that mercury is a 
highly toxic compound. Its presence in the U.S. diet may in 
some instances cause concerns for development of children 
neurologically. We agree that there may be other effects that 
have to be looked for in terms of the health effects.
    We also agree that the science of mercury is still 
emerging; that the linkage between health effects in particular 
areas, or for that matter in entire regions of the United 
States, and the sources of mercury is a critical question that 
would shape a wise course toward management decisionmaking. The 
work that I have been describing today is a step in doing that. 
The work that has been described by the other two witnesses 
today on health effects is a critical part of that linkage.
    Bringing this source-receptor issue together with the 
health effects on a specific geographic basis and among 
specific populations within the United States is a key part in 
answering the management questions.
    Senator Carper. Thank you.
    Dr. Myers, would you take a shot at my question please?
    Dr. Myers. I think we all agree that mercury is poisonous, 
every form. In high enough amounts, it is not only damaging to 
human health, but fatal generally. We all agree that it is 
worthwhile cleaning up the environment, I think. The question 
resolves at what level and at what cost. I think we all agree 
that these studies are extremely difficult to carry out and 
they are equally difficult to interpret because there are so 
many details to them. So it is so easy to end up with a bias 
either knowingly or unknowingly, generally I think unknowingly, 
that the interpretation of the details becomes incredibly 
important in these studies.
    Senator Carper. Thank you.
    Dr. Rice.
    Dr. Rice. I agree that we all know that methyl-mercury is 
toxic at high levels. There is absolutely no question about 
that. I agree with Dr. Myers that it is incredibly difficult to 
interpret these studies very often. They are very complex 
studies. There are a lot of variables, many of which we do not 
know. Epidemiology is an extremely blunt instrument. So that is 
why I think that it is important to look at the weight of 
evidence. There are a number of studies in humans that have 
documented effects of methyl-mercury at relatively low body 
burdens. In addition to that, there is a huge animal literature 
documenting effects and looking at the mechanisms of effects.
    We do not know why one study may be positive, whereas 
another may be negative. So we really have to go with the 
evidence as a whole.
    Senator Carper. And maybe cite your most serious area of 
disagreement among you as panelists.
    Dr. Levin. I would say disagreement probably rests in the 
question of the direction of research overall on the mercury 
issue, and how far that should continue.
    Senator Carper. Dr. Myers.
    Dr. Myers. I think the most serious area of disagreement is 
in the interpretation of the studies. We think that the Faroe 
Islands research is outstanding research. They have done a 
wonderful job. They have a great design. We are just not sure 
that they have been able to tease out from the mixture of 
chemicals present in whales a methyl-mercury component to it. 
That requires a lot of faith in their statistics and the 
details of the studies.
    In the case of the New Zealand study, most people 
discounted the New Zealand study for many years. It was only 
when it was reanalyzed in the late 1990's that people began to 
start thinking of it in other terms. So I think our biggest 
disagreement is in the interpretation of it.
    In addition, I think the weight of hundreds of small poorly 
done studies in difficult places such as the Amazon would never 
outweigh a really good study done looking at fish consumption.
    Senator Carper. Dr. Rice.
    Dr. Rice. I guess everything that Gary Myers just said is 
my biggest point of disagreement. All of the smaller studies 
are not poorly done. Some of them are well done. The Faroe 
Islands study and the Seychelles study have been extensively 
reviewed. They are both considered to be very, very good 
studies.
    The National Academy of Sciences looked at the issue of PCB 
co-exposure very, very carefully and asked the investigators to 
go back and do a number of additional analyses. Their 
conclusion was that the effects seem to be independent of each 
other. These are both neurotoxicants. Although they both had 
effects in the study, the NAS conclusion was that they were 
independent.
    Again, I think that we have go with a preponderance of 
evidence and not on just one study, no matter how well it has 
been done.
    Senator Carper. Mr. Chairman, I think this panel has been 
especially helpful to me. We thank you very, very much for your 
contributions today. Thank you.
    Senator Inhofe. Thank you, Senator Carper.
    Senator Clinton.
    Senator Clinton. Thank you, Mr. Chairman. I, too, want to 
thank the panel and welcome Dr. Myers from the University of 
Rochester, and Dr. Rice, your connection with Rochester, we 
will claim that as well.
    I want to pick up where Dr. Rice just concluded. We have 
set up a system of evidence in our legal system that looks at 
the preponderance of evidence; that looks at a reasonable 
person standard. I share Dr. Rice's concern that we are not 
adequately responding to the evidence we already have, which I 
think the preponderance of it, certainly based on the review by 
the National Academy of Sciences, suggests that we have a 
problem with the transmission mostly in utero by mother to 
child that leads to neurological problems that in turn lead to 
poor school performance.
    The 2000 report of the National Academy of Sciences found, 
I believe, that about 60,000 children might be born in the 
United States each year with this level of exposure that could 
affect school performance, but in your testimony you claim that 
more recent results from the CDC's National Health and 
Nutrition Examination Survey translate into over 300,000 
newborns per year. Is that correct?
    Dr. Rice. Yes. When the NAS did their analysis, the NHANES 
data was not available. The NHANES just started taking mercury 
blood and hair levels a couple of years ago, so those data have 
really become available since the NAS. They state that their 
60,000 children was an estimate. It is actually about 320,000 
children. Based on actual data that is representative of the 
U.S. population, it is above the EPA's reference dose.
    Senator Clinton. To me, this is truly alarming, that we 
have actual blood, hair sample, other kinds of physical 
examination which demonstrates that hundreds of thousands of 
our children are born each year potentially at risk for adverse 
affects on intelligence, memory, ability to pay attention, 
ability to use language and other skills.
    Mr. Chairman, we are facing an increasing number of 
children in our school systems with learning disabilities. 
There are not any easy answers as to why the numbers of 
children with such learning disabilities has increased. Senator 
Jeffords has been a champion of making sure that all children 
are given an adequate education. In New York alone, we have 
260,000 learning-disabled children. That is 50 percent of our 
special ed population. We spend $43 billion each year--$43 
billion--on special ed programs for individuals with 
developmental disabilities between three and twenty-one.
    Of course, not all special ed needs are the direct result 
of methyl-mercury exposure, but if it is demonstrably shown as 
we now have with evidence from the CDC's annual survey that we 
have levels of methyl-mercury in our children's bodies that is 
above what the EPA has determined to be healthy, and in fact 
some of us think the EPA standard is too low, but nevertheless 
if it meets that standard, then I would argue we have got to 
figure out how to address this environmental health challenge 
in a very short order.
    I have been working with a number of colleagues to try to 
address the better data collection and environmental health 
tracking that they need in the Individuals With Disabilities 
Act, and I think similarly on the scientific side with respect 
to better research and better analysis. But it is troubling to 
me that we are looking at a problem where the preponderance of 
the evidence I think is clear, where we know that there is a 
transmission, whether it is 60,000, 150,000, 300,000-plus 
children, and it needs some more effective response.
    I wanted to ask you, Dr. Rice, now that you are in Maine, 
from the State perspective, how closely do you work with the 
State health department on environmental health issues? Do you 
exchange information with the State health department and even 
with the State education department about some of the work that 
you are doing?
    Dr. Rice. I actually knew the State toxicologist for Maine 
quite well before I went up there, so I do interact with the 
health department. The methyl-mercury issue is very important 
to Maine. Maine has a very good program for trying to get rid 
of methyl-mercury from dental amalgams, from thermometers, from 
the kinds of things that can be controlled; to not put mercury 
in landfills because Maine understands that we are at the end 
of the pipeline for methyl-mercury deposition. Maine has a 
terrible problem with fish advisories. There are a lot of 
places where fish cannot be eaten in Maine because of the 
deposition of methyl-mercury.
    So I do work closely with the folks over there, and in fact 
my way here was paid by the air office, the Maine air office 
because the State of Maine is so very concerned about this 
issue. Maine is rural and it is poor, and it cannot really 
absorb the consequences of these kinds of additional exposures 
on the health of the people of Maine.
    Senator Clinton. Similarly, new science is demonstrating 
that we need lower standards for lead, based on what we are now 
determining. A lot of that groundbreaking work was done at the 
University of Rochester about lead exposures and the impacts of 
lead exposure. We can take each of these chemicals or compounds 
piece by piece, but I think that certainly when it comes to 
mercury and lead and their impacts on children's development, 
it is not something I feel comfortable studying and waiting too 
much longer on, particularly because there are so many indirect 
costs. I know that Dr. Levin's work looked at some of the risks 
and cost-benefits, but people do not seem to factor in this 
special education population that has been growing.
    Dr. Rice. If I may make a comment, I think your analogy is 
an apt one, and I think it is a very informative one. In 1985, 
there was a report to Congress on the cost-benefits of lead, of 
keeping lead out of gasoline, in fact. The benefits based on 
not only special education and things like lower birth weight 
with respect to lead, but also just the economic consequences 
of lowering the IQ of workers amounted to billions and billions 
of dollars a year in 1985 dollars or 1994 dollars. So as this 
effort goes forward in terms of figuring out how much it is 
going to cost to reduce mercury emissions, this other side of 
the equation, how much it is going to cost not to, needs to be 
kept very, very well in mind.
    Senator Clinton. Thank you, Dr. Rice.
    Senator Inhofe. Thank you, Senator Clinton.
    I thank the panel very much for their testimony.
    Senator Jeffords. I had a couple more questions.
    Senator Inhofe. Well, all right. It has to end at 12 
o'clock. Go ahead.
    Senator Jeffords. Dr. Levin, before setting a mercury max 
standard, would you agree that it makes sense for EPA to 
conduct a full modeling analysis of all available technology 
options and their emissions reduction potential, including the 
most stringent options?
    Dr. Levin. Yes, Senator. I think it is important for EPA to 
carry out a parallel study as EPRI has done, and to make that 
study public, as we have as well. I am not aware yet that they 
have actually done any modeling of a max standard since there 
has been no official proposal of one yet.
    Senator Jeffords. Dr. Myers, I believe your testimony is 
that the fish consumed with an average mercury content of 0.3 
parts per million has about the same mercury concentration as 
commercial fish in the United States. What are the 
concentration in non-commercial fish?
    Dr. Myers. Are you talking about the United States or the 
Seychelles?
    Senator Jeffords. In the United States.
    Dr. Myers. Well, all fish has some mercury in it. Most of 
the commercial fish in the United States, I understand, has 
less than \1/2\ part per million, but some of the fish, I am 
not sure what the non-commercial ones are, but it can go up to 
over two or three parts per million in some freshwater fish.
    Senator Jeffords. Dr. Rice and Dr. Myers, can you 
characterize the body burden of the pollutants like mercury in 
American children compared to the levels found in the 
Seychelles children?
    Dr. Myers. The average hair level in the mothers in 
Seychelles is 6.9 in the group we were studying. The average in 
the United States is less than one part per million. The 
average in Japan is somewhere around two parts per million.
    Senator Jeffords. Dr. Rice, any comment?
    Dr. Rice. No. That is correct, but I think it is important 
to understand that the NHANES data did identify some women, a 
very small percentage of women with higher hair mercury levels. 
I think it is important also to understand that the NHANES data 
are designed to be representative of the U.S. population as a 
whole, so that women who may eat more fish and may be at more 
risk for increased body burdens of methyl-mercury, such as 
immigrant populations or populations of people who are 
subsistence anglers and who eat inland fish. This is not 
captured. These populations are not captured by the NHANES data 
and I think that this needs to be kept in mind.
    Senator Jeffords. I have some further questions I would 
like to submit.
    Senator Inhofe. That would be perfectly appropriate. I 
appreciate it very much, and I appreciate the panel coming and 
also your patience from the long first session.
    We are now adjourned.
    [Whereupon, at 12 o'clock p.m. the committee was adjourned, 
to reconvene at the call of the chair.]
    [Additional statements submitted for the record follow:]
   Statement of Hon. Jon Cornyn, U.S. Senator from the State of Texas
    Mr. Chairman, I commend you for holding this important hearing 
examining what is known about the science of climate change, mercury 
and the potential health effects of mercury emissions from power 
plants.
    Given the timing of the energy debate on the Floor and this 
Committee's ongoing consideration of the Clear Skies Act, this is a 
very timely and important topic and I commend the Chairman for setting 
time aside to focus on the issue. I realize our focus today in regards 
to climate change is on the science, principally on temperature change. 
Two very different trains of thought are about to be presented to us 
today and I think this is positive and encourages a good, healthy 
debate. The question that this panel has to wrestle with is moving 
ahead with a greenhouse gas policy that may or may not be based on 
sound science. I am concerned about the costs in moving forward when 
there is a large body of science out there that says there isn't a 
problem.
    To shift our focus just a bit, an issue of particular concern to me 
is the available technology to control greenhouse gas emissions, 
specifically CO2. I am fairly certain that some of my 
colleagues agree with the line of thought about to be outlined by Dr. 
Mann, and this could very well lead this committee to a debate imposing 
mandatory controls on CO2. If this turns out to be the case 
it is imperative that this Committee determine whether or not the 
technology is currently available to accomplish CO2 
reductions that are effective enough to solve the ``problems'' thought 
to be faced. I realize this is a topic for another hearing, but one 
that causes me concern.
    In regards to mercury, in the 1990 amendments to the Clean Air Act, 
Congress specifically requested that EPA conduct an analysis of the 
health effects of mercury emissions from power plants and report back. 
EPA did conduct that study in 1997 and concluded that there was a 
``plausible link'' between mercury emission and potential health 
effects, but was unable to quantify the link.
    Six years have passed since EPA's 1997 study. Unfortunately, we 
still have not received any clarification from the EPA as to the 
magnitude of the health risks posed by power plant emissions, even 
though we are currently on the verge of spending billions of dollars to 
reduce those emissions.
    I suspect that one of the reasons for this lack of information is 
that we are dealing with a global problem. Many people today may find 
it surprising to learn that most of the mercury that is deposited in 
the United States originates from outside our borders. In fact, for 
most of the country, over 60-80 percent of the mercury deposited in the 
United States comes from emission sources located in another country. 
Additionally, natural sources of mercury, such as forest fires and 
vegetation burning, account for over half of the world's mercury 
emissions.
    What this means is that we have control over only a very small 
portion of total mercury emissions. Of the 5500 tons of mercury emitted 
globally, the U.S. accounts for only about 155 tons, or 3 percent of 
global emissions. U.S. power plant emissions which are estimated to be 
48 tons per year, represent less than 1 percent of total global 
emissions. Given how small this fraction is, it is both reasonable and 
prudent to ask what impact controls on power plants will have on actual 
public health.
    While EPA has unfortunately not provided us with any data on that 
question as of yet, Leonard Levin from the Electric Power Industry has. 
According to his very detailed analysis, control programs to reduce 
mercury emissions from power plants are likely to have less than a 1-
percent impact on public exposure in this country. In fact, he 
estimates an impact of less than 0.3 percent. I do not know if this 
number is correct, but I think his very detailed analysis deserves 
comment from EPA, especially given that this was exactly the kind of 
information Congress sought in 1990 when it amended the Act.
    I look forward to hearing Dr. Levin's testimony, as well as Dr. 
Rice's and Dr. Myers'. Your collective input is critical to this 
committee as we continue to debate the Clear Skies initiative.
    I yield back the balance of my time.
                                 ______
                                 
       Statement of Dr. Willie Soon, Harvard-Smithsonian Center 
                            for Astrophysics

    Distinguished Senators, panelists, and audience: My name is Willie 
Soon. I am an astrophysicist with the Harvard-Smithsonian Center for 
Astrophysics in Cambridge, Massachusetts. My training is in atmospheric 
and space physics and my sustained research interests for the past 10 
years include changes in the Sun and their possible impact on climate.
    This very rich area of scientific research, though still far from 
having definitive answers, has seen exciting and important progress 
from our increasing technical ability to measure, quantify, and 
interpret the changes in the Sun which could be linked to changes of 
the Earth's climate.
    Today I focus on my latest research conclusions regarding climate 
change over roughly the last 1000 years, especially the geographical 
pattern of those changes. My scientific study is only possible because 
of the careful research produced by nearly one thousand scientists 
around the world. Their expertise covers a very wide range, including 
physical, chemical, biological, and geological sciences.
    Together with several colleagues whose names are listed in the two 
scientific papers that I am submitting today for the record of this 
testimony, we have synthesized the results from several hundred studies 
of proxy records of climate, including much new work that has appeared 
in the scientific literature in the last 5 to 10 years.
    Climate proxies are indirect climate sensors based on information 
from tree rings, ice and seafloor sediment cores, corals, glaciers and 
other natural evidence. They also include important cultural and 
documentary records.
    It is important to recognize that these climate proxies are not 
temperature readings, but some proxies may be calibrated to give 
temperature changes. One example is the measurement of the flow of heat 
in boreholes drilled through rocks or ice, yielding century-scale 
temperature changes over several millennia. On the other hand, some 
proxies are sensitive to local rainfall as well as temperature, as in 
the case of annual tree growth in the southwest United States. Any 
given proxy may respond to temperature differently from other proxies, 
depending on, for instance, the type of proxy, location, or season.
    For all those reasons, it remains a big challenge to produce an 
accurate global temperature record over the past 1000 years from the 
diverse set of climate proxies.
    But within the limits and lessons learned from our research papers, 
we can offer three conclusions:
    First, local and regional, rather than ``global'', changes are the 
most relevant and practical measure of climate change and impact. This 
is because truly global averages rarely are available from the distant 
past, before modern satellite measurements, and because such averages 
can hide the significant changes that can occur over large parts of the 
Earth.
    Second, on a location by location basis, there was a widespread 
Medieval Warm Period between approximately 800 and 1300 A.D. This 
Medieval Warm Period was followed by a widespread colder period, called 
the Little Ice Age, that lasted from approximately 1300 to 1900 A.D.
    Third, there is no convincing evidence from each of the individual 
climate proxies to suggest that higher temperatures occurred in the 
20th century than in the Medieval Warm Period. Nor is there any 
convincing evidence to suggest that either the rate of increase or the 
duration of warming during the 20th century were greater than in the 
Medieval Warm Period.
    The fact that local and regional climate has been varying with 
significant swings in amplitude over many locations provides important 
challenges for computer simulation of climate. The full models that 
explore the Earth region by region can test for the natural patterns of 
change over the last 1,000 years through the use of the climate proxies 
we just discussed. In that way, the effects of human-caused climate 
change can be weighed against observed natural variability in the 
climate system. Having computer simulations reproduce past climate, 
which has been influenced predominantly by natural factors, is key to 
making an accurate forecast that includes all potential human-made 
warming and cooling effects.
    Further research could yield a deeper, quantitative improvement to 
our knowledge of local and regional climate variability during the past 
1000 years. As we could be inspired by Mr. Thomas Jefferson who 
remarked:

          ``It is a common opinion that the climates of the several 
        states of our union have undergone a sensible change since the 
        dates of their first settlements; that the degrees of both cold 
        & heat are moderated. The same opinion prevails as to Europe; 
        if facts gleaned from history give reasons to believe that, 
        since the times of Augustus Caesar, the climate of Italy, for 
        example, has changed regularly at the rate of 1 [degree] of 
        Fahrenheit's thermometer for every century. May we not hope 
        that the methods invented in latter times for measuring with 
        accuracy the degrees of heat and cold, and the observations 
        which have been & will be made and preserved, will at length 
        ascertain this curious fact in physical history?''--Marginal 
        notes from Thomas Jefferson's Monticello Weather Diary (January 
        1, 1810 to December 31, 1816).

    I strongly believe that the time for research in paleoclimatology 
to fulfill this important role is now.

[GRAPHIC] [TIFF OMITTED] T2381.001

[GRAPHIC] [TIFF OMITTED] T2381.002

[GRAPHIC] [TIFF OMITTED] T2381.003

[GRAPHIC] [TIFF OMITTED] T2381.004

[GRAPHIC] [TIFF OMITTED] T2381.005

[GRAPHIC] [TIFF OMITTED] T2381.006

[GRAPHIC] [TIFF OMITTED] T2381.007

[GRAPHIC] [TIFF OMITTED] T2381.008

[GRAPHIC] [TIFF OMITTED] T2381.009

[GRAPHIC] [TIFF OMITTED] T2381.010

[GRAPHIC] [TIFF OMITTED] T2381.011

[GRAPHIC] [TIFF OMITTED] T2381.012

[GRAPHIC] [TIFF OMITTED] T2381.013

[GRAPHIC] [TIFF OMITTED] T2381.014

[GRAPHIC] [TIFF OMITTED] T2381.015

[GRAPHIC] [TIFF OMITTED] T2381.016

[GRAPHIC] [TIFF OMITTED] T2381.017

[GRAPHIC] [TIFF OMITTED] T2381.018

[GRAPHIC] [TIFF OMITTED] T2381.019

[GRAPHIC] [TIFF OMITTED] T2381.020

[GRAPHIC] [TIFF OMITTED] T2381.021

[GRAPHIC] [TIFF OMITTED] T2381.022

[GRAPHIC] [TIFF OMITTED] T2381.023

[GRAPHIC] [TIFF OMITTED] T2381.024

[GRAPHIC] [TIFF OMITTED] T2381.025

[GRAPHIC] [TIFF OMITTED] T2381.026

[GRAPHIC] [TIFF OMITTED] T2381.027

[GRAPHIC] [TIFF OMITTED] T2381.028

[GRAPHIC] [TIFF OMITTED] T2381.029

[GRAPHIC] [TIFF OMITTED] T2381.030

[GRAPHIC] [TIFF OMITTED] T2381.031

[GRAPHIC] [TIFF OMITTED] T2381.032

[GRAPHIC] [TIFF OMITTED] T2381.033

[GRAPHIC] [TIFF OMITTED] T2381.034

[GRAPHIC] [TIFF OMITTED] T2381.035

[GRAPHIC] [TIFF OMITTED] T2381.036

[GRAPHIC] [TIFF OMITTED] T2381.037

[GRAPHIC] [TIFF OMITTED] T2381.038

[GRAPHIC] [TIFF OMITTED] T2381.039

[GRAPHIC] [TIFF OMITTED] T2381.040

[GRAPHIC] [TIFF OMITTED] T2381.041

[GRAPHIC] [TIFF OMITTED] T2381.042

[GRAPHIC] [TIFF OMITTED] T2381.043

[GRAPHIC] [TIFF OMITTED] T2381.044

[GRAPHIC] [TIFF OMITTED] T2381.045

[GRAPHIC] [TIFF OMITTED] T2381.046

[GRAPHIC] [TIFF OMITTED] T2381.047

[GRAPHIC] [TIFF OMITTED] T2381.048

[GRAPHIC] [TIFF OMITTED] T2381.049

[GRAPHIC] [TIFF OMITTED] T2381.050

[GRAPHIC] [TIFF OMITTED] T2381.051

[GRAPHIC] [TIFF OMITTED] T2381.052

[GRAPHIC] [TIFF OMITTED] T2381.053

[GRAPHIC] [TIFF OMITTED] T2381.054

[GRAPHIC] [TIFF OMITTED] T2381.055

[GRAPHIC] [TIFF OMITTED] T2381.056

[GRAPHIC] [TIFF OMITTED] T2381.057

[GRAPHIC] [TIFF OMITTED] T2381.058

[GRAPHIC] [TIFF OMITTED] T2381.059

[GRAPHIC] [TIFF OMITTED] T2381.060

[GRAPHIC] [TIFF OMITTED] T2381.061

[GRAPHIC] [TIFF OMITTED] T2381.062

[GRAPHIC] [TIFF OMITTED] T2381.063

[GRAPHIC] [TIFF OMITTED] T2381.064

[GRAPHIC] [TIFF OMITTED] T2381.065

[GRAPHIC] [TIFF OMITTED] T2381.066

[GRAPHIC] [TIFF OMITTED] T2381.067

[GRAPHIC] [TIFF OMITTED] T2381.068

[GRAPHIC] [TIFF OMITTED] T2381.069

[GRAPHIC] [TIFF OMITTED] T2381.070

[GRAPHIC] [TIFF OMITTED] T2381.071

[GRAPHIC] [TIFF OMITTED] T2381.072

[GRAPHIC] [TIFF OMITTED] T2381.073

[GRAPHIC] [TIFF OMITTED] T2381.074

[GRAPHIC] [TIFF OMITTED] T2381.075

[GRAPHIC] [TIFF OMITTED] T2381.076

[GRAPHIC] [TIFF OMITTED] T2381.077

[GRAPHIC] [TIFF OMITTED] T2381.078

[GRAPHIC] [TIFF OMITTED] T2381.079

[GRAPHIC] [TIFF OMITTED] T2381.080

[GRAPHIC] [TIFF OMITTED] T2381.081

[GRAPHIC] [TIFF OMITTED] T2381.082

[GRAPHIC] [TIFF OMITTED] T2381.083

[GRAPHIC] [TIFF OMITTED] T2381.084

[GRAPHIC] [TIFF OMITTED] T2381.085

[GRAPHIC] [TIFF OMITTED] T2381.086

[GRAPHIC] [TIFF OMITTED] T2381.087

[GRAPHIC] [TIFF OMITTED] T2381.088

[GRAPHIC] [TIFF OMITTED] T2381.089

[GRAPHIC] [TIFF OMITTED] T2381.090

[GRAPHIC] [TIFF OMITTED] T2381.091

[GRAPHIC] [TIFF OMITTED] T2381.092

[GRAPHIC] [TIFF OMITTED] T2381.093

[GRAPHIC] [TIFF OMITTED] T2381.094

[GRAPHIC] [TIFF OMITTED] T2381.095

   Response by Dr. Willie Soon to Additional Questions from Senator 
                                Jeffords

    Question 1. In testimony, you said that you did not know whether 
you submitted something for publication to Capitalism magazine. Here is 
the title and web address: ``Global Warming Speculation vs. Science: 
Just Ask the Experts'' by Sallie Baliunas & Willie Soon (Capitalism 
Magazine--August 22, 2002) http://capmag.com/article.asp?ID=1816. Did 
you submit or approve submission of this article for publication?
    Response. With the benefit of your reminder, I hereby confirm that 
the above mentioned article in Capitalism Magazine was taken from the 
original article ``Just Ask the Experts'' by Baliunas and Soon 
originally published by the TechCentralStation.com at the link: http://
www.techcentralstation.com/072302B.html. I did not submit the article 
to Capitalism Magazine.

    Question 2. In your testimony you indicated that your training is 
in ``atmospherics.'' Could you please explain this term more fully, and 
indicate your formal training in paleoclimatic studies and analysis?
    Response. My PhD thesis\1\ was on collisional-radiative properties 
of high-temperature, partially ionized nitrogen, oxygen, helium and 
hydrogen plasmas at conditions relevant to the Earth's atmosphere. This 
is why I mentioned that I had formal training in ``atmospheric and 
space physics'' in my oral remarks. If necessary, please consult my 
thesis advisor, Professor Joseph Kunc at [email protected] for further 
details about my educational background.
---------------------------------------------------------------------------
    \1\ Which was awarded the 1989 nation-wide IEEE Nuclear and Plasma 
Sciences Society Graduate Scholastic Award and the 1991's Rockwell 
Dennis Hunt Scholastic Award for ``the most representative PhD thesis 
work'' at the University of Southern California.
---------------------------------------------------------------------------
    I would add that the quality of knowledge about climate science or 
any other subject of interest must be judged on its own merits, and 
does not and must not be determined by invoking the amount of formal 
schooling or consensus viewpoints adopted by particular interest 
groups.
    My research interests and learning about paleoclimatology has been 
obtained mainly through the following individuals and sources:
    (1) Professor Eric Posmentier ([email protected]), 
who is also my colleague.
    (2) Professor David Legates ([email protected]), who is also my 
colleague.
    (3) Participation, both as a student and as lecturer, in numerous 
national and international workshops, conferences and summer schools 
including (a) the 1993's NATO Advanced Research workshop on ``Solar 
engine and its influence on terrestrial atmosphere and climate'', (b) 
the 1994's NASA-NOAA Summer School on Processes of Global Change, (c) 
the 1996's (French) CNRS ``Chaos et Fractales dans l'activite 
Solaire'', (d) the 2000's ``1st Solar and Space Weather Euroconference: 
The Solar Cycle and Terrestrial Climate,'' and other specialized 
meetings.
    (4) Many other scientists also have been helpful in my eager 
learning of the subject: the late Professor Jean Grove (Girton College, 
Cambridge University), Professor Jim Kennett (University of California 
Santa Barbara), Professor David J. A. Evans (University of Glasgow), 
Professor Lowell Stott (University of Southern California), Professor 
Hong-Chun Li (University of Southern California), Professor Reid Bryson 
(University of Wisconsin), Professor Henri Grissino-Mayer (University 
of Kentucky), Professor Emi Ito (University of Minnesota), Dr. ShaoPeng 
Huang (University of Michigan), Dr. Zhonghui Liu (Brown University), 
Dr. Ming Tang (Institute of Geology and Geophysics, Chinese Academy of 
Sciences), Dr. Yang Bao (Cold and Arid Regions Environmental and 
Engineering Research Institute, Chinese Academy of Sciences), and 
Professor Bin Wang (University of Hawaii).

    Question 3. Do you maintain that the proxy-based temperature 
reconstructions of the Mann and colleagues do not extend into the 
latter half of the 20th century?
    Response. The proxy-based temperature reconstructions for the 
Northern Hemisphere by Mann et al. (1998, Nature, vol. 392, 779-782) 
and Mann et al. (1999, Geophysical Research Letters, vol. 26, 759-762) 
extend from 1400-1980 and 1000-1980, respectively. So it is true that 
those proxy-based temperature series did not cover the 1981-2000 
interval of the late 20th century.
    Here is what close colleagues and co-authors (Bradley and Hughes) 
of Professor Mann admitted in their independent (i.e., without Prof. 
Mann as co-author) and updated publication, ``A caveat to [our] 
conclusion [about northern hemisphere temperature change over the last 
1000 years] is that the current proxy-based reconstructions do not 
extend to the end of the 20th century, but are patched on to the 
instrumental record of the last 2-3 decades [emphasis added]. This is 
necessary because many paleo data sets were collected in the 1960's and 
1970's, and have not been up-dated [NOTE: this statement by Bradley et 
al. (2003) referred primarily to the tree-ring data base from the 
International Tree-Ring Data base.], so a direct proxy-based comparison 
of the 1990's with earlier periods is not yet possible.'' [p. 116 of 
Bradley et al., 2003, In: Alverson, K., R.S. Bradley and T.F. Pedersen 
(eds.) Paleoclimate, Global Change and the Future. Springer Verlag, 
Berlin, 105-149]
    Agreeing with discussion on p. 260-261 of Soon et al. (2003), 
Bradley et al. (2003) cautioned that ``in the case of tree rings from 
some areas in high latitudes, the decadal time-scale climatic 
relationships prevalent for most of this century appear to have changed 
in recent decades, possibly because increasing aridity &/or snowcover 
changes at high latitudes may have already transferred the ecological 
responses of trees to climate (cf. Jacoby and D'Arrigo 1995; Briffa et 
al. 1998). For example, near the northern tree limit in Siberia, this 
changing relationship can be accounted for by a century-long trend to 
greater winter snowfall. This has led to delayed snowmelt and thawing 
of the active layer in this region of extensive permafrost, resulting 
in later onset of the growing season (Vaganov et al. 1999). It is not 
yet known how widely this explanation might apply to the other regions 
where partial decoupling has been observed, but regardless of the 
cause, it raises the question as to whether there might have been 
periods in the past when the tree ring-climate response changes, and 
what impact such changes might have on paleotemperature reconstructions 
based largely on tree ring data.'' (p. 116-117).
    Bradley et al. (2003) also worried that ``Paleoclimate research has 
had a strong northern hemisphere, extra-tropical focus (but even there 
the record is poorly known in many areas before the 17th century). 
There are very few high resolution paleoclimatic records from the 
tropics, or from the extra-tropical southern hemisphere, which leaves 
many questions (such as the nature of climate in Medieval times) 
unanswered.'' (p. 141). Bradley et al. continued ``All large-scale 
paleotemp-
erature reconstructions suffer from a lack of data at low latitudes. In 
fact, most ``northern hemisphere'' reconstructions do not include data 
from the southern half of the region (i.e. [missing comma] areas south 
of 30N). Furthermore, there are so few data sets from southern 
hemisphere that it is not yet possible to reconstruct a meaningful 
``global'' record of temperature variability beyond the period of 
instrumental records. For the northern hemisphere records, it must be 
recognized that the errors estimated for the reconstructions of Mann et 
al. (1999) and Briffa et al. (2001) are minimum estimates, based on the 
statistical uncertainties inherent in the methods used. These can be 
reduced by the use of additional data (with better spatial 
representation) that incorporate stronger temperature signals. However, 
there will always be additional uncertainties that relate to issues 
such as the constancy of the proxy-climate function over time, and the 
extent to which modern climate modes (i.e., those that occurred during 
the calibration interval) represent the full range of climate 
variability in the past [i.e., similar unresolved research questions 
had been raised in p. 239-242 and p. 258-264 of Soon et al. 2003]. 
There is evidence that in recent decades some high latitude trees no 
longer capture low frequency variability as well as in earlier decades 
of the 20th century (as discussed below in Section 6.8) which leads to 
concerns over the extent to which this may have also been true in the 
more distant past. If this was a problem (and currently we are not 
certain of that) it could result in an inaccurate representation of low 
frequency temperature changes in the past. Similarly, if former 
climates were characterized by modes of variability not seen in the 
calibration period, it is unlikely that the methods now in use would 
reconstruct those intervals accurately. It may be possible to constrain 
these uncertainties through a range of regional studies (for example, 
to examine modes of past variability) and by calibration over different 
time intervals, but not all uncertainty can be eliminated and so 
current margins of error must be considered as minimum estimates 
[meaning the actual range of error is larger than shown in Mann et al. 
1999 or the IPCC TAR's charts].'' (p. 114-115).
    It is also very important to heed warnings and cautions from other 
serious researchers about not over stating the true confidence of a 
reconstructed climatic result based on indirect proxies. Esper et al. 
(2003, Climate Dynamics, vol. 21, 699-706) modestly apprised of the 
current situation in reconstructing long-term climatic information from 
tree rings: ``Although these long-term trends agree well with ECS 
[i.e., Esper, Cook, Schweingruber in 2002, Science, vol. 295, 2250-
2253], the amplitude of the multi-centennial scale variations is, 
however, not understood. This is because (1) no single multi-centennial 
scale chronology could be built that is not systematically biased in 
the low frequency domain, and (2) no evidence exists that would support 
an estimation of the biases either in the LTM [Long-term mean 
standardization] nor in the RCS [Regional curve standardization] multi-
centennial chronologies. Consequently, we also avoided providing formal 
climate calibration and verification statistics of the chronologies. 
Note also that the climate signal of the chronologies' low frequency 
component could not be statistically verified anyway. This is because 
the high autocorrelations, when comparing lower frequency trends, 
significantly reduce the degrees of freedom valid for correlation 
analyses. We believe that a formal calibration/verification/transfer 
function approach would leave the impression that the long-term climate 
history for the Tien Shan [i.e., the location of Esper and five 
colleagues' study] is entirely understood, which is not the case. 
Further research is needed to estimate the amplitude of temperature 
variation in the Alai Range [south of Kirghizia] over the last 
millennium.'' (p. 705)

    Question 4. Do you claim that the Mann study does not reconstruct 
regional patterns of temperature change in past centuries?
    Response. In Soon et al. (2003, Energy & Environment, vol. 14, 233-
296), I and my colleagues cautioned that the regional temperature 
patterns resulted from Mann and colleagues' methodology are too 
severely restricted by the calibration particular, we are concerned 
that the regional (and hence larger spatial-scale averages) variability 
of temperature on multidecadal and centennial time scales deduced from 
such a method will be underestimated.
    Recently, the methodology of Mann et al. (1998) has been seriously 
challenged by McIntyre and McKitrick (2003, Energy & Environment, vol. 
14, 751-771) in that ``poor data handling, obsolete data and incorrect 
calculation of principal components'' were shown as the errors and 
defects of Mann et al's. paper. The exchange between Mann and 
colleagues and McIntyre and McKitrick is ongoing, but the use of 
obsolete data is a clear case of misrepresentation of regional basis of 
change in Mann et al's work. Further problems in Mann et al. (1998) are 
outlined under Question No. 13 below. Additional documentation 
(including responses by Prof. Mann and his colleagues) and updates can 
be found in http://www.uoguelph.ca/rmckitri/research/trc.html.

    Question 5. Do you maintain that the Mann study extrapolated global 
temperature estimates from the northern hemisphere?
    Response. I have not seen any global temperature curves presented 
in the two earlier studies by Mann et al. (1998 and 1999). But please 
consider the deep concerns about the lack of proxy data especially over 
the tropics (30N to 30S) and the southern hemisphere raised by Soon et 
al. (2003) and even in the independent paper by Professor Mann's close 
colleagues and co-authors (Bradley and Hughes), i.e., in Bradley et al. 
(2003), discussed under Question No. 3 above.
    ``Global'' temperature estimates, based on indirect climate 
proxies, from 200-1980 were shown in Mann and Jones (2003, Geophysical 
Research Letters, vol. 30 (15), 1820) as Figure 2c. But I am unsure if 
the temperature series presented by Mann and Jones (2003) could 
adequately represents the variability over the whole globe since it was 
openly admitted that the proxies used covered only 8 ``distinct 
regions'' in the Northern Hemisphere and 5 for the Southern Hemisphere 
(see the coverage of proxies shown in Figure 1 of Mann and Jones, 
2003).
    More importantly, Soon et al. (2004, Geophysical Research Letters, 
vol. 31, L03209) showed that the 40-year smoothed instrumental 
temperature trend for the Northern Hemisphere shown as Figure 2a of 
Mann and Jones (2003) has a physically implausible high value at year 
2000 (see more discussion in Question No. 6 below). We caution that the 
extremely rapid rate of warming trend of 1 to 2.5+ C per decade implied 
by the published results by Mann and his colleagues over the last one 
to 2 years [comparing Mann and Jones (2003) with both Mann (2002, 
Science, vol. 297, 1481-1482) and Mann et al. (2003, Eos, 84(27), 256-
257)], is most likely due to the artifacts of methodology and their 
procedure of trend smoothing. I am submitting the pdf file (SLB-GRL04-
NHtempTrend.pdf) of Soon et al. (2004) for the record of the committee.

    Question 6. Do you maintain that historical and instrumental 
temperature records that are available indicate colder northern 
hemisphere temperature conditions than the Mann et al northern 
hemisphere temperature reconstruction in the past centuries?
    Response. I am not sure about the meaning of this question. But 
when contrasted with borehole-based reconstruction, the Northern 
Hemisphere terrestrial temperatures produced by Mann et al. (1998, 
1999) over the last 500 years may have been too warm by about 0.4:+ C 
during the 17th-18th century (see Huang et al. 2000, Nature, vol. 403, 
756-758). Recent attempts by Mann et al. (2003, Journal of Geophysical 
Research, vol. 108. (D7), 4203) and Mann and Schmidt (2003, Geophysical 
Research Letters, vol. 30 (12), 1607) to rejustify and defend the Mann 
et al. (1998, 1999) results have been shown to be either flawed or 
invalid by Chapman et al. (2004, Geophysical Research Letters, vol. 31, 
L07205) and by Pollack and Smerdon (2003, Geophysical Research Abstract 
of EGS, vol. 6, 06345). The eventual fact will no doubt emerge with 
increased understanding, but Chapman et al. (2004) warned that ``A 
second misleading analysis made by Mann and Schmidt [2003] concerns use 
of end-points in reaching a numerical conclusion. . . . It is based on 
using end points in computing changes in an oscillating time series, 
and is just bad science.''
    With regard to instrumental thermometer data of the past 100-150 
years, it is important to note that Soon et al. (2004) has recently 
shown that the 40-year smoothed Northern Hemisphere temperature trend 
shown in Mann and Jones (2003) has a physically implausible high value 
at the year 2000 endpoint especially when studied in context with 
previous published results by Mann et al. (2003, Eos, vol. 84 (27), 
256-257) and Mann (2002, Science, vol. 297, 1481-1482). This important 
updated information, admittedly with the benefit of hindsight, together 
with the works by Chapman et al. (2004) and McIntyre and McKitrick 
(2003), showed clearly that the Northern Hemisphere temperature trends, 
either proxy-based or instrumental, derived by Mann et al. (1998, 1999) 
and Mann and Jones (2003) are not reliable.

    Question 7. Is it your understanding that during the mid-Holocene 
optimum period (the period from 4000-7000 B.C.) that annual mean global 
temperatures were more than a degree C warmer than the present day?
    Response. Again, I am not sure if there are sufficient proxy data 
that would allow a meaningful quantitative estimate of annual mean 
global temperatures back six to nine thousand years. But in a new paper 
for the Quaternary Science Reviews, Darrell Kaufman and 29 co-authors 
(2004, Quaternary Science Reviews, vol. 23, 529-560) found that indeed 
there are clear evidence for warmer than present conditions during the 
Holocene at 120 out of 140 sites they compiled across the Western 
Hemisphere of the Arctic. Kaufman et al. (2004) estimated that, at the 
16 terrestrial sites where quantitative data are available, the local 
Holocene Thermal Maximum summer temperatures were about 
1.60.8+ C higher than the average of the 20th century. The 
coarse temperature map sketched on the NOAA's Paleoclimatology web 
site: http://www.ngdc.noaa.gov/paleo/globalwarming/images/polarbigb.gif 
suggests that the summer temperatures 6000 years ago may have been 2 to 
4+ C warmer than present in the other sector (Eastern Hemisphere) of 
the Arctic.

    Question 8. As a climatologist, can you explain what kind of 
quantitative analysis it takes to determine whether or not the last 50 
years has been unusually warm compared to the last 1000 years?
    Response. The theoretical requirement is fairly simple: (a) find 
local and regional proxies that are sensitive to variations of 
temperature on timescales of decade, several decades and century; (b) 
have sufficient spatial coverage of these local and regional proxies. 
Then one would be able to compare the last 50 years of the 1000-year 
record with the previous 950 years.
    Soon et al. (2003) had indeed initiated an independent effort in 
this direction and concluded that a truly global or hemispheric 
averaged temperature record for the past 1000 years is not yet 
forthcoming because of the large and disparate range of the indirect 
local and regional proxies to temperature such that a robust ability of 
different proxies in capturing all the necessary scales of variability 
cannot yet be confirmed. The main problem I foresee in having any 
definitive answers for now is related to the fact that the statistical 
association of each proxy to climatic variables like temperature can 
itself be variable and changing depending on the location and time 
interval. But I am not sure if the sole focus on temperature as the 
measure of ``climate'' is sensible if not unnecessarily narrow.
    In Soon et al. (2003), we consider climate to be more than just 
temperature so we did not narrowly restrict ourselves to only 
temperature-sensitive proxies. For example, in addition to temperature, 
we are equally concerned about expansion and reduction of forested and 
desert-prone areas, tree-line growth limit, sea ice changes, balances 
of ice accumulation and ablation in mountain glaciers and so on. When 
studying the ice balance for a glacier, it is important to insist that 
although glaciers are very important indicators of climate change over 
a rather long time-scale, they are not simply thermometers as often 
confused by heated discussion pointing to evidence for global warming 
by carbon dioxide (see additional discussion on factors, especially 
atmospheric carbon dioxide, in determining Earth's climate and its 
change under Questions No. 19, 20, 25, 30 and 35 below). Examples 
include statements by Will Stefen, director of the International 
Geosphere-Biosphere Program, ``Tropical glaciers are a bellweather of 
human influence on the Earth system'' (quoted in the article ``The 
melting snows of Kilimanjaro'' by Irion, 2001, Science, vol. 291, 1690-
1691) or by Professor Lonnie Thompson, Ohio State University,

          ``We have long predicted that the first signs of changes 
        caused by global warming would appear at the few fragile, high-
        altitude ice caps and glaciers within the tropics . . . [t]hese 
        findings confirm those predictions. We need to take the first 
        steps to reduce carbon dioxide emissions. We are currently 
        doing nothing. In fact, as a result of energy crisis in 
        California--and probably in the rest of the country by this 
        summer--we will be investing even more in fuel-burning power 
        plants. That will put more power in the grid but, at the same 
        times it will add carbon dioxide to the atmosphere, amplifying 
        the problem'' (quoted in Ohio State University's press release, 
        http://www.acs.ohio-state.edu/units/research/archive/
        glacgone.htm).

    A clarification about the physical understanding of modern glacier 
retreats and climate change, especially those on Kilimanjaro, is 
necessary and has been forthcoming with important research progress. 
First, Molg et al. (2003, Journal of Geophysical Research, vol. 108 
(D23), 4731) recently concluded that their study:

        ``highlights that modern glacier retreat on Kilimanjaro is much 
        more complex than simply attributable to `global warming only', 
        a finding that conforms with the general character of glacier 
        retreat in the global tropic [Kaser, 1999]: a process driven by 
        a complex combination of changes in several different climatic 
        parameters . . . with humidity-related variables dominating 
        this combination.''

In another new paper for the International Journal of Climatology, 
Kaser et al. (2004, International Journal of Climatology, ``Modern 
glacier retreat on Kilimanjaro as evidence of climate change: 
Observations and facts'', vol. 24, 329-339; available from http://
geowww.uibk.ac.at/glacio/LITERATUR/index.html) provided clear answers 
that neither added longwave radiation from a direct addition of 
atmospheric CO2 nor atmospheric temperature were the key 
variables for the observed changes, as revealed in this long but highly 
informative passage:

          ``Since the scientific exploration of Kilimanjaro began in 
        1887, when Hans Meyer first ascended the mountain (not to the 
        top at this time, but to the crater rim), a central theme of 
        published research has been the drastic recession of 
        Kilimanjaro's glaciers (e.g., Meyer, 1891, 1900; Klute, 1920; 
        Gilman, 1923; Jager, 1931; Geilinger, 1936; Hunt, 1947; Spink, 
        1949; Humphries, 1959; Downie and Wilkinson, 1972; Hastenrath, 
        1984; Osmastion, 1989; Hastenrath and Greischar, 1997). Early 
        reports describe the formation of notches, splitting up and 
        disconnection of ice bodies, and measurements of glacier snout 
        retreat on single glaciers, while later books and papers 
        advance to reconstructing glacier surface areas. . . . Today, 
        as in the past, Kilimanjaro's glaciers are markedly 
        characterized by features such as penitentes, cliffs (Figure 
        3a/b) [not reproduced here], and sharp edges, all resulting 
        from strong differential ablation. These features illustrate 
        the absolute predominance [emphasis added] of incoming 
        shortwave radiation and latent heat flux in providing the 
        energy for ablation (Kraus, 1972). A positive heat flux from 
        either longwave radiation or sensible heat flux, if available, 
        would round-off and destroy the observed features within a very 
        short time ranging from hours to days. On the other hand, if 
        destroyed, the features could only be sculptured again under 
        very particular circumstances and over a long time. Thus, the 
        existence of these features indicates that the present summit 
        glaciers are not experiencing ablation due to sensible heat 
        (i.e., from positive air temperature). Additional support for 
        this is provided by the Northern Icefield air temperature 
        recorded from February 2000 to July 2002, which never exceeded 
        -1.6+ C, and by the presence of permafrost at 4,700 m below 
        Arrow Glacier on the western slope . . .''

Kaser et al. (2004) continue with this ``synopsis of interpretations 
and facts'':

          ``A synopsis of (i) proxy data indicating changes in East 
        African climate since ca. 1850, (ii) 20th century instrumental 
        data (temperature and precipitation), and (iii) the 
        observations and interpretations made during two periods of 
        fieldwork (June 2001 and July 2002) strongly support the 
        following scenario. Retreat from a maximum extent of 
        Kilimanjaro's glaciers started shortly before Hans Meyer and 
        Ludwig Purtscheller visited the summit for the first time in 
        1889, caused by an abrupt climate change to markedly drier 
        conditions around 1880. Intensified dry seasons accelerated 
        ablation on the respectively illuminated vertical walls left in 
        the hole on top by Reusch Crater as a result of volcanic 
        activity [emphasis added]. The development of vertical features 
        may also have started on the outer margins of the plateau 
        glaciers before 1900, primarily as the formation of notches, as 
        explicitly reported following field research in 1898 and 1912 
        (Meyer, 1900; Klute, 1920). A current example of such a notch 
        development is the hole in the Northern Icefield (see Figure 
        2). Once started, the lateral retreat was unstoppable, 
        maintained by solar radiation despite less negative mass 
        balance conditions on horizontal glacier surfaces, and will 
        come to an end only when the glaciers on the summit plateau 
        have disappeared. This is most probable within next decades, if 
        the trend revealed in Figure 1 continues. Positive air 
        temperatures have not contributed to the recession process on 
        the summit so far. The rather independent slope glaciers have 
        retreated far above the elevation of their thermal readiness, 
        responding to dry conditions. If present precipitation regime 
        persists, these glaciers will most probably survive in 
        positions and extents not much different from today. This is 
        supported by the area determinations in Thompson's et al. 
        (2002) map, which indicate that slope glaciers retreated more 
        from 1912 to 1952 than since then. From a hydrological point of 
        view, melt water from Kibo's glaciers has been of little 
        importance to the lowland in modern times. Most glacier 
        ablation is due to sublimation, and where ice does melt it 
        immediately evaporates into the atmosphere. Absolutely no signs 
        of runoff can be found on the summit plateau, and only very 
        small rivers discharge from the slope glaciers. Rainfall 
        reaches a maximum amount at about 2,500 m a.s.l. [above sea 
        level] (Coutts, 1969), which primarily feeds the springs at low 
        elevation on the mountain; one estimate attributes 95 percent 
        of such water to a forest origin (Lambrechts et al., 2002). The 
        scenario presented offers a concept that implies climatological 
        processes other than increased air temperature [emphasis added] 
        govern glacier retreat on Kilimanjaro in a direct manner. 
        However, it does not rule out that these processes may be 
        linked to temperature variations in other tropical regions, 
        e.g., in the Indian Ocean (Latif et al., 1999; Black et al., 
        2003).''

    Lindzen (2002, Geophysical Research Letter, vol. 29, paper 
2001GL014360) further added that ``Recent papers show that deep ocean 
temperatures have increased somewhat since 1950, and that the increase 
is compatible with predictions from coupled GCMs [General Circulation 
Models]. The inference presented is that this degree of compatibility 
constitutes a significant test of the models. . . . [But] it would 
appear from the present simple model (which is similar to what the IPCC 
uses to evaluate scenarios) that the ocean temperature change largely 
reflects only the fact that surface temperature change is made to 
correspond to observations, and says almost nothing about model climate 
sensitivity. . . . It must be added that we are dealing with observed 
surface warming that has been going on for over a century. The oceanic 
temperature change [at depth of 475 m or so] over the period reflects 
earlier temperature change at the surface. How early depends on the 
rate at which surface signals penetrate the ocean.'' In other words, 
the recently noted warming of the deeper ocean is not a proof of global 
surface and atmospheric warming by increasing CO2 in the air 
because the parameters of climate sensitivity and rate of ocean heat 
uptake are not sufficiently well quantified. In addition, if the 
earlier oceanic surface temperature warming mentioned by Lindzen were 
indeed initiated and occurred substantially long ago, then there would 
be no association of that change to man-made CO2 forcing.

    Question 9. The IPCC has found that the late 20th century is the 
warmest period in the last 1000 years, for average temperature in the 
northern hemisphere. Does your paper provide a quantitative analysis of 
average temperatures for the northern hemisphere for this specific time 
period--that is, for the later half of the 20th century?
    Response. It should be understood that (1) the conclusion of the 
IPCC Working Group I's Third Assessment Report (2001; TAR), (2) the 
evidence shown in Figure 1b of the Summary for Policymaker, (3) Figure 
5 of the Technical Summary, and (4) Figure 2.20 in Chapter 2 of TAR 
were all derived directly from the conclusion in Mann et al. (1999) and 
Figure 3a of Mann et al. (1999). Therefore all comments and criticisms 
presented in this Q&A about Mann et al. (1999) apply to the IPCC TAR's 
conclusion. In addition, Soon et al. (2004) recently cautioned that the 
40-year smoothed northern hemisphere temperature trend shown in Figure 
2.21 of TAR (2001) cannot be replicated according to the methodology 
described in the caption of Figure 2.21. The failure in replication 
introduces a significant worry about the actual quality of scientific 
efforts behind the production of Figure 2.21 in TAR (2001).
    The answer to the second part of your direct question is no. Here 
are the related reasons why a confident estimate of the averaged 
northern hemisphere temperature for the full 1000 years (including the 
full 20th century) is not yet possible, despite what had been claimed 
by Mann et al. (1999). First, several authors, including those detailed 
in section 5.1 of Soon et al. (2003) and those pointed out in Question 
No. 6, had shown that the 1000-year series of mathematical temperature 
derived by Mann et al. (1999) has significantly underestimated the 
multidecadal and centennial scale changes. Second, the focus of Soon et 
al. (2003) is to derive understanding of climatic change on local and 
regional spatial scales, instead of over the whole northern hemisphere 
per se, because those are the most relevant measures, in practical 
sense, of change. In addition, we provided the first-order attempt to 
collect all available climate proxies relevant for local and regional 
climatic changes, but not restricted to temperature alone. But more 
pertinent to your question is the fact discussed in Soon et al. (2003) 
that different proxies respond with differing sensitivities to 
different climatic variables, seasons, plus spatial and temporal 
scales, so that a convenient derivation of a self-consistent northern 
hemisphere averaged annual mean temperature for the full 1000 years, 
desirable as the result may be, is not yet possible.

    Question 10. Does your paper provide any quantitative analysis of 
temperature records specifically for the last 50 years of the 20th 
century?
    Response. Soon et al. (2003) considered all available proxy records 
with no particular prejudice. If the individual proxy record covers up 
to the last 50 years of the 20th century, then quantitative comparisons 
are performed, mostly according to the statements from the original 
authors. Please consider some of the detailed quantitative discussion 
in section 4 of Soon et al. (2003) and the qualitative results compiled 
in Table 1 of that paper.

    Question 11. In an article in the Atlanta Journal Constitution 
(June 1, 2003), you were quoted as acknowledging during a question 
period at a previous Senate luncheon that your research does not 
provide a comprehensive picture of the Earth's temperature record and 
that you questioned whether that is even possible, and that you did 
not, ``. . . see how Mann and the others could `calibrate' the various 
proxy records for comparison.'' How then does your analysis provide a 
comprehensive picture of Earth's temperature record or have any bearing 
on the finding by the IPCC, that the late 20th century is the warmest 
in the last 1000 years?
    Response. Thank you for referencing the article. I must first state 
on the record that contrary to the claim in this Atlanta Journal 
Constitution (June 1, 2003) article http://www.ajc.com/business/
content/business/0603/01warming.html, the writer, never, as claimed, 
conducted a telephone interview with me. No such conversation took 
place and I am rather shocked by this false claim. This fact has gone 
uncorrected until now.
    The strengths and weaknesses of my research works are fully 
discussed in Soon et al. (2003). The paper documented detailed local 
and regional changes in several climatic variables to try to obtain a 
broader understanding of climate variability. We concluded that:

          ``Because the nature of the various proxy climate indicators 
        are so different, the results cannot be combined into a simple 
        hemispheric or global quantitative composite. However, 
        considered as an ensemble of individual observations, an 
        assemblage of the local representations of climate establishes 
        the reality of both the Little Ice Age and the Medieval Warm 
        Period as climatic anomalies with worldwide imprints, extending 
        earlier results by Bryson et al. (1963), Lamb (1965), and 
        numerous other research efforts. Furthermore, these individual 
        proxies are used to determine whether the 20th century is the 
        warmest century of the 2nd Millennium at a variety of globally 
        dispersed locations. Many records reveal that the 20th century 
        is likely not the warmest nor a uniquely extreme climatic 
        period of the last millennium, although it is clear that human 
        activity has significantly impacted some local environments.''

    The question on the difficult problem of calibrating proxies of 
differing types and sensitivities to climatic variables is discussed in 
Soon et al. (2003) and some criticisms on the weaknesses of the 
reconstruction by Mann et al. (1999) or the related IPCC TAR's 
conclusion are listed especially under Questions No. 6 and 9.

    Question 12. Do you believe that appropriate statistical methods do 
not exist for calibrating statistical predictors, including climate 
proxy records, against a target variable, such as the modern 
instrumental temperature record?
    Response. True progress in the field of paleoclimatology will 
certainly involve a better and more robust means of interpreting and 
quantifying the variations and changes seen in each high-resolution 
proxy record. The issue is not merely a problem awaiting solution 
through appropriate statistical methods like the EOF methodology 
adopted by Mann et al. (1998, 1999). On pp. 241-242 of Soon et al. 
(2003), we briefly outlined our straight-forward approach and 
contrasted it to the one used by Mann and colleagues that does not 
necessarily lead to results with physical meaning and reality.

    Question 13. In determining whether the temperature of the 
``Medieval Warm Period'' was warmer than the 20th century, does your 
study analyze whether a 50-year period is either warmer or wetter or 
drier than the 20th century? If so, why is it appropriate to use 
indicators of drought and precipitation directly to draw inferences of 
past temperatures? Please list peer-reviewed works that specifically 
support the use of these indicators for inferring past temperature.
    Response. The detailed discussion behind our usage of the term 
``Medieval Warm Period'' or ``Little Ice Age'' was described in Soon et 
al. (2003). We are mindful that the two terms should definitely include 
physical criteria and evidence from the thermal field. But we emphasize 
that great bias would result if those thermal anomalies were 
dissociated from hydrological, cryospheric, chemical, and biological 
factors of change. So indeed our description of a Medieval Climatic 
Anomaly (see a similar sentiment later reported by Bradley et al. 2003, 
Science, vol. 302, 404-405) in Soon et al. (2003) includes a warmer 
time that contains both drought or flooding conditions depending on the 
locations.
    With regard to the last part of your question, I would answer by 
detailing only one example--Mann et al (1998). This influential study 
used both direct precipitation measurements and precipitation proxies 
as temperature indicators. This study was indeed applied by the IPCC 
TAR (2001). These include historical precipitation measurements in 11 
grid cells, two coral proxies (reported in Mann et al. [1998] as 
precipitation proxies; see http://www.ngdc.noaa.gov/paleo/ei/data--
supp.html for this and following references), two ice core proxies, 3 
reconstructions of spring precipitation in southeast United States by 
Stahle and Cleaveland from tree ring data, 12 principal component 
series for tree rings in southwestern United States and Mexico reported 
as precipitation proxies by Stahle and Cleaveland (and Mann et al. 
1998) and one tree ring series in Java--making a total of 31 
precipitation series used as proxies in temperature reconstruction by 
Mann et al. (1998). In this peer-reviewed article, for the 
precipitation data in a grid cell in New England, the researchers 
apparently used historical data from Paris, France (please see Figure 2 
of McIntyre and McKitrick, 2003 and their discussion on pp. 758-759). 
For a grid cell near Washington DC, the researchers used historical 
data from Toulouse, France. For a grid cell in Spain, the researchers 
used precipitation data from Marseilles, France. Of the 11 
precipitation series used in Mann et al. (1998), only one series 
(Madras, India) is correctly located. The precipitation data used by 
these authors cannot be identified in the source cited in paper Mann et 
al. (1998). While precipitation data and precipitation-related proxies 
can be instructive in providing information on past distribution of 
moisture and circulation patterns (and thus temperature), it is 
important to correctly identify the series used and important not to 
use data from the wrong continent for historical reconstructions.

    Question 14. Do you maintain that any two 50-year periods that 
occur within a multi-century interval can be considered 'coincident' 
from a climatic point of view?
    Response. The question raised here about the connection of any two 
50-year periods in any two regions to be related from climatic point of 
view is both important and interesting. But the answer will be strongly 
dependent on the nature of forcings and feedbacks involved. If longer-
term cryospheric or oceanic processes are involved then the answer 
would be yes.

    Question 15. Do your two recent studies employ an analysis (that 
is, a statistical or analytical operation performed upon numerical 
data) of a single proxy climate record?
    Response. The meaning of this question is not entirely clear to me. 
But I would say yes under the context of what is being said.

    Question 16. Has your study produced a quantitative reconstruction 
of past temperature patterns? Do you have a measure of uncertainty or 
verification in your description of past temperatures?
    Response. The results and conclusion of Soon et al. (2003) are best 
judged by the paper itself. Quantitative assessments of local and 
regional changes through the climatic proxies are discussed in section 
4 of that paper as well as some qualitative picture described in 
Figures 1, 2 and 3 of that paper. Again, Soon et al. (2003) did not 
tried to distill all the collected proxies down to produce a strict 
temperature-only result since we are interested in a broader 
understanding of climate variability. Part of the answers given under 
Questions No. 9 and 11 can help elaborate what was done by Soon et al. 
(2003). I would also like to direct your attention to the two warnings 
listed under Question No. 3 by Bradley et al. (2003) and Esper et al. 
(2003) concerning any undue, over confidence in promoting quantitative 
certainties in the reconstruction of past temperatures through highly 
imprecise black boxes of indirect proxies.

    Question 17. Your study indicates that you have compiled the 
results of hundreds of previous paleo-climate studies. Have you 
verified your interpretation of the hundreds of studies with any of the 
authors/scientists involved in those studies? If so, how many?
    Response. Specific authors and scientists that provided help in our 
work were listed in the acknowledgement section (p. 272) of Soon et al. 
(2003). We have also received generous help and comments from several 
scientists who are certainly highly qualified in terms of paleoclimatic 
studies. But the ultimate quality and soundness of our research shall 
always be our own responsibility.
    In the September 5, 2003 Chronicle for Higher Education article (by 
Richard Monastersky), there were indeed two very serious accusations 
that suggested that Soon et al. (2003) had misrepresented or abused the 
conclusions by two original authors whose work we had cited. Our 
corrections and explanations to these unfortunately false claims can be 
studied from the documentation listed in the URL http://cfa-
www.harvard.edu/wsoon/ChronicleHigherEducation03-d (read especially 
Sep12-lettoCHE3.doc and Sep12-lettoCHE4.doc).

    Question 18. What was earth's climate like the last time that 
atmospheric concentrations of carbon dioxide were at today's levels or 
about 370 parts per million (ppm) and what were conditions like when 
concentration were at 500 ppm, which will occur around 2060 or so?
    Response. Co-answer to this question is listed under Question No. 
19 below.

    Question 19. Please describe any known geologic precedent for large 
increases of atmospheric CO2 without simultaneous changes in 
other components of the carbon cycle and the climate system.
    Response. My July 29, 2003 testimony was about the climate history 
of the past 1000 years detailed in Soon et al. (2003) rather than any 
potential (causal or otherwise) relationship between atmospheric carbon 
dioxide and climate. The fact remains that the inner working of the 
global carbon cycle and the course of future energy use are not 
sufficiently understood or known to warrant any confident prediction of 
atmospheric CO2 concentration at year 2060. Please consider 
co-answer to this question under Question No. 25 below.
    However, it is abundantly obvious that atmospheric CO2 
is not necessarily an important driver of climate change. It is indeed 
a puzzle that despite the relative low level of atmospheric CO2 
of no more than 300 ppm in the past 320-420 thousand years (Kawamura et 
al., 2003, Tellus, vol. 55B, 126-137) compared to the high levels of 
330-370 ppm since the 1960's there is the clear suggestion of 
significantly warmer temperatures at both Vostok and Dome Fuji, East 
Antarctica, during the interglacials at stage 9.3 (about 330 thousand 
years before present; warmer by about 6+ C) and stage 5.5 (about 135 
thousand years before present; warmer by about 4.5+ C) than the most 
recent 1000 years (see Watanabe et al., 2003, Nature, vol. 422, 509-
512; further detailed discussion on environmental changes in Antartica 
over the past 1000 years or so, including the most recent 50 years can 
be found in section 4.3.4 or pp. 256-257 of Soon et al. 2003).
    But there are important concerns about the retrieval of information 
on atmospheric CO2 levels from ice cores. Jaworowski and 
colleagues (1992, The Science of the Total Environment, vol. 114, 227-
284) explained that:

          ``Ice is not a rigid material suitable for preserving the 
        original chemical and isotopic composition of atmospheric gas 
        inclusion. Carbon dioxide in ice is trapped mechanically and by 
        dissolution in liquid water. A host of physico-chemical 
        processes redistribute CO2 and other air gases 
        between gaseous, liquid and solid phases, in the ice sheets in 
        situ, and during drilling, transport and storage of the ice 
        cores. This leads to changes in the isotopic and molecular 
        composition of trapped air. The presence of liquid water in ice 
        at low temperatures [`even below--70+ C'] is probably the most 
        important factor in the physico-chemical changes. The permeable 
        ice sheet with its capillary liquid network acts as a giant 
        sieve which redistributes elements, isotopes and micro-
        particles. Carbon dioxide in glaciers is contained: (1) in 
        interstitial air in firn; (2) in air bubbles in ice; (3) in 
        clathrates; (4) as a solid solution in ice crystals; (5) 
        dissolved in intercrystalline veins and films of liquid brine; 
        and (6) in dissolved and particulate carbonates. Most of the 
        CO2 is contained in ice crystals and liquids, and 
        less in air bubbles. In the ice cores it is also present in the 
        secondary gas cavities, cracks, and in the traces of drilling 
        fluids.
          The concentration of CO2 in air recovered from the 
        whole ice is usually much higher than that in atmospheric air. 
        This is due to the higher solubility of this gas in cold water, 
        which is 73.5- and 35-times higher than that of nitrogen and 
        oxygen, respectively. The composition of other atmospheric 
        gases (N2, O2, Ar) is also different in 
        ice and in air inclusions than in the atmosphere. Argon-39 and 
        85Kr data indicate that 36-100 percent of air recovered from 
        deep Antarctic ice cores is contaminated by recent atmospheric 
        air during field and laboratory processing. Until about 1985, 
        CO2 concentrations in gas recovered from primary air 
        bubbles and from secondary gas cavities in pre-industrial and 
        ancient ice were often reported to be much higher than in the 
        present atmosphere. After 1985, only concentrations below the 
        current atmospheric level were published. Our conclusion is 
        that both these high and low CO2 values do not 
        represent real atmospheric content of CO2.
          Recently reported concentrations of CO2 in primary 
        and secondary gas inclusions from deep cores, covering about 
        the last 160,000 years, are much below the current atmospheric 
        level, although several times during this period the surface 
        temperature was 2-4.5+ C higher than now. If these low 
        concentrations of CO2 represented real atmospheric 
        levels, this would mean (1) that CO2 had not 
        influenced past climatic changes, and (2) that climatic changes 
        did not influence atmospheric CO2 levels.'' (p. 272-
        273)

    Additional historical evidence reveals natural occurrences of 
large, abrupt climatic changes that are not uncommon and they occurred 
without any known causal ties to large radiative forcing change. Phase 
differences between atmospheric CO2 and proxy temperature in 
historical records are often not fully resolved; but atmospheric 
CO2 has shown the tendency to follow rather than lead 
temperature and biosphere changes (see e.g., Dettinger and Ghil, 1998, 
Tellus, vol. 50B, 1-24; Fischer et al., 1999, Science, vol. 283, 1712-
1714; Indermuhle et al., 1999, Nature, vol. 398, 121-126).
    In addition, there have been geological times of global cooling 
with rising CO2 (during the middle Miocene about 12.5-14 
million years before present [Myr BP], for example, with a rapid 
expansion of the East Antarctic Ice Sheet and with a reduction in 
chemical weathering rates), while there have been times of global 
warming with low levels of atmospheric CO2 (such as during 
the Miocene Climate Optimum about 14.5-17 Myr BP as noted by Panagi et 
al., 1999, Paleocenoragphy, vol. 14, 273-292). A new study of 
atmospheric carbon dioxide over the last 500 million years (Rothman, 
2002, Proceedings of the (US) National Academy of Sciences, vol. 99, 
4167-4171) concluded that, ``CO2 levels have mostly 
decreased for the last 175 Myr. Prior to that point [CO2 
levels] appear to have fluctuated from about two to four times modern 
levels with a dominant period of about 100 Myr. . . . The resulting 
signal exhibits no systematic correspondence with geologic record of 
climatic variations at tectonic time scales.''

    Question 20. According to a study published in Science magazine, 
[B. D. Santer, M. F. Wehner, T. M. L. Wigley, R. Sausen, G. A. Meehl, 
K. E. Taylor, C. Amman, W. M. Washington, J. S. Boyle, and W. 
Bruggemann Science 2003 July 25; 301: 479-483], manmade emissions are 
partly to blame for pushing outward the boundary between the lower 
atmosphere and the upper atmosphere. How does that fit with the long-
term climate history and what are the implications?
    Response. It should first be noted that Pielke and Chase (2004, 
Science, vol. 303, 1771b; and see p. 1771c by Santer et al. and 
additional counter-reply by Pielke and Chase, with input from John 
Christy and Anthony Reale, available as paper 278b at http://
blue.atmos.colostate.edu/publications/reviewedpublications.shtml) had 
criticized and challenged Santer et al.'s claim and conclusion that,

        ``[o]ur results are relevant to the issue of whether the `real-
        world' troposphere has warmed during the satellite era. . . . 
        The direct evidence is that in the ALL experiment [i.e., 
        climate model results that included changes in well-mixed 
        greenhouse gases, direct scattering effects of sulfate 
        aerosols, tropospheric and stratospheric ozone, solar total 
        irradiance and volcanic aerosols; see more discussion below], 
        the troposphere warms by 0.07+ C/decade over 1979-1999. This 
        warming is predominantly due to increases in well-mixed 
        greenhouse gases. . . . Over 1979-1999, roughly 30 percent of 
        the increase in tropopause height in ALL is explained by 
        greenhouse gas-induced warming of the troposphere. 
        Anthropogenically driven tropospheric warming is therefore an 
        important factor in explaining modeled changes in tropopause 
        height.''

    In contrast, Pielke and Chase (2004) offered the observed evidence 
and concluded that

        ``[g]lobally averaged tropospheric temperature trends are 
        statistically indistinguishable from zero. Thus, the elevation 
        of the globally averaged tropopause report in [Santer et al., 
        2003] cannot be attributed to any detectable tropospheric 
        warming over this period.'' In addition, ``the climate system 
        is much more complex than defined by tropospheric temperature 
        and tropopause changes. Linear trend analysis [in Santer et 
        al., 2003] is of limited significance. Changes in global heat 
        storage provide a more appropriate metric to monitor global 
        warming than temperature alone.''

    Soon and Baliunas (2003, Progress in Physical Geography, vol. 27, 
448-455) had also previously outlined the incorrect fingerprint of 
CO2 forcing observed in even the best and sophisticated 
version of climate models thus far. A more general and comprehensive 
discussion about the fundamental difficulties on modeling the effects 
of carbon dioxide using current generation of climate models is given 
in Soon et al. (2001, Climate Research, vol. 18, 259-271). Thus, the 
new paper by Santer et al. (2003) does not supercede or overcome the 
difficulties with respect to General Circulation Climate Models raised 
in Soon and Baliunas (2003).
    Both the meaning and strength of the model-dependent results shown 
in Santer et al. (2003) remain doubtful and weak for several additional 
reasons.
    First, Figure 2 of Santer et al. (2003) itself confirmed that the 
modeled changes in tropopause height are caused mainly by large 
stratospheric cooling related to changes in stratospheric ozone (they 
admitted so even though their note No. 35 indicates that their 
numerical experiments did not separate tropospheric and stratospheric 
ozone changes) rather than by the well-mixed greenhouse gases that are 
supposed to be the subject of concern. Second, the model experiments of 
Santer et al. (2003) did not include changes in stratospheric water 
vapor which is known to be a significant factor for the observed 
stratospheric cooling (see e.g., Forster and Shine, 1999, Geophysical 
Research Letters, vol. 26, 3309-3312). Third, the failure to account 
for stratospheric water vapor contradicted the documented significant 
increases of stratospheric water vapor in the past half-century from a 
variety of instrumentations (e.g., Smith et al, 2000, Geophysical 
Research Letters, vol. 27, 1687-1690; Rosenlof et al., 2001, 
Geophysical Research Letters, vol. 28, 1195-1198; though Randel et al. 
[2004, Journal of the Atmospheric Sciences, submitted] recently noted 
that unusually low water vapor has been observed in the lower 
stratosphere for 2001-2003). Fourth, the model experiments by Santer et 
al. (2003) had clearly neglected (see note No. 18 of that paper) the 
role of the Sun's ultraviolet radiation that is not only known to be 
variable (e.g., Fontenla et al. 1999, The Astrophysical Journal, vol. 
518, 480-499; White et al., 2000, Space Science Reviews, vol. 94, 67-
74) but also known to exert important influence on both the chemistry 
and thermal properties in the stratosphere and troposphere (e.g., 
Larkin et al., 2000, Space Science Reviews, vol. 94, 199-214).
    Finally, the physical representation of aerosol forcing (which 
should not be restricted to sulfate alone) in Santer et al. (2003) is 
clearly not comprehensive and at best highly selective. Early on, 
Russell et al. (2000, Journal of Geophysical Research, vol. 105, 14891-
14898) cautioned that

        ``[o]ne danger of adding aerosols of unknown strength and 
        location is that they can be tuned to give more accurate 
        comparisons with current observations but cover up model 
        deficiencies.''

Anderson et al. (2003, Science, vol. 300, 1103-1104 and see also 
exchanges in Crutzen et al., 2003, vol. 303, 1679-1681) recently 
cautioned that:

        ``we argue that the magnitude and uncertainty of aerosol 
        forcing may affect the magnitude and uncertainty of total 
        forcing [i.e., `the global mean sum of all industrial-era 
        forcings'] to a degree that has not been adequately considered 
        in climate studies to date. Inferences about the causes of 
        surface warming over the industrial period and about climate 
        sensitivity may therefore be in error. . . . Unfortunately, 
        virtually all climate model studies that have included 
        anthropogenic aerosol forcing as a driver of climate change 
        (diagnosis, attribution, and projection studies; denoted 
        `applications' in the figure) have used only aerosol forcing 
        values that are consistent with the inverse approach. If such 
        studies were conducted with the larger range of aerosol 
        forcings determined from the forward calculations, the results 
        would differ greatly. The forward calculations raise the 
        possibility that total forcing from preindustrial times to the 
        present . . . has been small or even negative. If this is 
        correct, it would imply that climate sensitivity and/or natural 
        variability (that is, variability not forced by anthropogenic 
        emissions) is much larger than climate models currently 
        indicate. . . . In addressing the critical question of how the 
        climate system will respond to this [anthropogenic greenhouse 
        gases'] positive forcing, researchers must seek to resolve the 
        present disparity between forward and inverse calculations. 
        Until this is achieved, the possibility that most of the 
        warming to date is due to natural variability, as well as the 
        possibility of high climate sensitivity, must be kept open. 
        [emphasis added]''

    To further understand the complexity of calculating aerosol 
forcing, Jacobson (2001, Journal of Geophysical Research, vol. 106, 
1551-1568) has to account for a total of 47 species ``containing 
natural and/or anthropogenic sulfate, nitrate, chloride, carbonate, 
ammonium, sodium, calcium, magnesium, potassium, black carbon, organic 
matter, silica, ferrous oxide, and aluminium oxide'' in his recent 
estimate of only the global direct radiative forcing by aerosols. 
(Jacobson [2001] found that the global direct radiative forcing by 
anthropogenic aerosols is only -0.12 W/m\2\ while the forcing by 
combined natural and anthropogenic sources is -1.4 W/m\2\.) There are 
also the indirect aerosol effects. Temperature or temperature change is 
clearly not the only practical measure of effects by aerosols. Haywood 
and Boucher (2000, Reviews of Geophysics, vol. 38, 513-543) stressed 
the fact that the indirect radiative forcing effect of the modification 
of cloud albedo by aerosols could range from -0.3 to -1.8 W/m\2\, while 
the additional aerosol influences on cloud liquid water content (hence, 
precipitation efficiency), cloud thickness and cloud lifetime are still 
highly uncertain and difficult to quantify (see e.g., Rotstayn and Liu, 
2003, Journal of Climate, vol. 16, 3476-3481). This is why one can 
easily appreciate the difficulties faced by Santer et al. (2003) 
because climate forcing by aerosols is not only known within a wide 
range of uncertainties but also to a large degree of unknown.
    Therefore, I conclude that in addition to the fundamental issues 
related to climate model representation of physical processes, papers 
like Santer et al. (2003) have also failed the basic requirement for 
internal consistencies in the accounting for potentially relevant 
climatic forcing factors and feedbacks. This is why I cannot comment on 
the implication of this particular study and the meaning of the study 
for long-term climate history.

    Question 21. In your testimony, you discussed there being 
``warming'' and ``cooling'' for different periods. If you did not 
construct an integral across the hemisphere or a real timeline, don't 
your findings really just say there were some warm periods and cool 
periods, and therefore cannot speak to the issue of the rate of warming 
or cooling?
    Response. I am not sure about the meaning of this question and the 
quotes. My oral remark was merely referring to ``making an accurate 
forecast that includes all potential human-made warming and cooling 
effects.'' The detailed discussion about the climatic and environmental 
changes for the past 1000 years as deduced from the collection of 
proxies I had studied was given in Soon et al. (2003). I can certainly 
speak to the rate of warming or cooling at any given location or region 
when the available proxy, with sufficient temporal resolution, is known 
or proven to be temperature sensitive.

    Question 22. Is there any indication that regional climate 
variations are any larger or smaller at present than over the last 1000 
years (with 2003, for example, perhaps being a case with large regional 
variations from the normal)?
    Response. I would not recommend considering the pattern of change 
from a single year, i.e., 2003, and called it a climate change. But the 
fact is that in Soon et al. (2003) we had carefully studied individual 
proxy records from various locations and regions. As an example, the 
2000-year bottom-sediment record from Moon Lake, North Dakota, shows 
there is perhaps a distinct shift in the mode of hydrologic variability 
in the Northern Great Plain region starting around 1200 AD with the 
more recent period being more variable from the past. But, as indicated 
in the chart below, the author of this paper also noted that the severe 
droughts of the 1890's and 1930's around this area are ``eclipsed by 
more extensive droughts before the beginning of the instrumental 
period.''

[GRAPHIC] [TIFF OMITTED] T2381.096

    Question 23. In your oral presentation, you talked about ``[h]aving 
computer simulation.'' Could you please explain what you [as in your 
original] computer simulation or modeling to which you are referring, 
and, (a) Has this model gone through the appropriate set of model 
intercomparison studies like the various othe global models? (b) What 
forcings have been used to drive it? (c) How does it develop regional 
climate variations, and are these comparable to observations? and, (d) 
How does it perform over the 20th century, for example?
    Response. I apologize for any potential confusions.
    In my oral remark, I said,

          ``The entirety of climate proxies over the last 1,000 years 
        shows that over many areas of the world there has been, and 
        continues to be, large climate changes. Those changes provide 
        challenges for the computer simulations of climate. The full 
        models, which explore the Earth region by region, can be tested 
        against the natural patterns of change over the last 1,000 
        years that are detailed by the climate proxies. Having computer 
        simulations reproduce past patterns of climate, which has been 
        influenced predominantly by natural factors, is key to making 
        an accurate forecast that includes all potential human-made 
        warming and cooling factors.''

    So in the context of what I said, this question is clearly 
misdirected by someone who did not understand my remark. I was speaking 
on the potential application of works like Soon et al. (2003) for 
improving our ability to calculate with confidence the potential 
effects from man-made factors by first and foremost having a climate 
model that can at least reproduce some of the observed local and 
regional changes of the past.
    Personally, I am also conducting my research through the help of 
several climate models (both simple and complex types) appropriate for 
my interests and I would certainly apply what I found in Soon et al. 
(2003) to my own future studies using climate models. Any additional 
comments will be beyond the simple context of my oral testimony. But, 
it may be useful to take note of the comments by Green (2002, Weather, 
vol. 57, 431-439):

          ``It has always worried me that simple models of climate do 
        not seem to work very well. Experts on numerical models say 
        that this is because the atmosphere is very complicated, and 
        that large numerical models and computers are needed to 
        understand it. I worry because I do not know what they have 
        hidden in those models and the programs they use. I wonder what 
        I can compare their models with. Not with each other because 
        they belong to a sort of club, where to have a model that 
        disagrees with everyone else's puts you outside. That is not a 
        bad system, unless of course they are all wrong. Another 
        curiosity of complicated models is that their findings are 
        rarely used to improve the model that preceded them. I would 
        have expected that the more complex model would show where the 
        simpler one had got it wrong, and allow it to be corrected for 
        that misrepresentation.''

    Question 24. Based on the various comments of your scientific 
colleagues regarding your paper, including the methodological flaws 
pointed out in that paper by the former editor-in-chief of Climate 
Research, are you planning any reworking of your study or any further 
studies in the paleoclimatic area?
    Response. The use of a phrase like ``methodological flaws'' is a 
very convenient attempt to dismiss the weight of scientific evidence 
presented in Soon et al. (2003) but unfortunately without any clear nor 
confirmable basis. Thus far, the only formal criticism of Soon et al. 
(2003) was by Mann et al. (2003, Eos, vol. 84(27), 256-257) and we had 
provided our response to that criticism in Soon et al. (2003b, Eos, 
vol. 84(44), 473-476). My research interest and work to fully discern 
and quantitatively describe the local and regional patterns of climate 
variability over the past 1000 years or so will certainly continue 
despite this mis-characterization.
    It should however not be left unnoticed that several very serious 
problems in Mann et al. (1998, 1999), Mann and Schmidt (2003) and Mann 
and Jones (2003) had been found recently. Those unresolved anomalies 
are outlined in my answers to your Questions No. 3, 4, 5, 6, 9 and 13. 
A careful reworking with a fully open access to all data as well as a 
fully disclosed transparency of the actual methodologies and detailed 
applications will be the next important step for paleoclimate 
reconstruction research.

    Question 25. You indicated that there would likely be relatively 
small climatic response to even substantial increases in the CO2 
concentration. Do you disagree with the radiation calculations that 
have been done and the trapped energy that they calculate, as per the 
peer-reviewed literature? If so, please explain.
    Response. First, please consider the above discussion on climate 
forcing factors and climate response sensitivities under Question No. 
20 as part of the answers to this question.
    Second, I do not believe that I had made any strong claim, one way 
or another, about the CO2 forcing and potential response in 
any specific quantitative term during my testimony (since factually no 
one can). I do want to comment, as in my response under Question No. 
19, that CO2, as a minor greenhouse gas, is not a 
determinant of Earth's climate and therefore not entirely obvious a 
driver of its change. Most calculations in peer-reviewed literature (or 
not) that focus on the CO2 factor indeed would only like us 
to believe that CO2, especially under the realm of radiative 
forcing, is the predominant factor for driving anomalous climate 
responses, while the unavoidable and very difficult core subject about 
the actual dynamical state of Earth's ``mean'' climate is ignored.
    Third, some 10 years ago, Lindzen (1994, Annual Review in Fluid 
Mechanics, vol. 26, 353-378) pointed out a rather serious internal 
inconsistency regarding the role of water vapor and clouds when the 
physics of greenhouse effect is normally evaluated even among expert 
scientists or expert sources of information. (See e.g., the comment 
``without [the greenhouse effect], the planet would be 65 degrees 
colder'' by Jerry Mahlman in the February 2004 issue of Crisis 
Magazine, http://www.crisismagazine.com/february2004/feature1.htm) and 
the description of Greenhouse Effect in the EPA's ``global warming for 
kids'' webpage: http://www.epa.gov/globalwarming/kids/greenhouse.html.) 
Lindzen notes the ``artificial inevitability'' for the predominance of 
CO2 radiative forcing as a climatic factor in the following 
passage.

          ``In most popular depictions of the greenhouse effect, it is 
        noted that in the absence of greenhouse gases, the Earth's mean 
        temperature would be 255 K [about 0+ F], and that the presence 
        of infrared absorbing gases elevates this to 288 K [59+ F]. In 
        order to illustrate this, only radiative heat transfer is 
        included in the schematic illustrations of the effect (Houghton 
        et al. 1990, 1992) [IPCC reports]; this lends an artificial 
        inevitability to the picture. Several points should be made 
        concerning this picture: 1. The most important greenhouse gas 
        is water vapor, and the next most important greenhouse 
        substance consists in clouds; CO2 is a distant third 
        (Goody & Yung 1989). 2. In considering an atmosphere without 
        greenhouse substances (in order to get 255 K), clouds are 
        retained for their visible reflectivity while ignored for their 
        infrared properties. More logically, one might assume that the 
        elimination of water would also lead to the absence of clouds, 
        leading to a temperature of about 274 K [or 278 K depending on 
        what value of the solar irradiation factor is used] rather than 
        255 K. 3. Pure radiative heat transfer leads to a surface 
        temperature of about 350 K rather than 288 K. The latter 
        temperature is only achieved by including a convective 
        adjustment that consists simply in adjusting vertical 
        temperature gradient so as to avoid convective instability 
        while maintaining a consistent radiative heat flux. . . . `` 
        (p. 359-361)\2\
---------------------------------------------------------------------------
    \2\ A more pedagogical discussion of the greenhouse effect is given 
by Lindzen and Emanuel (2002) in Encyclopedia of Global Change, 
Environmental Change and Human Society, Volume 1, Andrew S. Goudie, 
editor in chief, p. 562-566, Oxford University Press, New York, 710 pp.

    Hu et al. (2000, Geophysical Research Letters, vol. 27, 3513-3516) 
added that as the sophistication of parameterization of atmospheric 
convection increases, there is a tendency for climate model sensitivity 
to variation in atmospheric CO2 concentration to decrease 
considerably. In Hu et al. (2000)'s study, the change is from a 
decrease in the averaged tropical warming of 3.3 to 1.6+ C for a 
doubling of CO2 that is primarily associated the 
corresponding decrease in the calculated total atmospheric column 
---------------------------------------------------------------------------
increase in water vapor from 29 percent to 14 percent.

    Question 26. If you accept those radiation calculations as valid, 
please explain why you seem to believe that the energy trapped by the 
greenhouse gases will have a small effect whereas you seem to believe 
that small changes in solar energy will have very large climatic 
effects?
    Response. In addition to my answers under Questions No. 19, 20 and 
25 above, I would like to point out that the Sun's radiation is not 
only variable but it varies in the ultraviolet part of the 
electromagnetic spectrum often by factors of 10 or more. The question 
about the relative effects of anthropogenic greenhouse gases and the 
Sun's radiation in terms of radiative forcing is certainly of interest 
but it does not add much to my current research quest to understand the 
Earth's mean climatic state and its nonlinear manifestations.

    Question 27. Please explain why you think the physically based 
climate models seem to quite satisfactorily represent the seasonal 
cycles of the climate at various latitudes based on the varying 
distributions of solar and infrared energy, but then would be so far 
off in calculating the climatic response for much smaller perturbations 
to solar radiation and greenhouse gases?
    Response. As indicated below, the first part of this sentence about 
a satisfactory representation of seasonal cycles of climate by computer 
climate models is not any assured statement of fact. This is why the 
followup question cannot be logically answered.
    For example, E. K. Schneider (2002, Journal of Climate, vol. 15, 
449-469) noted that:

        ``[a]t this writing, physically consistent and even flux-
        corrected coupled atmosphere-ocean general circulation models 
        (CGCMs) have difficulty in producing a realistic simulation of 
        the equatorial Pacific SST [sea surface temperature], including 
        annual mean, annual cycle, and interannual variability. Not 
        only do the CGCM simulations have significant errors, but also 
        there is little agreement among models.''

In a systematic comparison of the performance of 23 dynamical ocean-
atmosphere models, Davey et al. (2002, Climate Dynamics, vol. 18, 403-
420) found that ``no single model is consistent with the observed 
behavior in the tropical ocean regions . . . as the model biases are 
large and gross errors are readily apparent.'' Without flux adjustment, 
most models produced annual mean equatorial sea surface temperature in 
the central Pacific that are too cold by 2-3+ C. All GCMs except one 
simulated the wrong sign of the east-west SST gradient in the 
equatorial Atlantic. The GCMs also incorrectly simulated the seasonal 
climatology in all ocean sections and its interannual variability in 
the Pacific ocean.

    Question 28. In regard to your answers to the previous questions, 
to what extent is your indication of a larger climate sensitivity for 
solar than greenhouse gases due to quantitative analysis of the physics 
and to what extent due to your analysis of statistical correlations? Is 
this greater responsiveness for solar evident in the baseline climate 
system, or just for perturbations, and could you please explain?
    Response. Please see my answers to Questions No. 26 above and 30 
below.

    Question 29. Please explain why you seem to accept that solar 
variations, volcanic eruptions, land cover change, and perhaps other 
forcings can have a significant climatic influence, but changes in 
CO2 do not or cannot have a comparable influence?
    Response. Please see my answers to Question No. 30.

    Question 30. Could you please clarify why it is that you think the 
best way to get an indication of how much the climate will change due 
to global-scale changes in greenhouse gases or in solar radiation is to 
look at the regional level rather than the global scale? How would you 
propose to distinguish a natural variation from a climate change at the 
local to regional level?
    Response. Questions No. 28, 29 and 30 seem to be based on the 
unreasonable presumptions that some special insights about the effects 
of solar irradiation or land cover changes or even volcanic eruptions 
must be invoked or answered in order to challenge the role of carbon 
dioxide forcing in the climate system. That presumption is illogical. 
My basic view and research interest about carbon dioxide and the 
ongoing search for the right tool for modeling aspects of the Earth's 
climate system can be briefly summarized by my answers to Questions No. 
19, 25, 26, 27 and perhaps 20.
    As to your specific question on distinguishing a natural variation 
(either internally generated or externally introduced by solar 
variation or volcanic eruption) from a climate change by anthropogenic 
factors like land cover changes or carbon dioxide at the local to 
regional level, there is possibly a somewhat surprising answer. If one 
wish to single out the potential effects of man-made carbon dioxide 
against other natural and anthropogenic factors as hinted by your 
question, then the answer is clear--the CO2 effect is 
expected to be small in the sense that its potential signals will be 
likely be overwhelmed when compared with expected effects by other 
factors. It is a scientific fact that the signal of CO2 on 
the climate may be expected only over a very long time baseline and 
over a rather large areal extent. For example, Zhao and Dirmeyer (2003, 
COLA Technical Report No. 150; available at http://grads.iges.org/pubs/
tech.html), in their modeling experiments that attempt to account for 
the realistic effects of land cover changes, sea surface temperature 
changes and for the role of added atmospheric CO2, found 
that

        ``[w]hen observed CO2 concentrations are specified 
        in the model across the 18-year period, . . . we do not find a 
        substantially larger warming trend than in CTL [with no change 
        in CO2 concentration], although some small increase 
        is found. The weak impact of atmospheric CO2 changes 
        may be due to the small changes in specified CO2 
        during the model simulation compared to the doubling CO2 
        simulation, or the short length of the integrations. It is 
        clear that the relatively strong SST [sea surface temperature] 
        influence in this climate model is the driver of the [observed] 
        warming.''

Please also consider the point made by Lindzen (2002) under Question 
No. 8 above concerning the difficulties in linking the observed warming 
trend of the deep ocean (without challenging the quality and error of 
those deep ocean temperature data) to anthropogenic CO2 
forcing. Finally, I wish to note that Mickley et al. (2004, Journal of 
Geophysical Research, vol. 109, D05106) managed to use climate model 
simulations results to demonstrate ``the limitations in the use of 
radiative forcing as a measure of relative importance of greenhouse 
gases to climate change. . . . While on a global scale CO2 
appears to be a more effective `global warmer' than tropospheric ozone 
per unit forcing, regional sensitivities to increase ozone may lead to 
strong climate responses on a regional scale.''

    Question 31. How does your recent article relate to your 
assignments at the Harvard Smithsonian Observatory? Is paleoclimate 
part of the task of this observatory?
    Response. The publications of Soon et al. (2003) or Soon et al. 
(2004) are possible because of research grants that I and my 
collaborators obtained through competitive proposals to several 
research funding sources. I am a trust-fund employee at the Harvard-
Smithsonian Center for Astrophysics and the support of my position and 
research work here is mainly through my own research initiative and 
proposal application. The scientific learning about paleoclimatic 
reconstruction presented in Soon et al. (2003) is related to my 
research interest in the mechanisms of sun-climate relation, especially 
for relevant physical pathways and processes on multidecadal and 
centennial time scales. Additional fruit of my independent research and 
labor in the area of sun-climate physics, funded or unfunded, is 
exemplified by the March 2004 book ``The Maunder Minimum and The 
Variable Sun-Earth Connection'' (see http://www.wspc.com/books/physics/
5199.html) by W. Soon and S. Yaskell (published by World Scientific 
Publishing Company). It might also be instructive to note that 
paleoclimate researchers have been speculating about long-term 
variability of the sun as the cause of centennial- to millennial-scale 
variability seen in their proxy records.

    Question 32. In your testimony, you said that ``climate change is 
part of nature.'' Please describe what you meant, since obviously, 
climate change have occurred due, in part, to changes in various 
forcings, such as solar, continental drift, atmospheric composition, 
asteroid impacts, etc. rather than being just completely random events. 
Could you provide estimates of how large you consider future forcings 
might be and how big the climate change they might cause could be?
    Response. In this occasion, I am referring to the fact that any 
change or variability in climate is most likely a rule, rather than the 
exception, of the climate system. But I was not speaking about or 
trying to imply the factors of change, either naturally produced or 
man-made. I apologize for any potential confusion. It is certainly 
reasonable to suggest that those climatic changes may arise from 
``forcings'' but it would be unwise to rule out internally generated 
manifestations of climatic variables that could be purely stochastic in 
origin. I would strongly recommend the pedagogical discussion by 
Professor Carl Wunsch of MIT in Wunsch (1992, Oceanography, vol. 5, 99-
106) and Wunsch (2004, ``Quantitative estimate of the Milankovitch-
forced contribution to observed Quaternary climate change'', working 
manuscript downloadable from http://puddle.mit.edu/cwunsch/).
    I cannot speculate on future climate forcings and resultant 
climatic changes because I found no basis for doing so.

    Question 33. Please provide a comparable estimate, with some 
supporting examples from the past, of how big you think the decadal (or 
50-year if you prefer) change in the hemispheric/global climate could 
be due to natural variability? If you prefer to focus on the regional 
scale change, could you provide an indication of any expected change in 
the degree of regional variability about the hemispheric and global 
values, and what the mechanism for this might be?
    Response. This question seems a related question trying to get at a 
quantitative comparison of how large natural climate variability on 
regional or hemispheric scale can be under the shadow of expected 
future changes. Again, with no intention to devalue this interesting 
question, I do not have sufficient knowledge nor ability to venture 
such an estimate. In fact, I would go so far to say that if the 
estimates of variability for both the past and future are known within 
a reasonable range of uncertainties, then the actual scientific 
research program to address questions about the role of added carbon 
dioxide no longer require further funding or execution since we have 
obtained all the relevant answers. But you may have judged from my 
answers given throughout this Q&A that much remains to be quantified 
and understood and the hard scientific research must continue.

    Question 34. Please explain the scientific basis for your testimony 
that ``one should expect the CO2 greenhouse effect to work 
its way downward toward the surface.''
    Response. Co-answer to this question is given under Question No. 
35.

    Question 35. Do you believe that there is greater greenhouse 
trapping of energy in the troposphere than at the surface and that the 
atmosphere has a low heat capacity? If so, how big is this temperature 
difference?
    Response. It is broadly agreed and assumed that carbon dioxide, 
when released into the air, has a tendency to get mixed up quickly and 
so is distributed widely through out the whole column of the 
atmosphere. The air near the surface is already dense and moist, so 
addition of more carbon dioxide will introduce very little imbalance of 
radiation energy budget there. In contrast adding more carbon dioxide 
to the thinner and drier air of the troposphere will cause a chain of 
noticeable effects. First, the presence of more carbon dioxide in the 
uppermost part of the atmosphere will cause more infrared radiation to 
escape into space because there are more carbon dioxide molecules to 
channel this infrared radiation upward and outward unhindered. Part of 
that infrared radiation is also being emitted downward to the lower 
parts of the atmosphere and the surface where it is reabsorbed by 
carbon dioxide and the thicker air there. The layer of air at the lower 
and middle troposphere, being more in direct contact with this down-
welling radiation, is expected to heat more than air near the surface. 
Thus, adding more carbon dioxide to the atmosphere should cause more 
warming of the air around the height of two to seven kilometers. 
(Please consider for example the discussion by Kaser et al. (2004) 
under Question No. 8 about the ineffectiveness of an added longwave 
radiation from a direct addition of atmospheric CO2 or 
atmospheric temperature change in explaining the modern retreat of 
glaciers at Kilimanjaro.) In other words, the clearest impact of the 
carbon dioxide greenhouse effect should manifest itself in the lower- 
and mid-troposphere rather than near the earth's surface. Here, I am 
mostly speaking on the basis of expectation from pure radiative forcing 
considerations.
    Such a qualitative description is not complete, even though that is 
roughly what was modeled in the most sophisticated general circulation 
models (see e.g., Chase et al., 2004, Climate Research, vol. 25, 185-
190), because it misses the key roles of atmospheric convection and 
waves as well as all the important hydrologic processes (please see 
e.g., Neelin et al., 2003, Geophysical Research Letters, vol. 30 (no. 
24), 2275 and consider additional remarks about water vapor and 
atmospheric convection under Question No. 25 as well as discussion on 
climate forcing factors and climate response sensitivities under 
Question No. 20). Some theoretical proposals expect a warming of the 
surface relative to the low- and mid-troposphere because of nonlinear 
climate dynamics (Corti et al., 1999, Nature, 398, 799-802). That 
expectation is because of the differential surface response with the 
pattern of Cold Ocean and Warm Land (COWL) that becomes increasingly 
unimportant with distance away from the surface (rather than just the 
difference in heat capacity mentioned in your question) [see Soon et 
al., 2001 for additional discussion]. Nevertheless, no GCM has yet 
incorporated such an idea into an operationally robust simulation of 
the climate system response to greenhouse effects from added 
CO2. In the latest ``global warming'' work, Neelin et al. 
(2003), for example, still distinctly differentiate between mechanisms 
for tropical precipitation that are initiated through CO2 
warming of the troposphere and through El Nino warming rooted in 
oceanic surface temperature and subsurface thermocline dynamics. 
(Further note that their model experiments [see Figure 2b+2c and 
10b+10c of Chou and Neelin, 2004, ``Mechanisms of global warming 
impacts on regional tropical precipitation'' in preparation for Journal 
of Climate; available at http://www.atmos.ucla.edu/?csi/REF/] also 
clearly shown that the troposphere warmed significantly more than 
surface with the doubling of atmospheric CO2 as discussed by 
Chase et al. 2004 below.)
    But it is worth noticing that the current global observation shows 
that, at least over the 1979-2003 interval, the lower tropospheric 
temperatures are not warming as fast as the surface temperatures (see 
Christy et al. 2003, Journal of Atmospheric and Oceanic Technology, 
vol. 20, 613-629; for additional confidence on the results derived by 
the University of Alabama-Huntsville group, please see Christy and 
Norris, 2004, Geophysical Research Letters, vol. 31, L06211). This 
observed fact is in contradiction to the accelerated warming of the mid 
and upper troposphere relative to surface simulated in current models 
(Chase et al. 2004). Chase et al. (2004) arrives at the following 
conclusions, upon examining results from 4 climate models in both 
unforced scenarios and scenarios forced with increased atmospheric 
greenhouse gases and the direct aerosol effect\3\:
---------------------------------------------------------------------------
    \3\ Such a study should also be consistently challenged by the 
discussion under question No. 20 about the adequacy of studying 
responses from a combination of incomp;ete forcings--through my primary 
purpose here is to illustrate the theoretical expectation of CO2 
forcing deriving from state-of-the-art climate models.

           ``Model simulations of the period representative of 
        the greenhouse-gas and aerosol forcing for 1979-2000 generally 
        show a greatly accelerated and detectable warming at 500 mb 
        relative to the surface (a 0.06+ C 
        decade-1increase).
           Considering all possible simulated 22 yr trends 
        under anthropogenic forcing, a strong surface warming was 
        highly likely to be accompanied by accelerated warming at 500 
        mb [i.e., 987 out of 1018 periods or 97 percent of the cases 
        had a larger warming at 500 mb than at the surface] with no 
        change in likelihood as forcings increased over time.
           In simulated periods where the surface warmed more 
        quickly than 500 mb, there was never a case [emphasis added] in 
        which the 500 mb temperature did not also warm at a large 
        fraction of the surface warming. A 30 percent acceleration at 
        the surface was the maximum simulated as compared with an 
        observed acceleration factor of at least 400 percent the mid-
        troposphere trend.
           In cases where there was a strong surface warming 
        and the surface warmed more quickly than at 500 mb in the 
        forced experiments, there was never a case in which the 500 mb-
        level temperatures did not register a statistically significant 
        (p< 0.1) trend (i.e., a trend detectable with a simple linear 
        regression model). The minimum p value of approximately 0.08 
        occurred in the single case in which the significance was not 
        greater than 99 percent.
           It was more likely that surface warmed relative to 
        the mid-troposphere under control simulations than under forced 
        simulations.
           At no time, in any model realization, forced or 
        unforced, did any model simulate the presently observed 
        situation of a large and highly significant surface warming 
        accompanied with no warming whatsoever aloft.'' (p. 189)

    Question 36. The grants that are described as supporting your 
analysis seem to have much more to do with the sun or unrelated pattern 
recognition that with climate history (Air Force Office of Scientific 
Research-Grant AF49620-02-1-0194; American Petroleum Institute-Grants 
01-0000-4579 and 2002-100413; NASA-Grant NAG-7635; and NOAA-Grant 
NA96GP0448). Could you please describe how much funding you received 
and used in support of this study, all of the sources and the duration 
of that funding, and the relevance of those grant topics to the 
article?
    Response. All sources of funding for my and my colleagues' research 
efforts that resulted in the publication of Soon and Baliunas (2003) 
and Soon et al. (2003) were openly acknowledged. In other words, all 
sources of funding were disclosed in the manuscripts when they were 
submitted for publication; all sources of funding were also disclosed 
to readers in the printed journal articles. I am not the principal 
investigator for some of the grants we received (e.g., the NOAA grant 
was awarded to Professor David Legates), so I am not in the privilege 
position to provide exact quantitative numbers. But throughout the 
2001-2003 research interval in which our work was carried out, the 
funding we received from the American Petroleum Institute was a small 
fraction of the funding we received from governmental research grants.
    The primary theme of my research interest is on physical mechanisms 
of the sun-climate relationship. This is why researching into the 
detailed patterns of local and regional climate variability as 
published in Soon et al. (2003) is directly relevant to that goal. 
Please also consider my research position listed under Question No. 31 
above.

    Question 37. Have you been hired by or employed by or received 
grants from organizations that have taken advocacy positions with 
respect to the Kyoto Protocol, the U.N. Framework Convention on Climate 
Change, or legislation before the U.S. Congress that would affect 
greenhouse gas emissions? If so, please identify those organizations.
    Response. I have not knowingly been hired by, nor employed by, nor 
received grants from any such organizations described in this question.

    Question 38. Please describe the peer review process that took 
place with respect to your nearly identical articles published both in 
Climate Research and in Energy and Environment, including the number of 
reviewers and the general content of the reviewers' suggested edits, 
criticisms or improvements.
    Response. The Climate Research paper (Soon and Baliunas, 2003, 
Climate Research, vol. 23, 89-110) was submitted for publication and 
went through a routine peer-review process and was eventually approved 
for publication. The main content of the review was to propose: (a) 
reorganizing of materials including elimination of discussions on ENSO 
and GCMs; (b) removing ``tone'' problems by eliminating criticisms of 
previous EOF and superposition analyses; (c) reducing quotes especially 
those by Hubert Horace Lamb to improve readability; and (d) reviewing 
changes in each region with same thoroughness. The July 3, 2003's email 
(as Attachment I below) from the director of Inter-Research, Otto 
Kinne, who publishes Climate Research is enclosed below to confirm that 
the review process was fairly rigorous and all parties involved had 
carried out their roles and duties in this time-honored system 
properly.
    The extended and more complete paper by Soon et al. (2003, Energy & 
Environment, vol. 14, 233-296) was submitted to Energy & Environment 
for consideration together with the accepted Climate Research 
manuscript. Energy & Environment's editorial decision was to send our 
manuscript for review, and after acceptance, include in its editorial 
in Energy & Environment, volume 14, issues 2&3, a footnote referring to 
the Climate Research paper.
    Finally, we wish to correct that the false impression introduced by 
Professor Mann both during the testimony and in public media that his 
attack on the papers by Soon and Baliunas (2003) and Soon et al. 
(2003), in a FORUM article in the American Geophysical Union Eos 
newspaper (Mann et al., 2003, Eos, vol. 84, 256-258), were either 
rigorously peer-reviewed or represented widespread view of the 
community. Contrary to Professor Mann's public statements, a FORUM 
article in Eos is said to be only stating ``a personal point of view'' 
(http://www.agu.org/pubs/eos--guidelines.html#authors). Whatever peer-
reviewing that was done did not include soliciting comments from the 
authors of the papers being criticized. We first learned of this FORUM 
article from the AGU's press release No. 03-19 ``Leading Climate 
Scientists Reaffirm View that Late 20th Century Warming Was Unusual and 
Resulted From Human Activity'' (http://www.agu.org/sci--soc/prrl/
prrl0319.html). See Soon et al. (2003b, Eos, vol. 84 (44), 473-476) for 
our own response to the Mann et al. FORUM article.

                               __________

  Statement of Professor Michael E. Mann, Department of Environmental 
                    Services, University of Virginia

    My name is Michael Mann. I am a professor in the Department of 
Environmental Sciences at the University of Virginia. My research 
involves the use of climate models, the analysis of empirical climate 
data, and statistical methods for comparing observations and model 
predictions. One area of active current research of mine involves the 
analysis of climate ``proxy'' records (that is, natural archives of 
information which record past climate conditions by their biological, 
physical, or chemical nature). These data are used to reconstruct 
patterns of climate variability prior to the period of the past century 
or so during which widespread instrumental climate records are 
available. A primary focus of this research is deducing the long-term 
behavior of the climate system and the roles of various potential 
agents of climate change, both natural and human.
    I was a Lead Author of the ``Observed Climate Variability and 
Change'' chapter of the Intergovernmental Panel on Climate Change 
(IPCC) Third Scientific Assessment Report and a scientific contributor 
for several other chapters of the report. I am the current organizing 
committee chair for the National Academy of Sciences `Frontiers of 
Science' and have served as a committee member or advisor for other 
National Academy of Sciences panels related to climate change. I have 
served as editor for the `Journal of Climate' of the American 
Meteorological Society. I'm a member of the advisory panel for the 
National Oceanographic and Atmospheric Administrations' Climate Change 
Data and Detection Program, and a member of numerous other 
international and U.S. scientific working groups, panels and steering 
committees. I have co-authored more than 60 peer-reviewed articles and 
book chapters on diverse topics within the fields of climatology and 
paleoclimatology. Honors I have received include selection in 2002 as 
one of the 50 leading visionaries in Science and Technology by 
Scientific American magazine, the outstanding scientific publication 
award for 2000 from the National Oceanographic and Atmospheric 
Administration, and citation by the Institute for Scientific 
Information (ISI) for notable recognition of my peer-reviewed research 
by fellow scientists.
    In my testimony here today, I will explain: (1) How mainstream 
climate researchers have come to the conclusion that late-20th century 
warmth is unprecedented in a very long-term context, and that this 
warmth is likely related to the activity of human beings. (2) Why a 
pair of recent articles challenging these conclusions by astronomer 
Willie Soon and his co-authors are fundamentally unsound.

                  CLIMATE HISTORY AND ITS IMPLICATIONS

    Evidence from paleoclimatic sources overwhelmingly supports the 
conclusion that late-20th century hemispheric-scale warmth was 
unprecedented over at least the past millennium and probably the past 
two millennia or longer.
    Modeling and statistical studies indicate that such anomalous 
warmth cannot be explained by natural factors but, instead, requires 
significant anthropogenic (that is, `human') influences during the 20th 
century. Such a conclusion is the indisputable consensus of the 
community of scientists actively involved in the research of climate 
variability and its causes. This conclusion is embraced by the position 
statement on ``Climate Change and Greenhouse Gases'' of the American 
Geophysical Union (AGU) which states that there is a compelling basis 
for concern over future climate changes, including increases in global-
mean surface temperatures, due to increased concentrations of 
greenhouse gases, primarily from fossil-fuel burning. This is also the 
conclusions of the 2001 report of the Intergovernmental Panel on 
Climate Change (IPCC), affirmed by a National Academy of Sciences 
report solicited by the Bush administration in 2001 which stated, ``The 
IPCC's conclusion that most of the observed warming of the last 50 
years is likely to have been due to the increase in greenhouse gas 
concentrations accurately reflects the current thinking of the 
scientific community on this issue.''

                  THE MAINSTREAM SCIENTIFIC VIEWPOINT

    Human beings have influenced modern climate through changes in 
greenhouse gas concentrations, the production of industrial aerosols, 
and altered patterns of land-use. By studying both the record of 
ancient climate variability and the factors that may have influenced 
it, we can establish how and why the climate system varied naturally, 
prior to any large-scale anthropogenic impacts. Large changes in 
climate certainly occurred in the distant past. If we look 60 million 
years back in time, Dinosaurs were roaming the polar regions of the 
earth, and the globe was almost certainly warmer than today. Carbon 
dioxide levels were probably about double their current level, and had 
slowly attained such high levels due to changes in the arrangements of 
the continents (`plate tectonics') which influence the outgassing of 
carbon dioxide from the solid earth and thus, atmospheric greenhouse 
gas concentrations. These changes occur on timescales of tens of 
millions of years. 10,000 years ago, large ice sheets existed over 
North America due to natural changes that occur in the earth's orbit on 
timescales of tens of thousands of years. Trying to study distant past 
climates for insights into modern natural climate variability is 
hampered by the fact that the basic external constraints on the system 
(the continental arrangement, the geometry of the earth's astronomical 
orbit, the presence of continental ice sheets--what we call the 
`boundary conditions') were significantly different from today. 
Focusing on the evolution of climate in the centuries leading up to the 
20th century provides a perspective on the natural variability of the 
climate prior to the period during which large-scale human influence is 
likely to have occurred, yet modern enough that the basic boundary 
conditions on the climate system were otherwise the same. This provides 
us, in essence, a `control' for diagnosing whether or not recent 
climate changes are indeed unusual.
    Instrumental data for use in computing global mean surface 
temperatures are only available for about the past 150 years. Estimates 
of surface temperature changes prior to the 20th century must make use 
of historical documents and natural archives or ``proxy'' indicators, 
such as tree rings, corals, ice cores and lake sediments, to 
reconstruct the patterns of past climate change. Due to the paucity of 
data in the Southern Hemisphere, recent studies have emphasized the 
reconstruction of Northern Hemisphere rather than global mean 
temperatures. A number of independent reconstructions of the average 
temperature of the Northern Hemisphere support the conclusion that the 
hemispheric warmth of the late 20th century (i.e., the past few 
decades) is likely unprecedented over at least the past millennium. 
Preliminary evidence suggests that such a conclusion may well hold for 
at least the past two millennia, though more work, requiring the 
development of a more complete set of reliable proxy records spanning 
the past few millennia, are necessary to further decrease the 
uncertainties. Climate model simulations employing estimates of natural 
and anthropogenic radiative forcing changes agree well with the proxy-
based reconstructions (Figure 1). The simulations, moreover, show that 
it is not possible to explain the anomalous late 20th century warmth 
without the contribution from anthropogenic influences. Such consensus 
findings are expressed in the recently published article co-authored by 
myself and 12 other leading climate scientists from the United States 
and Britain that appeared recently in the journal `Eos', the official 
transactions of the American Geophysical Union, the largest 
professional society in the field.

       FLAWS IN A RECENT STUDY DISPUTING THE SCIENTIFIC CONCENSUS

    Two deeply flawed (and nearly identical) recent papers by 
astronomers Soon and Baliunas (one of them with some additional co-
authors--both henceforth referred to as `SB') have been used to 
challenge the scientific consensus. I outline the 3 most basic problems 
with their papers here:

[GRAPHIC] [TIFF OMITTED] T2381.097

    (1) In drawing conclusions regarding past regional temperature 
changes from proxy records, it is essential to assess to make sure that 
the proxy data are indicators of temperature and not precipitation or 
drought. SB make this fundamental error when they cite evidence of 
either `warm', `wet', or `dry' regional conditions as being in support 
of an exceptional `Medieval Warm Period' or `MWP'. Their criterion, ad 
absurdum, could be used to define any period of climate as `warm' or 
`cold'. Experienced paleoclimate researchers know that they must first 
establish the existence of a temperature signal in a proxy record 
before using it to evaluate past temperature changes (Figure A1).
    (2) It is essential to distinguish between regional temperature 
changes and truly hemispheric or global changes. SB do not make this 
essential distinction. The wavelike character of weather (i.e., the 
day-to-day wiggles of the Jet Stream) ensures that certain regions tend 
to warm when other regions cool. This past winter is a case in point. 
January was about 2+ C below normal on the east coast of the U.S., but 
about 4+ C above normal over much of the west. Utah, Nevada and parts 
of California and Alaska had the warmest January on record (the change 
in location of the Iditarod dog sled race was a casualty of the Alaskan 
winter warmth!). The average temperature over the entire U.S. was about 
1+ C above normal, much less warm than for the western U.S., and of the 
opposite sign of the eastern U.S.
    In a similar manner, average global or hemispheric temperature 
variations on longer timescales tend to be much smaller in magnitude 
than those for particular regions, due to the tendency for a 
cancellation of warm and cold conditions in different regions. While 
relative warmth during the 10th-12th centuries, and cool conditions 
during the 15th-early 20th centuries are evident from reconstructions 
and model simulations of the average temperature of the Northern 
Hemisphere (Figure 1), the specific periods of cold and warmth 
naturally differ from region to region (Figure A2). The notion of an 
unusually cold 17th century `Little Ice Age', for example, arose in a 
European historical context. What makes the late 20th century unique is 
the simultaneous warmth indicated by nearly all long-term records 
(Figure A2), leading to the anomalous warmth evident during this period 
in Northern Hemisphere average temperatures (Figure 1).
    (3) It is essential, in forming a climate reconstruction, to 
carefully define a base period for modern conditions against which past 
conditions may be quantitatively compared. The concensus conclusion 
that late-20th century mean warmth likely exceeds that of any time 
during the past millennium for the Northern Hemisphere, is based on a 
careful comparison of temperatures during the most recent decades with 
reconstructions of past temperatures, taking into account the 
uncertainties in those reconstructions. As it is only the past few 
decades during which Northern Hemisphere temperatures have exceeded the 
bounds of natural variability, any analysis such as SB that compares 
past temperatures only to early or mid-20th century conditions, or 
interprets past temperatures using proxy information not capable of 
resolving decadal trends cannot address the issue of whether or not 
late-20th century warmth is anomalous in a long-term context.

                              CONCLUSIONS

    The concentration of greenhouse gases in the atmosphere is higher 
than at any time in at least the last 400,000 years, and, it 
increasingly now appears, probably many millions of years. This 
increase is undeniably due to the activity of human beings through 
fossil fuel burning. Late 20th century warming is unprecedented in 
modern climate history at hemispheric scales. This is almost certainly 
a result of the dramatic increase in greenhouse gas concentrations due 
human activity. The latest model-based projections indicate a global 
mean temperature increase of 0.6 to 2.2+ C (1+ C to 4+ F) relative to 
1990 levels by the mid-20th century. While these estimates are 
uncertain, even the lower value would take us well beyond any previous 
levels of warmth seen over at least the past couple millennia. The 
magnitude of warmth, but perhaps more importantly, the unprecedented 
rate of this warming, is cause for concern.

[GRAPHIC] [TIFF OMITTED] T2381.098

   Response by Michael Mann to Additional Questions by Senator Inhofe
    Question 1. You have used the term ``climate scientist'' to 
distinguish certain individuals. What, in your view, does it take for 
one to earn the title ``climate scientist''? What specific credentials, 
or the lack thereof, would lead you to refuse to recognize someone as a 
``climates scientist''?
    Response. The term ``climate scientist'' is used, in my experience, 
to describe an individual with specific training in oceanographic, 
atmospheric, and coupled ocean-atmosphere processes relevant to 
understanding climate variability and the behavior of the climate 
system. An individual might obtain this training through either an 
advanced degree in those areas of study, or through years of research 
in those areas associated with numerous publications in the peer-
reviewed climate literature such as ``Journal of Geophysical Research--
Atmospheres'', ``Journal of Geophysical Research-Oceans'', ``Climate 
Dynamics'', ``The Holocene'', ``Geophysical Research Letters'', 
``Paleoceanography'' (or publication of climate papers in leading 
international science journals such as ``Nature'' and ``Science''). I 
would not, for example, consider scientists with advanced degrees in 
Astronomy, Astrophysics, or Physics who have published primarily in 
those areas, as ``climate scientists''--nor do I believe would most of 
my colleagues in the climate research community. In addition to 
training and publishing in a field, leading scientists would normally 
be expected to be actively interacting and collaborating in studies 
with colleagues and ensuring their understanding of cutting edge 
science through attendance and active participation in meetings 
convened by the leading professional societies and organizations.

    Question 2. Your work and testimony contends that the Little Ice 
Age was not global, but restricted to only portions of Europe. A 
forthcoming article by Shindell et al. (Shindell, D.T. et al., 2003: 
Volcanic and solar forcing of climate change during the pre-industrial 
era. Journal of Climate, in press), however, indicates the Little Ice 
Age could have resulted from a combination of solar and volcanic 
forcing. Do you agree with these conclusions from Shindell et al.? If 
so, how can solar and volcanic forcings generate climatic effects that 
are not observed across the entire hemisphere?
    Response. The statement is incorrect. I never testified that the 
``Little Ice Age was . . . restricted to only portions of Europe''.
    It should first be noted that many paleoclimatologists have 
questioned the utility of terms such as ``Little Ice Age'' and 
``Medieval Warm Period'' which provide misleading descriptions of past 
climate changes in many regions. There is a complex pattern of climate 
variability in past centuries, and the lack of evidence for synchronous 
temperature variations worldwide in past centuries [e.g. Bradley, R.S., 
and P.D. Jones, ``Little Ice Age'' summer temperature variations: their 
nature and relevance to recent global warming trends, The Holocene, 3, 
367-376, 1993; Hughes, M.K., and H.F. Diaz, Was there a `medieval warm 
period', and if so, where and when, Climatic Change, 26, 109-142, 
1994]. The cited paper by Shindell et al (2003), of which I am a co-
author, is fully consistent with such findings. The paper, rather than 
demonstrating globally uniform patterns of warming or cooling in past 
centuries, shows that surface temperature changes were dominated by 
regional overprints associated with the response of the ``North 
Atlantic Oscillation'' atmospheric circulation pattern to radiative 
forcing. This response leads to a pattern of cooling during the 17th/
18th centuries in certain regions (not just Europe, but many regions 
throughout the Northern Hemisphere extratropics) and warming in other 
regions. The paper shows that this pattern of warming and cooling 
closely resembles the pattern of surface temperature change during that 
interval reconstructed by Mann and colleagues (MBH98). It is worth 
noting, moreover, that the tropical Pacific seems to have been in a 
warmer, rather than a ``colder'' state, during the conventionally 
defined ``Little Ice Age'' [Cobb, K.M., Charles, C.D., Edwards, R.L., 
Cheng, H., & Kastner, M. El Nino-Southern Oscillation and tropical 
Pacific climate during the last millennium, Nature 424, 271-276 (2003). 
Climate dynamists understand the importance of such phenomena in 
understanding the highly variable pattern of surface temperature 
changes in past centuries, and rarely, if ever, argue for the existence 
of globally uniform or synchronous temperature change in past 
centuries. The response of the climate to solar and volcanic radiative 
forcing is known to involves dynamical responses associated with 
regionally differentiated temperature trends that overprint far smaller 
global mean responses. This contrasts strongly with the response of the 
climate to anthropogenic climate forcing, for which the integrated 
global mean radiative forcing is considerably greater, and the 
associated large-scale warming typically rises above the regional 
variability.

    Question 3. That same paper finds ``long-term regional response to 
solar forcing [that] greatly exceeds unforced variability . . . and 
produces climateanomalies similar to those seen during the Little Ice 
Age. Thus, longterm regional changes during the pre-industrial [era] 
appear to havebeen dominated by solar forcing.'' You further state that 
``For the few centuries prior to the industrial era, however, 
externally driven climate change is thought to have been forced 
primarily by only two factors: variation in solar output and volcanic 
eruptions . . . These forcings likely played a large role in the so-
called Medieval Warm Period (MWP) and Little Ice Age (LIA) epochs of 
the last millennium, which saw significant climate changes on at least 
regional scales . . .'' You then define ``regional'' ``to mean 
continental in scale . . .'' Do you claim that total solar irradiance 
change is the only solar forcing mechanism that has any significant 
climate effect? List your formal training in, plus courses you have 
taught, in solar physics. Do you agree with the paper's claim that the 
MWP and LIA exist on regional scales, in accordance with climate 
experts like R. Bryson and H.H. Lamb, starting with their work in the 
1960's, and recently updated in summary in Soon et al. (2003)?
    Response. Expertise in ``solar physics'' is not the expertise 
required to evaluate what is happening to the Earth's climate--what 
matters are the changes in solar radiation at the top of the atmosphere 
and then down through it. As is made clear in our paper, we are 
considering not only ``total solar irradiance'' but also its spectral 
distribution. Indeed, because much of the change in solar radiation 
occurs in UV wavelengths, induced changes in stratospheric ozone can be 
lead to significant changes in atmospheric circulation in the 
troposphere. The model simulations indicate that such atmospheric 
circulation changes can, acting with other factors, lead to regional 
variations in the climate such as were observed over the last 
millennium. As a co-author of this paper, I of course, agree with its 
findings. However, the inference that this paper confirms the work of 
Soon et al. (2003) is very mistaken.
    With respect to my training, teaching, I would encourage that my 
Curriculum Vitae, which I have provided separately, be included in the 
record to be compared to those of the other witnesses with respect to 
relevant expertise and standing in the climate research community.

    Question 4. In your testimony, you stated that you hold the 
``mainstream'' view with respect to climate theory of air temperature 
trends over the past two millennia. Provide supporting citations in the 
refereed scientific literature that are not authored or co-authored by 
you or your colleagues, collaborators, students or former students, or 
associates (i.e., Phil Jones, Ray Bradley, Malcolm Hughes), where 
others hold this Emainstream\1\ view.
    Response. The statement once again mischaracterizes my comments. As 
there is only one reconstruction of Northern Hemisphere annual mean 
temperature over the past two millennia, published only recently by 
Phil Jones and myself, it is hardly meaningful to discuss whether 
``other studies'' support this finding. The peer-review and publication 
process typically unfolds on timescales of a year or longer, not 
months. Any careful reading of my comments would reveal that I was not 
referring to this one specific reconstruction in the comments I made 
characterizing what I believe to be the mainstream viewpoint of the 
climate research community. This review, as discussed in my testimony, 
refers rather to the widespread evidence that late 20th century warmth 
is unprecedented in a long-term context, anywhere from the past several 
centuries to nearly the past two millennia, depending on the timeframe 
of the particular study.
    Second, the description of ``collaborators, associates, former 
students'', depending on how interpreted, is so broad a category as to 
include just about every leading scientist in the field. The following 
publications all come to the same conclusion that late 20th century 
Northern Hemisphere warmth is anomalous in a long-term context:
     Bauer, E., Claussen, M., Brovkin, V., Assessing climate 
forcings of the earth system for the past millennium, Geophys. Res. 
Lett., 30 (6), 1276, doi: 10.1029/2002GL016639, 2003.
     Bertrand C., Loutre M.F., Crucifix M., Berger A., Climate 
of the Last millennium: a sensitivity study. Tellus, 54(A), 221-244, 
2002.
     Bradley, R.S., and P.D. Jones, ``Little Ice Age'' summer 
temperature variations: their nature and relevance to recent global 
warming trends, The Holocene, 3 (4), 367-376, 1993.
     Bradley, R.S., Briffa, K.R., Crowley, T.J., Hughes, M.K., 
Jones, P.D, Mann, M.E., Scope of Medieval Warming, Science, 292, 2011-
2012, 2001.
     Bradley, R.S., M.K.Hughes and H.F. Diaz., Climate in 
Medieval Time. Science, 302, 404-405, 2003.
     Pollack, H.N., S. Huang, and P.-Y. Shen, Climate Change 
Record in Subsurface Temperatures: A Global Perspective, Science, 282, 
279-281, 1998.

     Briffa, K.R., T.J. Osborn, F.H. Schweingruber, I.C. 
Harris, P.D. Jones, S.G. Shiyatov, S.G. and E.A. Vaganov, Low-frequency 
temperature variations from a northern tree-ring density network. J. 
Geophys. Res., 106, 2929-2941, 2001.
     Crowley, T.J., Causes of Climate Change Over the Past 1000 
Years, Science, 289, 270-277, 2000.
     Crowley, T.J., and T. Lowery, How Warm Was the Medieval 
Warm Period, Ambio, 29, 51-54, 2000.
     Gerber, S., F. Joos, P. Brugger, T. F. Stocker, M. E. 
Mann, S. Sitch, and M. Scholze, Constraining temperature variations 
over the last millennium by comparing simulated and observed 
atmospheric CO2, Climate Dynamics, 20, 281-299, 2003.
     Hegerl, G.C., T.J. Crowley, S.K. Baum, K-Y. Kim, and W. T. 
Hyde, Detection of volcanic, solar and greenhouse gas signals in paleo-
reconstructions of Northern Hemispheric temperature. Geophys. Res. 
Lett., 30 (5), doi: 10.1029/2002GL016635, 2003.
     Huang, S., H. N.Pollack and P.-Y. Shen, Temperature Trends 
Over the Past Five Centuries Reconstructed from Borehole Temperature, 
Nature 403, 756-758, 2000.
     Jones, P.D., M. New, D.E. Parker, S. Martin, and I.G. 
Rigor, 1999: Surface air temperature and its changes over the past 150 
years. Reviews of Geophysics 37, 173-199.
     Jones, P.D., T.J. Osborn, and K.B. Briffa, The Evolution 
of Climate Over the Last Millennium, Science, 292, 662-667, 2001.
     Overpeck, J., K. Hughen, D. Hardy, R. Bradley, R. Case, M. 
Douglas, B. Finney, K. Gajewski, G. Jacoby, A. Jennings, S. Lamoureux, 
A. Lasca, G.M.J. Moore, M. Retelle, S. Smith, A. Wolfe, and G. 
Zielinski, Arctic Environmental Change of the Last Four Centuries, 
Science, 278, 1251-1256, 1997.
     Pollack, H.N., S. Huang, and P.-Y. Shen, Climate Change 
Record in Subsurface Temperatures: A Global Perspective, Science, 282, 
279-281, 1998.

    Question 5. Your work has been characterized as ``global'' in 
several venues, including the National Assessment. Is that a fair 
characterization, or are those sources confused by your use of Northern 
and Southern Hemisphere proxies in your Northern Hemisphere 
reconstruction? Can you explain why the National Assessment did not 
include error bars on your temperature reconstruction?
    Response. The proxy records on which our work is based represent 
conditions over much of the Northern Hemisphere and a small fraction of 
the Southern Hemisphere. While in any given year there can be some 
difference in the anomalies in the two hemispheres, the instrumental 
record indicates that over periods of a few decades or more, the 
anomalies in the two hemispheres are quite similar because of the 
thermodynamic and dynamic coupling between them. Thus, the major 
features of the temperature record, and in particular the unusual 20th 
century warming, are similar in the two hemispheres and thus global 
features. It was this aspect of the record to which the text of the 
National Assessment report refers in presenting the overall 
significance of our study, and the report is correct in suggesting that 
the 20th century warming is global in nature. The caption for Figure 2 
of Chapter 1 in the Foundation report (page 22 and page 544) states 
that ``Although this record comes mostly from the Northern Hemisphere, 
it is likely to be a good approximation to the global anomaly based on 
comparisons of recent patterns of temperature fluctuations.'' This 
accurately reflects the situation. In the Overview report (page 13), 
although the figure title says ``Global CO2 and Temperature 
Change,'' the caption next to the figure says ``Records of Northern 
Hemisphere surface temperatures, CO2 concentrations, and 
carbon emissions show a close correlation. Temperature change: 
reconstruction of annual-average Northern Hemisphere surface air 
temperatures derived from historical records, tree rings, and corals 
(blue), and air temperatures directly measured (purple).'' This quite 
clearly makes the point this is mainly a Northern Hemisphere 
temperature record.
    With respect to the question about the presentation of the figure, 
it is misleading to imply that the term ``error bars'' indicates that 
the central line is off by this amount--rather the limits mean that 
there is only 1 chance in 20 that the actual value is outside this 
range. That is, what we are showing is the likely range within which 
the anomaly lies, with there being a 95 percent chance the value is 
within this range. The line that we present in many of our figures, and 
that was presented in the National Assessment report, is the most 
likely value within this range (rather a natural choice to display in 
explaining a complex issue to the public). In looking at the National 
Assessment report, the caption for Figure 2 of Chapter 1 in the 
Foundation report (page 22 and page 544) states that ``The error bars 
for the estimate of the annual-average anomaly increase somewhat going 
back in time, with one standard deviation being about 0.25+ F (0.15+ 
C).'' Quite clearly, the reader interested in investigating the 
accuracy of the records would followup by reading the original 
reference, which is cited in the text.
    In that I was not involved in the National Assessment report, the 
questioner should consult the authors of that report for any further 
information or questions.

    Question 6. You testified that the late 20th century warming is 
likely caused byman-made CO2 forcing on climate; what is 
your scientific proof for thatclaim Please detail how you removed the 
potential effects from other factors including those of sulfate 
aerosols, tropospheric and stratospheric ozone, volcanic dust veiling, 
black soot, solar particleand wave length-dependent variability, sea 
ice, land use, vegetation andother greenhouse gases?
    Response. The question inconsistently equates my statement of a 
``likely'' causal relationship with the standard of ``scientific 
proof''. Scientists do not speak in terms of ``proof''. We speak in 
terms of likelihoods and the strength of evidence in support of a 
particular hypothesis.
    A large number of peer-reviewed scientific studies have been 
published in the leading scientific journals such as Nature and Science 
in the past two decades elucidating the role of natural and 
anthropogenic factors in observed climate changes. Physically based 
models have been developed and validated against observations, and 
these models reproduce complex climate phenomena such as El Nino. These 
same models have been driven with the primary ``external'' factors that 
are believed to govern climate variations on timescales of decades and 
centuries. These external factors include natural factors, such as the 
modest estimated variations in radiative output of the Sun, which 
varies by a fraction of a percent over time, variations in the 
frequency and intensity of explosive volcanic eruptions, which have a 
several-year cooling effect on the climate through the injection of 
reflective volcanic aerosols into the stratosphere, and very small 
changes in the Earth's orbit relative to the Sun that occur on multi-
century timescales. These external factors also include the 
``anthropogenic'' influences of increased greenhouse gas concentrations 
due to fossil fuel burning, changes in the reflective properties of the 
land surface due to human land use alterations, and the regional 
cooling effect of anthropogenic sulphate aerosols in certain industrial 
regions. When driven with these factors, these climate models have 
demonstrated a striking ability to reproduce observed global and 
hemispheric temperature trends during the 20th century, as well as 
longer-term trends in past centuries as reconstructed from proxy data. 
Such results have been demonstrated in the following peer-reviewed 
scientific articles:
     Wigley, T.M.L., R.L. Smith, and B.D. Santer, Anthropogenic 
Influence on the Autocorrelation Structure of Hemisoheric-Mean 
Temperatures, Science, 282, 1676-1680, 1998.
     Tett, S.F.B., P.A. Scott, M.R. Allen, W.J. Ingram, and 
J.F.B. Mitchell, Causes of Twentieth-Century Temperature Change Near 
the Earth's Surface, Nature, 399, 569-572, 1999.
     Hegerl, G.C., P.A. Scott, M.R. Allen, J.F.B. Mitchell, 
S.F.B. Tett, and U. Cubasch, Optimal detection and aftribution of 
climate change: sensitivity of results to climate model differences, 
Climate Dynamics, 16, 737-754, 2000.
     Crowley, T.J., Causes of Climate Change Over the Past 1000 
Years, Science, 289, 270-277, 2000.
     Stott, P.A., S.F.B. Tett, G.S. Jones, M.R. Allen, J.F.B. 
Mitchell, and G.J. Jenkins, External Control of 20th Century 
Temperature by Natural and Anthropogenic Forcings, Science, 290, 2133-
2137, 2001.
     Stott, P.A., S.F.B. Tett, G.S. Jones, M.R. Allen, W.J. 
Ingram, and J.F.B. Mitchell, Attribution of twentieth century 
temperature change to natural and anthropogenic causes, Climate 
Dynamics, 17, 11-21, 2001.
    These conclusions, furthermore, were endorsed by the 2001 IPCC 
scientific working group report (Chapter 12), and the followup National 
Academy of Sciences report that endorsed most of the key IPCC 
conclusions.

    Question 7. A number of expert studies have produced individual 
proxy records that show the existence of a local Medieval Warm Period 
or Little Ice Age. Such studies cover a large portion of the globe. How 
do you reconcile your hemispheric reconstruction with these individual 
proxy records?
    Response. It is unclear to me what precisely the questioner means 
by ``a number of expert studies'' or how he defines the ``existence'' 
of a ``Medieval Warm Period'' or ``Little Ice Age''. As discussed in my 
response to question 2, the regionally and temporally variable nature 
of climate changes in past centuries makes such descriptors of past 
climate change naive and often useless as a characterization of past 
changes. A sampling of some of the longest, high-quality best term 
proxy temperature estimates over the globe was provided in Figure 2 of 
the article: Mann, M.E., Ammann, C.M., Bradley, R.S., Briffa, K.R., 
Crowley, T.J., Hughes, M.K., Jones, P.D., Oppenheimer, M., Osborn, 
T.J., Overpeck, J. T., Rutherford, S., Trenberth, K.E., Wigley, T.M.L., 
On Past Temperatures and Anomalous Late 20th Century Warmth, Eos, 84, 
256-258, 2003. This figure demonstrates the lack of evidence for any 
periods in earlier centuries that are comparable in terms of evidence 
for sychronous warmth to the late 20th century. This same conclusion 
was also demonstrated by the recent article in Science: Bradley, R.S., 
M.K. Hughes and H.F. Diaz., Climate in Medieval Time. Science, 302, 
404-405, 2003.

    Question 8. Do you claim 22 proxies to be a sufficient sample of 
observations for reconstructing a Northern Hemisphere temperature? If 
not, why did you consider it sufficient for the 1400-1450 interval in 
your 1998 Nature paper? If you do, there are 29 proxies that continue 
to 1984 in the data base you used for your 1998 paper. Why then did you 
terminate your temperature reconstruction at 1980? What efforts have 
you made to extend the proxy re-constructions up to the present?
    Response. The question is wrongly premised. The Mann et al (1998) 
study made use of almost 100 proxy series over the interval AD 1400-
1450. The question appears to confuse the number of proxy series that 
was used, with the number of statistical indicators that were used to 
represent these proxy data. For example, the 70 series that make up the 
North American International Tree Ring Data Base date back to 1400, 
were represented in terms of their leading patterns of variance through 
a procedure known as ``Principal Component Analysis''. These patterns 
represented, however, a much larger number of underlying data. Most of 
the proxy records used in that analysis ended by 1980, limiting the 
useful upper limit to the calibration period used. A more recent paper 
(in press) extends proxy-based hemispheric temperature reconstructions 
through the mid-1990's, demonstrating the ability of the reconstruction 
to capture the accelerated warming evident in the instrumental record 
since 1980.

    Question 9. What are the patterns of temperature change in all 
proxies after 1980?
    Response. It is unclear what is meant by the question. Every proxy 
series which extends past 1980 exhibits its own particular pattern. A 
recent paper (in press), as referred to in Question No. 8, demonstrates 
that a composite of proxy temperature indicators with reliable low-
frequency variability that are available through the mid-1990's capture 
the accelerated warming after 1980.
    All of the data used in our study have been available since July 
2002 on the public ftp site: ftp://holocene.evsc.virginia.edu/pub/
MBH98/.

    Question 10. Do you have any external (not derived by you) method 
or data to provide verification of your temperature reconstruction 
Please explain.
    Response. We used the method of cross-validation to independently 
demonstrate the statistical reliability of our reconstructions. This is 
detailed in MBH98 and MBH99. We did not derive the method of cross-
validation--it is a well established statistical procedure, detailed in 
many introductory level statistics textbooks.

    Question 11. Are you aware of any errors in your data compilation 
for MBH98 or MBH99? If so, what are they?
    Response. We are not aware of any errors. We are, however, aware of 
recent spurious claims of such errors by the authors of an article 
published in the social science journal ``Energy and Environment''. 
These claims have already been widely discredited by a cursory analysis 
of the paper, and a manuscript detailing the numerous fundamental 
errors made in the Energy and Environment paper has been submitted to 
the peer-reviewed literature. We would be happy to provide a copy of 
the paper to be made part of the official Senate record once it is 
published.

    Question 12. Are you aware of any errors in any calculations that 
you made in MBH98 or MBH99? If so, what are the errors?
    Response. See response to question 11.

    Question 13. Vegetation grows as a result of a number of factors, 
including energy input, moisture supply, fire frequencies, and species 
competition. Do you claim it is possible to accurately remove the 
effects of these factors from your tree ring proxy datasets to produce 
a resulting time-series represents fluctuations in only air 
temperature? What is the magnitude of the error introduced in 
developing a procedure to remove these other effects? Please detail the 
analyses and list peer-reviewed works that specifically outline 
techniques to remove the effect of these other indicators for inferring 
past temperatures.
    Response. One of the co-authors of MBH98 (Malcolm Hughes) is among 
the world's foremost experts in dendroclimatology, so the team of MBH98 
hardly needs to be informed of the processes that influence tree 
growth. The method of MBH98 does not ``remove'' various factors from 
tree ring proxy information (which would be a most unwise approach!) 
but, rather, uses multivariate statistical methods similar to those 
commonly used in climate and paleoclimate field reconstruction [see 
e.g. Cook, E.R., K.R. Briffa, and P.D. Jones, Spatial Regression 
Methods in Dendroclimatology: A Review and Comparison of Two 
Techniques, International Journal of Climatology, 14, 379-402, 1994; 
Smith, T.M., R.W. Reynolds, R.E. Livezey, and D.C. Stokes, 
Reconstruction of Historical Sea Surface Temperatures Using Empirical 
Orthogonal Functions, Journal of Climate, 9, 1403-1420, 1996; Kaplan, 
A., Y. Kushnir, M.A. Cane, and M.B. Blumenthal, Reduced space optimal 
analysis for historical data sets: 136 years of Atlantic sea surface 
temperatures, Journal of Geophysical Research, 102, 27835-27860, 1997] 
to separate the information in the data that can meaningfully be 
related to surface temperature variations from that related to other 
influences.

    Question 14. Define the difference between variability and error in 
a statistical analysis. In EOF analyses, is the variation of the first 
principal component indicative of the uncertainty associated with the 
data? Why or why not?
    Response. Variability in an estimated quantity can be thought of as 
representing both `signal' (the physical quantity one is interested in 
and `noise' (everything else). The definition of noise and signal 
depends on a number of assumptions regarding the nature of the process 
that generated the times series of interest and the specification of 
the statistical model for the data in question. Uncertainty, which is 
associated with the partitioning of data variance into ``noise and 
signal'', as defined above, depends on such detailed considerations. 
There are no general statistical principles that I am familiar with 
that relate uncertainty, thusly defined, to the first, or any other, 
principal component of a dataset containing both signal and noise 
contributions. Uncertainty is typically diagnosed by the analysis of 
residual variance from a statistical model based on a combined 
calibration/cross-validation procedure. Introductory text books such as 
``Statistical Methods in the Atmospheric Sciences'' (D. Wilks, Academic 
Press) deal with this topic in detail.

    Question 15. Specifically, how do you construct regional patterns 
of temperature changes in past centuries when data are limited, either 
spatially, temporally, or both?
    Response. Our methods are described in detail in the following 
peer-reviewed scientific publications, which I would like to have made 
part of the official Senate record:
     Mann, M.E., Jones, P.D., Global Surface Temperatures over 
the Past two Millennia, Geophysical Research Letters, 30 (15), 1820, 
10.1029/2003GL017814, 2003.
     D'Arrigo, R.D., Cook, E.R., Mann, M.E., Jacoby, G.C., 
Tree-ring reconstructions of temperature and sea level pressure 
variability associated with the warm-season Arctic Oscillation since AD 
1650, Geophysical Research Letters, 30 (11), 1549, doi: 10.1029/
2003GL017250, 2003.
     Mann, M.E., Rutherford, S., Bradley, R.S., Hughes, M.K., 
Keimig, F.T., Optimal Surface Temperature Reconstructions Using 
Terrestrial Borehole Data, Journal of Geophysical Research, 108 (D7), 
4203, doi: 10.1029/2002JD002532, 2003.
     Rutherford, S., Mann, M.E., Delworth, T.L., Stouffer, R., 
Climate Field Reconstruction Under Stationary and Nonstationary 
Forcing, Journal of Climate, 16, 462-479, 2003.
     Mann, M.E., Large-scale climate variability and 
connections with the Middle East in past centuries, Climatic Change, 
55, 287-314, 2002.
     Cook, E.R., D'Arrigo, R.D., Mann, M.E., A Well-Verified, 
Multi-Proxy Reconstruction of the Winter North Atlantic Oscillation 
Since AD 1400, Journal of Climate, 15, 1754-1764, 2002.
     Mann, M.E., Rutherford, S., Climate Reconstruction Using 
`Pseudoproxies', Geophysical Research Letters, 29 (10), 1501, doi: 
10.1029/2001GL014554, 2002.
     Mann, M.E., Large-scale Temperature Patterns in Past 
Centuries: Implications for North American Climate Change, Human and 
Ecological Risk Assessment, 7 1247-1254, 2001.
     Mann, M.E., Climate During the Past Millennium, Weather 
(invited contribution), 56, 91-101, 2001.
     Cullen, H., D'Arrigo, R., Cook, E., Mann, M.E., 
Multiproxy-based reconstructions of the North Atlantic Oscillation over 
the past three centuries, Paleocean-
ography, 15, 27-39, 2001.
     Mann, M.E., Gille, E., Bradley, R.S., Hughes, M.K., 
Overpeck, J.T., Keimig, F.T., Gross, W., Global Temperature Patterns in 
Past Centuries: An interactive presentation, Earth Interactions, 4-4, 
1-29, 2000.
     Delworth, T.L., Mann, M.E., Observed and Simulated 
Multidecadal Variability in the Northern Hemisphere, Climate Dynamics 
16, 661-676, 2000.
     Mann, M.E., Bradley, R.S. and Hughes, M.K., Northern 
Hemisphere Temperatures During the Past Millennium: Inferences, 
Uncertainties, and Limitations, Geophysical Research Letters, 26, 759-
762, 1999.
     Mann, M.E., Bradley, R.S., and Hughes, M.K., Global-Scale 
Temperature Patterns and Climate Forcing Over the Past Six Centuries, 
Nature, 392, 779-787, 1998.

    Question 16. Do you claim that the instrumental temperature record 
is known without error? If not, what error and uncertainty would you 
associate with the annual Northern Hemisphere averaged air temperature 
for 1900? For 1950? For 2000? How were these estimates incorporated 
into your analysis?
    Response. The claim made by Dr. Legates in his testimony that we 
present the instrumental record without uncertainty is incorrect. If 
Legates, for example, were familiar with studies of the instrumental 
surface temperature record, he would understand that the uncertainties 
in this record during the 20th century are small compared to the 
uncertainties shown for our reconstruction [see e.g. Figure 2.1b in 
Folland, C.K., Karl, T.R., Christy, J.R., Clarke, R. A., Gruza, G.V., 
Jouzel, J., Mann, M.E., Oerlemans, J., Salinger, M.J., Wang, S.-W., 
Observed Climate Variability and Change, in Climate Change 2001: The 
Scientific Basis, Houghton, J.T., et al. (eds.), Cambridge Univ. Press, 
Cambridge, 99-181, 2001.]. Furthermore, all scientists with a proper 
training in statistics know that uncertainties add ``in quadrature''. 
In other words, you have to square them before adding them. This means 
that the relatively small uncertainty in the instrumental record makes 
a relatively small contribution to the total uncertainty. Legates 
claimed in his testimony that including the uncertainty in the 
instrumental record, which he estimates as 0.1oC would change the 
conclusions expressed by us and other mainstream climate scientists 
that the 1990's are the warmest decade in at least the past 1000 years 
within estimated uncertainties. This claim is very misleading for 
several reasons, First, the standard error in Northern Hemisphere mean 
annual temperatures during the 1990's is far smaller than the amount 
cited by Legates [see again Folland et al, 2001 cited above]. Even more 
problematic, however, Legates claim indicates a fundamental 
misunderstanding of the statistical concepts of standard error and 
uncertainty. The shaded region shown along with the Mann et al 
reconstruction (and other similar plots shown in recent articles such 
as the aforementioned ``Eos'' article, and the IPCC report) indicates 
two standard error intervals. The decade of the 1990's is roughly two 
standard errors warmer (i.e., about 0.4oC)] than any decade prior to 
the 20th century in the reconstruction. Based on a one-sided test for 
anomalous warmth, this translates to a roughly 97.5 percent level of 
significance. Modifying the uncertainties to include the small 
additional contribution due to uncertainties in the instrumental record 
itself would modify this only slightly, and would not lower the 
significance level below the 95 percent level. Though there is no such 
thing as an absolute estimate of uncertainty, despite Legate's 
implications to the contrary, a 95 percent confidence is often adopted 
as an appropriate criterion for significance. Legates' statement that 
including instrumental contributions to the uncertainty would change 
the conclusions is thus clearly false.

    Question 17. Assuming a proxy record extended back to 1000 A.D., 
what specifically would be required to disqualify this proxy record 
from your analyses? Provide supporting evidence where others have 
disqualified such records from temperature analyses on these criteria.
    Response. It is unclear what type of ``analyses'' are being 
referred to here. I have used a variety of different statistical 
methods and data in various published studies describing paleoclimate 
reconstruction, so the question as worded is implicitly vague. In any 
case, our approaches do not ``disqualify'' proxy data. They use 
objective statistical criteria to evaluate the strength of the signal 
available for reconstruction of the particular climate field or index 
(be it related to surface temperature, atmospheric circulation, 
drought, or other variables) to be reconstructed. Such statistical 
approaches, and the related approaches used by other climate 
researchers, are described in the various publications listed above in 
the response to Questions No. 13 and 15.

    Question 18. Have you made available via FTP the coefficients 
developed to relate proxies to principal components? If not, would you 
make those coefficients available at NGDC/paleo?
    Response. The question is based on two false premises. The first 
involves a naive view of what is required and expected of scientific 
researchers. It is unprecedented in my experience for any scientist to 
post in the public domain every single computational aspect of a 
complicated analysis. The methods of our study were adequately 
described in our paper and supplementary information, and the data used 
were made available in the public domain. Indeed, we made far more of 
our results, data, and methodological details available in the public 
domain than is provided in most similar scientific studies. The 
scientific funding agencies (DOE, NSF, and NOAA) would have informed us 
if we had not followed the appropriate protocols in the provision of 
data and results.
    The second false premise is a technical one. A proper understanding 
of the methodology employed by MBH98 would reveal that there is no one 
fixed set of ``coefficients'' that relate a particular proxy record to 
a particular principal components. The relationship is determined based 
on time-dependent inverse problem for which the weights on different 
records are not fixed over time, as described in our published 
articles.

    Question 19. Have you made available via FTP any specialized 
computer studies, such as Matlab scripts, in connection with your 
temperature reconstruction? If not, would you make any such scripts 
used in developing the temperature reconstructions in MBH98 and MBH99 
available through NGDC/paleo?
    Response. The methodologies have been described, and other climate 
researchers have independently, successfully implemented the 
methodology, e.g.: Zorita, E., F. Gonzalez-Rouco, and S. Legutke, 
Testing the Mann et al. (1998) Approach to Paleoclimate Reconstructions 
in the Context in a 1000-Yr Control Simulation with the ECHO-G Coupled 
Climate Model, Journal of Climate, 16, 1378-1390, 2003.

    Question 20. Do you claim your method of reconstructing past 
temperature from proxies is the only correct one? If not, please submit 
some published papers that use methods you consider to be correct as 
well. If you do consider yours the only correct method, can you provide 
a list of names of scientists whom you have contacted to tell them they 
are using the wrong methods in their work?
    Response. The question is based on the false premise that my 
colleagues and I use any one particular ``method'' of reconstructing 
past temperatures from proxy data. In fact, I have published on the 
application of at least five fundamentally independent methods for 
using proxy data to reconstruct past climate patterns in the peer-
reviewed literature. Examples of the applications of different methods 
can be found in the following peer-reviewed scientific publications:
     Zhang, Z., Mann, M.E., Cook, E.R., Alternative Methods of 
Proxy-Based Climate Field Reconstruction: Application to the 
Reconstruction of Summer Drought Over the Conterminous United States 
back to 1700 From Drought-Sensitive Tree Ring Data, Holocene, in press, 
2003.
     Mann, M.E., Jones, P.D., Global Surface Temperatures over 
the Past two Millennia, Geophysical Research Letters, 30 (15), 1820, 
10.1029/2003GL017814, 2003.
     Mann, M.E., Rutherford, S., Bradley, R.S., Hughes, M.K., 
Keimig, F.T., Optimal Surface Temperature Reconstructions Using 
Terrestrial Borehole Data, Journal of Geophysical Research, 108 (D7), 
4203, doi: 10.1029/2002JD002532, 2003.
     D'Arrigo, R.D., Cook, E.R., Mann, M.E., Jacoby, G.C., 
Tree-ring reconstructions of temperature and sea level pressure 
variability associated with the warm-season Arctic Oscillation since AD 
1650, Geophysical Research Letters, 30 (11), 1549, doi: 10.1029/
2003GL017250, 2003.
     Mann, M.E., Bradley, R.S., and Hughes, M.K., Global-Scale 
Temperature Patterns and Climate Forcing Over the Past Six Centuries, 
Nature, 392, 779-787, 1998
    On occasion, there are approaches used that are not adequate. For 
example, the approach of simply counting papers and not properly 
defining what constitutes an anomaly, as was the case for the paper by 
Soon et al. (2003), is most decidedly not adequate. Also, the analysis 
approach used by McIntyre and McItrick (2003) in which the authors 
attempted to reproduce the results of the previous study of MBH98 based 
on an analysis which used neither the same data (the authors eliminated 
the majority of data used by MBH98 for the first two centuries of the 
reconstruction) , or method as the original authors, was woefully 
inadequate. In fact, this latter study was described as ``seriously 
flawed'' and ``silly'' in a recent article in USA Today (``Global 
Warming Debate Heats Up Capitol Hill'', 11/19/03). When deeply flawed 
studies such as this are published, I am interested in determining what 
errors have been made and, if necessary as in this latter case, 
promptly submitting a rebuttal to the peer-reviewed scientific 
literature to ensure that the scientific community is not misled by the 
use of inadequate approaches. To my knowledge, I am not considered to 
be shy in offering criticism where criticism is due.

    Question 21. If there are other acceptable methods, did you try any 
of them on your data set prior to its publication to see what the 
results would be? If so would you please submit the results. If not, 
have you done so since? Why do you claim your multi-proxy results 
represent a ``robust consensus,'' as you said in your Eos publication, 
if you have not verified that its results would also be obtained using 
other acceptable methods?
    Response. As demonstrated both in the Eos article, and the various 
references provided in my response to Question 4, about a dozen 
different recent estimates based on a variety of data and approaches, 
published by different groups, yield statistically indistinguishable 
histories of Northern Hemisphere mean temperature changes in past 
centuries. I define such a result as characterizing a ``consensus''.

    Question 22. Did you at any time prior to publication compute the 
analysis up to 1984 or later? What were the results? If you did not, 
even though you had sufficient data, why not? If you did but you did 
not use those results, explain why. If the results were different, 
where did you publish a discussion of those differences? If they were 
the same, why did you delete them? Why, in other words, did you throw 
out data for the period of maximum interest?
    Response. Most of the proxy records used in MBH98 and MBH99 ended 
by 1980, limiting the useful upper limit to the calibration period 
used. A more recent paper (in press) extends proxy-based hemispheric 
temperature reconstructions through the mid-1990's, demonstrating the 
ability of the reconstruction to capture the accelerated warming 
evident in the instrumental record since 1980. We would be happy to 
provide a copy of this paper to be made part of the official Senate 
record when it is formally published.

    Question 23. On your web site http://www.ngdc.noaa.gov/paleo/ei/
data--supp.html where you explain the assembling of the data base for 
your 1980 paper you say: ``Small gaps have been interpolated. If 
records terminates lightly before the end of the 1902-1980 training 
interval, they are extended by persistence to 1980.'' Does this mean 
you made up some observations to fill in blank spots in the data 
records? Have you ever provided a complete public listing of all the 
data you made up? Please provide such a listing now. Of the 112 
proxies, in how many of them did you fill gaps? Why in some of them but 
not others? What is the longest interval of time over which you filled 
in missing observations?
    Response. Extension of missing values by `persistence' of the final 
available value is a typical statistical approach to estimating small 
amounts of unavailable data at the end of a time series (see e.g. the 
textbook by Wilks, referred to in the response to Question No. 14). The 
fact that this approach was used to infill a modest number of missing 
observations between 1972 and 1980 was described in the Nature 
supplementary information. All of the data used in our study have been 
available since July 2002 on the public ftp site: ftp://
holocene.evsc.virginia.edu/pub/MBH98/.

    Question 24. What was the effect on your results of filling in the 
missing data? Did you run your analysis without it? Please submit the 
results when the filled-in data are dropped from the analysis. If it 
changes your results, where is that discussed If it makes no 
difference, why did you do it?
    Response. The use of infilled data has essentially no effect on the 
reconstruction, as demonstrated by the fact that the same result is 
achieved if a 1902-1971 calibration period (which predates the use of 
any infilled proxy data) is used instead of a 1902-1980 calibration 
period. It is advisable to use the full 1902-1980 calibration interval, 
however, because the increased statistical constraint provided by the 
lengthening of the calibration period more than offsets the impact of 
the use of a modest amount of in-filled data in a small number of 
series.

    Question 25.  Do you agree that statistical methods based on linear 
extrapolation from data representing the far extreme of the line are 
associated with an added error/uncertainty? If so, how was this 
incorporated into the assessment of the error/uncertainty in your 
temperature reconstructions? Please provide citations from your 
publications. If not, please explain why the uncertainty envelope of a 
linear regression grows larger as a function of the distance from the 
mean of the data used to fit the parameters and why this was not 
included in your research.
    Response. The so-called ``leverage effect'' which the question 
appears to refer to, is taken into account through consideration of the 
spectrum of the calibration residuals, allowing for resolution of any 
enhancement of uncertainty as a function of frequency (see MBH99). 
Alternatively, the uncertainties can be evaluated from an independent 
sample (i.e., cross-validation, rather than calibration, residuals) 
that eliminates any influence of calibration period leverage in the 
estimation of uncertainties. Both approaches give similar results [e.g. 
Rutherford, S., Mann, M.E., Osborn, T.J., Bradley, R.S., Briffa, K.R., 
Hughes, M.K., Jones, P.D., Proxy-based Northern Hemisphere Surface 
Temperature Reconstructions: Sensitivity to Methodology, Predictor 
Network, Target Season and Target Domain, Journal of Climate, 
submitted, 2003].

    Question 26. Please describe the peer review process that took 
place with respect to your Forum article that appeared in EOS on July 
8, 2003. If, according to the AGU, the EOS Forum contains articles 
stating a personal point of view on a topic related to geophysical 
research or the relationship of the geophysical sciences to society, 
how can you claim that your article is peer reviewed?
    Response. The article was independently reviewed and evaluated for 
suitability for publication by an editor who has expertise in the 
particular subject area. The associated process is correctly described 
as ``peer review''. Appropriate to the relatively short and non-
technical nature of Eos ``Forum'' pieces, the associated peer review 
process is not as extensive as that employed for articles in the more 
technical literature such as Geophysical Research Letters, or Journal 
of Geophysical Research. I would suggest that the questioner contact 
representatives at AGU for more details on the peer-review process 
employed for their different journals and paper categories.

    Question 27. Do you claim that producing estimates of past climate 
states is an exact science If so, explain why different authors can get 
such significantly different results when investigating and 
reconstructing past temperature, and detail the errors that other 
authors must have made. If not, explain how there can be, as you put it 
in your EOS article, a ``robust consensus'' regarding the correct 
estimate of the climate state of the past millennium.
    Response. The term ``exact science'' is generally not used, or 
considered meaningful or appropriate by scientists, as science almost 
always involves the testing of hypotheses based on the use of 
intrinsically uncertain data or observations. Consistent with this 
fundamental aspect of nearly all scientific endeavors, my colleagues 
and I, and other researchers in the paleoclimate community, typically 
interpret the results of paleoclimate reconstructions within the 
context of sometimes substantial associated uncertainties. When a large 
number of estimates agree with each other within estimated 
uncertainties, and those uncertainties are modest enough to still allow 
for non-trivial conclusions (for example, that late 20th century warmth 
is anomalous in a long-term context), those conclusions can be 
considered as both ``robust'' and a ``consensus''.

    Question 28. Please describe the peer review process that took 
place with respect to your 1999 Geophysical Research Letters paper. 
What were the criticisms or improvements suggested by the referees? Why 
was no reference made to the anomalous global warming caused by the 
very strong El Nino event of 1997-98 in your paper? Is this 1999 paper 
continuation of your 1998 paper in Nature where you stopped your 
reconstruction at AD 1400?
    Response. The comments of reviewers on a manuscript are considered 
a confidential matter, involving the editor, reviewers, and authors. 
Providing these comments for public record would be ethically 
questionable, and probably violates the confidentiality policies of the 
associated journals. Minor suggestions were made by the reviewers and 
editor, and addressed to their satisfaction prior to the acceptance and 
publication of the paper.

    Question 29.  In Mann and Jones 2003 Geophysical Research Letters, 
did you change your methodology in the reconstruction of the 
hemispheric or global scale temperature from your prior publications? 
If so, why did you, and what is the rationale for the change of 
approach?
    Response. The question is wrongly premised, as it presumes, through 
the use of the language ``change your methodology'' that scientists 
only have one particular methodological approach that can be applied to 
a problem at hand. As discussed in my answer to question 20, my 
research has involved the use of a variety of different methods for 
reconstructing past climate patterns from proxy data. The paper by Mann 
and Jones (2003), for example, uses a coarser resolution proxy dataset 
than MBH98/MBH99 and a compositing methodology that allows for the 
reconstruction of decadal, but not annual, changes, and the 
reconstruction of hemispheric mean, but not spatially resolved, 
patterns of temperature in past centuries. In doing so, the study was 
able to make use of a more restricted set of temperature records 
available over a longer timeframe than those used in previous high-
resolution proxy reconstructions of hemispheric temperature change.

    Question 30. Did IPCC carry out any independent programs to verify 
the calculations that you made in MBH98 or MBH99? If so, please provide 
copies of the reports resulting from such studies.
    Response. It is distinctly against the mission of the IPCC to 
``carry out independent programs'', so the premise of the question is 
false. However, the IPCC's author team did engage in a lively 
interchanges about the quality and overall consistency of all of the 
papers as the chapter was drafted and revised in the course of review.

    Question 31. Did IPCC carry out any independent quality control on 
the data that you used in MBH98 and MBH99? If so, please provide copies 
of the reports resulting from such studies.
    Response. The IPCC doesn't ``carry out studies'', so the premise of 
the question is false. The IPCC instead depends that the normal 
scientific peer-review process, especially when done in a leading 
journal, has ensured an acceptable level of quality. In addition, the 
IPCC does check to see if any criticisms have been raised postreview in 
comments and response to the journal articles.

    Question 32. Did IPCC carry out any studies to validate the 
statistical procedures and methodologies used in MBH98 and MBH99? If 
so, please provide copies of the reports resulting from such studies.
    Response. The IPCC doesn't ``carry out studies'', so the premise of 
the question is false. Instead, as indicated above, the IPCC relies on 
earlier stages of review to cover such matters.

    Question 33. Has any organization other than IPCC or your 
associates carried out any independent programs to verify the 
calculations that you made in MBH98 or MBH99? If so, please provide 
copies of the reports resulting from such studies.
    Response. I know of no ``organizations'' that carry out 
``independent programs'' to verify calculations of individual co-
authors. If the question is, have other scientists reproduced the basic 
results of MBH98 and MBH99, the answer is yes. Numerous other groups 
(see the dozen or so independent estimates of various groups shown in 
Figure 1 of: Mann, M.E., Ammann, C.M., Bradley, R.S., Briffa, K.R., 
Crowley, T.J., Hughes, M.K., Jones, P.D., Oppenheimer, M., Osborn, 
T.J., Overpeck, J.T., Rutherford, S., Trenberth, K.E., Wigley, T.M.L., 
On Past Temperatures and Anomalous Late 20th Century Warmth, Eos, 84, 
256-258, 2003) have produced reconstructions that are remarkably 
similar to those of MBH98 based on a variety of data and methods. Refer 
back to my answer to question 4 for further details. I would like to 
see each of these papers made an official part of the Senate record.

    Question 34. Has any organization other than IPCC conducted 
independent quality control on the data that you used in MBH98 and 
MBH99? If so, please provide copies of the reports resulting from such 
studies.
    Response. The IPCC doesn't ``carry out studies'', so the premise of 
the question is false. The data used by MBH98 (and MBH99) were produced 
by other researchers, not Mann and colleagues. It is thus not clear 
what kind of ``independent quality control'' is being referred to here. 
However, it is fair to say that each of these papers has been subject 
to rigorous peer review in a leading scientific journal, which is 
considered by scientists to be an independent quality control process. 
We are aware of no criticisms of the datasets in the peer-reviewed 
scientific literature.

    Question 35. Has any organization other than IPCC carried out any 
studies to validate the statistical procedures and methodologies used 
in MBH98 and MBH99? If so, please provide copies of the reports 
resulting from such studies.
    Response. The IPCC doesn't ``carry out studies'', so the premise of 
the question is false. If the question were asked: Have other 
independent groups tested the methodology of Mann et al (1998) in a 
publication in the peer-reviewed climate literature, the answer would 
be ``yes''. I would refer the questioner to the following paper: 
Zorita, E., F. Gonzalez-Rouco, and S. Legutke, Testing the Mann et al. 
(1998) Approach to Paleoclimate Reconstructions in the Context in a 
1000-Yr Control Simulation with the ECHO-G Coupled Climate Model, 
Journal of Climate, 16, 1378-1390, 2003.
    The paper arrives at the conclusion that the methodology of MBH98 
performs well with networks of data comparable to those used by MBH98.

    Question 36. Have you ever received any communications that 
suggested that there might be computational errors in MBH98 or MBH99? 
Please provide such communications together with any responses.
    Response. I receive many emails, often from list-serves of self-
professed ``climate skeptics'' making numerous spurious claims against 
my work and that of many of my colleagues. I have received no 
correspondence providing credible evidence of any errors in our work. 
Nor has any such credible evidence been published in the peer-reviewed 
scientific literature.

    Question 37. Did the peer reviewers for Nature in MBH98 carry out 
any independent quality control or validation studies? If so, please 
provide copies of such reports.
    Response. Neither I, nor authors of peer-reviewed journal articles 
in general, are made privy to the detailed analyses that peer reviewers 
may or may not have performed in the process of reviewing a manuscript. 
Authors only receive the comments that were selected to be made 
available to them by the reviewer and editor. This question is thus 
impossible to answer. Numerous other groups (see the dozen or so 
independent estimates of various groups shown in Figure 1 of: Mann, 
M.E., Ammann, C.M., Bradley, R.S., Briffa, K.R., Crowley, T.J., Hughes, 
M.K., Jones, P.D., Oppenheimer, M., Osborn, T.J., Overpeck, J.T., 
Rutherford, S., Trenberth, K.E., Wigley, T.M.L., On Past Temperatures 
and Anomalous Late 20th Century Warmth, Eos, 84, 256-258, 2003) have 
produced reconstructions that are remarkably similar to those of MBH98 
based on a variety of data and methods. See my answer to Question No. 
4.

    Question 38. 38. Did the peer reviewers for Geophysical Research 
Letters in MBH99 carry out any independent quality control or 
validation studies? If so, please provide copies of such reports.
    Response. See response to Question No. 37.

    Question 39. How many people have requested the underlying digital 
information in MBH98? Please provide dates of such requests and dates 
of your reply.
    Response. My collaborators and I have not kept a specific record. 
The data has been provided to any scientific groups that have requested 
it, and has been made available on an open access basis through a 
public ftp site: ftp://holocene.evsc.virginia.edu/pub/MBH98/, since 
July 2002.

    Question 40. Were you one of the primary or lead authors of IPCC/
TAR chapter 2?
    Response. The convening lead authors of chapter 2 of the IPCC TAR 
were Dr. Chris Folland and Thomas Karl. I was one of eight additional 
co-authors contributing to chapter 2.

    Question 41. In your capacity as IPCC/TAR author, did you prepare 
any drafts that referred to your own papers? Please provide all drafts 
that you prepared for IPCC.
    Response. I contributed to numerous sections of the chapter and 
provided contributions that referenced the work of the leading 
paleoclimatologists, which includes me and many of my colleagues. Those 
interested in drafts of IPCC chapters should inquire of the appropriate 
IPCC working group. I am not in possession of such drafts, and even if 
I were, I would not be at liberty to distribute the various drafts of 
the chapters of the report.

    Question 42. Was any language from your drafts referring to your 
own reports ultimately used by IPCC/TAR? Please provide highlighted 
versions from IPCC.
    Response. The wording of the question is unclear. If the question 
is, did I, in my contributions to the chapter, provide summaries that 
included references to my own work as well as that of other scientists, 
the answer is of course yes. Since each of the authors was asked to 
contribute sections related to their particular areas of expertise, and 
since the IPCC authors were chosen from among the leading scientists in 
the world, it would be distinctly odd if it were not the case that most 
authors referred to their work, as well as that of others, in their 
contributions.

    Question 43. Did IPCC/TAR have any policies governing how lead 
authors used their own work? Did IPCC/TAR have any quality control 
procedures in the event that a lead author used his own work? Please 
provide a short summary of your understanding of such procedures.
    Response. I am not a spokesperson for the IPCC. However, it is my 
understanding that the IPCC carries out a process for developing its 
summarization of the understanding of science that leads to one of the 
most rigorously peer-reviewed scientific documents in existence. 
Individual technical chapters are prepared by expert scientific teams 
that consider the full range of published papers in a subject area. 
This expert author team then solicits an initial peer review from a 
large number of other scientists in the field, drawing on those 
representing the full range of expert science. The reports next go 
through a much wider review that is open to literally thousands of 
scientists around the world. Finally countries, NGO's, and professional 
groups (such as business groups) are provided the opportunity to send 
in review comments. (and in the case of the U.S. government review, an 
invitation to submit comments to be considered to be forwarded to the 
IPCC is published in the Federal Register, enabling all to participate 
in this review). With the comments available at each stage of the 
review process, the authors consider each comment and document their 
response. The meticulousness and fairness of the revision process by 
the authors in response to reviewer comments is evaluated by an 
independent pair of ``review editors'' who are themselves top 
international climate scientists who are not authors of the report 
itself. The National Academy of Sciences, at President George W. Bush's 
request, and other national academies around the world have 
independently reviewed the process and the validity of the scientific 
findings of the IPCC and endorsed them.

    Question 44. Did MBH98 and MBH99 use any proxy series, which were 
either unpublished or which resulted from unpublished calculations, 
which you carried out? If so, please identify, and detail how you 
verified those unpublished results.
    Response. MBH98 and MBH99, as many studies, made use of newly 
available data that had not yet been published by the original authors 
providing those data, and thus was provided to Mann and colleagues on a 
provisional basis that they not release the data until the authors had 
a chance to publish the records themselves. After all of the data used 
had been published, the full dataset used by MBH98 and MBH99 was made 
available in the public domain on the public website: ftp://
holocene.evsc.virginia.edu/pub/MBH98/

    Question 45. Despite solar variability over the last two millennia, 
your analysis concludes the Northern Hemisphere average temperature has 
remained virtually constant. What mechanism or mechanisms are 
responsible for negating the influence of the sun? Do climate models 
(GCMs) exhibit the same lack of response to solar forcing that your 
analysis implies? If not, why are model simulations at variance with 
your conclusions and how does that limit their applicability for future 
climate scenario assessments?
    Response. The question is falsely premised on several levels. No 
reasonable description of the reconstructions that we or others have 
produced of temperature variations in past centuries would characterize 
them as ``virtually constant''. The reconstructions performed by my 
group and others indicate an amplitude of variability that consistent 
with expectations from models driven with estimates of past radiative 
forcing including solar and radiative forcing, and allowing for the 
added role of internal unforced variability [see e.g. Crowley, T.J., 
Causes of Climate Change Over the Past 1000 Years, Science, 289, 270-
277, 2000]. Indeed, it has been shown that the model-predicted pattern 
of surface temperature response to solar forcing in past centuries 
closely resembles that estimated from the temperature reconstructions 
that my colleagues and I have performed [Shindell, D.T., Schmidt, G.A., 
Mann, M.E., Rind, D., Waple, A., Solar forcing of regional climate 
change during the Maunder Minimum, Science, 294, 2149-2152, 2001; 
Waple, A., Mann, M.E., Bradley, R.S., Long-term Patterns of Solar 
Irradiance Forcing in Model Experiments and Proxy-based Surface 
Temperature Reconstructions, Climate Dynamics, 18, 563-578, 2002; 
Shindell, D.T., Schmidt, G.A., Miller, R., Mann, M.E., Volcanic and 
Solar forcing of Climate Change During the Pre-Industrial era, Journal 
of Climate, in press, 2003].

    Question 46. How did the temperatures of the mid-Holocene Optimum 
Period (6000 to 9000 BP) compare with those observed today? Was it a 
global or a local phenomenon? What was or were the cause or causes of 
any temperature anomalies in that period? What is the cause of the 104 
to 105 year timescale changes in deuterium, oxygen isotope, etc., 
concentrations in ice core records? Are such changes global or local?
    Response. Paleoclimate experts have established that mid-Holocene 
warmth centered roughly 5000 years ago was restricted to high latitudes 
and certain seasons (summer in the Northern Hemisphere and winter in 
the southern hemisphere). Because much of the early paleoclimate 
evidence that was available (for example, fossil pollen assemblages) 
came from the Northern Hemisphere extratropics, and is largely 
reflective of summer conditions, decades ago some scientists believed 
that this was a time of globally warmer conditions. It is now well 
known that this is not the case. More abundant evidence now 
demonstrates, for example, that the tropical regions were cooler over 
much of the year. All of these changes are consistent with the expected 
response of surface temperatures to the known changes in the Earth's 
orbital geometry relative to the Sun during that time period and 
associated climate feedbacks, as detailed in peer-reviewed scientific 
publications [e.g., Hewitt, C.D., A Fully Coupled GCM Simulation of the 
Climate of the Mid-Holocene, Geophysical Research Letters, 25 (3), 361-
364, 1998; Ganopolski, A., C. Kubatzki, M. Claussen, V. Brovkin, and V. 
Petoukhov, The Influence of Vegetation-Atmosphere-Ocean Interaction on 
Climate During the Mid-Holocene, Science, 280, 1916-1919, 1998].
    Climate model simulations indicate quite good agreement with 
paleoclimate evidence now available. These models calculate that global 
annual average temperatures were probably about the same or a few 
tenths of a degree C cooler than today (the late 20th century) during 
this time period [Ganopolski, A., C. Kubatzki, M. Claussen, V. Brovkin, 
and V. Petoukhov, The Influence of Vegetation-Atmosphere-Ocean 
Interaction on Climate During the Mid-Holocene, Science, 280, 1916-
1919, 1998; Kitoh, A., and S. Murakami, Tropical Pacific Climate at the 
mid-Holocene and the Last Glacial Maximum simulated by a coupled 
oceanatmosphere general circulation model, Paleooceanography, 17 (3), 
(19)1-13, 2002.]. That's a far cry from the very out-of-date claim made 
by Dr. Legates in his testimony. Dr. Legates' comments regarding 
climate changes over the past 1000 years reflect a similar lack of 
familiarity with a whole body of paleoclimate research, especially with 
the new insights gained through the augmented research program, during 
the past decade.

    Question 47. It has been observed that in the past, carbon dioxide 
concentrations have sometimes lagged air temperature trends; that is, 
changes in air temperature have subsequently sometimes resulted in 
changes in carbon dioxide concentrations. Do you agree with those 
results from expert researchers? Why or why not?
    Response. The question mis-characterizes the evidence that has been 
provided by paleoclimate researchers. The studies that the questioner 
appears to be alluding to, demonstrate a phase relationship between ice 
core CO2 estimates and *local* temperature variations at the 
site of the ice core. Furthermore these local temperature estimates are 
indirectly inferred from oxygen isotopes, based on quite uncertain 
assumptions regarding oxygen isotope paleothermometary and neglecting 
possible biases due to the variable seasonality of local accumulation. 
As local temperature variations at the site of the ice core have an 
unknown relationship with global mean temperature variations (which are 
far more dominated by lower latitudes which occupy the majority of the 
Earth's surface area), the phase relationships between past CO2 
and global mean temperature variations are not known. In spite of these 
qualifications, it is not at all implausible that the geologic record 
indicates that at some times the CO2 increase may lag the 
initial temperature increase; such a situation would be expected, for 
example, if the change in climate was initiated by a change in the 
orbital geometry that affected the distribution of solar radiation, and 
then the slow warming drove CO2 from the warming ocean into 
the atmosphere. It is because of the many possibilities for how 
different processes can interact that it is essential to not simply 
base a conclusion on an apparent correlation without evaluating the 
underlying physical mechanisms for that particular period.

    Question 48. Are there any time periods for which atmospheric 
CO2 content has changed without a concomitant change in 
global air temperature? Are there periods when the atmospheric CO2 
content was relatively high butglobal air temperatures relatively low?
    Response. In his testimony, Dr. Legates indicated that there were 
historical cases where the temperature has gone up, but that CO2 
has fallen. It may well be the case that this has happened in the past. 
However, it is hardly surprising, and certainly not inconsistent with 
our established understanding of the various factors that influence 
surface temperatures. The warming response to increased greenhouse gas 
concentrations lags the actual increase in greenhouse gas 
concentrations in the atmosphere potentially by several decades, due to 
the sluggish response of the oceans, which have an enormous thermal 
capacity compared to the atmosphere, to increased surface radiative 
forcing. So warming is not expected to be contemporaneous with changes 
in CO2, but instead, to lag it by several decades. However, 
greenhouse gases are certainly not the only factor affecting the 
average surface temperature of the Earth. There are other anthropogenic 
factors, such as increased sulphate aerosols, which can have a cooling 
effect on the climate, and natural factors, such as volcanic activity, 
modest natural variations in solar output, and internal dynamics 
associated with climate events such as El Nino, which also influence 
the average surface temperature of the globe. At any particular time, 
these other factors may outweigh the warming effect due to increased 
greenhouse gases. For example, the relative lack of warming during the 
period 1940-1970 appears to be related to a combination of such 
factors, as discussed in my response to an earlier question. But while 
these other factors tend to cancel over time, the increased greenhouse 
gases lead to a systematic warming that will not cancel out over a very 
long time period. It is for precisely this reason that late 20th 
century warming now appears to have risen above the range of the 
natural variability of past centuries.

    Question 49. Two independent and nearly direct measures of surface 
temperature (deep borehole reconstructions) over the past several 
millennia have been published for Greenland (Dahl-Jensen et al 1998) 
and the Middle Urals (Demeshko and Shchapov 2001). The local surface 
temperature at these locations is highly correlated with global 
temperature on 10-year time scales and longer (r2 > 50 percent 10 yr 
with agreement increasing for longer averaging periods). Both 
reconstructions independently show their local surface temperatures 
were at least a1 + C warmer for century-scale mean temperatures around 
A.D. 900 than the latter half of the 20th century, translating into a 
global anomaly of at least +0.2+ C relative to today.
    This further implies that even higher global temperature anomalies 
for shorter periods, such as half-century or decadal periods, were 
observed about 1000 years ago. Why do these two robust measures of 
local and global approximations differ greatly from Mann et al. 1999?
    Response. The question is wrongly premised on multiple levels. 
First, the correlations cited are completely wrong. No citation to the 
peer-reviewed literature is provided, so it is difficult to determinate 
how these numbers were arrived at. I therefore proceeded to analyzed 
the appropriate surface air temperature gridpoint data from the 
Climatic Research Unit of the University of East Anglia myself. I found 
that only after 1922 is there adequate coverage (>50 percent areal 
coverage) to estimate a meaningful Greenland areal-mean temperature. 
For the period back to 1922, the linear correlation between the 
Greenland and Northern Hemisphere mean temperature is r= -0.06 
(negative!), nor is there a significant correlation at decadal or 
longer timescales. In fact, the trends in the two series during the 
latter 20th century are of opposite sign. So the numbers cited are 
completely spurious.
    It is in fact well known by the climate community that there are 
fundamental physical reasons why temperatures in Greenland are, in 
general, poorly correlated with Northern Hemisphere mean temperature. 
Owing to the strong overprint of processes, such as the North Atlantic 
Oscillation, and changes in coupled ocean-atmosphere processes in the 
North Atlantic that impart a large regional overprint of temperature 
variation in this region, both negative and positive correlations with 
Northern Hemisphere mean temperature can be found, depending on the 
time period and region of Greenland analyzed.
    The Dahl Jensen et al Greenland borehole data may indeed be useful 
temperature proxy data for the regions they represent, and they have 
been used in reconstructions of Northern Hemisphere mean temperature, 
with caveats due to their extremely low temporal resolution (see Mann 
and Jones, 2003). While the two Greenland borehole records show 
significantly different histories over the past 1000 years (which is 
expected since temperature trends vary markedly depending on the region 
of Greenland in question), one of the two records does correlate well 
with the instrumental Greenland record over the period of mutual 
overlap. Its shows the mid-20th century warm peak, followed by the 
latter 20th century cooling peak, just as the instrumental Greenland 
annual mean temperature record does. However, instrumental Northern 
Hemisphere mean temperature has, in contrast, warmed markedly during 
the latter 20th century.
    The Greenland borehole temperature reconstruction may tell us 
something about temperatures in Greenland over the past few millennia 
even though the two different Greenland borehole records show some 
differences between them. But these results are unlikely to tell us 
much, if anything, about Northern Hemisphere mean temperature trends. 
Indeed, Dahl-Jensen et al have never, to my knowledge, claimed in their 
studies that that temperature variations in the two regions of 
Greenland reconstructed (which themselves show significantly different 
histories over the past 1000 years) are representative of Northern 
Hemisphere mean temperatures, and I would be surprised if the authors 
were comfortable in having their data represented as such.

    Question 50. In your view what should the Federal Government do in 
response of rising concentrations of CO2? What would be the 
climate impact of this effort?
    Response. In my view, the Congress and the Federal Government 
should be taking the scientific findings of the mainstream research 
community very seriously and should stop focusing so much attention on 
the poorly conducted and distracting nitpicking of the various 
contrarian scientists. The IPCC assessments represent the most 
authoritative reviews of the science and have been unanimously endorsed 
by all of the participating nations of the world--it is time to pay 
attention to their findings. Exactly what steps should be taken and how 
fast this should be done are policy questions that members of this body 
should be responsibly and thoughtfully addressing. The long residence 
timescales of anthropogenic greenhouses gases, and the lags in the 
response of the climate system (e.g. sea level rise) to already 
realized increases in greenhouse gas concentrations dictate, however, 
that there are potentially significant costs to delayed action.

    Question 51. Approximately what percentage of the temperature 
increase in the observational record over the last 100 years would you 
attribute to anthropogenic causes? What percentage would you attribute 
to increased urbanization? What percentage would you attribute to non-
urbanized land use changes? What percentage would you attribute to 
natural (solar, volcanic, etc.) variability? What percentage would you 
attribute to ``internal'' climate variability? What percentage would 
you say results from other or unexplained sources? Give estimates for 
the years 1900,1940, 1980 and 2000.
    Response. A cursory review of the available evidence (see e.g. 
Figure 2.1 of chapter 2 of the 2001 IPCC Scientific Working Group 
report) indicates the following approximate attributes in the observed 
record of global mean temperature changes over the past 100 years: a 
warming of approximately 0.3+ C to 1940, a statistically insignificant 
change (given the uncertainties) from 1940 to the mid-1970's, and then 
an additional warming of approximately 0.5+ C from 1970 to 2000. This 
pattern of behavior is reproduced closely by models driven with 
estimates of both natural and anthropogenic forcing of the climate 
during the 20th century. The period of relative stasis in global mean 
temperatures from 1940 to 1970, in these model simulations, appears to 
result from the cooling impact of anthropogenic aerosols (for which 
there was a large increase during that time period) as well as a 
potential cooling contribution from explosive volcanic eruptions that 
occurred during that period, which tended to offset the warming 
influence of increased greenhouse gas concentrations during that time 
period (e.g. the 1957 eruption). However, much of the overall warming 
of the globe during the 20th century (which is between 0.6+ C and 1.0+ 
C depending on the precise instrumental data set used, and the precise 
endpoints of the interval examined) is clearly a result of increased 
greenhouse gas concentrations, as established in these simulations, 
consistent with the conclusions of the IPCC Third Assessment Report 
that most of the warming of the past 50 years is attributable to human 
influences.

    Question 52. What was the earth's climate like the last time the 
atmospheric concentration of carbon dioxide was near today's level of 
about 370 parts per million (ppm) and what were past conditions like 
when concentrations were at 550 ppm? Detail the factors that cause the 
global carbon cycle to produce these high levels of atmospheric carbon 
dioxide.
    Response. It is not precisely known what the ``earth's climate'' 
was like the last time carbon dioxide levels were near 370 ppm (let 
alone 550 ppm) because the available paleoclimate evidence available 
this long ago are quite uncertain and incomplete. That having been 
said, it is believed, based on the available proxy information and 
faunal/floral evidence, that global temperatures were probably several 
degrees higher than they are today when CO2 concentrations 
neared 550 pm, roughly consistent with model simulation results. One 
probably has to go back roughly 40-50 million years ago (see chapter 3 
of the 2001 IPCC working group 1 report) to find a time when CO2 
concentrations were in the range of 550 ppm (i.e., roughly double their 
pre-industrial concentration) and approximately 80 million years ago 
(i.e., the mid-Cretaceous period when Dinosaurs roamed the polar 
regions) to find a time when CO2 levels were in excess of 
1200 ppm (a level that will be reached, at current rates of CO2 
increase, within 1 to 1\1/2\ centuries). Proxy evidence available for 
this period, tenuous though it is, suggests deep ocean temperatures 8-
12+ C warmer than present. State of the art climate model simulations 
performed by Bette Otto-Bleisner and colleagues using the National 
Center for Atmospheric Research (NCAR) global climate model, which 
incorporate such CO2 levels (and the continental 
configuration corresponding to the mid-Cretaceous period), indicate 
significantly warmer sea surface temperatures, with tropical sea 
surface temperatures approximately 4+ C warmer and polar sea surface 
temperatures approximately 6-14+ C warmer than present. The simulations 
indicate an absence of perennial sea ice at even the most polar 
latitudes.

    Question 53. In your vitae, you indicate that you serve on the 
panel for NOAA's Climate Change Data and Detection (CCDD) program, 
while at the same time, you also have received large grants from this 
program. Please explain your role on the panel, how grant submissions 
are evaluated, and why there is no conflict of interest or impropriety 
associated with members of a panel receiving large grants from the 
program for which they serve.
    Response. I am not a spokesperson for NOAA, and would suggest that 
the questioner contact the appropriate NOAA agency officials for 
further information on their conflict of interest and disclosure 
policies. That notwithstanding, however, I would note the following 
points. Government funding agencies seek to draw upon the leading 
experts of the field in their panels. Inevitably, this means that 
specific science programs within NSF and NOAA invite to their review 
panels scientists who typically submit proposals themselves to those 
panels. Scientists are also asked to disclose any conflicts of interest 
they might have in reviewing a proposal, and are asked to recuse 
themselves from any participation in discussions related to proposals 
that they might have even peripheral involvement with. In my 
involvement in both NSF and NOAA panel reviews, I have on many 
occasions recused myself from reviewing or discussing a proposal based 
on such considerations.

    Question 54. Do you receive any income or reimbursement (travel, 
speaking fees, etc.) from any sources, which have taken advocacy 
positions with respect to the Kyoto Protocol, the U.N. Framework 
Convention on Climate Change, or legislation before the U.S. Congress 
that would affect greenhouse gas emissions? If so, please identify 
those sources and the approximate amount of money that they represent.
    Response. All income or travel expense reimbursement funds that I 
have received to my recollection have come from academic institutions, 
government funding agencies such as NSF, NOAA, NASA, DOE, and 
scientific organizations such as the American Geophysical Union and 
University Corporation for Atmospheric Research (UCAR). I am not 
familiar with any advocacy positions that have been taken by any of 
these institutions or organizations regarding the U.N. Framework 
Convention on Climate change, or legislation before the U.S. Congress 
that would affect greenhouse gas emissions.

                               __________

Responses by Michael Mann to Additional Questions from Senator Jeffords

    Question 1. Is it your understanding that during the mid-Holocene 
optimum period (the period from 4000-7000 B.C.) that annual mean global 
temperatures were more than a degree C warmer than the present day?
    Response. This is an oft-repeated but patently false claim. Dr. 
Legates, who has no established expertise in the relevant field of 
paleoclimatology, indeed asserts that temperatures were warmer at this 
time. In fact, not only is that not the consensus of the paleoclimate 
research community, but just the opposite is believed to be true of 
global annual mean temperatures at this time. Paleoclimate experts know 
that the mid-Holocene warmth centered roughly 5000 years ago was 
restricted to high latitudes and certain seasons (summer in the 
Northern Hemisphere and winter in the southern hemisphere). Because 
much of the early paleoclimate evidence that was available (for 
example, fossil pollen assemblages) came from the Northern Hemisphere 
extratropics, and is largely reflective of summer conditions, decades 
ago some scientists believed that this was a time of globally warmer 
conditions. It is now well known that this is not the case. More 
abundant evidence now demonstrates, for example, that the tropical 
regions were cooler over much of the year. All of these changes are 
consistent with the expected response of surface temperatures to the 
known changes in the Earth's orbital geometry relative to the Sun 
during that time period and associated climate feedbacks, as detailed 
in peer-reviewed scientific publications [e.g., Hewitt, C.D., A Fully 
Coupled GCM Simulation of the Climate of the Mid-Holocene, Geophysical 
Research Letters, 25 (3), 361-364, 1998; Ganopolski, A., C. Kubatzki, 
M. Claussen, V. Brovkin, and V. Petoukhov, The Influence of Vegetation-
Atmosphere-Ocean Interaction on Climate During the Mid-Holocene, 
Science, 280, 1916-1919, 1998].
    Climate model simulations indicate quite good agreement with 
paleoclimate evidence now available. These models calculate that global 
annual average temperatures were probably a few tenths of a degree C 
cooler than today during this time period [Kitoh, A., and S. Murakami, 
Tropical Pacific Climate at the mid-Holocene and the Last Glacial 
Maximum simulated by a coupled ocean-atmosphere general circulation 
model, Paleooceanography, 17 (3), (19)1-13, 2002.]. That's a far cry 
from the very out-of-date claim made by Legates. Legates' comments 
regarding climate changes over the past 1000 years reflect a similar 
lack of familiarity with a whole body of paleoclimate research, 
especially with the new insights gained through the augmented research 
program, during the past decade.

    Question 2. Why only focus on the past 1000 or 2000 years and not 
further back?
    Response. Large changes in climate certainly occurred in the 
distant past. If we look million years back in time, dinosaurs were 
roaming the polar regions of the Earth, and the globe was several 
degrees wanner than today. Carbon dioxide levels were probably several 
times their current level, slowly having attained such high levels due 
to changes in the arrangements of the continents (`plate tectonics') 
which influence the volcanic outgassing of carbon dioxide from the 
solid Earth. These changes occurred on timescales of tens of millions 
of years. Going back 10,000 years ago, large ice sheets existed over 
North America due to natural changes that occur in the Earth's orbit 
around the Sun on timescales of tens of thousands of years. Trying to 
study distant past climates for insights into modern natural climate 
variability is hampered by the fact that the basic external constraints 
on the system (the continental arrangement, the geometry of the Earth's 
astronomical orbit, the presence of continental ice sheets--what we 
call the `boundary conditions') were significantly, different from 
today. Focusing on the evolution of climate in the centuries leading up 
to the 20th century (i.e., the past 1000 to 2000 years) provides a 
perspective on the natural variability of the climate prior to the 
period during which large-scale human influence is likely to have 
occurred, yet modern enough that the basic boundary conditions on the 
climate system were otherwise the same. This provides us, in essence, 
`control' for diagnosing whether or not recent changes in climate are 
indeed unusual. Moreover, only during the past 1000-2000 years do we 
have adequate networks of proxy climate data with the required (annual) 
resolution in time to compare and validate against modern instrumental 
records; Reliable quantitative reconstructions of large-scale surface 
temperature patterns further back in time are thus not, at present, 
possible.

    Question 3. One of the Northern Hemisphere temperature 
reconstructions in your Figure 1 (the green curve from a paper by Esper 
and colleagues from Science in 2001) shows larger swings in past 
centuries, marginally outside the uncertainty bounds of the other 
reconstructions and model simulations. Does this indicate internal 
inconsistency in our knowledge?
    Response. There is no inconsistency. Esper et al noted, in their 
paper, that their estimate, unlike that of my colleagues and mine (the 
Mann/Bradley/Hughes or ``MBH'' reconstruction), was not representative 
of the entire Northern Hemisphere. They explicitly noted this in their 
paper, where they pointed out the likely reason for differences is that 
the MBH reconstruction represents the full Northern Hemisphere 
(tropics, subtropics, and extratropics) while the Esper et al 
reconstruction only represents the restricted extratropical continents. 
In fact, in a Science article that appeared in the same issue as the 
Esper et al paper [Briffa, K.R. and T.J. Osborn, Science, Blowing Hot 
and Cold, 295, 2227-2228, 2002] Briffa and Osborn noted that much of 
the difference was due to an arguably inappropriate scaling that Esper 
et al used, and an inappropriate comparison of summer vs. annual 
temperatures; ``when we regressed the record of Esper et al. against 
non-smoothed data (see the figure), this difference (with MBHJ was 
reduced to about 0.4+ C. Recalibrating both curves against year-by-year 
warm season temperatures reduces this difference further to about 0.35+ 
C.''
    As shown in the ``Eos''article discussed in my testimony, which 
represents a consensus of the leading researchers in the field [Mann, 
ME, Ammann, C.M., Bradley, R.S., Briffa; K.R., Crowley, T.J., Hughes, 
M.K.; Jones, P.D., Oppenheimer, M., Osbom, T.J., Overpeck, J.T., 
Rutherford, S., Trenberth, K.E., Wigley, T.M.L., On Past Temperatures 
and Anomalous Late 20th Century Warmth, Eos, 84, 256-258, 2003], a 
proper scaling of Esper et al record prior to comparison with other 
estimates, shows it only marginally outside the error estimates of the 
MBH reconstruction and the many other estimates that are in agreement 
with it. As noted in two articles in Science that shortly followed the 
Esper et al paper [Mann, M.E., Hughes, M.K., Tree-Ring Chronologies and 
Climate Variability, Science, 296, 848, 2002; Mann, ME, The Value of 
Multiple Proxies, Science, 297, 1481-1482, 2002], it is likely that the 
emphasis of the Esper et al reconstruction on only the summer season 
and the extratropical continental regions, provides a biased estimate 
of the true pattern of annual, hemisphere-wide temperatures in past 
centuries, explaining the small differences between this estimate and 
other estimates. This conclusion has been verified in recent modeling 
studies [Shindell, D.T., Schmidt, G.A., Miller, R., Mann, M.E., 
Volcanic and Solar forcing of Climate Change During the Pre-Industrial 
era, Journal of Climate, in press, 2003 ]. It is thus clearly 
disingenuous when contrarians make the argument that the Esper et al 
result is in conflict with the mainstream conclusions of the climate 
research community with regard to the history of Northern Hemisphere 
mean annual temperature variations over the past millennium as 
embodied, for example, in the Eos article.

    Question 4. As climatologist, can you explain what kind of 
quantitative analysis it takes to determine whether or not the last 50 
years has been unusually warm compared to the last 1000 years?
    Response. Well, such an analysis requires the careful use of proxy 
data, because we don't have widespread instrumental temperature records 
prior to the mid-19th century. By `careful use': I mean that one must 
first establish that the records actually resolve the changes of the 
past 50 years. This typically requires annually resolved proxy records 
or the very circumspect use of records with decadal resolution. One 
must not, as in the Soon and Baliunas' studies, use records that do not 
resolve the trends of the past few decades. One must also establish the 
existence of an actual temperature signal in the available proxy data 
before using them to reconstruct past temperature patterns, and one 
must properly synthesize regional data, which typically all show 
different tendencies at any given time, into an estimate of the average 
temperature over the entire hemisphere or globe. There area number of 
ways of performing such a synthesis, from the sophisticated pattern 
reconstruction approaches my colleagues and I have described in the 
technical literature, to the relatively straightforward compositing 
approach that many other paleoclimatologists, (including myself) have 
also used. In all cases, the estimates based on the proxy data must be 
calibrated against modern instrumental temperature measurements, to 
allow, for a quantitative estimate of past temperatures. The estimate 
must then be independently verified or, what we call, `cross-
validated,' by showing that it independently reproduces earlier 
instrumental data that were not used to calibrate the estimates. 
Finally, uncertainties must be diagnosed based on how well the 
reconstruction describes actual available instrumental measurements. 
Once such steps have been taken, it is possible to compare the recent 
instrumental record to the reconstruction within the context of the 
uncertainties of the reconstruction. This latter comparison allows us 
to gauge whether or not late 20th century temperatures are anomalous or 
not in a long-term context. The conclusion from legitimate such studies 
that late 20th century warmth is indeed anomalous in a millennial or 
longer-term context has been shown to be quite robust with respect to 
the details of the data set used, or the methodology used (as shown in 
exhibit 1 in my testimony, the first figure of the ``Eos'' piece). It 
is noteworthy that the Soon and Baliunas paper satisfies none of the 
required standards for a `careful use of proxy data' specified above.

    Question 5. Do you claim that appropriate statistical methods do 
not exist for calibrating statistical predictors, including climate 
proxy records, against a target variable, such as the modern 
instrumental temperature record?
    Response. No. The statement belies a centuries-old field of 
statistics known as ``multivariate linear regression'' in which a set 
of candidate ``predictors'' (such as proxy data) are statistically 
related to a target variable or ``predictand'' (such as the 
instrumental temperature record) during a common interval of overlap 
(e.g., the 20th century). If done properly, this statistical method 
isolates the temperature information that is contained within the proxy 
data, and uses that information to reconstruct past temperature 
patterns from the proxy data. It is also well known to those properly 
trained in statistics that the ``regression model'' must be 
independently ``validated'' by showing that it successfully reproduces 
independent data (e.g. longer-term instrumental temperature records) 
that were not used in constructing the statistical model itself. The 
estimates by Mann and colleagues embrace each of these fundamental 
statistical principles.
    Legates in his testimony seemed to claim that a composite estimate 
(e.g. of Northern Hemisphere mean temperature) should somehow resemble 
each of the individual predictors (e.g. the various regional 
temperature estimates). Such a result is, in general, a statistical 
impossibility. There is a basic theorem in statistics known as the 
``central limit theorem'' which indicates a general tendency for a 
composite (i.e. average) of a large number of different individual 
estimates to cancel out in terms of the pattern variation and amplitude 
of variability evident in the individual estimates. If Legates were 
indeed (as he claims to be) familiar with the instrumental surface 
temperature record, he would know, for example, that individual 
instrumental thermometer records available for particular locations 
over the globe during the 20th century show very little in common with 
the `composite' series constructed by averaging all of the individual 
records into a hemispheric or global estimates. Because season-to-
season and year-to year fluctuations in the climate at regional scales 
often result from shifts in the atmospheric circulation, not every 
location experiences the same variation in any given year; for example, 
this summer, the western United States and Europe are anomalously warm, 
while the East Coast is anomalously cool. The average series reflects a 
tendency for a cancellation of the various ``ups'' and ``downs'' in the 
different individual series that often occur at different times. This 
is simply a statement about the instrumental record itself, and 
requires no use of proxy data at all. It is discouraging that Legates 
and colleagues haven't performed even such simple analyses with 
available instrumental data that would expose the fundamental flaws in 
their supposed `statistical' reasoning.
    In this context, it should be noted that Legates testimony 
seriously misrepresents the statistical analyses used by Mann and co-
workers. In his testimony, Legates claimed that my collaborators and I 
replaced the proxy data for the 1900's by the instrumental. The 
assertion is simply factually incorrect. Any reader of our published 
work knows full well that our proxy-based temperature reconstruction 
extends well into the late 20th century, through 1980 (the vast 
majority of published high-resolution climate proxy data are not 
available at later times than this). It is shown in our work that the 
reconstruction independently reproduces estimates from the instrumental 
surface temperature data record back through the mid-19th century and, 
in certain regions, back through the mid-18th century (i.e., our 
regression model is ``validated'' in the manner discussed above). Our 
Northern Hemisphere average temperature series reconstructed from proxy 
data is shown to agree with the instrumental Northern Hemisphere 
average record over the entire interval available for comparison (1856-
1980). This successful validation of the reconstruction, furthermore, 
allows us to compare the proxy-based reconstruction to the entire 
instrumental record (available through 1999 at the time of our 
publication), taking into account the uncertainties in the 
reconstruction.
    Legates further claim that we present the instrumental record 
without uncertainty is disingenuous. If Legates, for example, were 
familiar with studies of the instrumental surface temperature record, 
he would understand that the uncertainties in this record using the 
20th century are minimal compared to the uncertainties shown for our 
reconstruction [see e.g. Figure 2.1b in Folland, C.K., Karl, T.R., 
Christy, J.R., Clarke, R. A.,Gruza, G.V., Jouzel, J., Mann, M.E., 
Oerlemans, J., Salinger, M.J.;, Wang, S.-W., Observed Climate 
Variability arid: Change, in Climate Change 2001: The Scientific Basis, 
Houghton, J.T., et al. (eds.), Cambridge Univ. Press, Cambridge, 99-
181, 2001.]. Furthermore, all scientists with a proper training in 
statistics know that uncertainties add ``in quadrature'', In other 
words, you have to square them before adding them. This means that the 
relatively small uncertainty in the instrumental record makes a 
relatively small contribution to the total uncertainty. Legates claimed 
in his testimony that including the uncertainty in the instrumental 
record, which he estimates as 0.1+ C would change the conclusions 
expressed by us and other mainstream climate scientists that the 1990's 
are the warmest decade in at least the past 1000 years within estimated 
uncertainties. This claim is very misleading for several reasons, 
First, the standard error in Northern Hemisphere mean annual 
temperatures during the 1990's is far smaller than the amount cited, by 
Legates [see again Folland et al, 2001 cited above]. Even more 
problematic, however, Legates claim indicates a fundamental 
misunderstanding of the statistical concepts of standard error and 
uncertainty,. The shaded region shown along with the Mann et al 
reconstruction (and other similar plots shown in recent articles such 
as the aforementioned ``Eos'' article, and the IPCC report) indicate 
two standard error intervals. The decade of the 1990's is roughly two 
standard errors warmer (i.e., about 0.4+ C) than any decade prior to 
the 20th century in the reconstruction. Based on a one-sided test for 
anomalous warmth, this translates to a roughly 97.5 percent level of 
significance. Modifying the uncertainties to include the small 
additional contribution due to uncertainties in the instrumental record 
itself would modify this only slightly, and would not lower the 
significance level below the 95 percent level. Though there is no such 
thing as an absolute estimate of uncertainty, despite Legate's 
implications to the contrary, a 95 percent confidence is often adopted 
as an appropriate criterion for significance. Legates statement that 
including instrumental contributions to the uncertainty would change 
the conclusions is thus clearly false.

    Question 6. In determining whether the temperature of the 
``Medieval Warm Period'' was warmer than the 20th century, does your 
work analyze whether a 50-year period is either warmer or wetter or 
drier than the 20th century? Is it appropriate to use indicators of 
drought and precipitation directly to draw inferences of past 
temperatures?
    Response. No, the work of me and my colleagues does not follow the 
flawed approach used by Soon and Baliunas. It is fundamentally unsound 
to infer past temperature changes directly from records of drought or 
precipitation. The analysis methods used by my various collaborators 
and I (e.g. the 13 authors of the recent article in Eos) employ 
standard statistical methods for identifying the surface temperature 
signal contained in proxy climate records, and using only that 
temperature signal in reconstructing past temperature patterns. By 
contrast, Soon and Baliunas simply infer evidence for warm ``Medieval 
Warm Period'' or cold ``Little Ice Age'' conditions from the relative 
changes in proxy records often reflective of changes in precipitation 
or drought, rather than temperature. It is difficult to imagine amore 
basic mistake than misinterpreting hydrological evidence in terms of 
temperature evidence. This fundamental shortcoming in their approach is 
identified in the Eos article referred to in my testimony that was 
written by 13 leading climate and paleoclimate scientists. 
Incidentally, the Eos piece was peer-reviewed, despite the claim by 
Legates otherwise in his testimony--an associate editor with training 
in the particular field corresponding to the submitted piece 
(``Atmospheric Science'' in this case); reviews the content of Eos 
forum pieces prior to acceptance.

    Question 7. Can you compare the quantitative analysis that supports 
your conclusion that the climate is warming faster in recent years than 
at any time in the recent past with the analysis done in the Soon 
literature review?
    Response. Technically speaking, there is no actual ``analysis'' in 
the Soon and Baliunas review, in that they don't appear to have 
performed a single numerical or statistical operation upon a single 
time series at all. They do not provide any quantitative estimates of 
temperature changes, let alone any estimate of uncertainties, Instead, 
they claim to interpret the results of past studies mainly by counting 
the number of studies coming to some conclusion; no matter how they got 
there and whether there have been later interpretations. Science 
requires analysis--not just counting studies. Climate scientists whose 
records they analyzed have gone on record as indicating that Soon and 
Baliunas misinterpreted their studies (e.g., the article by David 
Appell in the August 2003 issue of Scientific American) and numerous 
climate scientists have indicated (see same article) that Soon and 
Baliunas misinterpreted evidence of drought or precipitation as 
evidence of temperature changes, did not use records that resolve the 
climate changes of the late 20th century, and did not take into account 
whether or not variations in different regions were coincident or not. 
Soon and Baliunas also neglect undeniable evidence of substantial 
warming of hemispheric and global surface temperatures during the past 
few decades. So the Soon and Baliunas analysis fails just about every 
meaningful criterion that might be applied to determining the validity 
of an analysis that purports to evaluate current warming in the context 
of past temperature trends. This deeply flawed study thus contrasts 
sharply with other rigorous quantitative studies (as discussed in the 
Eos article) performed by numerous other scientists with appropriate 
training in the fields of climatology and paleoclimatology, which use 
proper statistical methods for inferring past temperature changes from 
proxy data, provide uncertainty estimates, and employ appropriate 
comparisons of current and past trends.

    Question 8. What was the earth's climate like the last time that 
atmospheric concentrations of carbon dioxide were at today's levels or 
about 370 parts per million (ppm) and what were conditions like when 
concentrations were at 550 ppm, which will occur around 2060 or so?
    Response. We have to go back far into the past to find carbon 
dioxide levels approaching today's levels. Ice core studies indicate 
that modern carbon dioxide levels are unprecedented for at least four 
glacial/inter-glacial cycles: in other words, for more than 400,000 
years. Other evidence suggests that carbon dioxide levels are now 
higher than they have been for at least 10 million years. One probably 
has to go back roughly 40-50 million years ago (see chapter 3 of the 
2001 IPCC working group 1 report) to find CO2 concentrations 
were in the range of 550 ppm (i.e., roughly double their preindustrial 
concentration) and approximately 80 million years ago (i.e., the mid-
Cretaceous period when Dinosaurs roamed the polar regions) to find 
CO2 levels in excess of 1200 ppm (a level that will be 
reached, at current rates of CO2 increase, within one to 
one-and-a-half centuries). Proxy evidence available for this period, 
tenuous though it is, suggests deep ocean temperatures 8-12+ C warmer 
than present. State of the art climate model simulations performed by 
Bette Otto-Bleisner and colleagues using the National Center for 
Atmospheric Research (NCAR) global climate model, which incorporate 
such CO2 levels (and the continental configuration 
corresponding to the mid-Cretaceous period), indicate significantly 
warmer sea surface temperatures, with tropical sea surface temperatures 
approximately 4+ C warmer and polar sea surface temperatures 
approximately 6-14+ C warmer than present. The simulations indicate an 
absence of perennial sea ice at even the most polar latitudes.

    Question 9. Are you aware of any scientists beside the authors of 
the Soon et al article who support using ``wetness'' or ``dryness'' as 
indicators of past temperatures, instead of actual temperatures or 
proxy data that reflects temperatures?
    Response. I am not aware of any other scientist who has made the 
mistake of interpreting paleoclimatic information in this way. As 
discussed above, trained paleoclimatologists typically use statistical 
methods to identify the strength of the temperature signal in a proxy 
record prior to using it in reconstructing past temperature patterns.

    Question 10. Is there any known geologic precedent for large 
increases of atmospheric CO2 without simultaneous changes in 
other components of the carbon cycle and the climate system?
    Response. There is not, to my knowledge, such an example. As 
discussed above, the geological record shows a clear relationship 
between periods of high CO2 and relatively high global mean 
temperatures. The study of the relationship between changes in CO2 
and climate in the paleoclimate record is sometimes complicated by the 
fact that these relationships can be relatively complex during rapid 
transitions between glacial and interglacial climates such as those 
that occurred with the coming and going of ice ages that occurs on a 
roughly 100 thousand year timescale over the past nearly one million 
years. However, one can turn to periods of time when the climate and 
CO2 were not varying rapidly, and thus the climate was 
approximately in an ``equilibrium'' state, for insights into the 
relationship between CO2 and climate, A perfect such example 
is the height of the last ice age, the so-called ``Last Glacial 
Maximum'' or ``LGM'' centered roughly 25 thousand years ago. At this 
time, CO2 was substantially lower than today (just below 200 
ppm) and global mean surface temperatures were several degrees (about 
4C or so) cooler than today. Such relationships between past CO2 
changes and global temperature changes typically, suggest a 
``sensitivity'' of the climate system to enhanced CO2 of 1, 
5 to 4.5 C warming for each doubling of CO2 concentrations 
from their pre-industrial levels, similar to the range of sensitivities 
found in various climate models.

    Question 11. According to a study published in Science magazine 
recently [B. D. Santer; M. F. Wehner, T. M. L. Wigley, R. Sausen; G. A. 
Meehl, K. E. Taylor, C. Ammann, J. Arblaster, W. M. Washington, J. S. 
Boyle, and W. Bruggemann Science 2003 July 25; 301; 479-483], manmade 
emissions are partly to blame for pushing outward the boundary between 
the lower atmosphere and the upper atmosphere. How does that fit with 
the long-term climate history and what are the implications?
    Response. This is yet another independent piece of evidence 
confirming a detectable anthropogenic influence on climate during the 
late 20th century. This evidence is consistent with evidence for 
unprecedented surface warming during the past few decades--warming that 
indeed appears unprecedented for as long as we have records (i.e., for 
at least a thousand, and probably two thousand years). These changes, 
moreover, are consistent with predictions from climate models driven by 
known anthropogenic (human) forcing of the climate.

    Question 12. At this hearing, there were a number of calls for 
``sound science.'' Could you please explain what it is about the IPCC 
process that justifies respecting the IPCC results as the very soundest 
representation of the science of climate change?
    Response. The IPCC carries out a process for developing its 
summarization of the understanding of science that leads to one of the 
most rigorously peer-reviewed scientific documents in existence, 
Individual technical chapters are prepared by expert scientific teams 
that consider the full range of published papers in a subject area. 
This expert author team then solicits an initial peer review from a 
large number of other scientists in the field, drawing on those with 
the full range of views. The reports next go through a much wider 
review that is open to literally thousands of scientists around the 
world. Finally countries, NGOs, and professional groups (such as 
business groups) are provided the opportunity to send in review 
comments. At each stage, authors consider each comment and document 
their response. The meticulousness and fairness of the revision process 
by the authors in response to reviewer comments is evaluated by an 
independent pair of ``review editors'' who are themselves top 
international climate scientists who are not authors of the report 
itself. The National Academy of Sciences, at President George W. Bush's 
request, and other national academies around the world have 
independently reviewed the process and the validity of the scientific 
findings of the IPCC and endorsed them. To question the IPCC and the 
IPCC process, as does Dr. Legates thus not only does a disservice to 
thousands of the world's top scientists; but to the exceptional care 
and rigor of the process that has led to the unanimous adoption of all 
of the IPCC's assessments by representatives of the over 150 nations 
that participate in the IPCC process. The documents are very finely 
honed and carefully phrased. The scientific studies of those such as 
or, Legates are considered, as are their review comments, and it is 
terribly disingenuous, not to mention totally unacceptable to the 
international community, after all of the care and consideration put 
into these efforts to try to so cavalierly dismiss them.
    In his testimony Legate's alleges that the IPCC report 
misrepresents what is known about climate change in past centuries and 
that it somehow replaces conventional wisdom with dramatically new 
conclusions. One must conclude that Legates either did not read the 
report, or if he did, he did not understand what he read; for if he had 
he would certainly have to recognize factually incorrect nature of his 
comments. The IPCC chapter dealing with paleoclimatic evidence 
discussed the full range of regional evidence described in the peer-
reviewed as well as evidence from hemispheric composites that average 
the information from different regions. The paper by Soon and Baliunas 
is a dramatic throwback to the State of our knowledge many decades ago, 
while the IPCC report provides a far more upto-date assessment of all 
of the available knowledge regarding past climate change. The Soon and 
Baliunas papers provide a glaring example of the very ``unsound'' 
science that Senator Inhofe claims to be concerned about, as numerous 
mainstream climate researchers have now opined in the media, and in the 
peer-reviewed scientific literature.

    Question 13. In your opinion, how do the processes used by the IPCC 
and the National Academy of Sciences compare to the process used in the 
publication of the Soon and Baliunas paper and other papers by so-
called ``contrarians''? In the next IPCC assessment, would you expect 
that the Soon and Baliunas paper will be considered and cited, putting 
it into the context of other papers and findings and explaining why it 
has differences or similarities?
    Response. As discussed above, the IPCC is one of most rigorously 
peer-reviewed scientific documents in existence. By contrast, the 
contrarians rarely publish in the peer-reviewed literature and when 
they do, it is not uncommon to discover, as in the case of the Soon and 
Baliunas paper, irregularities in the peer-review process. Publisher 
Otto Kinne indeed indicated that the review process at Climate Research 
``failed to detect methodological flaws'' in the Soon and Baliunas 
paper. I would indeed expect that in the next IPCC assessment, the Soon 
and Baliunas paper will be discussed and evaluated in the context of 
other available evidence and in the context of how it is faring in the 
literature when the IPCC review takes place (e.g., if there are as many 
criticisms about their work as at present, and they have not been 
seriously addressed by more careful followup studies by the authors. I 
would think their conclusions will be rejected as scientifically 
unsound. I would not presume to know in detail what the result of the 
assessment will be, but I believe it fair to assume that the rigorous 
review provided by the IPCC assessment process will not fail to 
identify the methodological flaws that appear to have slipped through 
the cracks in their publication in the journal ``Climate Research''.

    Question 14. Could you explain the IPCC's lexicon for indicating 
relative levels of confidence and how you would suggest this relates to 
the information being ``real'' and ``probable''? When IPCC says 
something is ``very likely,'' just what do they mean?
    Response. To avoid the type of misunderstanding that often results 
from when scientists seek to convey scientific results to a non-
technical audience, the IPCC specifically sought to employ a lexicon in 
which terms such as ``likely'' or ``probable'' or ``very likely'' had 
specific statistical meanings attached to them. A fairly conservative 
standard was typically employed in this process. Consider the 
conclusion in the IPCC report that the 1990's are the warmest decade in 
at least the past 1000 years for the Northern hemisphere average 
temperature. This conclusion is based on the fact that the average 
warmth of the 1990's exceeds that for any reconstructed decade in the 
reconstructed Northern Hemisphere series. To be more specific, the 
1990's warmth exceeds any past decade by two standard errors. This 
corresponds to a roughly 97.5 percent probability based on standard 
statistical assumptions. Probabilities of 90 percent-99 percent are 
termed ``very likely'' in lexicon typically adopted by the IPCC report. 
However, this conclusion was offered as only ``likely'' (corresponding 
to a 66 percent-90 percent level of probability) rather than the more 
stringent ``very likely'' because it was based on only a small number 
of independent studies at the time. Since that time, of course, several 
more studies have affirmed this conclusion, and one might imagine that 
a more stringent conclusion will be offered in the future. This example 
nonetheless illustrates the manner by which IPCC adopted conservative 
standards in their use of terms such as ``likely'' or ``very likely''. 
It is instructive to contrast that standard with the one taken in the 
Soon and Baliunas paper. The Soon and Baliunas paper does not provide a 
quantitative estimate of any quantity (such as average Northern 
Hemisphere temperature), or any assessment of uncertainty. It is thus 
not possible for the authors to attach any meaningful statement of 
likelihood or probability to any of their conclusions. They thus 
provide no basis for judging the validity of any of the claims made in 
their paper, in striking contrast to the rigorous standards adopted by 
IPCC, and by the work of my collaborators and me.

    Question 15. In his questioning, Senator Inhofe cited results 
regarding the potential costs of implementing the Kyoto Protocol from 
the Wharton Econometric Forecasting Association (WEFA). I realize that 
you are not an economist, but would you please comment as a scientist 
on the following two points:
    (a) Senator Inhofe cited economic projections (e.g., 14 percent 
increase in medical costs; real income drop of $2,700 per household) 
going out a decade or so into the future and without any indication of 
uncertainty on these estimates implying an accuracy of two-significant 
figures. Senator Voinovich cited numbers to similar claimed accuracy 
going out 20-25 years; again without any indication of uncertainty. 
Could you please comment on what you think are the relative strengths 
and weaknesses of making climate projections based on use of physical 
laws versus economic extrapolations and what sorts of relative 
uncertainty should likely be associated with each type of estimate so 
that they can be interpreted in a comparative way by decisionmakers?
    Response. Indeed, it is somewhat remarkable that politicians who 
reject the validity, for example, of climate model simulations, which 
are based on solution of the laws of physics, can so uncritically 
accept precise economic projections based on economic forecasts based 
on untestable and unverifiable assumptions governing human 
decisionmaking, and speculative future scenarios that depend on the 
unfolding of the political process. Specific numbers of ``14 percent'' 
or ``$2,700'' are of course entirely dependent on the assumptions that 
go into such forecasts. Because assumptions about future economic 
growth and about future government policies are necessarily uncertain, 
estimates of changes in costs or income from such models must also be 
quite uncertain, and so should have large uncertainties (or ranges) 
associated with them. The faith expressed in such poorly constrained 
economics estimates by the same individuals who express strong 
skepticism of results from far more physically based and testable 
climate simulation models, strikes me as a remarkable inconsistency.
    (b) Because you may be familiar with the 1997 study by the World 
Resources Institute entitled ``The costs of climate protection: A guide 
for the perplexed,'' which explains the important role of assumptions 
in leading to very different cost estimates from even one economic 
model, much less among different models, could you explain in a 
comparative fashion how robust the findings regarding the ``hockey 
stick'' behavior of the climate in the various studies carried out: by 
you and fellow investigators may be to variations in the assumptions 
made. It's clear that varying the assumptions among reasonable 
possibilities in the economic models can change what is calculated to 
be a few percent impact on the economy to a small gain; would changes 
in the assumptions you are making change the indication of strong 
anthropogenic warming to an indication of human-induced cooling?
    Response. The millennial temperature reconstruction (or ``hockey 
stick'' as it was termed by former GFDL head Dr. Jerry Mahlman) is 
based on a rigorously validated statistical model with demonstrated 
predictive skill based on comparisons with independent data. The 
primary conclusions drawn from the reconstruction (e.g. the anomalous 
nature of late 20th century warmth) are based on a conservative 
appraisal of the uncertainties in the reconstruction, and are not 
strongly dependent on the assumptions made. The same--conclusions have 
been affirmed now by several other independent empirical and-model-
based estimates. As discussed above, this contrasts with economic 
predictions which are necessarily far more sensitive to the assumptions 
that go into them.

    Question 16. In his opening statement, Senator Inhofe concluded 
that the Soon and Baliunas paper is ``credible; well-documented; and 
scientifically defensible.'' By contrast, your testimony indicated that 
the experts in the field do not consider this to be the case. Does one 
have to be an expert in the field to understand the apparent problems 
with this paper? If not, could you summarize in terms for non-
scientists what the key problems with the paper are?
    Response. The mainstream scientific research community has indeed 
rejected the approach, interpretation, and conclusions advanced by Soon 
and Baliunas as fundamentally unsound. The major flaws in the analysis, 
as described in earlier comments in more detail, are basic enough that 
they can be understood by a non-specialist. In short, the analysis by 
Soon and Baliunas is unsound because (a) they inappropriately 
interpreted indicators of past precipitation as evidence of past 
temperature changes, (b) they did not use an approach which takes into 
account the simultaneity and lack of simultaneity of variations in 
different regions, and (c) they did not employ a proper standard for 
evaluating recent changes (i.e., changes during the past few decades) 
in the context of past variations. Indeed, they also misinterpreted 
past published work, and did not provide any quantitative estimates, 
let alone estimates of uncertainty. It is difficult to find anything of 
scientific merit at all in their published work.

    Question 17. As I understand it, the data that both you and Soon 
and Baliunas draw from is the same, and it is not a question of the 
data being the problem. Instead, it is apparently the processing of the 
data that you are indicating has been done in a substandard way by Soon 
and Baliunas. Is this correct?
    Response. Keep in mind that Soon and Baliunas, unlike my 
collaborators and I, don't actually analyze any data at all. They 
simply claim to have `interpreted' past studies (often incorrectly). 
Soon and Baliunas refer to a number of proxy data studies that describe 
data which we employed in our analysis. There are also many proxy data 
that we used in our study that the Soon and Baliunas do not discuss. 
However, as alluded to in the question, the real issue isn't what data 
were used. Numerous independent now-published studies employing widely 
different assemblages of proxy climate data sets have demonstrated 
(e.g, as in the Eos article discussed earlier) a similar pattern of 
past variations in hemispheric mean temperature. The real issue with 
Soon and Baliunas is indeed not the set of studies that they claim to 
have interpreted, but rather the approach that they took to 
interpreting those studies. As indicated in previous comments, Soon and 
Baliunas, unlike mainstream climate researchers, did not employ a 
method for isolating the actual temperature signal in the proxy records 
before using the records to draw conclusions regarding past temperature 
changes. Unlike mainstream researchers, they did not aggregate 
information in a way that addresses whether or not variations in 
different regions are simultaneous. Unlike mainstream researchers, they 
did not analyze the actual modern (late 20th century) warmth in the 
context of past variations. And finally, they did not even produce a 
quantitative estimate of past temperature variations, let alone an 
estimate of uncertainty.

    Question 18. The Soon et al literature review has been described as 
shifting the paradigm away from the ``hockey stick'' description of 
global warming. It seems that that review simply attempted to revive an 
older theory of climate change that has been discarded by NOAA, the 
USGCRP, the NAS, the IPCC, etc. Please comment.
    Response. The mainstream scientific community has clearly and 
decisively rejected the Soon and Baliunas papers as scientifically 
unsound which undermines any claims made by industry-funded special 
interest groups or politicians, that the papers provide any valid 
scientific conclusions, let alone the basis for a shift of the 
``paradigm''. Indeed, the Soon and Baliunas papers simply promote a 
number of long-discredited myths which have been replaced, in recent 
decades by far more rigorous and quantitative analyses as described by 
the IPCC, USGCRP, NAS, and other mainstream scientific organizations 
and funding agencies. In short, the Soon & Baliunas papers simply 
repackage myths that were discredited more than a decade ago.

    Question 19. In an opinion-editorial by former Secretary of Energy 
James Schlesinger, he suggested that ``we have only a limited grasp of 
the overall forces at work, . . .'' in terms of global climate change. 
Could you please summarize for us what the scientific community 
considers the key forces at work over the past 1000 years, how well 
these estimates are understood, and whether there is a general 
consistency or inconsistency between the various forcings and the 
climate estimates that you and colleagues have developed for the last 
1000 years?
    Response. Mr. Schlesinger's assertions entirely mischaracterize the 
nature of our scientific knowledge, which has advanced tremendously 
during the past several decades. In fact, a very large number of peer-
reviewed scientific studies have been published in the leading 
scientific, journals such as Nature and Science in the past two decades 
elucidating the role of natural and anthropogenic factors in observed 
climate changes. Physically based models have been developed and 
validated against observations, and these models reproduce complex 
climate phenomena such as El Nino. These same models have been driven 
with the primary ``external'' factors that are believed to govern 
climate variations on timescales of decades and centuries. These 
external factors include natural factors, such as the modest estimated 
variations in radiative output of the Sun, which varies by a fraction 
of a percent over time, variations in the frequency and intensity of 
explosive volcanic eruptions, which have a several-year cooling effect 
on the climate through the injection of reflective volcanic aerosols 
into the stratosphere, and very small changes in the Earth's orbit 
relative to the Sun that occur on multi-century timescales. These 
external factors also include the ``anthropogenic'' influences of 
increased greenhouse gas concentrations due to fossil fuel burning, 
changes in the reflective properties of the land surface due to human 
land use alterations, and the regional cooling effect of anthropogenic 
sulphate aerosols in certain industrial regions. When driven with these 
factors, these climate models have demonstrated a striking ability to 
reproduce observed global and hemispheric temperature trends during the 
20th century, as well as longer-term trends in past centuries as 
reconstructed from proxy data. Such results are nicely summarized in 
the 2001 IPCC scientific working group report. Mr. Schlesinger would 
have benefited from a reading of this report, or the follow-up National 
Academy of Sciences report that endorsed the key IPCC conclusions, 
prior to writing his oped piece, which reflects a decades-old 
understanding of the State of the science.

    Question 20. In Dr. Soon's testimony, he speaks about there being 
``warming'' and ``cooling'' for different periods. If he did not 
construct an integral across the hemisphere or a real timeline; can he 
say anything other than that there were some warm periods and cool 
periods?
    Response. Aside from not adequately distinguishing temperature 
changes from hydrological changes, Dr. Soon and his collaborators 
indeed did not even attempt to estimate contemporaneous patterns of 
past temperature change, let alone an integral across the hemispheric 
domain to assess hemispheric mean temperature changes. It is unclear 
what, if any, meaningful conclusions can be drawn from the Soon and 
Baliunas study.

    Question 21. Dr. Soon indicates that ``local and regional, rather 
than global average changes are the most relevant and practical measure 
of climate change and its impact.'' Could you please comment on this, 
including the relative likelihood of identifying a signal of climate 
change amidst the local fluctuations? In what sense might local changes 
be the most practical measure? In that the primary forcings of the 
climate are global in scale, does it not make most sense to first 
determine how the large-scale rather than the regional climate might be 
affected?
    Response. Dr. Soon's comments are truly misguided. Firstly, the 
surface temperature reconstructions published by my colleagues and I 
explicitly resolve regional patterns of surface temperature, so it is 
entirely unclear why Dr. Soon believes that we don't address regional 
climate changes. Had Dr. Soon understood our papers, he would be aware 
that we do. However, unlike the study that Dr. Soon published, our 
reconstructions explicitly take into account the issue of the relative 
timing and simultaneity of surface temperature changes in different 
regions. Only through doing this it possible to form an integrated 
measure of temperature changes such as hemispheric mean temperature. 
Scientists with training in climatology, statistics; and other areas of 
research required in the study of paleoclimate reconstruction know that 
the signal-to-noise ratio of any surface response to global radiative 
climate forcing increases as the scale of spatial averaging increases, 
In discussions of climate change it is thus the integral of the surface 
temperature field over an entire hemisphere or globe, which constitutes 
the most useful single variable for detecting, and attributing causal 
factors to observed changes. The spatial signature of the surface 
temperature signal (both with respect to position on the surface of the 
Earth, and altitude in the atmosphere) can nonetheless help to 
distinguish one source of climate forcing (e.g. solar) from another 
(e.g. enhanced greenhouse gases). My colleagues and I have indeed used 
the spatial patterns of surface temperature changes in past centuries 
to identify the role of natural external forcing of climate [Shindell, 
D.T., Schmidt, G.A., Mann, M.E, Rind, D., Waple, A., Solar forcing of 
regional climate change during the Maunder Minimum, Science, 294, 2149-
2152, 2001; Waple, A., Mann, M.E., Bradley, R.S., Long-term Patterns of 
Solar Irradiance Forcing in Model Experiments and Proxy-based Surface 
Temperature Reconstructions, Climate Dynamics, 18, 563-578, 2002; 
Shindell, D.T., Schmidt, G.A., Miller, R., Mann, M.E., Volcanic and 
Solar forcing of Climate Change During the Pre-Industrial era, Journal 
of Climate, in press, 2003]. Both Dr. Soon and Dr. Legates advocate in 
their testimony a primary role of solar forcing in recent climate 
change, though they provide no quantitative justification for this 
assertion at all. In fact, nearly a dozen detailed ``detection'' and 
``attribution'' studies published during the past decade by leading 
climate researchers in the premier international scientific peer-
reviewed journals such as Science and Nature, have shown that the 
vertical and horizontal pattern of observed warming is inconsistent 
with the response of the climate to solar forcing, but is consistent, 
with the response of the climate to anthropogenic forcing. Thus a 
prudent use of spatial information, as described in various studies by 
leading climatologists, including my collaborators and I, can 
potentially help elucidate the roles of natural and anthropogenic 
factors. However, Dr. Soon's studies are deficient in their use of any 
such information, and provide no insights into the factors governing 
past climate change.

    Question 22. This year, the western United States is anomalously 
hot and dry. The eastern United States is wetter than it has been since 
approximately 1891 and cool. Europe is hotter and drier than it has 
been in about 150 years. If we assume for the moment that these types 
of anomalies would persist for 50 years, are these the types of 
anomalies that Soon and Baliunas would consider as being indicative of 
there being an equivalent to the Medieval Warming in the western United 
States and Europe while at the same time there is the equivalent of the 
Little Ice Age in eastern North America? How would your type of 
approach vary in its analysis of the year 2003 compared to the 
apparently contradictory results that Soon and Baliunas would have?
    Response. Indeed, as my colleagues and I discussed in our peer-
reviewed articles in ``Eos'' and more recently ``Geophysical Research 
Letters'' [Mann, M.E., Jones, P.D., Global Surface Temperatures over 
the Past two Millennia, Geophysical Research Letters, 30 (15), 1820, 
doi: 10.1029/2003GL017814, 2003] the Soon and Baliunas approach is 
indeed internally contradictory in that it would separately identify 
anomalies for even a given year, such as 2003, as simultaneously 
supportive of conditions they would classify as associated with a 
``Little Ice Age'' and a ``Medieval Warm Period'' anomaly. As outlined 
in the question, this year's pattern of climate anomalies is a perfect 
example. Trained climatologists and paleoclimatologists know that one 
must independently evaluate precipitation or drought information from 
temperature information in reconstructing past climate patterns. For 
example, colleagues of mine and I have developed reconstructions of 
patterns of drought over the continental U.S. in past centuries from 
droughtsensitive tree-ring data [Cook, E.R., D.M. Meko, D.W. Stahle, 
and M.K. Cleaveland, Drought Reconstructions for the Continental United 
States, Journal of Climate, 12, 1145-1162, 1999; Zhang, Z., Mann, M.E., 
Cook, E.R., Alternative Methods of Proxy-Based Climate Field 
Reconstruction: Application to the Reconstruction of Summer Drought 
Over the Conterminous United States back to 1700 From Drought-Sensitive 
Tree Ring Data, Holocene, in press, 2003]. The drought reconstructions 
display a quite different pattern of behavior over time from 
reconstructions of Northern Hemisphere mean temperatures, just as 
patterns of drought over the continental U.S. during the 20th century 
as recorded from instrumental data show relatively little in common 
with instrumental Northern Hemisphere mean temperature estimates (for 
example, the most prominent drought episode was the `dust bowl' of the 
1930's, while the most prominent anomaly in the Northern Hemisphere 
temperature record is the late 20th century warming). Drought and 
temperature are essentially independent climate variables. The papers 
by Soon and Baliunas seem not to recognize this fundamental fact. 
Finally, there is an irony in the testimonies of Soon and Legates in 
that they seem to be criticizing my colleagues and me for supposedly 
only focusing on the reconstruction of temperature patterns, when in 
fact we, and not they, have published work reconstructing past patterns 
of drought, precipitation, and atmospheric circulation from proxy 
climate data. However, we have made careful use of the information 
contained in proxy data in independently reconstructing patterns of 
temperature and patterns of drought. By contrast, Soon and colleagues 
hopelessly convolute such information in their interpretations of past 
climate trends.

    Question 23. Could you provide a more detailed explanation for the 
apparent Northern Hemisphere, cooling from the 1940's to 1970's? What 
is the general expectation of what would have happened to the climate 
in the absence of any human influences, so just continuing on from the 
trend for the last 1000 years prior to human intervention?
    Response. In fact, this issue has been studied by quite a number of 
climate scientists for well over a decade. As I mentioned in my 
testimony, a statistically significant cooling trend from 1940's to the 
1970's is not evident for the globe, but only, for the Northern 
Hemisphere. Dr Legates testimony on this matter is incorrect in that 
regard. The observed record of global-mean temperature changes over the 
past 100 years indicates a warming to about 1940; little change from 
1940 to the mid-1970's, and then further warming, Legates implies in 
his comments that these changes are inconsistent with our current 
understanding of the factors governing climate change. This is also 
incorrect. In order to understand these observed changes it is 
necessary to consider all likely causal factors, both anthropogenic and 
natural, Anthropogenic factors include the warming effects of 
greenhouse gases and the cooling effects of sulfate aerosols. Natural 
factors include changes. in the output of the Sun and the effects of 
explosive volcanic eruptions (such as the El Chichon eruption in 1982 
and the Mt. Pinatubo eruption in 1991). and internal variability 
associated with natural climate oscillations in the ocean circulation 
and various modes of coupled ocean-atmosphere variability (such as El 
Nino). When all of these factors are considered, models give an 
expected pattern of 20th century temperature changes that is in 
remarkable agreement with the observations--and the models clearly show 
the three phases noted above. In particular, the leveling off of the 
warming trend over 1940-1975 turns out to be explained largely by the 
relatively rapid increase in cooling effects of sulfate aerosols as the 
world emerged from the Depression and WWII (and perhaps a small 
contribution from natural, internal variations in ocean currents). This 
cooling temporarily offset the warming due to increasing concentrations 
of greenhouse gases. This was first pointed out in a paper by Dr. Tom 
Wigley of the National Center for Atmospheric Research (NCAR) in Nature 
in 1989 and has been verified by numerous additional studies since. 
This agreement between models and observations shows quite clearly that 
human factors have been the dominant cause of global-scale climate 
change over the past 50 years, contrary to the repeated assertions by 
Soon and Legates that they are a manifestation of natural climate 
variability. In the absence of anthropogenic factors, model simulations 
indicate that natural factors alone would have lead to a slight cooling 
trend of global temperatures over the 20th century [Crowley, T.J., 
Causes of Climate Change Over the Past 1000 Years, Science, 289, 270-
277, 2000], in stark contrast to the dramatic warming that has been 
observed.

    Question 24. It was suggested at the hearing that increased 
CO2 could enhance plant life, and that since plants produce 
oxygen, this could lead to more O2 and less CO2. 
Could you please comment on the likelihood of this and how large the 
percentage changes could possibly be, recognizing that as the CO2 
decreased, this would presumably mean the plants would do less well and 
conditions would cool?
    Response. Those suggestions (for example, Legates testimony with 
regard to the role of the `CO2 fertilization' effect) 
represent a misunderstanding of the factors governing carbon cycle 
dynamics and their interaction with climate. In fact, careful studies 
that have been performed with coupled climate/terrestrial carbon cycle 
models that take into account the internal coupled interactions between 
climate and carbon dioxide, accounting for multiple potential factors 
such as (a) the potential `CO2 fertilization' effect in 
which productivity of plants increases in a higher CO2 
environment, (b) the impact of climate on productivity in which higher 
surface temperatures favor enhanced plant growth, and (c) the feedback 
of CO2 back on surface temperature alluded to in the 
question [see chapter 3 of the 2001 IPCC working group 1 report]. Such 
studies show that changes in surface temperature, through their impact 
on biological productivity, have led to, at most, changes of 5 to 10 
ppm in CO2--levels over the past 1000 years [see Gerber, S, 
Joos, F., Bruegger, P.P., Stocker, T.F., Mann, M.E.; Sitch, S., 
Constraining Temperature Variations over the last Millennium by 
Comparing Simulated and Observed Atmospheric CO2; Climate 
Dynamics, 20, 281-299,2003]. Such changes are minimal in comparison 
with the dramatic increases in CO2 concentrations of more 
than 80 ppm associated with human activity, suggesting that the 
`CO2 fertilization' effect advanced by Legates in his 
testimony in reality has a minimal role, at best, in the modern changes 
taking place in CO2 concentrations and climate.

    Question 25. Could you please clarify your remarks regarding the 
FACE experiments? When you say that increased CO2 leads to 
more uptake and that they will rot, do you mean that all plants will 
grow and eventually die and decay, and that increased CO2 
really only ties up a bit more carbon in the process?
    Response. The sequence of questions and time allotted did not allow 
me to adequately explain this basic, but important point. The point I 
was making in my testimony is that the supposed increase in the 
terrestrial carbon reservoir due to enhanced plant growth that is 
argued to occur in a higher CO2 concentration atmosphere 
(the so-called `CO2 fertilization' effect) is not a long-
term, sustained effect, It is only a short-term effect that lasts only 
over the generational timescales of forest stands. Any depletion of the 
atmospheric carbon reservoir due to enhanced growth or productivity of 
plants argued to arise from higher CO2 concentrations is 
short lived, because the plant or tree eventually dies and gives its 
carbon back to the atmosphere either through microbial activity 
(rotting) or burning. In other words, when plants, with any potential 
additional organic carbon storage that might arise from enhanced 
biological activity eventually die, they don't simply pile up in place 
with their carbon reservoir intact (which is what is implicitly assumed 
by those who argue that `CO2 fertilization' represents a 
potential long-term offset to anthropogenic CO2 increases).
    Instead, this carbon is acted upon by biological, chemical, or 
physical processes which serve to add the carbon back to the 
atmosphere. Thus, the so-called ``CO2 fertilization'' 
effect, cannot serve as a permanent offset to anthropogenic increases 
in the atmospheric carbon budget (i.e., atmospheric CO2 
concentrations), as implied by Legates in his testimony. It may simply 
act to slow, slightly, the rate of CO2 increase in the 
atmosphere by slightly increasing the storage rate (but not the 
residence time) of carbon in the terrestrial biosphere.
    Another way to estimate the potential influence is by considering 
the total amounts of carbon presently stored in vegetation. Present, 
about 600 billion tons of carbon are tied up in the aboveground 
vegetation. About 2-3 times this much is tied up in roots and below 
ground carbon, which is a more difficult carbon pool to augment. By 
comparison, scenarios for fossil fuel emissions for the 21st century 
range from about 600 billion tons (if we can keep total global 
emissions at current levels, which implies controls well beyond the 
Kyoto Protocol calls for) to over 2500 billion tons if the world 
increases its reliance on combustion of coal as economic growth and 
population increase dramatically. These numbers clearly indicate that 
sequestering a significant fraction of projected emissions in 
vegetation is likely to be very difficult, especially as forests are 
cleared to make way for agriculture and communities. While there are 
possibilities of storage in wells and deep in the ocean, stabilizing 
the atmospheric CO2 concentration would require gathering up 
the equivalent of 1 to 2 times the world's existing above ground 
vegetation and putting it down abandoned oil wells or deep in the 
ocean. While CO2 fertilization will help to increase above 
ground vegetation a bit, storing more than a few tens of percent of the 
existing carbon would be quite surprising, and this is likely to be 
more like a few percent of global carbon emissions projected for the 
21st century.

    Question 26. Senator Thomas stated that ``[t)he rise in temperature 
during the 20th century occurred between 1900 and 1940.'' Could you 
please provide an indication of how much change occurred during this 
period based on internationally accepted observations, and compare this 
to the total change during the 20th century? Also please comment on 
whether it is scientifically representative to calculate a change 
starting with a cold period due to volcanic eruptions and end it during 
a period devoid of volcanic eruptions and then compare it to the 
century long period, which had major volcanic eruptions in both the 
first and last decades of the century.
    Response. A cursory review of the actual evidence (see e.g. Figure 
2.1 of chapter 2 of the 2001 IPCC Scientific Working Group report) 
indicates the following approximate attributes in the observed record 
of global-mean temperature changes over the past 100 years: a warming 
of approximately 0.3+ C to 1940, a statistically insignificant change 
(given the uncertainties) from 1940 to the mid-1970's, and then an 
additional warming of approximately 0.5+ C from 1970 to 2000. Senator 
Thomas' claim is thus clearly mistaken. As discussed in my answer to an 
earlier question, this pattern of behavior is reproduced closely by 
models driven with estimates of both natural and anthropogenic forcing 
of the climate during the 20th century. The period of relative stasis 
in global mean temperatures from 1940 to 1970, in these model 
simulations, appears to result from the cooling impact of anthropogenic 
aerosols (for which there was a large increase during that time period) 
as well as a cooling contribution from explosive volcanic eruptions 
that occurred during that period, which tended to offset the warming 
influence of increased greenhouse gas concentrations during that time 
period. However, much of the overall warming of the globe during the 
20th century (which is between 0.6+ C and 1.0+ C depending on the 
precise instrumental data set used, and the precise endpoints of the 
interval examined) is clearly a result of increased greenhouse gas 
concentrations, as established in these simulations.

    Question 27. Senator Thomas stated that ``there is no real 
evidence'' that the greenhouse gases are affecting the climate. Could 
you please summarize the available evidence explaining their probable 
effect? Please include in your answer a specific example of a proxy 
indicator such as tree rings and explain the various subtleties in 
deriving a temperature.
    Response. As discussed in my answers to previous questions, the 
fact that increased greenhouse gas concentrations have a role in 20th 
century warming is no longer considered as being in doubt by mainstream 
researchers. Even noted contrarians such as Patrick Michaels of the 
Cato Institute now agree with this conclusion. The only room for 
legitimate scientific debate concerns the relative role of greenhouse 
gas concentrations vs. other factors, and the rate of future warming 
that may occur. Evidence establishing the role of anthropogenic 
greenhouse gas increases in 20th, century warming includes the 
agreement with the full spatial (horizontal and vertical) pattern of 
warming with predictions from model simulations, and the fact that only 
model simulations which include anthropogenic forcing can match the 
observations, as discussed earlier. Evidence for an anthropogenic 
influence on climate also comes from evidence of the anomalous nature 
of late 20th century warmth in a very long-term context (i.e., in at 
least the past millennium, and potentially the past several millennia 
or longer). One such source of evidence for this conclusion comes from 
proxy climate records (such as tree rings, corals, and ice cores) that 
can be used to reconstruct long-term temperature patterns based on a 
careful consideration of the temperature signal in those data, as 
discussed in my response to earlier questions. But other evidence of 
anomalous late 20th century warmth comes from indications of 
unprecedented melting of mountain glaciers the world over (including 
meltback in the Alps so dramatic that it recently revealed the now-
famous ``Ice Man'' who had been trapped in ice for more than 5000 
years), and evidence of unusual phenological changes (e.g. the timing 
of flowering of plants) during the late 20th century.

    Question 28. Senator Carper asked the other two witnesses if they 
thought it ``possible to emit unlimited amounts of CO2 into 
our atmosphere without having any impact on climate or temperature?'' 
What is your expectation of what would occur? That is, how much change 
in the CO2 concentration would cause how much of a response?
    Response. The response of global mean surface temperature to 
increased CO2 varies roughly as the logarithm of the 
CO2 concentration (meaning that increments in temperature 
scale with the percentage change in CO2 rather than the 
change in amount itself). This is a very well known, and long 
established result that follows both from basic theoretical 
considerations of radiative transfer theory, and is embodied in 
experiments using global climate models with varying levels of CO2 
concentrations. The statistical relationship between estimated 
concentrations of CO2 and the admittedly crude estimates of 
global mean temperatures at various periods in the geological past or 
during past glacial intervals, conform relatively well to this 
theoretical relationship within estimated uncertainties [see e.g. the 
textbook, ``Earth's Climate Past and Future'', by W.F. Ruddiman (WR 
Freeman and Co), 2001]. I was extremely surprised when Dr. Soon 
indicated that he did not know how to answer Senator Carper's question, 
suggesting that he is not familiar with this fairly basic scientific 
knowledge.
    This result implies, in the absence of any other factors, a linear 
increase in temperature over time in response to an exponential 
increase in CO2 (which is not a bad description of the 
character of the CO2 trend associated with exponentially 
increasing anthropogenic activity over the past two centuries). Climate 
models tell us that the ``slope'' of that linear increase is between 
1.5+ C to 4.5+ C for each doubling of the CO2 concentration. 
In this context, the testimony Dr. Legates that an arbitrary increase 
in greenhouse gases would lead only to a ``slight'' increase in 
termperature, seems especially disingenuous. Dr. Legates seems to be 
suggesting that the warming would be small despite the magnitude of the 
CO2 increase. Yet, both model-based studies and analyses of 
how climates changes in the past may have been influenced by changes in 
atmospheric composition suggest that a 1.5+ C to 4.5+ C increase in 
temperature is likely for each doubling of the CO2 
concentration. Thus, a quadrupling of the CO2 concentration, 
which is plausible if the world chooses to derive most of its future 
energy from coal, would be expected to be associated with a roughly 3+ 
C to 9+ C increase in global mean temperature. Does Dr, Legates 
consider this a ``slightly'' increased temperature?

    Question 29. In his testimony, Dr. Legates indicated that there 
were historical cases where the temperature has gone up; but the 
CO2 has fallen. Do you agree there were such periods and how 
would you explain this?
    Response. It is certainly the case that this has happened in the 
past. However, it is hardly surprising, and certainly not inconsistent 
with our established understanding of the various factors that 
influence surface temperatures. The warming response to increased 
greenhouse gas concentrations lags the actual increase in greenhouse 
gas concentrations in the atmosphere potentially by several decades, 
due to the sluggish response of the oceans, which have an enormous 
thermal capacity compared to the atmosphere, to increased surface 
radiative forcing. So warming is not expected to be contemporaneous 
with changes in CO2, but instead, to lag it by several 
decades. In addition, greenhouse gases are certainly not the only 
factor affecting the average surface temperature of the Earth. There 
are other anthropogenic factors, such as increased sulphate aerosols, 
which can have a cooling effect on the climate, and natural factors, 
such as volcanic activity, modest natural variations in solar output, 
and internal dynamics associated with climate events such as El Nino, 
which also influence the average surface temperature of the globe. At 
any particular time, these other factors may outweigh the warming 
effect due to increased greenhouse gases. For example, the relative 
lack of warming during the period 1940-1970 appears to be related to a 
combination of such factors, as discussed in my response to an earlier 
question. But while these other factors tend to cancel over time, the 
increased greenhouse gases lead to a systematic warming that will not 
cancel out over time. It is for precisely this reason that late 20th 
century warming now appears to have risen above the range of the 
natural variability of past centuries.
    There are two myths commonly perpetuated by climate change 
contrarians with regard to the relationship between historical CO2 
and temperature variations that are worth addressing in particular:
    (1) Contrarians sometimes argue that the fact that the seasonal 
cycle in atmospheric CO2, which is opposite of the seasonal 
cycle in temperature in the Northern Hemisphere (maximum atmospheric 
CO2 levels over the course of the year occur during the 
Northern Hemisphere winter) implies a negative feedback of temperature 
on CO2 concentration. Such an argument is based on a most 
profound misunderstanding of the basic principles governing atmospheric 
chemistry. Properly trained atmospheric chemists know that the seasonal 
cycle in global atmospheric CO2 concentration is governed by 
the breathing of the terrestrial biosphere, which exhibits a 
hemispheric (and thus seasonal) asymmetry: there is a net uptake of 
atmospheric CO2 (and thus a drawdown of atmospheric CO2 
concentrations) by terrestrial plants during the Northern Hemisphere 
summer growing season, owing to the vastly greater proportion of land 
in the Northern Hemisphere. This simple fact, and nothing else, 
dictates the relationship between Northern Hemisphere surface 
temperatures and CO2 concentrations on seasonal timescales.
    (2) Contrarians sometimes argue that the relationship between 
atmospheric CO2 concentrations and temperature variations 
associated with glacial/interglacial variations over the past several 
hundred thousand years, as deduced from ice core measurements shows 
that CO2 is an effect, rather than cause, of climate 
variability. This reasoning is unsound for at least two fundamental 
reasons:
    (a) Detailed measurements show that global atmospheric CO2 
concentrations lead estimated polar temperature variations (as deduced 
from ice core oxygen isotope ratios) during the long phase of increased 
glaciation, consistent with greenhouse gas forcing of the atmosphere. 
There is some evidence that CO2 concentrations, however, lag 
estimated polar temperature variations during the rapid phase of 
deglaciation (melting of the terrestrial ice sheets at the termination 
of an ice age). This observation is the basis of the flawed argument 
summarized below. During this more rapid `deglacial' phase, the climate 
system if far from being in an equilibrium state, and the dynamics of 
the climate system must be considered as representative of the a 
coupled interaction between surface temperature, atmospheric 
CO2, ocean circulation, and glacial mass. It is well known 
by glaciologists who study this problem that the relationship between 
CO2 and temperatures in such a scenario cannot be 
interpreted in terms of a simple causeeffect relationship.
    (b) Even during the rapid deglaciation, the oxygen isotopes from 
the ice cores only provide an estimate of surface temperature 
variations in the proximity of the ice core (and a very imperfect one, 
owing to possible seasonal deposition biases and non-temperature 
influences on isotope fractionation). They certainly do not provide an 
estimate of hemispheric, let alone global, temperature variations. 
Thus, a comparison of ice core estimates of CO2 and oxygen 
o18 ratios cannot be used to confidently infer the relationship between 
CO2 concentrations and global mean temperatures

    Question 30. During the hearing, there was some contention over the 
issue of the effect of surface cover changes and urban influences on 
the climate? Could you please restate your position on the likely sign 
and magnitude of the influence of both factors?
    Response. Unfortunately misleading comments by Soon and Legates, 
and the complexity of the issues involved, made it difficult for me to 
convey, in the brief time allotted, the established science dealing 
with the various influences on Earth's surface radiation balance and 
changes therein in recent decades. Legates in his testimony confused 
and misstated the nature of both natural and anthropogenic influences 
on the Earth's surface energy budget and on the measurement of surface 
temperatures from surface-based stations. There are several different 
issues involved here, which I will attempt to clarify one at a time 
below:
    (1) The claim made by Legates that the location of thermometer 
measurements in urban centers biases estimates of global mean 
temperature from the available meteorological observations would be 
correct were this effect not already carefully accounted for. In 
particular, possible urban heat island effects on global temperature 
estimates have been studiously accounted for in estimates that have 
been produced for more than a decade, See e.g. the 2001 IPCC report. 
This is unrelated to the issue of the influence of land-use changes on 
the surface radiation budget, though Legates testimony blurs the 
distinction between the two issues:
    (2) The implication by Legates that land-use changes (such as 
urbanization) are the dominant influence on changes in the absorptive 
properties of the Earth as a whole in recent decades is completely 
wrong for at least two reasons:
    (a) The primary factor impacting changes in the absorption of solar 
insolation by the Earth's surface in modern decades is the decrease in 
reflective snow and ice cover due to the warming of the Earth's 
surface. This represents a well-known positive feedback (the `ice-
albedo' feedback) associated with global warming in which warming leads 
to melting of snow and ice, which decreases the reflective properties 
of the surface, increasing surface absorption of radiation, and: thus 
increased the surface temperatures themselves. This crucial positive 
feedback, which enhances the impact of greenhouse gas concentrations on 
the warming of the surface, is fully accounted for in the climate model 
simulations that I have referred to above and in my testimony.
    (b) While urbanization, as suggested by Legates, may lead to 
increased absorption of solar insolation in some urban areas, this is 
the more minor of the human land use changes impacting climate. There 
are far more extensive regions of the Earth where other changes in land 
use, such as conversion of forested land to agricultural land, have, 
instead, increased the reflective properties of the Earth's surface 
[Ramankutty, N., and J. A. Foley Estimating historical changes in 
global land cover: croplands from 1700 to 992, Global Biogeochemical 
Cycles, 13, 997-1027, 1999.], tending to cool the surface, as I 
explained in my testimony. Scientists who have studied the influences 
of these effects have found that the latter cooling effect is the 
dominant of these two anthropogenic land-use influences on the Earth's 
surface properties. Thus, climate model simulations investigating the 
influence of land-use changes on hemispheric or global mean 
temperatures indicate that they have imposed a modest cooling influence 
[Govindasamy, B., P.B. Duffy, and K. Caldeira, Land use changes and 
Northern Hemisphere cooling, Geophysical Research Letters, 28, 291-294, 
2001; Bauer, E, M. Claussen, and V. Brovkin, Assessing climate forcings 
of the Earth system for the past millennium, Geophys. Res. Lett., 30, 
doi: 10.1029/2002GL016639, 2003] that partially offsets even greater 
warming that would have been realized during the 20th century due to 
anthropogenic greenhouse gas influences, Evidence, therefore, does not 
support the case, as argued by Legates, that the full range of human 
land use changes have had a net warming effect on the climate. They 
have had a modest cooling influence on the climate.

    Question 31. Do you receive any income from any sources which have 
taken advocacy positions with respect to the Kyoto Protocol, the U.N. 
Framework Convention on Climate Change, or legislation before the U.S. 
Congress that would affect greenhouse gas emissions? If so, please 
identify those sources and the relevant advocacy position taken.
    Response. I do not, nor have I ever, received any such income.
                               __________

Statement of David R. Legates, Director, Center for Climatic Research, 
                         University of Delaware

    Distinguished Senators, panelists, and members of the audience: I 
would like to thank the Committee for inviting my commentary on this 
important topic of climate history and its implications. My name is 
David R. Legates and I am an Associate Professor and Director of the 
Center for Climatic Research at the University of Delaware in Newark, 
Delaware. My research interests have focused on hydroclimatology--the 
study of water in the atmosphere and on the land--and on the 
application of statistical methods in climatological research.
    I am familiar with the testimony presented here by Dr. Soon. I 
agree with his statements and I will not reiterate his arguments. My 
contributions to Dr. Soon's research stemmed from my grappling with the 
apparent technology between the long-standing historical record and the 
time-series recently presented by Dr. Mann and his colleagues. It also 
stems from my own experiences in compiling and merging global estimates 
of air temperature and precipitation from a variety of disparate 
sources.
    My Ph.D. dissertation resulted in the compilation of high-
resolution climatologies of global air temperature and precipitation. 
From that experience, I have become acutely aware of the issues 
associated with merging data from a variety of sources and containing 
various biases and uncertainties. By its very nature, climatological 
data exhibit a number of spatial and temporal biases that must be taken 
into account. Instrumental records exist only for the last century or 
so and thus proxy records can only be used to glean information about 
the climate for earlier time periods. But it must be noted that proxy 
records are not observations and strong caveats must be considered when 
they are used. It too must be noted that observational data are not 
without bias either.

              THE HISTORICAL RECORD OF THE LAST MILLENNIUM

    Much research has described both the written and oral histories of 
the climate as well as the proxy climate records (e.g., ice cores, tree 
rings, and sedimentations) that have been derived for the last 
millennium. It is recognized that such records are not without their 
biases--for example, historical accounts often are tainted with the 
preconceived beliefs and limited experiences of explorers and 
historians while trees and vegetation respond not just to air 
temperature fluctuations, but to the entire hydrologic cycle of water 
supply (precipitation) and demand (which is, in part, driven by air 
temperature). Nevertheless, such accounts indicate that the climate of 
the last millennium has been characterized by considerable variability 
and that extended periods of cold and warmth existed. It has been 
generally agreed that during the early periods of the last millennium, 
air temperatures were warmer and that temperatures became cooler toward 
the middle of the millennium. This gave rise to the terms the 
``Medieval Warm Period'' and the ``Little Ice Age'', respectively. 
However, as these periods were not always consistently warm or cold nor 
were the extremes geographically commensurate in time, such terms must 
be used with care.

     A BIASED RECORD PRESENTED BY THE IPCC AND NATIONAL ASSESSMENT

    In a change from its earlier report, however, the Third Assessment 
Report of the Intergovernmental Panel on Climate Change (IPCC), and now 
the U.S. National Assessment of Climate Change, both indicate that 
hemispheric or global air temperatures followed a curve developed by 
Dr. Mann and his colleagues in 1999. This curve exhibits two notable 
features. First is a relatively flat and somewhat decreasing trend in 
air temperature that extends from 1000AD to about 1900AD and is 
associated with a relatively high degree of uncertainty. This is 
followed by an abrupt rise in air temperature during the 1900's that 
culminates in 1998 with the highest temperature on the graph. Virtually 
no uncertainty is shown for the data of the last century. The 
conclusion reached by the IPCC and the National Assessment is that the 
1990's are the warmest decade with 1998 being the warmest year of the 
last millennium.
    Despite the large uncertainty, the surprising lack of variability 
in the record gives the impression that climate remained relatively 
unchanged through most of the last millennium--at least until human 
influences began to cause an abrupt increase in temperatures during the 
last century. Interestingly, Mann et al. replace the proxy data for the 
1900's by the instrumental record and no uncertainty characterization 
is provided. This too yields a false impression that the instrumental 
record is consistent with the proxy data and that it is `error free'. 
It is neither. The instrumental record contains numerous uncertainties, 
resulting from a lack of coverage over the world's oceans, an under-
representation of mountainous and polar regions as well as under-
developed nations, and the presence of urbanization effects resulting 
from the growth of cities. Even if a modest uncertainty of a 
0.1+ C were imposed on the instrumental record, the claim 
of the 1990's being the warmest decade would immediately become 
questionable, as the uncertainty window would overlap with the 
uncertainty associated with earlier time periods. Note that if the 
satellite temperature record--where little warming has been observed 
over the last 20 years--had been inserted instead of the instrumental 
record, it would be impossible to argue that the 1990's are the warmest 
decade.

              RATIONALE FOR THE SOON ET AL. INVESTIGATION

    So we were left to question why the Mann et al. curve seems to be 
at variance with the previous historical characterization of climatic 
variability. Investigating more than several hundred studies that have 
developed proxy records, we came to the conclusion that nearly all of 
these records show considerable fluctuations in air temperature over 
the last millennium. Please note that we did not reanalyze the proxy 
data--the original analysis from the various researchers was left 
intact. Most records show the coldest period is commensurate with at 
least a portion of what is termed the ``Little Ice Age'' and the 
warmest conditions are concomitant with at least a portion of what is 
termed the ``Medieval Warm Period''.
    But our conclusion is entirely consistent with conclusions reached 
by Drs. Bradley and Jones that not all locations on the globe 
experienced cold or warm conditions simultaneously. Moreover, we chose 
not to append the instrumental record, but to compare apples-with-
apples and determine if the proxy records themselves indeed confirm the 
claim of the 1990's being the warmest decade of the last millennium. 
That claim is not borne out by the individual proxy records.
    However, the IPCC report, in the chapter with Dr. Mann as a lead 
author and his colleagues as contributing authors, also concludes that 
research by Drs. Mann, Jones, and their colleagues ``support the idea 
that the 15th to 19th centuries were the coldest of the millennium over 
the Northern Hemisphere overall.'' Moreover, the IPCC report also 
concludes that the Mann and Jones research ``show[s] temperatures from 
the 11th to 14th centuries to be about 0.2+ C warmer than those from 
the 15th to 19th centuries.'' This again is entirely consistent with 
our findings. Where we differ with Dr. Mann and his colleagues is in 
their construction of the hemispheric averaged time-series, their 
assertion that the 1990's are the warmest decade of the last 
millennium, and that human influences appear to be the only significant 
factor on globally averaged air temperature. Reasons why the Mann et 
al. curve fails to retain the fidelity of the individual proxy records 
are detailed statistical issues into which I will not delve. But our 
real difference of opinion focuses solely on the Mann et al. curve and 
how we have concluded it misrepresents the individual proxy records. In 
a very real sense, this is an important issue that scientists must 
address before the Mann et al. curve is taken as fact.
    Our work has been met with much consternation from a variety of 
sources and we welcome healthy scientific debate. After all, it is 
disagreements among scientists that often lead to new theories and 
discoveries. However, I am aware that the editors of the two journals 
that published the Soon et al. articles have been vilified and the 
discussion has even gone so far as to suggest that Drs. Soon and 
Baliunas be barred from publishing in the journal Climate Research. 
Such tactics have no place in scientific debate and they inhibit the 
free exchange of ideas that is the hallmark of scientific inquiry.

            CLIMATE IS MORE THAN MEAN GLOBAL AIR TEMPERATURE

    In closing, let me state that climate is more than simply annually-
averaged global air temperature. Too much focus has been placed on 
divining air temperature time-series and such emphasis obscures the 
true issue in understanding climate change and variability. If we are 
truly to understand climate and its impacts and driving forces, we must 
push beyond the tendency to distill it to a single annual number. Proxy 
records, which provide our only possible link to the past, are 
incomplete at best. But when these records are carefully and 
individually examined, one reaches the conclusion that climate 
variability has been a natural occurrence, and especially so over the 
last millennium. And given the uncertainties in the proxy and 
instrumental records, an assertion of any decade as being the warmest 
in the last millennium is premature.
    I'm sorry that a discussion that is best conducted among scientists 
has made its way to a U.S. Senate committee. But hopefully a healthy 
scientific debate will not be compromised and we can push on toward a 
better understanding of climate change.
    I again thank you for the privilege of speaking before you today.

                               __________

  Prepared Statement of Leonard Levin, Ph.D., Technical Leader, EPRI, 
                         Palo Alto, California

    I am Leonard Levin, technical leader at EPRI, which is a non-
profit, collaborative organization conducting energy-related R&D in the 
public interest. Our members are public and private organizations in 
the electricity and energy fields, and we now serve more than 1000 
energy and governmental organizations in more than 40 countries. These 
remarks constitute a synthesis of current research on environmental 
mercury, and are not a representation of official EPRI position.

                              INTRODUCTION

    As a global pollutant, the impact of mercury on the human 
environment is a critical issue that EPRI and the scientific community 
have been examining for many years. As the scientific understanding of 
where mercury originates nationally and globally, combined with the new 
health data, continues to be refined, it can help inform decisions 
regarding its management. I would like to address three key questions 
where new findings have emerged. First, where does mercury found in the 
U.S. environment originate? Second, how much has mercury in fish 
changed in the last few decades? And third, how do potential mercury 
management steps change the amount of mercury depositing to the earth's 
surface in the U.S.?

         WHERE DOES MERCURY IN THE U.S. ENVIRONMENT ORIGINATE?

    Mercury is clearly a global issue. Recent estimates are that, in 
1998, some 2340 tons of mercury were emitted globally through 
industrial activity; of these, more than half, or 1230 tons, came from 
Asian countries, primarily China\1\. These findings are similar to 
those of other global inventories\2\. In addition, it is estimated that 
another 1300 tons of mercury emanates from land-based natural sources 
globally, including abandoned mining sites and exposed geological 
formations. Another 1100 tons or so issues from the world's oceans, 
representing both new mercury emitted by undersea vents and volcanoes, 
and mercury cycled through the ocean from the atmosphere previously. 
Recent findings from the large United States-Canadian METAALICUS field 
study in Ontario, Canada showed that a fairly small amount of deposited 
mercury, no more than 20 percent or so, re-emits to the atmosphere, 
even over a 2-year period. The implications of this are profound: 
mercury may be less mobile in the environment than we previously 
thought; once it is removed from the atmosphere, it may play less of a 
role in the so-called ``grasshopper effect\3\ '' where persistent 
global pollutants are believed to successively deposit and re-emit for 
many years and over thousands of miles.
---------------------------------------------------------------------------
    \1\ Christian Seigneur, C., K. Vijayaraghavan, K. Lohman, Pr. 
Karamchandani, C. Scott, Global Source Attribution for Mercury 
Deposition in the United States, submitted to Environ. Sci. Technol., 
2003.
    \2\ Jozef M. Pacyna, Elisabeth G. Pacyna, Frits Steenhuisen and 
Simon Wilson, Mapping 1995 global anthropogenic emissions of mercury, 
Atmospheric Environment 37 (S1) (2003) pp. 109-117.
    \3\ Environment Canada, The Grasshopper Effect and Tracking 
Hazardous Air Pollutants, The Science and the Environment Bulletin, 
May/June 1998.
---------------------------------------------------------------------------
    Recent studies by EPRI have shown that the mercury depositing into 
the U.S. from the atmosphere may originate at very distant points. 
Model assessments show that, for \3/4\ of the area of the continental 
United States, more than 60 percent of the mercury received originates 
outside U.S. borders, from other countries or even other continents. 
Only 8 percent of U.S. territory receives \2/3\ or more of its mercury 
from U.S. domestic sources, and less than 1 percent of U.S. territory 
gets 80 percent or more of its mercury from sources within the U.S. One 
implication of this dichotomy between mercury sources and the U.S. 
areas impacted is that there may be a ``management floor'' for U.S. 
mercury, a level below which the amount of mercury depositing to the 
surface cannot be reduced.
    Additional evidence for the external origins of much of the mercury 
in the U.S. environment was gathered over the last 2 years by aircraft 
experiments carried out by EPRI, the National Center for Atmospheric 
Research, and a number of U.S., Asian, and Australian investigators. 
One set of flights measured significant mercury in winds entering the 
Pacific Ocean from Shanghai; China; researchers tracked the Chinese 
mercury plume over the Pacific for 400 miles toward America. A second 
set of flights from Monterey, California, found that same plume from 
China crossing the California coast, and a second, higher plume of 
enriched mercury originating in Central Asia also moving into the U.S. 
The global nature of mercury in the U.S. has been clearly demonstrated.

  WHAT ARE THE PRIMARY SOURCES OF MERCURY IN FISH AND THE ENVIRONMENT?

    For much of the twentieth century, mercury was an essential part of 
industrial products, such as batteries and switches, or a key 
ingredient in such other products as house paints. These industrial 
uses of the element declined significantly in the latter half of the 
century, and are now less than 10 percent of their use of fifty years 
ago.\4\ Professor Francois Morel of Princeton University and colleagues 
recently analyzed newly caught Pacific tuna for mercury\5\, and 
compared those results to the mercury content of similar tuna caught in 
the 1970's. Despite changes in mercury emissions to the atmosphere in 
those thirty years\6\, and a matching increase in the mercury 
depositing; from the atmosphere to rivers and oceans below, Prof Morel 
found that mercury levels in tuna have not changed over that time. One 
conclusion is that the mercury taken up by such marine fish as tuna is 
not coming from sources on land, such as utility power plants, but from 
natural submarine sources of mercury, including deep sea volcanoes and 
ocean floor vents. The implications are that changes in mercury sources 
on the continents will not affect the mercury levels found in open 
ocean foodfish like tuna.
---------------------------------------------------------------------------
    \4\ Engstrom, D.R., E.B. Swain, Recent Declines in Atmospheric 
Mercury Deposition in the Upper Midwest, Environ. Sci. Technol. 1997, 
31, 960-967.
    \5\ Kraepiel, A.M.L., K. Keller, H.B. Chin, E.G. Malcolm, F.M.M. 
Morel, Sources and Variations of Mercury in Tuna, Meeting of American 
Society for Limnology and Oceanography, Salt Lake City, Utah: January 
2003.
    \6\ Slemr, F., E-G. Brunke, R. Ebinghaus, C. Temme, J. Munthe, I. 
Wangberg, W. Schroeder, A. Stgeffen, T. Berg, Worldwide trend of 
atmospheric mercury since 1977, Geophys. Res. Ltrs., 30 (10), 23-1 to 
23-4
---------------------------------------------------------------------------
    An estimate in 2001 by scientists of the Geological Survey of 
Canada and others\7\ estimated that geological emissions of mercury, as 
well as emissions from inactive industrial sites on land, are five to 
seven times as large as had been estimated earlier. Recent measurements 
in the stratosphere by EPRI researchers show a rapid removal of mercury 
in the upper atmosphere, allowing for additional sources at the surface 
while still maintaining the measured rates of deposition and removal 
needed for a global balance of sources and sinks. As a result, it is 
now possible to attribute a greater fraction of the mercury entering 
U.S. waters to background natural sources rather than industrial 
emissions from the U.S. or elsewhere globally.
---------------------------------------------------------------------------
    \7\ Richardson G. M., R. Garrett, I. Mitchell, M. Mah-Paulson, T. 
Hackbarth, Critical Review On Natural Global And Regional Emissions Of 
Six Trace Metals To The Atmosphere, International Lead Zinc Research 
Organization, International Copper Association, Nickel Producers 
Environmental Research Association.
---------------------------------------------------------------------------
   HOW COULD POTENTIAL MERCURY REDUCTIONS CHANGE MERCURY DEPOSITION?

    EPRI recently completed work to assess the consequences, of further 
mercury emissions reductions for mercury in the atmosphere, U.S. 
waterways, and fish\8\. The approach used linked models of atmospheric 
mercury chemistry and physics with analyses of Federal data on mercury 
in fish in the U.S. diet, along with a model of costs needed to attain 
a given reduction level.
---------------------------------------------------------------------------
    \8\ EPRI Technical Report 1005224, ``A Framework for Assessing the 
Cost-Effectiveness of Electric Power Sector Mercury Control Policies,'' 
EPRI, Palo Alto, CA, May 2003.
---------------------------------------------------------------------------
    Current U.S. utility emissions of mercury are about 46 tons per 
year. At the same time, a total of about 179 tons of mercury deposit 
each year in the U.S., from all sources global and domestic. One 
proposed management scenario examined cutting these utility emissions 
by 47 percent, to 24 tons per year. The analysis showed that this cut 
results in an average 3 percent drop in mercury deposition into the 
U.S. Some isolated areas totaling about 1 percent of U.S. land area 
experience drops of up to 30 percent in mercury deposited. The cost 
model used in association with these calculations showed utility costs 
to reach these emission control levels would amount to between $2 
billion and $5 billion per year over 12 years. This demonstrated that 
U.S. mercury patterns are relatively insensitive to the effects of this 
single category of sources.
    In addition, most of the fish consumed in the U.S. comes from ocean 
sources, which would be only marginally affected by a global reduction 
of 24 tons of mercury per year due solely to U.S. controls. Wild fresh 
water fish in the U.S. would be expected to show a greater reduction in 
mercury content, but are a relatively small part of the U.S. diet 
compared to ocean or farmed fish. When these changes were translated 
into how much less mercury enters the U.S. diet, we found that 0.064 
percent fewer children would be born ``at risk'' due to their mothers 
taking in less mercury from consumed fish. These results were based on 
the Federal dietary fish consumption data. So, a drop of nearly half in 
utility mercury emissions results in a drop of 3 percent (on average) 
in mercury depositing to the ground, and a drop of less than one-tenth 
of a percent in the number of children ``at risk.''

                    DECISIONMAKING UNDER UNCERTAINTY

    These recent findings on mercury sources, dynamics, and management 
are a small part of the massive international research effort to 
understand mercury and its impacts on the human environment. EPRI and 
others, including the U.S. Environmental Protection Agency and the U.S. 
Department of Energy, are racing to clarify the complex interactions of 
mercury with geochemical and biological systems, vital to understanding 
mercury's route to human exposure and potential health effects. With 
this improved understanding, informed decisions can be made on the best 
ways to manage mercury.
    Thank you for the opportunity to deliver these remarks to the 
Committee.

[GRAPHIC] [TIFF OMITTED] T2381.099

[GRAPHIC] [TIFF OMITTED] T2381.100

[GRAPHIC] [TIFF OMITTED] T2381.101

[GRAPHIC] [TIFF OMITTED] T2381.102

[GRAPHIC] [TIFF OMITTED] T2381.103

[GRAPHIC] [TIFF OMITTED] T2381.104

[GRAPHIC] [TIFF OMITTED] T2381.105

[GRAPHIC] [TIFF OMITTED] T2381.106

[GRAPHIC] [TIFF OMITTED] T2381.107

[GRAPHIC] [TIFF OMITTED] T2381.108

[GRAPHIC] [TIFF OMITTED] T2381.109

[GRAPHIC] [TIFF OMITTED] T2381.110

[GRAPHIC] [TIFF OMITTED] T2381.111

[GRAPHIC] [TIFF OMITTED] T2381.112

[GRAPHIC] [TIFF OMITTED] T2381.113

[GRAPHIC] [TIFF OMITTED] T2381.114

[GRAPHIC] [TIFF OMITTED] T2381.115

[GRAPHIC] [TIFF OMITTED] T2381.116

[GRAPHIC] [TIFF OMITTED] T2381.117

[GRAPHIC] [TIFF OMITTED] T2381.118

[GRAPHIC] [TIFF OMITTED] T2381.119

[GRAPHIC] [TIFF OMITTED] T2381.120

[GRAPHIC] [TIFF OMITTED] T2381.121

[GRAPHIC] [TIFF OMITTED] T2381.122

[GRAPHIC] [TIFF OMITTED] T2381.123

[GRAPHIC] [TIFF OMITTED] T2381.124

[GRAPHIC] [TIFF OMITTED] T2381.125

[GRAPHIC] [TIFF OMITTED] T2381.126

[GRAPHIC] [TIFF OMITTED] T2381.127

[GRAPHIC] [TIFF OMITTED] T2381.128

[GRAPHIC] [TIFF OMITTED] T2381.129

[GRAPHIC] [TIFF OMITTED] T2381.130

[GRAPHIC] [TIFF OMITTED] T2381.131

[GRAPHIC] [TIFF OMITTED] T2381.132

[GRAPHIC] [TIFF OMITTED] T2381.133

[GRAPHIC] [TIFF OMITTED] T2381.134

[GRAPHIC] [TIFF OMITTED] T2381.135

[GRAPHIC] [TIFF OMITTED] T2381.136

[GRAPHIC] [TIFF OMITTED] T2381.137

[GRAPHIC] [TIFF OMITTED] T2381.138

[GRAPHIC] [TIFF OMITTED] T2381.139

[GRAPHIC] [TIFF OMITTED] T2381.140

[GRAPHIC] [TIFF OMITTED] T2381.141

[GRAPHIC] [TIFF OMITTED] T2381.142

[GRAPHIC] [TIFF OMITTED] T2381.143

[GRAPHIC] [TIFF OMITTED] T2381.144

[GRAPHIC] [TIFF OMITTED] T2381.145

[GRAPHIC] [TIFF OMITTED] T2381.146

[GRAPHIC] [TIFF OMITTED] T2381.147

[GRAPHIC] [TIFF OMITTED] T2381.148

[GRAPHIC] [TIFF OMITTED] T2381.149

[GRAPHIC] [TIFF OMITTED] T2381.150

[GRAPHIC] [TIFF OMITTED] T2381.151

[GRAPHIC] [TIFF OMITTED] T2381.152

[GRAPHIC] [TIFF OMITTED] T2381.153

[GRAPHIC] [TIFF OMITTED] T2381.154

[GRAPHIC] [TIFF OMITTED] T2381.155

[GRAPHIC] [TIFF OMITTED] T2381.156

[GRAPHIC] [TIFF OMITTED] T2381.157

[GRAPHIC] [TIFF OMITTED] T2381.158

[GRAPHIC] [TIFF OMITTED] T2381.159

[GRAPHIC] [TIFF OMITTED] T2381.160

[GRAPHIC] [TIFF OMITTED] T2381.161

[GRAPHIC] [TIFF OMITTED] T2381.162

[GRAPHIC] [TIFF OMITTED] T2381.163

[GRAPHIC] [TIFF OMITTED] T2381.164

[GRAPHIC] [TIFF OMITTED] T2381.165

[GRAPHIC] [TIFF OMITTED] T2381.166

[GRAPHIC] [TIFF OMITTED] T2381.167

       Statement of Deborah C. Rice, Ph.D., Maine Department of 
                Environmental Protection, Augusta, Maine

     I would like to thank the Committee for this opportunity to 
present information on the adverse health consequences of exposure to 
methylmercury in the United States. Until 3 months ago, I was a senior 
toxicologist in the National Center for Environmental Assessment in the 
Office of Research and Development at the Environmental Protection 
Agency. I am a co-author of the document that reviewed the scientific 
evidence on the health effects of methylmercury for EPA, and which 
included the derivation of the acceptable daily intake level for 
methylmercury.
     I would like to focus my presentation on four points that are key 
to understanding the health-related consequences of environmental 
mercury exposure. One: there is unequivocal evidence that methylmercury 
harms the developing human brain. Two: the Environmental Protection 
Agency used analyses of three large studies in its derivation of an 
acceptable daily intake level, including the study in the Seychelles 
Islands which found no adverse effects. Three: 8 percent of women of 
child-bearing age in the United States have levels of methylmercury in 
their bodies above this acceptable level. And four: cardiovascular 
disease in men related to low levels of methylmercury has been 
documented, suggesting that a potentially large segment of the 
population may be at risk for adverse health effects.
     The adverse health consequences to the nervous system of 
methylmercury exposure in humans were recognized in the 1950's with the 
tragic episode of poisoning in Minamata Bay in Japan, in which it also 
became clear that the fetus was more sensitive to the neurotoxic 
effects of methylmercury than was the adult. A similar pattern of 
damage was apparent in subsequent episodes of poisoning in Japan and 
Iraq. These observations focused the research community on the question 
of whether exposure to concentrations of methylmercury present in the 
environment might be producing neurotoxic effects that were not 
clinically apparent.
     As a result, over half a dozen studies have been performed around 
the world to explore the effects of environmental methylmercury intake 
on the development of the child. Studies in the Philippines (Ramirez et 
al., 2003), the Canadian Arctic (McKeown-Eyssen et al., 1983), Ecuador 
(Counter et al., 1998), Brazil (Grandjean et al., 1999), French Guiana 
(Cordier et al., 1999) and Madeira (Murata et al., 1999) all found 
adverse effects related to the methylmercury levels in the children's 
bodies. These included auditory and visual effects, memory deficits, 
deficits in visuospatial ability, and changes in motor function.
     In addition to the above studies, there have been three major 
longitudinal studies on the effects of exposure to the mother on the 
neuropsychological function of the child: in the Faroe Islands in the 
North Atlantic (Grandjean et al., 1997), in the Seychelles Islands in 
the Indian Ocean (Myers et al., 1995), and in New Zealand (Kjellstrom 
et al., 1989). Two of these studies identified adverse effects 
associated with methylmercury exposure, whereas the Seychelles Islands 
study did not. Impairment included decreased IQ and deficits in memory, 
language processing, attention, and fine motor coordination. A National 
Research Council (NRC) National Academy of Sciences panel evaluated all 
three studies in their expert review, concluding that all three studies 
were well designed and executed (NRC, 2000). They modeled the 
relationship between the amount of methylmercury in the mother's body 
and the performance of the child on a number of neuropsychological 
tests. From this analysis, they calculated a defined adverse effect 
level from several types of behavior in each of the three studies. 
These adverse effect levels represent a doubling of the number of 
children that would perform in the abnormally low range of function. 
The National Academy of Sciences panel also calculated an overall 
adverse effect level of methylmercury in the mother's body for all 
three of the studies combined, including the negative Seychelles study. 
Thus the results of all three studies were included in a quantitative 
manner by the NRC.
     The Environmental Protection Agency (EPA) used the analyses of the 
NRC in the derivation of the reference dose, or RfD, for methylmercury. 
The RfD is a daily intake level designed to be without deleterious 
effects over a lifetime. The EPA divided the defined deleterious effect 
levels calculated by the NRC by a factor of 10 in its RfD derivation. 
There are two points that need to be made in this regard. First, the 
factor of 10 does not represent a safety factor of 10, since the 
starting point was a level that doubled the number of low-performing 
children. Second, the EPA performed the relevant calculations for a 
number of measurements for each of the two studies that found 
deleterious effects as well as the integrative analysis that included 
all three studies modeled by the NRC, including the negative Seychelles 
study. The RfD is 0.10  g/kg/day based on the Faroe Islands study alone 
or the integrative analysis of all three studies. The RfD would be 
lower than 0.10  g/kg/day if only the New Zealand study had been 
considered. Only if the negative Seychelles Islands study were used 
exclusively for the derivation of the RfD, while ignoring the values 
calculated for the Faroe Islands and New Zealand studies, would the RfD 
be higher than the current value of 0.10  g/kg/day. EPA believes that 
to do so would be scientifically unsound, and would provide 
insufficient protection to the U.S. population.
     A substantial portion of U.S. women of reproductive age have 
methylmercury in their bodies that is above the level that corresponds 
to the EPA's RfD. Data collected over the last 2 years as part of the 
National Health and Nutritional Examination Survey (NHANES 99+) 
designed to represent the U.S. population (CDC, Web) revealed that 
about 8 percent of women of child-bearing age had blood levels of 
methylmercury above the level that the U.S. EPA believes is ``safe'' 
(Schober et al., 2003). This translates into over 300,000 newborns per 
year potentially at risk for adverse effects on intelligence and 
memory, ability to pay attention, ability to use language, and other 
skills that are important for success in our highly technological 
society.
     I would like to further comment here on the use of a factor of 10 
by EPA to derive the allowable daily intake level (RfD) for 
methylmercury from the defined effect levels calculated by the National 
Research Council. The RfD corresponds to roughly 1 part per million 
(ppm) of methylmercury in maternal hair, from the defined effect level 
of about 11 ppm calculated by the NRC. But we know that there is no 
evidence of a threshold below which there are no adverse effects down 
to about 2-3 ppm in hair, the lowest levels in the Faroe Islands study. 
In fact, there is evidence from both the Faroe Islands (Budtz-Jorgensen 
et al., 2000) and New Zealand (Louise Ryan, Harvard University, 
personal communication) studies that the change in adverse effect in 
the child as a function of maternal methylmercury level may be greater 
at lower maternal methylmercury levels than at higher ones. Therefore, 
the so-called safety factor almost certainly is less than 10, and may 
be closer to non-existent. Babies born to women above the RfD may be at 
actual risk, and not exposed to a level 10 times below a risk level.
     There is an additional concern regarding the potential for adverse 
health consequences as a result of environmental exposure to 
methylmercury. Several years ago, a study in Finnish men who ate fish 
found an association between increased methylmercury levels in hair and 
atherosclerosis, heart attacks, and death (Salonen et al., 1995, 2000). 
Two new studies in the U.S. and Europe found similar associations 
between increased methylmercury levels in the bodies of men and 
cardiovascular disease (Guallar et al., 2002; Yoshizawa et al., 2002). 
Effects have been identified at hair mercury levels below 3 ppm. It is 
not known whether there is a level of methylmercury exposure that will 
not cause adverse effects. It is important to understand that the 
cardiovascular effects associated with methylmercury may put an 
additional, very large proportion of the population at risk for adverse 
health consequences as a result of exposure to methylmercury from 
environmental sources.
     In summary, there are four points that I would like the Committee 
to keep in mind. First, at least eight studies have found an 
association between methylmercury levels and impaired 
neuropsychological performance in the child. The Seychelles Islands 
study is anomalous in not finding associations between methylmercury 
exposure and adverse effects. Second, both the National Research 
Council and the Environmental Protection Agency included the Seychelles 
Islands study in their analyses. The only way the acceptable level of 
methylmercury could be higher would be to ignore the two major positive 
studies that were modeled by the NRC, as well as six smaller studies, 
and rely solely on the single study showing no negative effects of 
methylmercury. Third, there is a substantial percentage of women of 
reproductive age in the United States with levels of methylmercury in 
their bodies above what EPA considers a safe level. As a result of 
this, over 300,000 newborns each year are exposed to methylmercury 
above levels U.S. EPA believes to be ``safe''. Fourth, increased 
exposure to methylmercury may result in atherosclerosis, heart attack, 
and even death from heart attack in men, suggesting that an additional 
large segment of the population may be at risk as a result of 
environmental methylmercury exposure.
     Thank you for your time and attention.

                                 ______
                                 
Responses by Deborah Rice to Additional Questions from Senator Jeffords

    Question 1. In testimony, you indicated that ``there might be 
virtually no safety factor at all'' with respect to the effect level 
for mercury exposure. Does that mean that the reference dose should be 
lowered further? If so, what would be a safer and more protective 
reference dose?
    Response. The current reference dose (RfD) is based on a cord blood 
mercury concentration associated with a defined risk: a doubling of the 
number of children performing in the abnormally low range. A total 
uncertainty factor of 10 was applied to account for inter-individual 
variability. There are several decisions made by EPA that, if 
different, would have resulted in a lower RfD.
    (a) It was assumed that the ratio of cord-to-maternal blood mercury 
was one. Subsequent analyses of 10 studies revealed that cord blood has 
more mercury compared to maternal blood. The average ratio is 1.7:1.0, 
with the upper 5 percent of women having a ratio of 3.3:1.0. Based on 
just the average ratio, if no other decisions were changed, the RfD 
would be reduced from 1.0  g/kg/day to 0.6  g/kg/day.
    (b) As was recommended by the NAS expert committee, EPA assumed 
that there was a linear relationship between adverse effects on a 
number of neuropsychological endpoints and the level of mercury in cord 
blood or maternal hair. In fact, the data from the Faroe Island study 
best fit a supra-linear model: i.e., the slope was actually greater at 
lower body burdens (see figure below). It turns out that this was also 
true for the New Zealand study. Recently, a study was published 
reporting a supra-linear shape to the relationship between adverse 
behavioral performance and blood lead levels in children. So this 
phenomenon, while somewhat counter-intuitive, may be real. Using the 
``best fit'' model rather than forcing a linear relationship would 
result in a lower estimate of the defined adverse starting point (a 
doubling of the number of children performing in the abnormally low 
range), and thereby a lower RfD.
      
    [GRAPHIC] [TIFF OMITTED] T2381.168
    
      
    (c) EPA used a total uncertainty factor (UF) of 10 to derive the 
RfD, which is designed to provide a margin of safety against adverse 
effects. EPA typically applies an UF of 10 for inter-individual 
variability if the starting point is a no-observable-adverse-effect-
level (NOAEL). If the starting point is the lowest level that has been 
demonstrated to produce an effect, with a NOAEL not identified, the EPA 
applies an additional UF, usually 10. In the case of methylmercury, 
even though the starting point was a level associated with an effect, 
only a total factor of 10 was applied, rather than the more typical 
100. In addition, the UF of 10 for inter-individual variability is 
presumed to account for differences in both metabolism and response of 
the target organ (sensitivity) between individuals. The variability in 
metabolism of methylmercury between women has been demonstrated to be 
about 3. The variation in cord-maternal blood levels between women may 
be also about 3. These would be multiplied together to equal about 10. 
That allows no room for any variation in response of the fetal brain to 
methylmercury, which is undoubtedly not the case. Therefore a total UF 
of 10 is almost certainly inadequate to protect the most sensitive 
portion of the population.
    The issue of whether the reference dose should be lowered, and if 
so, the appropriate value, requires thorough evaluation by a group of 
expert risk assessors and other scientists. Any new evaluation of the 
RfD should also include evaluation of the levels of methylmercury that 
produce adverse cardiovascular effects documented in several studies of 
adult males. It is currently unknown whether these effects occur at 
lower or higher levels than those that produce developmental 
neurotoxicity.

    Question 2. What is a reasonable estimate of the approximate 
average mercury concentrations in non-commercial fish in the U.S.?
    Response. EPA keeps an extensive data base of fish tissue 
contaminant levels from inland water bodies compiled by individual 
states (http://www.epa.gov/ost/fish/mercurydata.html). Data for average 
levels of mercury for 1987-2000 are in the attached figure. Average 
tissue levels vary significantly depending on species, such that 
deriving an ``average'' for all species is not particularly 
informative. Averages for different species range from 0.1 ppm for 
herring and whitefish to 0.9 ppm for bowfin. As can be seen from the 
figure, the average level for many species is below the 0.3 ppm level 
recommended by EPA (Water Quality Criterion for the Protection of Human 
Health: Methylmercury, OST, Office of Water, 2001, EPA-823-R-01-001). 
Approximately one third of species have average concentrations above 
this. Even for species with averages below 0.3 ppm, some samples will 
exceed this level. For species with averages about 0.5 ppm, more than 
half the samples will exceed the EPA recommended limit, whereas half 
the samples will exceed the 0.5 ppm action limit set by many European 
countries and Canada. Ocean fish and sharks can have levels that are 
considerably higher. For example, blue marlin average 3.08 ppm, with 
the highest level for an individual at 6.8 ppm (Florida Marine Research 
Institute Technical Reports' Mercury Levels in Marine and Estuarine 
Fishes of Florida. 1989-2001: FMRI Technical Report TR-9, Second 
Edition, Revised, 2003). Sharks such as white shark averaged over 5 
ppm, with the highest value for a shark at 10 ppm (ibid.) These are 
non-commercial sport-caught species.

    Question 3. You indicated that the NHANES data does not adequately 
capture the individuals or subpopulations that are likely to be the 
most exposed to non-commercial fish mercury concentrations above the 
reference dose. Are you aware of any work underway to collect this kind 
of data and hopefully protect these people from overexposure?
    Response. There have been a number of relatively small studies 
focusing on fish intake by groups that consume large amounts of fish, 
specifically sports fishers and subsistence fishing communities. Most 
of these efforts have been by individual states or tribes. EPA is 
developing a data base of these studies, most of which are unpublished 
and not in the public domain, a project which I managed before leaving 
the agency. The data base currently includes about 70 studies (contact 
project officer Cheryl Itkin, EPA/ORD/National Center for Environmental 
Assessment, Washington, D.C. at [email protected]).
    There are also several published studies: Bellanger, T.M., Caesar, 
E.M., Trachtman, L. 2000. Blood mercury levels and fish consumption. J. 
La. Med. Soc, 152:64-73; Burge, P., Evans, S. 1994. Mercury 
contamination in Arkansas gamefish. A public health perspective. J. 
Ark. Med. Soc. 90:542-544; Hightower, J.M., Moore, D. 2003. Mercury 
levels in high-end consumers offish. Environ. Health Perspect. 111:604-
608; and Knobeloch, L.M., Ziamik, M., Anderson, H.A., Dodson, V.N. 
1995. Imported seabass as a source of mercury exposure: A Wisconsin 
case study. Environ. Health Perspect. 103:604-606.
    Protecting individuals who may be at greater risk from over-
exposure to methylmercury presents significant challenges. Forty states 
have fish advisories for inland waters, based largely on levels of 
mercury in fish. Some states have levels that are specific to 
particular water bodies, others have statewide advisories for all water 
bodies. Advisories typically are set with regard to species of fish, 
designating them as e.g. ``no restriction'', ``eat no more than once a 
week'', or ``eat no more than once a month''. If a person eats a fish 
from one restricted category they are meant not to eat fish from 
another restricted category in that month. Signs are posted by some 
states at specific water bodies, and most if not all states distribute 
literature related to fish advisories with fishing licenses. Some 
tribes have also performed significant outreach related to issues of 
contaminants in wild foods. Immigrant communities are often the most 
difficult to inform, as a result of language and cultural barriers. 
Minnesota, for example, has made a substantial effort to work with 
immigrant communities, publishing appropriate information in relevant 
languages, as well as performing extensive outreach activities. A few 
other states have made efforts in this regard as well. Some communities 
rely on fish as a significant protein source for both cultural and 
economic reasons. It is unfortunate indeed that these communities are 
risking adverse health outcomes by consuming what should be a very 
healthful food.

    Question 4. Please describe the purposes and intended uses of the 
various Federal agencies' exposure limits for methyl mercury.
    Response. EPA, FDA, and ATSDR have set exposure limits for 
methylmercury. The reference dose (RfD) set by EPA is designed to 
represent an ``estimate of a daily exposure to the human population 
(including sensitive subgroups) that is likely to be without 
appreciable risk of deleterious [non-cancer] effects during a 
lifetime'' (http://www.epa.gov/iris/index.html).
    The minimal risk level (MRL) of ATSDR is ``an estimate of the daily 
human exposure to a hazardous substance that is likely to be without 
appreciable risk of adverse noncancer health effects over a specified 
duration of exposure''. MRLs may be derived for acute (1-14 days), 
intermediate (15-364 days) or chronic durations (over 364 days). ATSDR 
states that ``[t]hese substance-specific estimates, which are intended 
to serve as screening levels, are used by ATSDR health assessors and 
other responders to identify health effects that may be of concern at 
hazardous waste sites. It is important to note that MRLs are not 
intended to define clean-up or action levels for ATSDR or other 
Agencies.'' [bold original] (http://www.atsdr.cdc.gov/mrls.html) It is 
critical to understand that ATSDR is involved in clean-up activities. 
The MRLs are designed to identify chemicals that are important for 
clean-up decisions. They are not intended as health-protective levels 
for the general population, or for a lifetime.
    The FDA acceptable daily intakes (ADI) is ``the amount of a 
substance that can be consumed daily over a long period of time without 
appreciable risk'' (http://www.fda.gov; http://www.cfsan.fda.gov/-
acrobat/hgstud16.pdf). For contaminants in food, FDA uses the ADI to 
derive an Action Level, ``which defines the maximum allowable 
concentration of the contaminant in commercial food.'' In other words, 
the Action Level is supposed to be health-based.
    The RfD and the ADI are designed to protect the general population 
from adverse effects from contaminants in food over a lifetime of 
exposure, including protection of sensitive populations. In contrast, 
the MRL is designed for a different purpose: identifying contaminants 
that may be important in making decisions regarding clean-up of 
contaminated sites.
    The exposure limits from U.S. agencies are as follows:

        EPA RfD: 0.1  g/kg/day
        ATSDR MRL: 0.3  g/kg/day
        FDA ADI: 0.4  g/kg/day

    Question 5. What is the preferred measurement methodology for most 
reliably determining and predicting the effect on children's 
developmental health of methyl mercury exposure?
    Response. There has been considerable discussion within the 
academic and regulatory communities regarding what might be a ``best'' 
test or test battery for determining adverse neuropsychological 
function in children exposed to methylmercury. There are two basic 
strategies that have been used to assess methylmercury neurotoxicity. 
The first is the use of standard clinical instruments such as measures 
of IQ. These have the advantage of being standardized for the 
population, as well as assessing a wide range of functional domains. 
However, because they may be measuring a number of functions that are 
not affected in addition to those that are, the results can be 
``diluted'', and therefore these tests may be less sensitive than a 
more focused approach. The second approach is to choose domain-specific 
tests based on the known effects of higher levels of the toxic 
chemical, if such effects are known. This strategy has the advantage of 
being potentially more sensitive than using broad-based clinical 
instruments. On the other hand, using domainspecific tasks runs the 
risk of looking at the wrong functions.
    The investigators of the Faroe Islands study used a number of 
domain-specific tasks, based on the effects of high-level methylmercury 
exposure as well as the pathological changes in specific brain areas 
produced by methylmercury. The Faroe Island study found deficits in 
these' tasks. The investigators of the Seychelles study used standard 
clinical instruments that assessed a little bit of a lot of functions, 
which were standardized for a U.S. population rather than the 
Seychellois population. They found no effect of methylmercury. In 
contrast, the investigators of the New Zealand study, also using 
standard clinical instruments, did identify mercury-related deficits.
    The consensus of the research community seems to be that a 
combination of both approaches should be used, The standard clinical 
instruments (e.g. full-scale IQ) are comprised of subscales (e.g. 
verbal, visuospatial) that can be used to explore more specific 
functional domains. Researchers should also use what is known about the 
behavioral and neuropathological effects of methylmercury to design 
domainspecific tests, with the hope that these will be maximally 
sensitive. To date, deficits in memory, language processing, 
visuospatial ability, motor function, and attention have been 
identified to be adversely affected by in utero methylmercury exposure. 
Hearing may also be adversely affected. New studies, or continued 
testing of current cohorts, should build on this knowledge to hone in 
even further on specific behavioral functions.

    Question 6. In 1974, the FDA established a mercury action limit of 
.5 parts per million in fish. This was changed in 1979 to 1 part per 
million. What was the basis for this change?
    Response. FDA set an action level of 0.5 ppm for mercury in fish in 
1969,' in response to the recognition. of the devastating consequences 
of fetal exposure to methylmercury in the poisoning episodes in 
Minamata and Niigata, Japan. This level was reaffirmed in 1974, citing 
concerns about damage to the fetus at lower exposures than are harmful 
to the adult. The level was changed in 1979 as a result of a lawsuit by 
the fishing industry that resulted in a court ruling based on 
socioeconomic impacts presented by the National Marine Fisheries 
Service (NMFS). They argued that raising the action level would expand 
the number of fisheries available for exploitation and expand the 
profits of the fishing industry (Fed. Reg. 3990, 3992, 1979). The 
notice was a withdrawal of the proposed rulemaking and terminated a 
rulemaking procedure to codify the (then) existing action level 
limiting the amount of unavoidable mercury residues permitted in fish 
and shellfish of 0.5 ppm. The FR notice also indicates that ``[t]he 
Food and Drug Administration will continue to monitor mercury levels in 
fish so that if there is any change in mercury residue levels as a 
result of raising the action level, or if there is any other change in 
the information regarding mercury in fish, the action level can be 
revised accordingly.'' Thus, the action limit is not health-based, but 
was established for economic considerations.

    Question 7. What, if anything, should consumers of fish in the 
Great Lakes region and other areas that are downwind of major mercury 
emission sources such as coal-fired power plants, chlor-alkali 
manufacturing facilities and; waste incinerators, be advised to do with 
respect to limiting their methyl mercury exposure?
    Response. Unfortunately, the majority of inland lakes and rivers 
are contaminated with mercury. Methylmercury is created from mercury by 
microorganisms in the water. Methylmercury is bioconcentrated as it is 
passed up the food chain, with older and larger fish at the top of the 
food chain containing more methylmercury than smaller fish or fish that 
are lower on the food chain. Methylmercury exposure in humans is 
exclusively from eating contaminated fish. Forty states have explicit 
fish advisories as a result of mercury contamination for consumption of 
fish based on species, size, and in some states specific water bodies. 
There were 2,242 advisories in 2000, up 8 percent o from 1999 and up 
149 percent from 1993. By far the greatest number of fish advisories 
for mercury are around the Great Lakes and in the Northeastern states. 
Consumers are advised to carefully follow State fishing advisories for 
inland fish. There is an increasing recognition that commercial and/or 
ocean fish may represent a significant source of methylmercury 
exposure. Currently, FDA advises pregnant women, nursing mothers and 
young children against eating any shark, swordfish, tilefish, or king 
mackerel. Recent data indicate that canned white (albacore) tuna may 
have substantial levels of methylmercury, and so should be consumed 
seldom, especially by children. Other species such as fresh tuna and 
halibut may also have significant levels of methylmercury. Intake of 
purchased fish that are potentially high in methylmercury should be 
included by individuals in determining safe fish intake over a specific 
time period. In other words, consumers need to have detailed 
information on fish species from both commercial and non-commercial 
sources to keep track of their potential methylmercury intake.
    This is an unsatisfactory solution, since fish should be a very 
healthful food. Moreover, sport fishing is an important economic 
resource in many areas, and some individuals rely on fishing for a 
substantial portion of their protein, particularly in certain immigrant 
communities. The ultimate solution is of course to decrease 
environmental deposition of mercury.

[GRAPHIC] [TIFF OMITTED] T2381.169

[GRAPHIC] [TIFF OMITTED] T2381.170

[GRAPHIC] [TIFF OMITTED] T2381.171

[GRAPHIC] [TIFF OMITTED] T2381.172

[GRAPHIC] [TIFF OMITTED] T2381.173

[GRAPHIC] [TIFF OMITTED] T2381.174

[GRAPHIC] [TIFF OMITTED] T2381.175

[GRAPHIC] [TIFF OMITTED] T2381.176

[GRAPHIC] [TIFF OMITTED] T2381.177

[GRAPHIC] [TIFF OMITTED] T2381.178

          Statement of Dr. Gary Myers, Pediatric Neurologist 
                 and Professor, University of Rochester

    Thank you for the opportunity to present the views of our research 
group on the health effects of methylmercury (MeHg) exposure. My name 
is Gary Myers. I am a pediatric neurologist and professor at the 
University of Rochester in Rochester, New York and one member of a 
large team that has been studying the human health effects of MeHg for 
nearly 30 years. For nearly 20 years our group has specifically studied 
the effects of prenatal MeHg exposure from fish consumption on child 
development.

                           MERCURY POISONINGS

    In the 1950's, massive industrial pollution for over two decades in 
Japan resulted in high levels of MeHg in ocean fish. Several thousand 
cases of human poisoning from consuming the contaminated fish were 
reported. The precise level of human exposure causing these health 
problems was never determined, but was thought to be high. During that 
epidemic pregnant women who themselves had minimal or no clinical 
symptoms of MeHg poisoning delivered babies with severe brain damage 
manifested by cerebral palsy, seizures and severe mental retardation. 
This suggested that MeHg crosses the placenta from the mother to the 
fetus and that the developing nervous system is especially sensitive to 
its toxic effects.
    In 1971-1972 there was an epidemic of MeHg poisoning in Iraq. 
Unlike the Japanese poisonings, the source of exposure in Iraq was 
maternal consumption of seed grain coated with a MeHg fungicide. Our 
research team studied the children of about 80 women who were pregnant 
during this outbreak. We measured mercury exposure to the fetus using 
maternal hair, the biomarker that best corresponds to MeHg brain level, 
and examined the children. We concluded that there was a possibility 
that exposure as low as 10 ppm in maternal hair might be associated 
with adverse effects on the fetus, although there was considerable 
uncertainty in this estimate. This value is over 10 times the average 
in the United States, but individuals consuming large quantities of 
fish can achieve this level.

               MERCURY FOUND NATURALLY IN THE ENVIRONMENT

    Mercury is a natural element in the earth's crust. In aquatic 
environments, bacteria can convert inorganic mercury to MeHg. Once MeHg 
enters the food chain, it is bioaccummulated and bioconcentrated. All 
fish contain small amounts, and predatory fish or mammals such as 
whales have larger amounts. Most commercial oceanic fish in the United 
States has < 0.5 ppm MeHg in the muscle, but some freshwater fish have 
> 1 ppm. In comparison, contaminated fish in Japan that caused 
poisoning had up to 40 ppm.
    Everyone who consumes fish is exposed to MeHg, and regular fish 
consumption can lead to hair mercury levels as high as 10 ppm or more. 
The average hair mercury level in the United States is < 1 ppm. If MeHg 
does affect the developing brain at such low levels, mothers who 
consume large amounts of fish would be exposing their babies to this 
risk.
    The hypothesis of our study in the Seychelles was that prenatal 
MeHg from fish consumption might affect child development. Since 
millions of people around the world consume fish as their primary 
source of protein, we decided to investigate the question directly. We 
initiated the Seychelles Child Development Study in 1983 and began 
enrolling subjects in a pilot study in 1987. We selected the Seychelles 
as a sentinel population for the United States for two reasons. First, 
they consume large amounts of fish. The average mother in our main 
cohort ate fish with 12 meals per week or over 10 times that of U.S. 
women. Second, the fish consumed in Seychelles (average mercury content 
0.3 ppm) has approximately the same mercury concentration as commercial 
fish in the United States.

             THE SEYCHELLES CHILD DEVELOPMENT STUDY (SCDS)

    The SCDS is a collaborative study carried on by researchers at the 
University of Rochester Medical Center in Rochester, NY and the 
Ministries of Health and Education in the Republic of the Seychelles. 
Funding has come from the National Institute of Environmental Health 
Sciences, the Food and Drug Administration, and the governments of 
Seychelles and Sweden. The Republic of the Seychelles is an island 
Nation in the Indian Ocean off the East Coast of Africa.
    Our original hypothesis was that prenatal MeHg exposure at levels 
achieved by regular maternal consumption of fish would be associated 
with adverse effects on child development that could be detected by 
clinical examination, or by the use of developmental tests that have 
previously been used to study the effects of environmental exposures on 
child development.
    The Seychelles was chosen partly because there is no mercury 
pollution and many factors that complicate epidemiological studies of 
low-level exposure are not present. Health care is free, universal and 
readily available. Prenatal care is nearly 100 percent, the birth rate 
is high, and the general health of mothers and children is good. 
Education is free, universal, and starts at age 3.5 years. There is 
limited emigration and both the people and the government were 
cooperative and supportive.
    Before starting a carefully controlled main study, we carried out a 
pilot study. We expected to find only subtle effects on children at 
these levels of exposure. Consequently, it was important to minimize 
any possibility of bias, so a number of decisions were made before the 
study began. First, no one in Seychelles including researchers visiting 
the island would know the exposure level of any child or mother, unless 
our results indicated that children were at risk from prenatal mercury 
exposure. Second, children with a known cause of developmental delay 
(meningitis, very low birth weight, or brain trauma) would not be 
studied. Third, the tests administered would include tests previously 
reported to show associations with MeHg exposure, tests used with other 
toxic exposures, and other tests that might detect subtle changes. 
Fourth, all testing would be performed within specific age windows to 
minimize the effect of age on test interpretation. Fifth, results would 
be adjusted for multiple confounding factors (covariates), including 
things like socioeconomic status, maternal intelligence and birth 
weight, which are known to have independent effects on child 
development and if not accounted for, could bias the results. Sixth, 
the data analysis plan would be determined before the data were 
collected to minimize the possibility that the data would be repeatedly 
analyzed until the anticipated effect was eventually found.
    In 1989-90, we enrolled over 700 mothers and children in the SCDS 
main study. These children were evaluated on five occasions (6, 19, 29, 
66 and 107 months of age) during the past 9 years. When the children 
were about 4 years old their homes were visited and evaluated. The 
study focused on prenatal exposure. This was measured in the mothers' 
hair growing during pregnancy. Postnatal exposure was also periodically 
measured in the children's hair. The exposure of both mothers and 
children ranged from 1 to 27 ppm, the range of concern. The testing was 
extensive with over 57 endpoints being evaluated to date.
    Through 107 months (9 years) and over 57 primary endpoints, the 
study has found only three statistical associations with prenatal MeHg 
exposure. One of these associations was adverse, one was beneficial and 
one was indeterminate. These results might be expected to occur by 
chance and do not support the hypothesis that adverse developmental 
effects result from prenatal MeHg exposure in the range commonly 
achieved by consuming large amounts of fish. The test results do show 
associations with factors known to affect child development such as 
maternal IQ and home environment so there is evidence that the tests 
are functioning well.

                   OUR INTERPRETATION OF THE FINDINGS

    We do not believe that there is presently good scientific evidence 
that moderate fish consumption is harmful to the fetus. However, fish 
is an important source of protein in many countries and large numbers 
of mothers around the world rely on fish for proper nutrition. Good 
maternal nutrition is essential to the baby's health. Additionally, 
there is increasing evidence that the nutrients in fish are important 
for brain development and perhaps for cardiac and brain function in 
older individuals.
    The SCDS is ongoing and we will continue to report our results. 
Presently we are examining a new cohort to determine specific nutrients 
that might influence the effects of MeHg.
    Appendix--Not read before the committee, but included in the 
handout.
    Because of the public health importance of the question being 
studied by the SCDS, the potential exists for differing opinions of 
scientific findings to become highly politicized. The SCDS has received 
only one published criticism (JAMA, 280:737, 1998), but other points 
have been raised at conferences. These questions are addressed here 
individually.
     Why did the SCDS measure mercury in the hair rather than 
in the cord blood? Hair mercury was used because it is the standard 
measure used in nearly all other studies of this question. Mercury is 
thought to enter the hair and brain in a similar fashion. Hair was also 
chosen because hair has been shown to follow blood concentrations 
longitudinally, and samples of hair can recapitulate the entire period 
of exposure, in this case the period of gestation. As part of our 
research we have shown that hair levels reflect levels in the target 
tissue, brain. Measuring mercury in blood requires correction for the 
red blood cell volume (hematocrit) since the mercury is primarily in 
red blood cells and reflects only very recent exposure. It can also 
vary if recent meals with high mercury content are consumed.
     Did the SCDS use subjects whose mercury values were too 
low to detect an association? No, the study's goal was to see if the 
children of women who consume fish regularly were at risk for adverse 
developmental effects from MeHg. Women in Seychelles eat fish daily and 
represent a sentinel population with MeHg levels 10 times higher than 
U.S. women. Because of higher levels of exposure, their children should 
be more likely to show adverse effects if they are present. These 
children show no adverse effects through 9 years of age suggesting that 
eating ocean fish, when there is no local pollution, is safe. However, 
we cannot rule out an adverse effect above 12-15 ppm since we had too 
few cases to substantiate a statistical association if one really 
existed.
     Did the SCDS use the best tests available to detect 
developmental problems? Yes, the SCDS used many of the same 
neurodevelopmental and neuropsychological tests used in other 
developmental studies. These tests are deemed to be excellent measures 
for determining development at the ages studied. The tests examined 
specific domains of children's learning and were increasingly 
sophisticated as the children become older.
     Did the SCDS find expected associations between 
development and birth weight, socioeconomic factors, and other 
covariates? Yes, expected relationships with many covariates such as 
maternal IQ, family socioeconomic status and the home environment were 
found, indicating that our tests were sensitive to developmental 
differences.
     Did the removal of statistical outliers in the analysis 
bias the study? No. It is standard practice among statisticians to 
remove statistical outliers. Outliers are values that are inconsistent 
with the statistical model employed to analyze the data. Every 
statistical analysis depends on a model, and every statistical model 
makes assumptions about the statistical (distributional) properties of 
the data that must be satisfied if the results of the analysis are to 
be interpreted correctly. Sound statistical practice requires that the 
necessary assumptions be checked as part of the statistical analysis. 
Examination of outliers constitutes one of these checks. Statistical 
outliers are defined by the difference between the actual test score 
for a child and the value predicted by the statistical model. Small 
numbers of such outliers occurred in test scores for children with 
widely varying MeHg exposures. The results of all analysis were 
examined both before as well as after the removal of outliers. For 
analyses in the main study the removal of statistical outliers did not 
change the conclusions.
     What about the Faroe Islands study where prenatal MeHg 
exposure was reported to adversely affect developmental outcomes? There 
are substantial differences between the Faroe Islands and Seychelles 
studies. The exposure in the Faroe Islands is from consuming whale meat 
and there is also concomitant exposure to PCBs and other neurotoxins. 
There are also differences in the measurement of exposure and the 
approach to statistical analysis. The Faroe Islands study reported 
associations between cord blood mercury levels and several tests. After 
statistical analysis they attributed the associations to prenatal MeHg 
exposure. Scientific studies are frequently open to different 
interpretations and some scientists do not agree with the researchers' 
interpretation. We believe the Seychelles study of individuals 
consuming fish more closely approximates the U.S. situation.
     Are the children in Seychelles too developmentally robust 
to find the effects of MeHg if they are present? No, the children in 
Seychelles tested similar to U.S. children on nearly all measures apart 
from motor skills where they were more advanced. There is no reason to 
think that they are too robust to show the effects of prenatal MeHg 
exposure if any are present.
     Are children in Seychelles exposed to PCBs or other food-
born toxins that might have confounded the results? No, sea mammals are 
not consumed in Seychelles and measured PCBs in the children's blood 
were low.
     Should data from the Seychelles be considered interim? 
Maybe. Among developmental studies, a 9-year followup is considered 
very long and should be adequate to identify associations with most 
toxic exposures. However, very subtle effects can be more readily 
tested in older individuals and there is evidence from experimental 
animals that some effects of early mercury exposure may not appear 
until the animal ages.

[GRAPHIC] [TIFF OMITTED] T2381.179

[GRAPHIC] [TIFF OMITTED] T2381.180

[GRAPHIC] [TIFF OMITTED] T2381.181

[GRAPHIC] [TIFF OMITTED] T2381.182

[GRAPHIC] [TIFF OMITTED] T2381.183

[GRAPHIC] [TIFF OMITTED] T2381.184

[GRAPHIC] [TIFF OMITTED] T2381.185

[GRAPHIC] [TIFF OMITTED] T2381.186

[GRAPHIC] [TIFF OMITTED] T2381.187

[GRAPHIC] [TIFF OMITTED] T2381.188

[GRAPHIC] [TIFF OMITTED] T2381.189

[GRAPHIC] [TIFF OMITTED] T2381.190

[GRAPHIC] [TIFF OMITTED] T2381.191

[GRAPHIC] [TIFF OMITTED] T2381.192

[GRAPHIC] [TIFF OMITTED] T2381.193

[GRAPHIC] [TIFF OMITTED] T2381.194

[GRAPHIC] [TIFF OMITTED] T2381.195

[GRAPHIC] [TIFF OMITTED] T2381.196

[GRAPHIC] [TIFF OMITTED] T2381.197

[GRAPHIC] [TIFF OMITTED] T2381.198

[GRAPHIC] [TIFF OMITTED] T2381.199

                                                   August 18, 2003.
Hon. James M. Inhofe, Chairman,
Committee on Environment and Public Works,
U.S. Senate,
Washington, DC.
    Dear Mr. Chairman: Thank you for offering me the opportunity to 
respond to certain comments that were made in the EPW committee hearing 
on Tuesday, July 29, of this year. I hope I can clear up any confusion 
that might have been caused by incomplete, misleading or erroneous 
testimony that day.
    The testimony in question by Dr. Michael Mann stated:

          ``It's unfortunate to hear comments about the supposed 
        inconsistencies of the satellite record voiced here years after 
        that has been pretty much debunked in the peer-reviewed 
        literature in Nature and Science. Both journals have, in recent 
        years, published . . . articles indicating that in fact, the 
        original statement that the satellite record showed cooling was 
        flawed because . . . the original author, John Christy, did not 
        take into account a drift in the orbit of that satellite, which 
        actually leads to a bias in the temperatures . . . Christy and 
        colleagues have claimed to have gone back and fixed that 
        problem. But just about every scientist who has looked at it 
        says that this fix isn't correct and that if you fix it 
        correctly then the satellite record actually agrees with the 
        surface record, indicating fairly dramatic rates of warming in 
        the past two decades.''

    Virtually all of this testimony is misleading or incorrect. I will 
touch on the major problems, point-by-point, and I will try to be 
brief.
    1. Certainly no one has ``debunked'' the accuracy of the global 
climate dataset that we built at The University of Alabama in 
Huntsville (UAH) using readings taken by microwave sensors aboard NOAA 
satellites. This dataset has been thoroughly and rigorously evaluated, 
and has been published in a series of peer-reviewed papers beginning in 
Science (March 1990). The most recent version of the dataset was 
published in May 2003 in the Journal of Atmospheric and Oceanic 
Technology after undergoing a strenuous peer review process.
    2. We, and others, are constantly scrutinizing our techniques to 
find ways to better analyze the data. In every case except one we 
discovered needed improvements ourselves, developed a method for 
correcting the error, and published both the error and the correction 
in peer-reviewed journals. When Wentz, et al. (1998) published their 
research on the effects of orbital decay (the one exception) they 
explained an effect we immediately recognized, but which was partially 
counterbalanced by other factors we ourselves discovered. Since that 
time we have applied the corrections for both orbital decay and other 
factors, and have published the corrected data in peer-reviewed 
journals.
    3. The UAH satellite record does not show cooling in the lower 
troposphere and hasn't shown a long-term cooling trend since the period 
ending in January 1998. I cannot say where this chronic cooling 
misconception originated. Our long-term data show a relatively modest 
warming in the troposphere at the rate of 0.133+ Fahrenheit per decade 
(or 1.33+ Fahrenheit per century) for the period of November 1978 to 
July 2003.
    4. There is no credible version of the satellite dataset that 
``actually agrees'' with the surface temperature record for the past 25 
years, nor one that shows ``fairly dramatic rates of warming.'' The as-
yet-unexplained differences between the surface and satellite data are 
at the heart of the controversy over the accuracy of the satellite 
data.
    While much of the surface data remains uncalibrated and 
uncorroborated, we have evaluated our UAH satellite data against 
independent, globally-distributed atmospheric data from the U.S. and 
the U.K. (Hadley Centre) as shown in the figure (enclosure 1). We 
published the results of those comparisons in numerous peer-reviewed 
studies (enclosure 2). In each case we found excellent consistency 
between the satellite data and the atmospheric data. One should note 
that such independent corroboration has not been performed on the other 
satellite temperature datasets alluded to in the quoted testimony.
    This consistency between two independent datasets gathered using 
very different techniques gives us a high level of confidence that the 
UAH satellite dataset provides a reliable measure of global atmospheric 
temperatures over more than 90 percent of the globe. (By comparison, 
one of the most often quoted surface temperature datasets achieves 
partial-global coverage only by claiming that certain isolated 
thermometer sites provide representative temperatures for an area 
roughly equaling two-thirds of the contiguous 48 states, an area that 
would reach from about Brownsville, Texas, to Grand Forks, North 
Dakota.)
    5. A final point relates to numerous comments elsewhere in the 
testimony in which an appeal to a nebulous ``mainstream climate 
community'' was made to support what was stated. First, the notion that 
``thousands'' of climate scientists agreed on the IPCC 2001 text is an 
illusion. I was a lead author of IPCC 2001, as was Dr. Mann. There were 
841 lead authors and contributors, the majority of whom were not 
climatologists and who provided input in the area in which they have 
expertise only to their tiny portion of the 800+ page document. These 
841 were not asked to approve nor where they given the opportunity to 
give a stamp of approval on what was finally published.
    Although I might be outside the ``mainstream,'' according to Dr. 
Mann's perspective, I have never thought a scientist's goal was to 
achieve membership in the ``mainstream.'' My goal is to produce the 
most reliable climate datasets for use in scientific research. Whether 
they show warming or cooling is less important to me than their 
reliability and accuracy. That these datasets have been published in 
numerous peer-reviewed venues is testimony to accomplishing this goal 
and, by inference, would place me inside the mainstream climate 
community. In addition to being an IPCC lead author, significant 
achievement awards from NASA and the American Meteorological Society 
along with my recent election as a Fellow of the AMS are evidence of my 
impact on the community of scientists.
    I hope this clears up any confusion you or your committee members 
might have had about the UAH global temperature data. If you or any of 
your committee members have any questions, I will be delighted to 
answer them to the best of my ability.
    Thank you again for offering me this opportunity. I remain,
            Sincerely,
                                John Christy, Ph.D.