More info at



"It seems to me that there is a huge inconsistency in your positions
on NEOs and global warming. As with climate modeling, there is no
"proof" that an NEO is going to hit the Earth at any given time. And
we know that some impacts are "good" for life on Earth, just as some
greenhouse warming may possibly be "good". Therefore, the consistent
position would be to ridicule predictions of doom by asteroid or comet
impacts, extol the virtues of impacts, and argue to derail all global
attempts to deal with the problem as long as our information is still
--David Grinspoon, Southwest Research Institute

"Let me set David's mind at rest. Over the years, I have indeed
taken 'a consistent position to ridicule predictions of doom by asteroid
or comet impacts.' Instead, I have emphasised a sanguine philosophy
that the impact hazard can be cracked progressively over the next 100
to 200 years. In this way - and at the same time - we will be developing new
technologies and space industries as well as improving living standards -
rather than restraining economic progress and retreating in cultural
cynicism and technological distrust."
--Benny Peiser, 10 July 2002

    CO2 Science Magazine, 10 July

    National Post, 5 July 2002

    CO2 Science Magazine, 10 July 2002

    CO2 Science Magazine, 10 July 2002

    World Climate Report, 8 July 2002

    Time Magazine, 8 July 2002

    Tech Central Station, 9 July 2002

    David Grinspoon <>

    Benny Peiser <>

    Timo Niroma <>

     Number Watch, July 2002


>From CO2 Science Magazine, 10 July

Davey, M.K., Huddleston, M., Sperber, K.R., Braconnot, P., Bryan, F., Chen,
D., Colman, R.A., Cooper, C., Cubasch, U., Delecluse, P., DeWitt, D.,
Fairhead, L., Flato, G., Gordon, C., Hogan, T., Ji, M., Kimoto, M., Kitoh,
A., Knutson, T.R., Latif, M., Le Treut, H., Li, T., Manabe, S., Mechoso,
C.R., Meehl, G.A., Power, S.B., Roeckner, E., Terray, L., Vintzileos, A.,
Voss, R., Wang, B., Washington , W.M., Yoshikawa, I., Yu, J.-Y., Yukimoto,
S., and Zebiak, S.E. 2002.  STOIC: a study of coupled model climatology and
variability in tropical ocean regions. Climate Dynamics 18: 403-420.

What was done
The authors tested the ability of 23 dynamical ocean-atmosphere models to
correctly simulate fields of tropical sea surface temperature (SST), surface
wind stress, and upper ocean vertically-averaged temperature (VAT) in terms
of annual mean, seasonal cycle, and interannual variability characteristics.
Of the models tested, 21 were coupled GCMs, of which 13 used no form of flux

What was learned
In the words of the 36 climate modelers who performed the tests and did the
evaluations, "in most models without flux adjustment, the annual mean
equatorial SST in the central Pacific is too cool and the Atlantic zonal SST
gradient has the wrong sign."  They also note that "annual mean wind stress
is often too weak in the central Pacific and in the Atlantic, but too strong
in the west Pacific." In addition, they say that "few models have an upper
ocean VAT seasonal cycle like that observed in the equatorial Pacific." On
top of these deficiencies, they report that "interannual variability is
commonly too weak in the models: in particular, wind stress variability is
low in the equatorial Pacific." Likewise, they say that "most models have
difficulty in reproducing the observed Pacific 'horseshoe' pattern of
negative SST correlations with interannual Niño3 SST anomalies, or the
observed Indian-Pacific lag correlations."

What it means
The authors rightly conclude, from their results and the results of Latif et
al. (2002), that "the equatorial Pacific is still a major problem area for
coupled models." Indeed, they report that "overall, no single model is
consistent with observed behavior in the tropical ocean regions," concluding
that "understanding and overcoming these shortcomings must be a high
priority for the coupled modeling community."

We think that the attainment of these high-priority climate modeling goals
should be a high-priority prerequisite for formulating Kyoto-type energy
policy; for how can one rationally base such important regulatory decisions
on climate models that have such important proven deficiencies in
replicating current real-world climate, to say nothing of their ability to
accurately predict its future? Most sensible people would laugh at the
thought, were it not so serious and sad a matter. Which leads us to ask,
Where in the world has reason fled?

Latif, M., Sperber, K., Arblaster, J., Braconnot, P., Chen, D., Colman, A.,
Cubasch, U., Cooper, C., Delecluse, P., DeWitt, D., Fairhead, L., Flato, G.,
Hogan, T., Ji, M., Kimoto, M., Kitoh, A., Knutson, T., Le Treut, H., Li, T.,
Manabe, S., Marti, O., Mechoso, C., Meehl, G., Power, S., Roeckner, E.,
Sirven, J., Terray, L., Vintzileos, A., Voss, R., Wang, B., Washington, W.,
Yoshikawa, I., Yu, J. and Zebiak, S.  2001.  ENSIP: the El Niño simulation
intercomparison project.  Climate Dynamics 18: 255-276.
Copyright © 2002.  Center for the Study of Carbon Dioxide and Global Change


>From National Post, 5 July 2002{1E6A50C9-E179-4F1A-9903-8B7C03FFFA57 }

Les Perreaux 

WINNIPEG - Your own best guess might be as reliable as long-term forecasts
from Environment Canada or the Weather Network, a Manitoba study suggests.

John Hanesiak, a University of Manitoba professor, and a group of his
students found that Environment Canada and the Weather Network failed to
predict the weather beyond two days in a study of forecasts for Winnipeg.

Environment Canada accurately predicted the temperature about 75% of the
time for the first day of most five-day forecasts. The forecasts were also
60% accurate for the second day. However, accuracy plummeted to 25%, 22% and
15% on day three, four and five of the forecast.

For each forecast the Weather Network lagged behind, ranging from 59%
accuracy on day one to about 7% on the fifth day.

The study allowed a five-degree spread for accuracy and sampled forecasts
for Winnipeg from Jan. 15 to Feb. 15. Similar rates of accuracy were found
for wind and precipitation predictions.

Mr. Hanesiak, a specialist in weather and climate at the university's
geography department, said Environment Canada's decision to rely on
computers for the last three days of the forecast is probably the cause for
the sudden drop.

"They've had to do it because of internal cuts. They've decided to
essentially omit the human factor, and it has led to a significant drop in
accuracy," he said.

Environment Canada's own long-term statistics show the national average for
the fifth-day of a five-day forecast is about 55% accurate, according to Jim
Abraham, the director of weather research. Weather Network officials could
not be reached for comment.

Mr. Hanesiak acknowledged Winnipeg's weather during the sample period was
unusually mild, contributing to the especially low accuracy.

However, he said computer models tend to have more difficulty with unusual
factors, making human expertise even more important for weather prediction.

Last month, the union representing federal weather forecasters complained
about the elimination of the human factor in long-term forecasting, after an
unexpected deluge led to localized flooding and dangerously high water
levels in areas south of Winnipeg.

However, computerized weather forecasting has greatly improved accuracy
overall, Mr. Abraham stressed. "We're as accurate on the third day of a
forecast as we were on day one 20 years ago," he said. "Of all the things in
this bizarre world, we remain one of the few areas that can even come close
to predicting the future. Listen, if my five-day forecast was perfect, I'd
only have to work one day a week."

Mr. Abraham said the forecasts are also limited by the methods that
communicate them. The predicted weather for each day must be boiled down to
one line for mass consumption, and often cover vast areas of Canada.

He pointed out that aviation forecasts offer hour-by-hour forecasts for 24
hours. They are often quite accurate, but probably not practical for mass

"The atmosphere is a complicated beast. To describe the evolution in such
short time and space is very awkward," he said.

He added that most people find the forecasts useful, even if they are
accurate half the time, or less.

"For some people, the answer, 'we don't know' would be better for uncertain
predictions five days from now. Some people want a decision from us because
it is better than nothing. Besides, it's a major component of conversation
for Canadians. It's some indication, if nothing else," he said.

© Copyright  2002 National Post


>From CO2 Science Magazine, 10 July 2002

One of the many dangers of global warming, according to climate alarmists,
is the propensity for rising temperatures to produce longer and more severe
droughts.  What does history have to tell us about this claim?  Have
droughts become more frequent or severe as the earth has recovered from the
relative coolness of the Little Ice Age?  In this brief summary, we review
some recent papers that have explored this subject in Africa.

Verschuren et al. (2000) developed a decadal-scale history of rainfall and
drought in equatorial east Africa for the past thousand years, based on lake
level and salinity fluctuations of a small crater-lake basin in Kenya, as
reconstructed from diatom and midge assemblages retrieved from sediments.
Although they did indeed find that the Little Ice Age was wetter than the
current Modern Warm Period, they identified three Little Ice Age intervals
of prolonged dryness (1390-1420, 1560-1625, and 1760-1840); and these
"episodes of persistent aridity," as they refer to them, were determined to
be "more severe than any recorded drought of the twentieth century."

The significance of this observation is perhaps best appreciated when it is
realized that since the late 1960s, as noted by Holmes et al. (1997), the
African Sahel has experienced "one of the most persistent droughts recorded
by the entire global meteorological record."  Nevertheless, in a
high-resolution study of a sediment sequence extracted from an oasis in the
Manga Grasslands of northeast Nigeria, they too determined that "the present
drought is not unique and that drought has recurred on a centennial to
interdecadal timescale during the last 1500 years."

Finally, in a review of climatic change in Africa over the past two
centuries, Nicholson (2001) reports there has been "a long-term reduction in
rainfall in the semi-arid regions of West Africa," which has been "on the
order of 20 to 40% in parts of the Sahel."  She describes the phenomenon as
"three decades of protracted aridity," noting that "nearly all of Africa has
been affected ... particularly since the 1980s." Nevertheless, Nicholson
also says that "the rainfall conditions over Africa during the last 2 to 3
decades are not unprecedented," and that "a similar dry episode prevailed
during most of the first half of the 19th century," when much of the planet
was still in the grip of the Little Ice Age.

In considering the preponderance of real-world evidence from the continent
of Africa, it thus appears that the global warming of the past century or so
has not led to either a greater frequency or severity of drought in that
part of the world. Indeed, even the worst drought in recorded meteorological
history does not seem to be any greater than what has periodically occurred
during much colder times. Hence, we see no reason to put any credence
whatsoever in the climate-alarmist claim that global warming leads to more
frequent or severe droughts. It doesn't!

Holmes, J.A., Street-Perrott, F.A., Allen, M.J., Fothergill, P.A., Harkness,
D.D., Droon, D. and Perrott, R.A. 1997. Holocene palaeolimnology of
Kajemarum Oasis, Northern Nigeria: An isotopic study of ostracodes, bulk
carbonate and organic carbon.  Journal of the Geological Society, London
154: 311-319.

Nicholson, S.E. 2001. Climatic and environmental change in Africa during the
last two centuries. Climate Research 17: 123-144.

Verschuren, D., Laird, K.R. and Cumming, B.F. 2000. Rainfall and drought in
equatorial east Africa during the past 1,100 years. Nature 403: 410-414.
Copyright © 2002. Center for the Study of Carbon Dioxide and Global Change 


>From CO2 Science Magazine, 10 July 2002

The Kyoto Protocol is based on the premise that the ongoing rise in the
air's CO2 content must be slowed as soon as possible, and ultimately stopped
altogether, in order to avoid an increase in mean global air temperature of
sufficient magnitude to inflict serious damage on the biosphere.
Consequently, those who believe in the conceptual foundation of the Protocol
- as well as some who don't (but who promote its adoption for political or
philosophical reasons) - would like to see its provisions implemented as
soon as is practicable, in order to prevent the presumed deleterious
consequences (or, alternatively, to foist their political philosophy upon
the world).

Within this context, it is important to know what the proponents of the
Protocol consider a dangerous climate impact worthy of immediate action.
Taking the Intergovernmental Panel on Climate Change as their guide, O'Neill
and Oppenheimer (2002) say it is an impact that either imposes a risk upon
unique and threatened ecosystems or engenders a risk of some large-scale
discontinuity in earth's climate system.  On this basis, they claim there
are three warming-related risks that are serious enough to implement the
Kyoto Protocol with all due haste.  These risks are the potentials for (1)
the infliction of extreme damage to earth's coral reefs, (2) the
disintegration of the West Antarctic Ice Sheet, and (3) the virtual shutdown
of the marine thermohaline circulation.

With respect to the third of these risks - we dealt with the first and
second ones in our Editorials of 26 June 2002 and 3 July 2002 - O'Neill and
Oppenheimer (hereafter, O & O) claim there is strong evidence that the
thermohaline circulation or THC has shut down many times in the past "in
association with abrupt regional and perhaps global climate changes," citing
Broecker (1997).  They also say that "most coupled atmosphere-ocean model
experiments show weakening of the THC during this century in response to
increasing concentrations of greenhouse gases, with some projecting a
shutdown if the trends continue."  Hence, they conclude that "to avert
shutdown of the THC, we define a limit at 3°C warming over 100 years, based
on Stocker and Schmittner (1997)."

With respect to O & O's claim that the THC abruptly shut down several times
in the past - with which we have no argument - there is an important
auxiliary fact they fail to mention; and that is that these shutdowns
typically occurred during very cold glacial or transition periods, but
rarely, if ever, during warm interglacials. Part of their failure to report
this fact may be related to their reliance on somewhat outdated
publications, such as Broecker (1997), who referenced a minor (and readily
explained) exception to this rule that occurred approximately 8200 years
ago, and Stocker and Schmittner (1997), who have recently reported much
different modeling results than they did half a decade ago.

In the case of the dramatic regional cooling of 1.5-3°C that is known to
have occurred at marine and terrestrial sites around the northeastern North
Atlantic some 8200 years ago, Barber et al. (1999) have made a strong case
for the likelihood that it was caused by the catastrophic release of a huge
amount of freshwater into the Labrador Sea as a result of the final outburst
drainage of glacial lakes Agassiz and Ojibway, which significantly reduced
the strength of the THC and retarded the transport of heat to the northeast
North Atlantic. Consequently, this event could validly be classified as a
"holdover" phenomenon from the prior glacial-to-interglacial transition

In a model study of a more gradual increase in freshwater input through the
St. Lawrence River system "similar to that associated with freshening due to
the [predicted] warming climate of the next century," Rind et al. (2001)
found that "North Atlantic Deep Water production decreases linearly with the
volume of fresh water added through the St. Lawrence" and that it does so
"without any obvious threshold effects." Under such circumstances it would
be expected that the gradual slowing of the THC would gradually reduce the
northward transport of heat in the North Atlantic, leading to a gradual
reduction in freshwater input to the North Atlantic through the St. Lawrence
and other rivers that would ultimately lead to a gradual intensification of
the THC, and etc., thereby producing a climatic oscillation of much more
modest amplitude than the abrupt changes of which O & O are so concerned.

Similar conclusions have been reached by a number of other investigators as
well. The modeling work of Ganopolski and Rahmstorf (2001, 2002) and Alley
and Rahmstorf (2002), for example, suggests that the North Atlantic branch
of the THC possesses two potential modes of operation during glacial times,
between which it oscillates in response to weak (and probably solar-induced)
forcings that produce small cyclical variations in freshwater input to high
northern latitudes that are amplified by stochastic resonance and therefore
produce rapid warmings followed by slower coolings that are a full order of
magnitude greater than similar oscillations that occur during interglacials.
Throughout these latter much warmer periods, such as the current Holocene,
however, Ganopolski and Rahmstorf (2002) report that climate "is not
susceptible to regime switches by stochastic resonance with plausible
parameter choices and even unrealistically large noise amplitudes, and
neither is it in conceptual models."  Furthermore, they correctly report
that real-world observations reveal "there is no evidence for regime
switches during the Holocene," as previously noted by Stocker (2000) and
suggested by the modeling work of Latif et al. (2000) and Gent (2001).

Schmittner et al. (2002) come to pretty much the same conclusion, i.e., that
the stability of the THC during glacial periods is much reduced from what it
is during interglacials; and in a review of the current status of our
knowledge of abrupt climate change and its relationship to changes in the
THC, Clark et al. (2002) conclude that essentially all of the rapid warmings
and associated slower coolings of which we have record "were characteristic
of the last glaciation," as opposed to the Holocene.  They thus state that
"the palaeoclimate data and the model results also indicate that the
stability of the thermohaline circulation depends on the mean climate
state."  And in this regard, cold is incredibly robust, producing large and
rapid changes in climate, while warm is much more subdued, producing more
gentle variations of which periodic swings between Medieval Warm Period and
Little Ice Age conditions are typical.

These several observations thus suggest that, if anything, additional warmth
may actually provide an insurance policy against radical reorganizations of
the THC, as well as the abrupt climate changes that accompany them. Hence,
this last of the three most dangerous impacts of global warming identified
by O & O - like its predecessors - turns out to be pretty much of a

Dr. Sherwood B. Idso, President 
Dr. Keith E. Idso, Vice President 

Alley, R.B.S. and Rahmstorf, S. 2002. Stochastic resonance in glacial
climate. EOS, Transactions, American Geophysical Union 83: 129, 135.

Barber, D.C., Dyke, A., Hillaire-Marcel, C., Jennings, A.E., Andrews, J.T.,
Kerwin, M.W., Bilodeau, G., McNeely, R., Southon, J., Morehead, M.D. and
Gagnon, J.-M. 1999. Forcing of the cold event of 8,200 years ago by
catastrophic drainage of Laurentide lakes. Nature 400: 344-348.

Broecker, W.S. 1997. Thermohaline circulation, the Achilles heel of our
climate system: Will man-made CO2 upset the current balance? Science 278:

Clark, P.U., Pisias, N.G., Stocker, T.F. and Weaver, A.J. 2002. The role of
the thermohaline circulation in abrupt climate change. Nature 415: 863-869.

Ganopolski A. and Rahmstorf, S. 2001. Rapid changes of glacial climate
simulated in a coupled climate model. Nature 409: 153-158.

Ganopolski, A. and Rahmstorf, S. 2002. Abrupt glacial climate changes due to
stochastic resonance. Physical Review Letters 88: 038501.

Gent, P.R. 2001. Will the North Atlantic Ocean thermohaline circulation
weaken during the 21st century?  Geophysical Research Letters 28: 1023-1026.

Latif, M., Roeckner, E., Mikolajewicz, U. and Voss, R. 2000. Tropical
stabilization of the thermohaline circulation in a greenhouse warming
simulation. Journal of Climate 13: 1809-1813.

O'Neill, B.C. and Oppenheimer, M. 2002. Dangerous climate impacts and the
Kyoto Protocol.  Science 296: 1971-1972.

Rind, D., deMenocal, P., Russell, G., Sheth, S., Collins, D., Schmidt, G.
and Teller, J.  2001.  Effects of glacial meltwater in the GISS coupled
atmosphere-ocean model. I. North Atlantic Deep Water response. Journal of
Geophysical Research 106: 27,335-27,353.

Schmittner, A., Yoshimori, M. and Weaver, A.J. 2002. Instability of glacial
climate in a model of the ocean-atmosphere-cryosphere system. Science 295:

Stocker, T.F. 2000. Past and future reorganizations in the climate system.
Quaternary Science Reviews 19: 301-319.

Stocker, T.F. and Schmittner, A. 1997. Influence of CO2 emission rates on
the stability of the thermohaline circulation. Nature 388: 862-865.

Copyright © 2002. Center for the Study of Carbon Dioxide and Global Change 


>From World Climate Report, 8 July 2002

We've been waiting, and it hasn't been long, for someone to say that our
recent wildfire outbreak is caused by global warming and its attendant
climate changes, and doggone it if Bob Herbert of the New York Times didn't
oblige us on June 24:

   ...Enormous wildfires have been raging in bone-dry regions of the West
and Southwest...In
   Colorado, which is enduring its worst drought in decades....the long
drought and continued hot
   weather provided the conditions that enabled this [fire] to explode into
an unprecedented
   conflagration... "Can you say global warming?"

Sure. We'll even help Bob along with a little quantitative analysis,
something that appeared to be missing from his column.

A gander at Figure 1, which shows 10-year averages for acreage burnt in the
United States, is a good start. In the warm 1990s, we averaged a little
around 5 million acres per year going up in smoke. In the cool 1960s, it was
around 5 million.

Figure 1. Average acreage burned per year in the U.S., by decade.

But before Bambi, the cartoon that has probably spawned more ecological
mismanagement than any other film, we used to just let things burn. So when
we look at, say, the 1930s, about 38 million acres went kerblooey each year;
25 million went up on the average in the 1920s.

So let's stop the nonsense about more acreage going up than ever.

But, lest we be accused of setting up a straw tree, let's have a look-see at
what has happened since Bambi, or in the current era of irrational fire

Figure 2 shows summer (June-September) temperature since 1960, and Figure 3
are decadal average precipitation over the United States. Yesiree, Bob,
there is a warming trend, with a rise of about 0.9°F in the period. And,
something that anyone who wanted to get the whole story out would have
noted, there's also an increase in precipitation. In the 1960s, we averaged
about 28.3 inches of rain per year. By the 1990s that moved up to 30.5, or a
rise of 2.3 inches. Our friend Kevin Trenberth of the U.S. National Center
for Atmospheric Research, and our very few friends on the United Nations'
Intergovernmental Panel on Climate Change all like to conflate this with
global warming, and they think they're always right (that's why they call
themselves the Consensus of Scientists), so we'll agree for the sake of

Figure 2. U.S. temperatures since 1960.
Figure 3. Average precipitation per year in the United States, by decade.

We naively believe that heat accelerates forest fires and rain puts them
out. And to test that hypothesis, we'll actually build a computer model:

Acreage burned = X (temperature) -Y (rainfall).

This little equation explains a bit under half of the total year-to-year
variation in acreage burned in the U.S. Our computer model is displayed
graphically in Figure 4.

Figure 4. Computer-modeled and observed acreage burnt, using temperature and
X turns out to equal approximately 700,000, which means, on the average,
that a year that is one degree warmer than normal will have 700,000 more
burned acres. Calm down, Bob! Y is equal to 400,000. So every inch of rain
above normal reduces the annual burn by 400,000 acres.

Since 1960, the 0.9 degree rise in temperature means that we are burning
630,000 more acres per year. But, as shown in Figure 3, the decadal the
2.3-inch rise in rainfall means that we're burning 930,000 less due to
increased moisture. In other words, the total "change" in the
climate-related signal according to our model is minus 300,000 acres.

Perhaps readers now expect us to claim global warming is reducing U.S.
fires, but we won't. That's because we're a bit more sophisticated than a
mere columnist for the Times. It turns out you can't find the needle of
300,000 acres in this 6,000,000-acre haystack. That's because the normal
variation in acreage burnt from year to year is about 1,600,000 acres. As we
scientists say, you can't tell the climate signal from the random noise. If
we were columnists, we would have written "Can you say random numbers"?

Seeing the Forest Through the Trees

There's even more here to meets the eye, namely a lot of smoke.

A look back at figure one shows that the average burn in the pre-Bambi era
was about 30 million acres, or six times the current figure. These fires
were natural. They're what you got if you just let lightning do it's thing
every summer. And they point up what may be one of the biggest myths in our
modern ecological pantheon: that our current air is unusually "polluted"
compared to the natural background.

The average pre-Bambi burn turns out to be about 1.5 teragrams (thousand
billion) of carbon dioxide into the atmosphere each year, which is about 20
percent of the 7.0 teragrams that we currently combust every year in the
form of fossil fuels. Note that a great deal of the current combustion is
"clean," with smog-forming nitrogen oxides (NOX) catalyzed away, and
power-plant effluent much less obNOXious (get it? har har har) than it used
to be.

The trees of the pre-Bambi era oxidized at will, with no great catalytic
converter sucking their exhaust through the skies, and they put out plenty
of nitrogen oxides (that's what you get, among other things, when you burn
things like proteins, and trees, like everything else alive, contain

So it's a pretty good bet that the summertime air over much of the United
States was pretty unhealthy before that darned cartoon. Further, the sleaze
wasn't just confined to our cities but was pretty much everywhere.

In other words, for all our good intentions, our "Clean Air Act" may in fact
render the atmosphere unnaturally clear, compared with what it used to be
when we just let it burn.


>From Time Magazine, 8 July 2002,13673,501020708-268236,00.htm

Yearning for spiritual leadership, Japan has spawned a rash of apocalyptic
religions and ominously popular sects 


During a thundershower of biblical proportions, hundreds of people dash from
a train station in the Tokyo suburb of Kawaguchi, across a bricked plaza and
into a modern civic center. Inside, there are English lessons on the 11th
floor, "welcome to kabuki world" on the first floor, a seminar on working at
home on the sixth floor. It's a busy night for the self-improvement crowd.
But the main attraction on this Tuesday night in late June is holding forth
in the auditorium. There, more than 2,000 people have gathered to see a man
whom they believe possesses unsurpassed wisdom and power. In their eyes,
Shoei Asai, the 70-year-old leader of a religious sect called Nichiren
Kenshokai, is a healer and a prophet who envisions a looming calamity for
Japan that he alone can avert. "Asai sensei understands" says Kazuhito
Suzuki, a disillusioned, young construction worker who professes nothing but
disdain for Japan's establishment and despair for the future. "He has the

Everybody else in this comfortably wealthy country seems lost. Things that
matter - family, safety, community, aesthetics, money-appear to be slipping
away, and the sense of alienation and desperation deepens with every
headline decrying the nation's unemployment rate or senseless schoolyard
violence. The collective yearning for someone-anyone-to check the country's
spiritual drift is palpable. "People are seeking mental healing during this
time of continuous bad news," says Nobutaka Inoue, a professor of religious
studies at Tokyo's Kokugakuin University.

In spiritual matters, as in economics, supply rises to meet demand. Into the
country's leadership vacuum has stepped an eccentric mix of charismatic
characters professing to have a hot line to God or a shortcut to
enlightenment. According to religious scholars, there are some 2,000 "new
religions" in Japan. They include Ho No Hana Sanpogyo, which urges the
worship of feet, and Life Space, a group whose leader claims that he doesn't
need to eat, bathe or sleep because of his superhuman powers. He is now
serving a 15-year prison sentence for murdering a follower and keeping the
mummified corpse in a hotel room.

Some of the groups may be little more than bizarre clubs or innocuous
doctrinal offshoots of traditional Buddhism and Shintoism, Japan's most
common faiths. But religious scholars and the police are nonetheless alarmed
by what they see as the proliferation of doomsday cults. Mystics consumed
with signs of the apocalypse have a tendency to bring their visions
horrifically to life. Japanese need no reminder of Aum Shinrikyo, the cult
that staged a deadly chemical gas attack on commuters in Tokyo's subway
system seven years ago, allegedly masterminded by Aum's Shoko Asahara. Last
week, one of Asahara's top henchmen, Tomomitsu Niimi, became the eighth Aum
member to be sentenced to death in connection with the chemical attack.

Aum still exists, but another movement has eclipsed it. Asai's Nichiren
Kenshokai sect, which drew throngs to the Kawaguchi civic center, claims to
have 881,865 followers. "Kenshokai is the biggest of the new religions,"
says Taro Takimoto, a lawyer who helped in 1995 to organize a group
comprising family members trying to rescue relatives from cults. "There are
many high school students quitting school, people quitting their jobs, to
join Kenshokai." Kenshokai's nationalistic appeal is particularly popular
among young men, including members of Japan's Self-Defense Force. The cult
claims to have attracted 11,000 new adherents in June alone.

Little has been written about this religion in the mainstream media, and its
leader, Asai, has never talked with the press. His eldest son, 39-year-old
Katsue Asai, serves as a kind of general manager for the group and does
agree to meet - the first time, he says, he has given an interview to a
journalist. A serious man in a business suit, he explains how the movement
was started by his grandfather in 1957, when he and his acolytes splintered
from the centuries-old Nichiren sect of Buddhism. Kenshokai differs from
other Nichiren sects-especially the politically powerful Soka Gakkai-in that
its practitioners see it as destined to become the national religion of
Japan. "We still believe that," says Katsue Asai.

Indeed, the leader's son talks of ultimately attracting every living
Japanese soul - all 130 million of them - to the fold. "I'm sure it will
happen," says Katsue Asai, matter-of-factly. How long will it take? "A bit
more than 10 years. At the most, 20 years. This might sound strange," he
says, "but we think this is not only about Japan, but the whole universe. A
huge power is coming, sometime soon. Society is getting really confused
these days. There are problems with education, all the political scandals.
Then there will be a big natural disaster, like an earthquake. Then China
will come to invade us, to take advantage of our problems. When that
happens, people will feel that whatever it was they believed in is
inadequate. That's when they'll come to us."

Using a similar appeal, Aum Shinrikyo drew tens of thousands of followers
during the 1990s, and even ran political candidates in national elections.
Aum's godhead was its founder, Asahara, an intelligent misfit who claimed he
could levitate and who appeared regularly on TV talk shows. Asahara, whose
real name is Chizuo Matsumoto, preached distorted versions of Buddhism and
Hinduism steeped in apocalyptic theology.

For a dozen Tokyo commuters, dire prophecy came true. On a sunny March
morning in 1995, Aum members, in an apparent attempt to create mayhem and
distract a police investigation into their operations, used the tips of
umbrellas to puncture plastic bags filled with liquid sarin, which they left
behind on five subway trains. A poisonous, invisible cloud spread through
the carriages and stations. Thousands of people were made sick, and 12 died.

Asahara, now 47, has spent the past seven years in a Tokyo jail cell. In
court one day recently, facing murder charges in connection with, among
other crimes, the gas attack, he bobs his head up and down, looking tired
and confused. His hair, once wild and frizzy, is now cut short, his
Rasputin-like beard trimmed respectably. Every move he makes is closely
watched by his remaining disciples-wide-eyed men and women who flock to the
courtroom to bask in the aura of the man they still consider their spiritual
father. "He never did what you expected him to," says one of them, Hiroki
Araki. "But we are grateful for what he has given us."

Remaining members, who number in the hundreds, changed the group's name two
years ago from Aum Shinrikyo (Supreme Truth) to Aleph (the first letter of
the Hebrew alphabet). Japan's Public Security Investigation Agency assigns
about 50 agents to keep tabs on them. The cult has seven main facilities
throughout Japan and 20 smaller branches where followers can practice
meditation; it also organizes yoga classes, computer seminars and student
clubs on university campuses. These activities attract recruits like Ai
Ozaki, 25. A shy, thoughtful woman, Ozaki (her cult name) joined Aum after
the sarin attack, drawn in part by its promise of life after death in a
reincarnated form. "I was afraid of dying," she says, "so I liked their
creed." She knows in her heart, she says, that Asahara must have had
something to do with the subway murders, "but there is a part of me that
still hopes he can save me. I still want to believe in him."

So many people flock to see Shoei Asai in Kawaguchi that dozens of
latecomers must wait outside in the lobby because there isn't room for them
in the auditorium. Black-suited men with walkie-talkies and earplugs roam
through the crowd, reminding people to turn off their cell phones. Two of
these guards, surprised to see a foreign visitor, stop me from entering
before another explains that I was invited. It's the first time a journalist
has been allowed to witness a Nichiren Kenshokai meeting.

On the stage, there are 11 rows of chairs, with 24 people per row, each
person sitting straight-backed, hands neatly placed on their legs. The men
all wear black suits with white shirts and ties. They all sit silently,
until Asai himself appears, walking briskly across the platform to his
chair, front and center. He says not a word and sits down. Moments later,
one of the men onstage stands up, removes his jacket and walks forward. His
legs positioned in a kendo stance, he whisks out a golden fan emblazoned
with the red circle of the Japanese flag. He briskly waves the fan in
deliberate downward strokes as a militaristic march plays in the background.
The audience claps along, one solid clap every three seconds, while they
sing in unison:

The sound of footsteps roar the earth
A grand marching of the missionaries
In the midst of evil and eternal damnation
Buddha's army rises to save the suffering

Over the next two-and-a-quarter hours, a dozen or so members of this
Buddha's army rise to deliver emotional testimonials to the power of their
religion and their leader. A young woman describes a litany of health
woes-debilitating skin disease, a broken leg suffered while snowboarding,
stomach problems-that all miraculously disappeared one month after joining
Kenshokai. Another woman speaks of how her son was born with a hole in his
heart but was cured by the powers of Kenshokai. She adds exultantly: "We
must all help Asai for the rest of our lives."

Finally, it is Asai sensei's time to talk. He wears a gray business suit and
glasses and has thinning gray hair. He doesn't quote scriptures or recite
Buddhist chants. His short speech is delivered in warm, avuncular, soothing
tones. Yet his words conjure up pictures of doom as he talks about last
year's terrorist attacks on the U.S., about al-Qaeda and the threat of dirty
bombs. "A big country like America couldn't even crush al-Qaeda completely,"
he says. "Because of this technology, these dirty bombs, even a small group
can ruin America."

It is somehow proof, he suggests, that salvation can be found only in the
teachings of his religion. "The most important thing is to teach all
Japanese people, seriously and strongly. Even if they don't believe in the
beginning, it is important that they know." After Asai leaves the stage, the
crowd disperses. The black-suited guards swoop down on me as I try to
introduce myself to members. Outside, a middle-aged woman clutching a tape
recorder offers to explain her beliefs. "There will be a big disaster in
Japan, and Asai sensei will become the leader," she says. "You never know
from one year to the next who will be the Prime Minister," she adds. "It is
always uncertain. But Asai sensei is always with us. He is the only one who
talks about Buddhism for the nation. He is the only one who can save us."

Copyright 2002, Time Magazine


>From Tech Central Stration, 9 July 2002

By Howard Fienberg 07/09/2002 

A World Wildlife Fund (WWF) study, to be released today, warns that
over-consumption will force human colonization of other planets within fifty
years unless it is curtailed immediately. The WWF report warns that the seas
will become emptied of fish, all forests will be destroyed and supplies of
drinking water will become polluted or disappear.

However, according to reliable data, the Day of Judgment is not just around
the corner. In fact, economic and scientific advances have made it possible
for the developed world to be more efficient in its natural resource usage,
to find or make new resources, and maintain or revive endangered species.

The WWF follows in the footsteps of the Reverend Thomas Malthus, who
famously predicted in his 1798 tract "An Essay on the Principle of
Population" that exponential human population growth would inevitably
outstrip food supplies, leading to starvation, war and mass deaths. The WWF
is not alone, as Paul Ehrlich reworked Malthus' mathematically-based
theories for the modern era, predicting large numbers of fatalities from
famine in the 1970's in his 1968 book "The Population Bomb." Others like the
Club of Rome have adhered to essentially the same arguments. But we never
starved to death and humans and Earth survive to this day. Will 'present
trends' continue, forcing us to begin terra-forming Mars for a mid-century
move? The data says no.

Will deforestation leave us with no trees? Data from the UN's Food and
Agriculture Organization show a relatively consistent trend line from 1948
to the mid-nineties, with about 30 percent of the planet's land surface
covered by forest. If these trends continue, nothing bad will happen. It is
likely that forest coverage will increase rather than decrease. Ongoing
improvement in modern agriculture means that less and less space is needed
to produce more and better food, decreasing the need for clear-cutting. The
developed world is already increasing conservation and replacement efforts,
which might off-set any decrease in cover caused by excessive consumption in
the developing world. Indeed, commercial growers are making use of
faster-growing pine stocks -- a good example of market adaptation at work.

Will the seas be empty of fish? Not likely. The WWF claims that stocks of
cod in the North Atlantic have declined from an estimated spawning stock of
264,000 tons in 1970, to less than sixty thousand in 1995. Yet while
over-fishing of particular stocks and particular regions has yielded a
decline in many wild catches, scientific aquaculture and genetically
modified fish development mean that farmed fish production is increasing
dramatically every year. Already, it accounts for about twenty percent of
seafood production. Increased reliance on farmed fish implies lower reliance
on wild fish, perhaps giving their stocks a chance to recover and even

Will we run out of drinking water? No way. While certain regions appear
destined for shortages (if they don't have them already), the planet as a
whole has an abundance of drinking water. The WWF might make a credible case
for new policies in water management and pricing, so that these regions may
benefit from other regions' abundance. But drinkable water is not
disappearing. Besides, over seventy percent of the Earth's surface is water
of some kind (if the worst-case-scenario climate change models are correct,
that percentage will grow even further). That water can recycle through the
climate, eventually into drinking water, or we can desalinate it if need be.

In short, while many major events are bound to befall us in the next
fifty-odd years, the complete eradication of natural resources isn't one of

This study comes hard on the heels of other doom-saying research, like
Mathis Wackernagel's recent Proceedings of the National Academy of Sciences
article. Wackernagel and his co-authors claimed to have measured humanity's
"footprint" on the Earth. By their calculations, humanity's footprint had
already outgrown the planet's "capacity" by 25 percent in 1999. However, as
pointed out by Reason magazine's Ron Bailey and Spiked's Jennie Bristow, the
researchers' model fails to account for technological innovation. While
charting the need for farmland, the researchers do not control for
agricultural advancements, which will dramatically reduce the requisite
amount of land. Efficiency gains in all kinds of resource consumption
combined with resource restoration make Wackernagel's claims seem
outlandish. Similarly, the UN Environment Program made headlines last month
when it declared that a quarter of the world's mammal species could face
extinction within the next 30 years. But all they actually documented were
possible species that whose habitats were "threatened."

Why was the WWF study released at this particular moment? At the end of
August, the United Nations will hold an Earth Summit. Both the Wackernagel
and WWF studies seem to single out the United States for environmental
transgressions. It is possible they hope to shame the U.S. into concessions
at the conference. As the Observer reported on July 7, "WWF wants world
leaders to use its findings to agree on specific actions to curb the
population's impact on the planet." One would hope that, whatever dramatic
proposals are considered at the Earth Summit, U.S. policy-makers will at
least consult freely-available scientific data before making any rash

© 2002 Tech Central Station



>From David Grinspoon <>

Dear Benny,

I enjoy many of the articles on CCNet, but in my view your "Climate change
and climate scares" postings only discredit CCNet and indirectly harm the
credibility of the movement for NEO awareness and planetary protection.

As a climate modeler myself, I am well aware that it is an inexact science,
at best.  I don't believe that our predictions are very good. However, it is
perfectly clear that we are altering the atmosphere in ways that are almost
surely changing the climate, and to just stick our heads
in the sand and continue this flatulence unabated does not seem like an
intelligent approach. We should not wait for "proof" of global warming to
change our behavior, not if we want to act like an intelligent species with
some control over our behavior and some concern for our global home.

It seems to me that there is a huge inconsistency in your positions on NEOs
and global warming.  As with climate modeling, there is no "proof" that an
NEO is going to hit the Earth at any given time. And we know that some
impacts are "good" for life on Earth, just as some greenhouse warming may
possibly be "good". Therefore, the consistent position would be to ridicule
predictions of doom by asteroid or comet impacts, extol the virtues of
impacts, and argue to derail all global attempts to deal with the problem as
long as our information is still incomplete.

Your readers should also know that "CO2 Science Magazine" from which the
lion's share of your climate missives seems to stem, is closely related to
the Western Fuels Association, a cooperative of coal-dependent utilities in
the western United States that works to discredit climate change science and
to prevent regulations that might hurt coal-related industries.

The fundamental difference between science and pseudoscience, it seems to
me, is a different approach to evidence. Science ideally reaches conclusions
based on evidence. Pseudoscience reaches conclusions and then sifts through
and spins evidence to support these predetermined beliefs.

By this definition, your climate postings seem to me like pseudoscience.
The danger is that people will, by association or analogy, be led to
conclude the same thing about the NEO threat.

With best regards,

David Grinspoon
Southwest Research Institute                        


>From Benny Peiser <>

David is not the first subscriber who has been concerned about my skepticism
regarding environmental and religious doomsday prophecies. He is also right
to note that I treat the impact hazard somehwhat more generously than the
risks associated with global warming. This different approach is due to the
fact that we know a little more about past impacts and their catastrophic
effects, which have been studied in situ, while the global warming scenarios
and their alleged disastrous consequences are purely hypothetical. Moreover,
they are vigorously contested by a number of eminent researchers who neither
accept the reliability of climate modeling and long-term prediction, nor the
underlying assumption that the current warming trend is primarily due to
human action.

Within the NEO community, in contrast, skeptical colleagues are always at
hand when they feel that I (or others) over-estimate the NEO hazard or call
for unreasonable action/funding etc. You only need to compare the small
amount of money spend worldwide on NEO research and detection with the huge
sums spend on climate change to realise that the NEO community has a more
realistic understanding of the inherent NEO risk. In this way, we have
achieved a healthy balance while maintaining an overall consensus that the
impact hazard can, over time, be tackled effectively (and cost-effectively

Unfortunately, this cannot be said about the wider climate change community.
I believe the most important aim should be to apply normal risk assessment
tools to the problem and stick to empirical data and evidence in order to
decide whether or not global action (a la Kyoto Protocol) should be taken
that will unquestionably cost billions of dollars and might cost millions of
jobs. I do not believe that the current empirical evidence warrant such
far-reaching action. After all, there is no compelling evidence that the
insignificant and often erratic warming trend during the last century is
man-made rather than, say the result of our continuing emergence from the
cold spell of the little ice-age.

While David is concerned that posting climate skepticism on CCNet may harm
the integrity of the NEO  community, I firmly believe that CCNet's general
anti-alarmism actually enhances our reputation as a trustworthy scientific
community that will not use scare-tactics in order to obtain more funding -
as many other scientific communities regrettably do habitually!

Let me set David's mind at rest. Over the years, I have indeed taken 'a
consistent position to ridicule predictions of doom by asteroid or comet
impacts.' Instead, I have emphasised an sanguine philosophy that the impact
hazard can be cracked progressively over the next 100 to 200 years. In this
way - and at the same time - we will be developing new technologies and
space industries as well as improving living standards - rather than
restraining economic progress and retreating in cultural cynicism and
technological distrust.

Benny Peiser


>From Timo Niroma <>

IPCC has taken too rigid a view concerning the climate. I think this is now
widely seen. Whether we take as our time perspective 1000 million years, 1
million years, thousands of years or 100 years, we see oscillation. In the
first category goes pre-Cambrian total freeze, dinosaur-era tropical warm,
or the cold of the last 15 million years. In the second category goes the
100,000+ years ice ages, in the third category goes the Roman Warm Period,
the Dark Ages cold, the Middle Age Warm Period, the Little Ice Age with
Sporer and Maunder minima, and the today's Modern Warm Period. Between these
exists the 2 degrees estimated colder climate after about 2200 BC, which
divides the Holocene (excluding the beginnings) into two periods.

In the following I describe the temperature in Finland in 1900-2001 to show
the minor oscillations when climbing towards the Modern Warm Period, which
still is about 1 degree below from what reigned about 1000-1100 AD, or 900
years earlier.

Based on the statistics of the Finnish Meteorological Institution I listed
the four warmest and four coldest temperatures for eight stations, three in
South, three in the Middle and two in Northern Finland. Listing only the
peak years I got the following results:

1. cold period 1900-1923:

1902 whole country
1915 whole country

1. warm period 1930-1939:

1934 whole country, esp. summers
1938 whole country, esp. summers

2. cold period 1940-1943:

1941 south and middle, esp. winters
1942 south and middle, esp. winters

A smooth period 1944-1984, minor oscillation 1965-1972
Only exception Lappland 1974, very warm.

3. cold period 1985-1987

1985 whole country, esp. winters
1987 south and middle, esp. winters

2. warm period 1988-

1989 whole country, esp. winters
2000 whole country

Phase shifts:

1920's oscillations, last remnants of the Little Ice Age, MWP begins
1939-1940 global warming interrupted for nearly 50 years
1988 global warming continues, Modern Warm Period


Based on my solar formulas, I predict a second interruption in global
warming beginning around 2010, being at its deepest in 2020-2030. After that
MWP continues achieving the Medieval value, which was 1 degree above today's
values, but does not exceed that. The data today gives no clear
sign of man's action, if not the 50-year interruption delay 1940-1987,
easily explained by half a Gleissberg cycle, but may have been helped by
dirty atmosphere caused by cars, agriculture and industrial smoke (not
anyway by CO2). Cleaner atmosphere would help the natural warming. (So
Kyoto protocol or rather cleaner wastes would help the warming.) The amount
of CO2 increased in atmosphere increased from about 300 ppmv to 350 ppmv
during 1940-1987 without any affect to temperature.

In this situation it is difficult to avoid the anomalous year 1867. May in
that year and the first part of June was coldest in the entire of 174 years
of observations in Helsinki. Already in September was the frost back and
during the winter 1867-68 over 100,000 Finnish people died of famine. May
1867 was really astounding. In the next table I have summoned all the May
temperatures in Helsinki from 1829 to 2002. The coldest or the year 1867 had
an temperature of 1.8 degrees compared to the second coldest being 4.2
degrees. The whole table of the 174 years is listed here:

degrees + no of them

 2  1
 3  -
 4  1
 5  4
 6 15
 7 29
 8 39
 9 29
10 26
11 19
12  7
13  4

It seems to have been a Northern and Mid-Europe phenomenon. If we take as
average the temperature during that decade, the May 1867 was colder than
normal by over 6 degrees in Southern Finland, Estonia and the North-West
Russia. St Petersburg suffered a cold of 6.6 degrees below normal. If we
take the area where the temperature was more than 4 degrees colder than
normal, we must extend to the area to cover whole Finland, Sweden without
Skone, whole Baltia and East Russia near to Ural. If we enlarge the area
with the area where temperature was colder than 2 degrees below normal, we
can include Norway, Denmark, Southern Sweden, Northern Germany and from
there to East thru Ukraina. The normal temperature was reached only south
from line than goes from England to the Black Sea.

Jean Grove: "The Little Ice Age" lists "a notable frost event" in USA the
former year or 1866.

If we accept the great part that small particles in atmosphere have had in
causing cold periods after outbreaks from volcanoes, one readily accepts
here as an explanation the Leonids in 1865-66 which together with a weak sun
allowed greater cosmic radiation.

This is to say that there are two types of oscillations in temperature: one
based on cycles of very different periods and sudden drops due to some

Timo Niroma


>From Number Watch, July 2002

It seemed too good to be true when the New York Times decided to publish an
article on the importance of negative results [CCNet 9 July 2002], but of
course, they missed the main point. They seem to think that the main reason
for publishing negative results is to save others from repeating the
experiments. In fact, it is the almost universal non-publishing of negative
results that sustains most of the epidemiological fictions that arise from
apparent positive results.

Let us demonstrate with some simple modelling. Assume that there is a
politically correct surmise that passive drinking causes toe-nail cancer.
Assume also that there is actually no such correlation and that the
probability of any member of the populace contracting the disease in a year
is 0.0001. Eager to get on the bandwagon, no fewer than one hundred
professors world wide each detail a research student to do a study. Each
research student is especially diligent and finds 100,000 people who have
been exposed to passive drinking. After a year they look to see how many
have contracted the disease. It is obvious from the figures that we would
expect about ten. The 100 results, however, are scattered about this figure
and we can model this process by, for example, the MathCad expression:

v := rbinom (100, 100000, 0.0001)

The results of the 100 studies are then as in the graph below.

This is where the publication bias comes in. Only 46 of the 100 results are
positive in that they produce more than the expected number. The rest are
discarded and the professors go on to some other topic. So the distribution
of published results is then as in the second graph. The average of the
whole 100 studies is 10.11, about what we expected, but the average of the
published studies is now 12.975.

At this stage along comes a nameless US Government agency to do a
meta-study, combining all the results. The next you hear is that all the
media are carrying headlines bellowing that Passive drinking causes a thirty
percent increase in toenail cancer. This all shows how easy it is to get a
Relative Risk of 1.3 on the basis of no effect at all.

Did you notice, by the way, that one of the studies produced a RR of greater
than 2? Which will perhaps help to answer one of the the most frequent
questions posed to Number Watch - why are real scientists not prepared to
accept risk ratios of less than 2?

In case you think that there is any exaggeration in the above, the effect is
actually understated. The random numbers would normally be divided by
another set of random numbers, the estimates for the non-exposed population,
which further increases the scatter. Furthermore, epidemiologists vary in
their squeamishness about publish low RRs, which moves the average even
higher and nicely rounds the distribution, so that it looks more genuine.
Also we did a relatively large number of studies in order to be able to
produce histograms, but the argument applies however many or few studies are

It does not matter much which sort of study is done. All that matters is the
expected number (10 in this case, quite a common sort of number). You might,
for example, have done a retrospective study. If you have a thousand toenail
cancer patients and the probability of any individual being exposed to
passive drinking is one percent, you again have an expected number of 10.

According to friend Poisson, the resulting scatter (standard deviation) as a
fraction of the mean for an expected value of N is the reciprocal of the
square root of N, so for N = 10 it is about 30%, which is what we got.

Funnily enough, about 1.3 is very common RR in published epidemiological
studies. Some of the really tacky ones, such as the EPA study on ETS, even
go down to less than 1.2.

Its a funny old world.

CCNet is a scholarly electronic network. To subscribe/unsubscribe, please
contact the moderator Benny J Peiser < >. Information
circulated on this network is for scholarly and educational use only. The
attached information may not be copied or reproduced forany other purposes
without prior permission of the copyright holders. The fully indexed archive
of the CCNet, from February 1997 on, can be found at DISCLAIMER: The opinions,
beliefs and viewpoints expressed in the articles and texts and in other
CCNet contributions do not necessarily reflect the opinions, beliefs and
viewpoints of the moderator of this network.

CCCMENU CCC for 2002

The content and opinions expressed on this Web page do not necessarily reflect the views of nor are they endorsed by the University of

The content and opinions expressed on this Web page do not necessarily reflect the views of nor are they endorsed by the University of Georgia or the University System of Georgia.