Black is the New Green

SUBHEAD: Biochar can address three or four critical crises at once.

 By Fiona Harvey on 27 February 2009 in The Financial Times
Image above: Detail of page of portraits of South American Natives. From
In Brazil’s Amazon basin, farmers have long sought out a special form of fertiliser – a locally sourced compost-like substance prized for its amazing qualities of reviving poor or exhausted soils. They buy it in sacks or dig it out of the earth from patches that are sometimes as much as 6ft deep. Spread on fields, it retains its fertile qualities for long periods.

They call it the terra preta do indio – literally, “the dark earth of the Indians”. Dense, rich and loamy, this earth forms a stark contrast with the thin, poor soils of the region. (It seems a paradox, but rainforest soils have low fertility. This is why farmers who cut down the forest for agriculture have to keep on felling – after a few years of cropping, yields collapse and they have to move on.) Patches of terra preta extend for many hectares in some places but until recently, no one really knew what the mysterious dark earth was. Some guessed it was volcanic, or the sediment of old lakes, or the residue of some long-rotted vegetation. Few imagined that it was man-made.

Terra preta, modern analysis has proved, is one of the last remaining traces of pre-Columbian agriculture in the Amazon basin. It was made more than 2,500 – and perhaps as long as 6,000 – years ago by people living by the river. These cultures survived and supported complex agriculture, despite poor soil, by making their own earth. They used dung, fish, animal bones and plant waste – the usual suspects. But the key ingredient in terra preta, and what gives it its dark colour, is charcoal.

“It’s wonderful stuff,” says Simon Shackley, a social science lecturer at the University of Edinburgh. “We started to get to know about it when Dutch scientists began to look at it in the 1960s. They found these dark soils in this area of very poor soil, where it was being put on fields like compost. It’s really the product of slash-and-burn agriculture, and other organic waste, incorporated into the soils over hundreds or even thousands of years – and it does appear to be fertile indefinitely, which is really a very odd thing.”

This ancient product of the Amazon is now the subject of intense scrutiny by climate change scientists. The tenacity of the charcoal of terra preta – retaining its fertilising properties over centuries – has given them an idea. Charcoal is a form of carbon, the burnt remains of plant and animal material. If it can stay intact in the earth for so long, without being released as carbon dioxide gas, why not lock up more carbon in the earth in this manner?

Scientists have begun to refer to the charcoal made from plants for the purpose of storing carbon as “biochar”. The theory is that biomass – any plant or animal material – can be turned into charcoal by heating it in the absence of oxygen. By taking CO2 out of the atmosphere, the impact on climate change could be huge. Soils naturally contain large quantities of carbon, from decayed vegetation. But this carbon is relatively unstable, in climate terms – soils give off CO2 when they are disturbed, by ploughing for example, making them as much a carbon source as a carbon sink. So the idea of trying to lock up carbon in soils has found little favour among climate scientists – indeed, it has even gained a bad name, as farmers have sought to cash in by claiming their fields should qualify for the carbon credits intended to provide financial support to projects such as wind farms or solar power plants.

What is different about biochar is that the stability of the charcoal should make it possible to lock away the carbon it contains for hundreds of years. The carbon is mineralised, so it’s very resistant to breaking down. What’s more, the ancillary benefits – not just its soil-improving characteristics, but certain byproducts of its manufacture – should be enough to make it economically attractive.

When it’s made, about a third of the biomass is turned to char, a third is turned to syngas that can be burned to generate electricity, and a third into a crude oil substitute that could be very useful in making plastics, though it would be hard to use as a transport fuel. Tim Flannery, the eminent Australian explorer and naturalist , argues that these properties of biochar “allow us to address three or four critical crises at once: the climate change crisis, the energy crisis, and the food and water crises”, because putting biochar in the soil not only fertilises the soil, but also helps it to retain water.

Just how much could biochar do to change the world’s carbon balance? There is little doubt of the enormous amount required. Every year, human activities – burning fossil fuels, cutting down forests, converting grassland to crops and so on – contribute eight to 10 billion tonnes of carbon to the atmosphere. Most of that carbon does not go on to damage the climate – the world has a natural carbon cycle, by which carbon dioxide in the atmosphere is absorbed and re-emitted by “carbon sinks” of vegetation, soils, the seas and other natural processes. But these processes are being severely overloaded, so the carbon content of the atmosphere is rising. At present, it stands at about 387 parts per million, certainly higher than at any time in the last 650,000 years and probably in the last 20 million. According to the Global Carbon Project, between 2000 and 2007, the land and ocean carbon sinks – such as forests, and plankton in the ocean – removed about 54 per cent, or 4.8 billion tonnes a year of the carbon that humans pumped into the atmosphere. That leaves a carbon surplus of about 4 billion tonnes or so per year, which we need to find ways to reduce or absorb. Moreover, the amount absorbed by natural sinks is declining as land and oceans warm, meaning every year we must either work even harder to remove carbon from the air, or stop emitting it. 
Even as governments talk of a “low-carbon economy”, global greenhouse gas emissions are rising fast. According to the Intergovernmental Panel on Climate Change, the world authority on climate science, emissions must peak in the period 2015 to 2020 if we are to avoid the most catastrophic effects of climate change. On present projections, that will be impossible – unless a way can be found to make available cheap, easy methods of removing carbon dioxide from the atmosphere, and of generating clean electricity in ways that can be adopted around the world much more quickly than current renewable technologies.

According to some early estimates of biochar’s potential, this wonder substance alone could achieve all the carbon reductions necessary to prevent further global warming. Johannes Lehmann of Cornell University and others calculated that biochar could remove between 5.5 and 9.5 billion tonnes of carbon from the air each year. But those estimates relied on heroic assumptions about the ability to make biochar easily around the world, says Shackley. “There has been a tendency to withdraw from some of the very large figures lately,” he notes. “Now, I would say people are talking more about something in the range of one to two billion tonnes a year.”

This may seem disappointing in comparison with previous grandiose claims, but it still represents an impressive potential contribution from a single method, Shackley says. “It’s certainly not trivial,” agrees Tim Lenton, professor of earth systems science at the University of East Anglia. “It might be a good-sized slice of what we need, and it has sizeable side benefits – it’s win-win.” If other carbon-reducing techniques – such as preserving and regrowing forests, increasing the share of energy from renewables, and the push for energy efficiency – were pursued simultaneously, the world could make the cuts needed in our “carbon budget” to stave off climate disaster.

This potential, and the unique and sometimes mysterious qualities of biochar, are making it one of the most exciting new areas of climate change research. The idea of sequestering carbon through biochar has gained some heavy-hitting scientific backers, such as James Lovelock, the maverick scientist whose Gaia hypothesis has come back into vogue. Scientists at Cornell University, led by Lehmann, are working on ways to sequestrate carbon in biochar-enriched soil. 
In the UK, a biochar research centre has been set up at Edinburgh university; other European countries are following suit, and research projects are under way in countries from Canada to Australia. A few companies are in the early stages of trying to find ways to commercialise biochar production. Biochar even has its own song – “The Biochar Blues” – written by members of the International Biochar Initiative.

Charcoal is, of course, nothing new. People have been making it for millennia, chiefly for fuel. The process is simple: take wood, or straw or the waste from crops, and heat it in the absence of oxygen. Traditionally, this was done by heaping earth on top of the lit biomass so that it smouldered for a long time. Modern kilns can make the process more efficient, but the principle remains the same.

There is much about biochar that remains a puzzle, however. Take the soil-fertility effects. What is it about biochar that improves soil so much? “The simple answer is that we don’t know exactly,” says Shackley. “It’s probably a combination of several factors. Charcoal is very porous, so it acts like a sponge in retaining water, and the nutrients dissolved in water, which is something poor soils aren’t very good at. And [its porous nature] also means it provides a good material for the growth of lots of important bacteria.”

Another factor in its favour is that using biochar as a fertiliser can displace artificial nitrogen fertilisers, which give off nitrous oxide, a greenhouse gas 300 times more powerful than carbon dioxide. And biochar is not toxic, adds Lenton – “no one has yet said there is some great hidden danger associated with it”.

But Saran Sohi, a lecturer in soil science, warns that anyone hoping that biochar alone will solve fertility problems is probably deluded – biochar is not enough by itself to make the difference that terra preta does to thin Brazilian soils. “Terra preta soils also contain other nutrients, from the other substances they contain – things like bones, which are rich in phosphorus [essential for healthy plant growth],” he says. The biochar undoubtedly plays a role in holding these nutrients together, ensuring they remain available to plant roots, but the nutrients must be provided by other means. “No one has yet succeeded in recreating terra preta,” Shackley adds.

To produce biochar on an industrial scale, traditional methods of charcoal production would be impractical. Instead, researchers are looking to pyrolysis – which is a form of controlled thermal decomposition of organic material in the absence of oxygen, at heats that can reach 500 to 600°C.

Using pyrolysis also allows the capture of the syngas and the tarry liquid byproducts, both of which can be used as fuel to generate electricity or for the heating process.

The amount of biochar to be produced depends on accelerating or slowing down the pyrolysis process: fast methods produce 20 per cent biochar and 20 per cent syngas, with 60 per cent bio-oil, while slow methods produce about 50 per cent char and far smaller quantities of oil. “It’s much easier to do slow pyrolysis as well,” notes Adrian Higson of the UK’s National Non-Food Crops Council. “And cheaper.” As modern pyrolysis plants can be run entirely from the syngas, the output is between three and nine times the energy input required, according to the Institute for Governance and Sustainable Development.

What to use to make the char? Tearing down forests to turn into charcoal would be insane in climate change terms. But there is plenty of other material. Agriculture produces large amounts of plant and animal waste – straw, husks, dung. Even human waste – sewage sludge, or some forms of household rubbish – could be used.

And using waste products creates a double carbon saving: if left to rot, they produce methane, a greenhouse gas 20 times more powerful than carbon dioxide. But the difficulty is in gathering the waste – and making it economic to do so. Farmers will require some persuasion that the trouble of conserving and cooking their waste to a charcoal makes financial sense, and they may need new machinery to do so. At a municipal waste level, the problem will be sorting the organic waste, which can be turned to char, from the rest of the rubbish – and proving that this is cheaper and more beneficial than merely burying it.

The IGSD suggests a way of marrying small-scale and industrial methods for producing the char, that if refined could enable the economically viable production of biochar in urban, rural and even poor regions. It suggests three possible systems. The first is a centralised scheme, whereby all waste biomass in a given region would be brought to a central plant for processing; the second is a decentralised system in which each farmer or a small group of farmers would have their own fairly low-tech pyrolysis kiln.

The third system proposes a mobile alternative, in which a vehicle equipped with a pyrolyser, powered using syngas, would visit small farms, returning the biochar to the farmers to use, while collecting the bio-oil to be transported to a refinery and turned into liquid biofuel for vehicles. As an example, the IGSD cites Brazil’s sugar cane industry, in which the tops of the canes, normally burned in the field, and the bagasse – the residue from sugar production – could be turned efficiently into biochar. It estimates that of the 460 megatonne annual sugar cane harvest, as much as 230 megatonnes could be available for pyrolysis. A clutch of companies is now working on these problems, and seeking to commercialise biochar as a medicine for both climate and soil, and as an energy source.

As Mike Mason, founder of the carbon offsetting company Climate Care, bought by JP Morgan, somewhat ruefully notes, he had been planning by now to spend most of his time charging round Africa looking at elephants (he was born in the UK, but was raised in east Africa). But instead he decided that climate change was too great a problem to leave alone, and with his new company, Biojoule, has been investigating ways to turn biochar into a viable business. In Ontario, Canada, Dynamotive is making biochar and up to 130 tonnes a day of bio-oil at a wood-products mill. Crucible Carbon, based in Australia, predicts that its technology will allow carbon sequestration from biochar at the cost of about A$20 (£9) a tonne.

Yet even without the logistical problems, others are less sure of the absolute benefits of the product. Robert Trezona, head of research and development at the Carbon Trust, a UK government-funded body that helps businesses cut their greenhouse gas emissions, worries that seeing biochar as the main output from cooking biomass might be to miss the point. The Carbon Trust is running a competition to develop pyrolysis plants, but with the aim of manufacturing liquid transport fuels from biomass, using fast pyrolysis techniques, to which biochar is merely a byproduct of questionable usefulness.
 “Producing liquid biofuels for transport is going to be very important in cutting emissions. We don’t know the same about biochar,” he says. In fact, encouraging small farmers to produce biochar by traditional, low-tech methods may actually result in more greenhouse gas emissions than simply burning the plants for fuel or discarding them, he says.

“This is very much unproven,” he objects. “You want to be able to show that it stays in the soil for hundreds of years, and to prove that is difficult.”

The Carbon Trust is not allowing companies applying to it for funding to count the biochar byproduct of pyrolysis as part of the carbon savings they produce. “We are a long way from having enough technical evidence to create a proper case for biochar,” says Trezona. “Even the soil-improvement benefit is a new unexpected finding.”

Flannery disagrees. “At least half of the carbon in charcoal is still sequestered 500 years later. This has been known for a long time, from radio carbon dating from charcoal by paleontologists,” he says.

Even if biochar does not fulfil all of the potential claimed for it, it could still make an important contribution. Al Gore, the former US vice-president and environmental campaigner, likes to point out that the search for a “silver bullet” to solve the problem of climate change has been a distraction. Instead, he argues, though there may be no silver bullet, “there is silver buckshot”. 
Only by bringing many different methods of cutting emissions or absorbing carbon to bear can we reduce atmospheric levels of carbon to within the limits of safety. And of those possible methods, few are as simple and cheap as biochar. Johannes Lehmann of Cornell makes the point that “biochar sequestration does not require a fundamental scientific advance and the underlying production technology is robust and simple, making it appropriate for many regions of the world”.

But no one should doubt that rolling out this technology will be a mammoth task. The problem is twofold: developing a model for biochar production that reliably reduces greenhouse gases but is easily replicable in small farms in poor countries; and in the developed world, changing the business model of large farms so that collecting and cooking their waste is a better option than not. The huge US agribusinesses may be easy to reach, and good candidates to start using their waste for char, but they are likely to need financial incentives before they begin to see the point. The poor farmers of the developing world might be glad of the husbandry advice and techniques that would help them revitalise their own soils with biochar, but how to reach them all? That may prove impossible.

These problems of economics and communication will be the real hurdles at which biochar may fall, just as they have been the reasons why we have failed to capitalise on other ways of cutting carbon, from the very simple – small alterations to wood-fired cooking stoves in Africa and India can reduce the indoor air pollution from cooking fires that kills millions, yet hardly any homes have them – to the complex challenges, such as adopting renewable energy. A massive effort will be required to overcome the inertia that has been the downfall of other great climate ideas.

Yellow Is the New Green

SUBHEAD: Less energy-intensive sanitation may be starting to make sense. By Rose George on 27 February 2009 in The New York Times Image above: The urinals at 150 year-old McSoreley's Ale House on East 7th Street in NYC. From In the far reaches of Shaanxi Province in northern China, in an apple-producing village named Ganquanfang, I recently visited a house belonging to two cheery primary-school teachers, Zhang Min Shu and his wife, Wu Zhaoxian. Their house wasn’t exceptional — a spacious yard, several rooms — except for the bathroom. There, up a few steps on a tiled platform, sat a toilet unlike any I’d seen. Its pan was divided in two: solid waste went in the back, and the front compartment collected urine. The liquids and solids can, after a decent period of storage and composting, be applied to the fields as pathogen-free, expense-free fertilizer. From being unsure of wanting a toilet near the house in the first place — which is why the bathroom is at the far end of their courtyard — the couple had become so delighted with it that they regretted not putting it next to the kitchen after all. What does this have to do with you? Mr. Zhang and Ms. Wu’s weird toilet — known as a “urine diversion,” or NoMix (after a Swedish brand), toilet — may have things to teach us all. In the industrialized world, most of us (except those who have septic tanks) rely on wastewater-treatment plants to remove our excrement from the drinking-water supply, in great volumes. (Toilets can use up to 30 percent of a household’s water supply.) This paradigm is rarely questioned, and I understand why: flush toilets, sewers and wastewater-treatment plants do a fine job of separating us from our potentially toxic waste, and eliminating cholera and other waterborne diseases. Without them, cities wouldn’t work. But the paradigm is flawed. For a start, cleaning sewage guzzles energy. Sewage treatment in Britain uses a quarter of the energy generated by the country’s largest coal-fired power station. Then there is the nutrient problem: Human excrement is rich in nitrogen, phosphorus and potassium, which is why it has been a good fertilizer for millenniums and until surprisingly recently. (A 19th-century “sewage farm” in Pasadena, Calif., was renowned for its tasty walnuts.) But when sewage is dumped in the seas in great quantity, these nutrients can unbalance and sometimes suffocate life, contributing to dead zones (405 worldwide and counting, according to a recent study). Sewage, according to the United Nations Environment Program, is the biggest marine pollutant there is. Wastewater-treatment plants work to extract the nutrients before discharging sewage into water courses, but they can’t remove them all. And there’s also the urine problem. Urine, like any liquid, is a headache for wastewater managers, because most sewer systems take water from street drains along with the toilet, shower and kitchen kind. Population growth is already taxing sewers. (London’s great network was built in the late 19th century with 25 percent extra capacity, but a system designed for three million people must now serve more than twice as many.) When a rainstorm suddenly sends millions of gallons of water into an already overloaded system, the extra must be stored or — if storage is lacking — discharged, untreated, into the nearest river or harbor. Each week, New York City sends about 800 Olympic-size swimming pools’ worth of sewage-polluted water into nearby waters because there’s nowhere else for it to go. This probably won’t kill us, but it’s not ideal. Environmental scientists in California have calculated that sewage discharged near 28 Southern California beaches has contributed to up to 1.5 million excess gastrointestinal illnesses, costing as much as $51 million in health care. We can do better. Urine might be one way forward. Before engineers scoff into their breakfast, consider that since at least 135,000 urine-diversion toilets are in use in Sweden and that a Swiss aquatic institute did a six-year study of urine separation that found in its favor. In Sweden, some of the collected urine — which contains 80 percent of the nutrients in excrement — is given to farmers, with little objection. “If they can use urine and it’s cheap, they’ll use it,” said Petter Jenssen, a professor at the Agricultural University of Norway. The price of phosphorus fertilizers rose 50 percent in the past year in some parts of the world, as phosphate reserves, the largest of which are in Morocco and China, dwindle. (The gloomiest predictions suggest they’ll be gone in 100 years.) Although half of sewage sludge in the United States is already turned into cheap fertilizer known as “biosolids,” urine contains hardly any of the pathogens or heavy metals that critics of biosolids claim remain in mixed sewage, despite treatment. The rest of Sweden’s collected urine goes to municipal wastewater plants, but in much smaller volume so it’s easier to deal with. Research by Jac Wilsenach, now a civil engineer in South Africa, found that removing even half of the nutrient-rich urine enables the bacteria in the aeration tanks to munch all the nitrogen and phosphate matter in solid waste in a single day rather than the usual 30. Urine diversion also makes for richer sludge and produces more methane, which can be turned into gas or electricity, Mr. Wilsenach said. In short, separating urine turns a guzzler of energy into a net producer. Putting urine to use is not new. A friend’s grandmother remembers the man coming round for the buckets 60 years ago in Yorkshire, which were then sold to the tanning industry. The flush toilet ended that, and no one — my friend’s nan included — wants outside privies again. “Any innovation in the toilet that increases owner responsibility is probably seen as downwardly mobile,” said Carol Steinfeld, of New Bedford, Mass., who imports NoMix toilets into the United States. Then there’s the sitting problem: in most urine-diversion toilets, a man must empty his bladder sitting down. This wouldn’t be a problem in some countries — Germany recently introduced a toilet-seat alarm that admonishes standers to sit — but it has been in others. Professor Jenssen was flummoxed by one participant at a training workshop in Cuba who said firmly, “If a man sits, he is homosexual.” For now, “ecological sanitation” — or more sustainable sewage disposal — thrives mostly in fast-industrializing countries like China and India, which have money to invest in alternatives but few sewers. A subculture of composting toilets exists in the United States, but only a few hundred urine-diversion toilets have been imported, Ms. Steinfeld said. Necessity — whether occasioned by fertilizer prices, carbon footprints or crippling capital investments — could bring change. At a recent wastewater conference, I watched in astonishment as dour engineers rushed to question a speaker who had been talking about stabilization ponds, which clean sewage using water, flow control, bacteria and light. Normally, such things would be cast into the box of hippie-ish ecological sanitation. But to managers struggling with energy quotas and budget limitations, more sustainable, less energy-intensive sanitation may be starting to make sense. As Mr. Zhang told me with a smile: “For me, whatever the toilet is, I use it. For example, here we eat wheat. When we go to the south of China, we eat rice. Otherwise we starve.” It’s been more than 100 years since Teddy Roosevelt wondered aloud whether “civilized people ought to know how to dispose of the sewage in some other way than putting it into the drinking water.” The Zhang family toilet is not the perfect answer to Roosevelt, as it still uses some water, though 80 percent less than a regular flush toilet uses. But at least it’s the result of someone asking the right questions. • Rose George is the author of “The Big Necessity: The Unmentionable World of Human Waste and Why It Matters.”

Monsanto Aftermath

SUBHEAD: Dialog on Monsanto plans to pull current GMO development from Kauai.
By Jeri DiPietro on 26 February 2009 of
Image above: Monsanto headquarters in Hanapepe along he river next to the 1911 Bridge.Note Monsanto shuttle bus parked blocking public access boat ramp. Photo by Juan Wilson 11/7/07
[Note from Jeri: Diane Leone at The Honolulu Advertiser, Kauai Bureau and environment reporting wrote me the following.]
You've probably heard the news Monsanto is removing its 30-person current workforce from Kauai by May. This "technology" group worked on early phase development of seed corn, about 70 percent of it GMO. Those jobs are moving to other Monsanto Hawaii operations on Maui, Molokai and/or Oahu. The people have the option to transfer or take severance. Meanwhile, Monsanto is considering having another of its groups move to Kauai, but hasn't made the final decision. If they come, they will be planting larger quantities of seed corn at a farther along stage of the development process. This could mean more acreage and fewer people. They might also decide to leave Kauai entirely. I'd like to put this move by Monsanto into context with the seed company business on Kauai and in the state. So your observations as part of GMO-Free Hawaii are welcome.
Aloha Diane"
[Note from Jeri: My response follows.]
Aloha Diane,
Thank you so much for the opportunity to comment on the latest news of Monsanto leaving Kaua`i and moving their operations to Oahu, Molokai and Maui. While we feel great regret for these sister islands and our ohana, we are very thankful for this turn of events. While Monsanto's time on Kaua`i has been relatively short and in areas limited to Hanapepe, Poipu and Puhi (to the best of our knowledge) we are extremely concerned about the pollution left behind. Very few studies have been done on how to mitigate contamination of soil and bacteria where genetically engineered crops have been grown. We are assembling a group of concerned local farming advocates to address a protocol on how these lands could be cleaned and cleared of genetically modified organisms and the heavy applications of multiple herbicides and fossil fuel based fertilizers. We are discussing possibilities of seeking leases to local growers for the production of island grown food. It actually provides an opportunity for conversations with the land owners to think about local food production The impact of GMO’s on soil organisms is not commonly studied, especially Bt crops which produce pesticides. Soil fertility, and the organisms which maintain the fertility of soil are a vital aspect of the environment, especially in the context of food and agricultural production. Kaua`i farmers and KCC students are finding many new ways to restore micro-organisms into the soil. With healthy soil crops do better and lessen the need for chemicals. One such method practiced by growers here and across the world is the use of cover crops or green manures. Before industrialized agriculture and oil bases fertilizers, the process of crop rotations and letting fields go fallow was a necessary step to keep soil healthy and nutrient rich. This is a vital step to growing nutrient rich food. Studies have shown that much of the grocery store produce does not care the rich array of vitamins that they once did. This is why many people take vitamin supplements, healthy food starts with good soil. Our hope is that we can create new jobs on Kaua`i growing seed for cover crops and organic seed production. These jobs would be much less toxic to the field workers and insure that traditional crops continue to provide seed to the farmers who grow our food. While on Kaua`i the Monsanto fields in Puhi had guards on site 24/7, our county government had no disclosure as to the true nature of these experiments or what they were hiding. During the heavy rains at the end of last year, neighborhoods in Hanapepe were evacuated when a cargo container at the Monsanto test site was lifted into the water of the swollen river. The container floated down the river to the ocean where the contents of agricultural chemicals all dumped into the sea. While this was on the front page of the Garden Island newspaper, no one was ever held responsible for the pollution that went into our reef and ecosystems. Economically speaking, these business models of sustainability and food sovereignty are a solid base to grow our tourist industry and insure Hawai`i's reputation as a place to experience a clean environment, delicious locally grown food and the aloha of healthy residents. We should not be the epicenter for these controversial open air field tests. Currently these biotech company's are receiving huge state and federal subsides and tax incentives to come here. Most of the seed company profits are realized out of state, we experience little benefit and receive much environmental degradation that we will be left for us to mitigate. It is irresponsible for us to ignore the need for food security. The patenting of GMO seed and corporate control of our food supply is the opposite of what Hawai`i needs. Our isolation and year round growing climate are the assets that should allow for us to feed ourselves and export to others. Feeding our people is dependent on clean seed, revitalized soil, healthy bee's, access to water and a living wage for farmers. Let's subsidize the farmers, not the commodity crops by putting something down on our future, and the culture of the Hawaiian people. Contact info: Jeri at at (808)651 1332 or MiKey at GMO Free Kauai at (808) 651-9603
See also:

Surviving the coming century

SUBHEAD: The juggernaut of global warming will be unstoppable.
By Gaia Vince on 25 February 2009 in New Scientist - (
Image above: "Urban Sunset" by *Azmys. Ffrom (
Alligators basking off the English coast; a vast Brazilian desert; the mythical lost cities of Saigon, New Orleans, Venice and Mumbai; and 90 per cent of humanity vanished. Welcome to the world warmed by 4 °C.
Clearly this is a vision of the future that no one wants, but it might happen. Fearing that the best efforts to curb greenhouse gas emissions may fail, or that planetary climate feedback mechanisms will accelerate warming, some scientists and economists are considering not only what this world of the future might be like, but how it could sustain a growing human population. They argue that surviving in the kinds of numbers that exist today, or even more, will be possible, but only if we use our uniquely human ingenuity to cooperate as a species to radically reorganise our world.
The good news is that the survival of humankind itself is not at stake: the species could continue if only a couple of hundred individuals remained. But maintaining the current global population of nearly 7 billion, or more, is going to require serious planning.
Four degrees may not sound like much - after all, it is less than a typical temperature change between night and day. It might sound quite pleasant, like moving to Florida from Boston, say, or retiring from the UK to southern Spain. An average warming of the entire globe by 4 °C is a very different matter, however, and would render the planet unrecognisable from anything humans have ever experienced. Indeed, human activity has and will have such a great impact that some have proposed describing the time from the 18th century onward as a new geological era, marked by human activity. "It can be considered the Anthropocene," says Nobel prizewinning atmospheric chemist Paul Crutzen of the Max Planck Institute for Chemistry in Mainz, Germany.
A 4 °C rise could easily occur. The 2007 report of the Intergovernmental Panel on Climate Change, whose conclusions are generally accepted as conservative, predicted a rise of anywhere between 2 °C and 6.4 °C this century. And in August 2008, Bob Watson, former chair of the IPCC, warned that the world should work on mitigation and adaptation strategies to "prepare for 4 °C of warming".
A key factor in how well we deal with a warmer world is how much time we have to adapt. When, and if, we get this hot depends not only on how much greenhouse gas we pump into the atmosphere and how quickly, but how sensitive the world's climate is to these gases. It also depends whether "tipping points" are reached, in which climate feedback mechanisms rapidly speed warming. According to models, we could cook the planet by 4 °C by 2100. Some scientists fear that we may get there as soon as 2050.
If this happens, the ramifications for life on Earth are so terrifying that many scientists contacted for this article preferred not to contemplate them, saying only that we should concentrate on reducing emissions to a level where such a rise is known only in nightmares.
"Climatologists tend to fall into two camps: there are the cautious ones who say we need to cut emissions and won't even think about high global temperatures; and there are the ones who tell us to run for the hills because we're all doomed," says Peter Cox, who studies the dynamics of climate systems at the University of Exeter, UK. "I prefer a middle ground. We have to accept that changes are inevitable and start to adapt now."
Bearing in mind that a generation alive today might experience the scary side of these climate predictions, let us head bravely into this hotter world and consider whether and how we could survive it with most of our population intact. What might this future hold?
The last time the world experienced temperature rises of this magnitude was 55 million years ago, after the so-called Palaeocene-Eocene Thermal Maximum event. Then, the culprits were clathrates - large areas of frozen, chemically caged methane - which were released from the deep ocean in explosive belches that filled the atmosphere with around 5 gigatonnes of carbon. The already warm planet rocketed by 5 or 6 °C, tropical forests sprang up in ice-free polar regions, and the oceans turned so acidic from dissolved carbon dioxide that there was a vast die-off of sea life. Sea levels rose to 100 metres higher than today's and desert stretched from southern Africa into Europe.
While the exact changes would depend on how quickly the temperature rose and how much polar ice melted, we can expect similar scenarios to unfold this time around. The first problem would be that many of the places where people live and grow food would no longer be suitable for either. Rising sea levels - from thermal expansion of the oceans, melting glaciers and storm surges - would drown today's coastal regions in up to 2 metres of water initially, and possibly much more if the Greenland ice sheet and parts of Antarctica were to melt. "It's hard to see west Antarctica's ice sheets surviving the century, meaning a sea-level rise of at least 1 or 2 metres," says climatologist James Hansen, who heads NASA's Goddard Institute for Space Studies in New York. "CO2 concentrations of 550 parts per million [compared with about 385 ppm now] would be disastrous," he adds, "certainly leading to an ice-free planet, with sea level about 80 metres higher... and the trip getting there would be horrendous."
Half of the world's surface lies in the tropics, between 30° and -30° latitude, and these areas are particularly vulnerable to climate change. India, Bangladesh and Pakistan, for example, will feel the force of a shorter but fiercer Asian monsoon, which will probably cause even more devastating floods than the area suffers now. Yet because the land will be hotter, this water will evaporate faster, leaving drought across Asia. Bangladesh stands to lose a third of its land area - including its main bread basket.
The African monsoon, although less well understood, is expected to become more intense, possibly leading to a greening of the semi-arid Sahel region, which stretches across the continent south of the Sahara desert. Other models, however, predict a worsening of drought all over Africa. A lack of fresh water will be felt elsewhere in the world, too, with warmer temperatures reducing soil moisture across China, the south-west US, Central America, most of South America and Australia. All of the world's major deserts are predicted to expand, with the Sahara reaching right into central Europe.
Glacial retreat will dry Europe's rivers from the Danube to the Rhine, with similar effects in mountainous regions including the Peruvian Andes, and the Himalayan and Karakoram ranges, which as result will no longer supply water to Afghanistan, Pakistan, China, Bhutan, India and Vietnam.
Along with the exhaustion of aquifers, all this will lead to two latitudinal dry belts where human habitation will be impossible, say Syukuro Manabe of Tokyo University, Japan, and his colleagues. One will stretch across Central America, southern Europe and north Africa, south Asia and Japan; while the other will cover Madagascar, southern Africa, the Pacific Islands, and most of Australia and Chile (Climatic Change, vol 64, p 59).
The high life The only places we will be guaranteed enough water will be in the high latitudes. "Everything in that region will be growing like mad. That's where all the life will be," says former NASA scientist James Lovelock, who developed the "Gaia" theory, which describes the Earth as a self-regulating entity. "The rest of the world will be largely desert with a few oases."
So if only a fraction of the planet will be habitable, how will our vast population survive? Some, like Lovelock, are less than optimistic. "Humans are in a pretty difficult position and I don't think they are clever enough to handle what's ahead. I think they'll survive as a species all right, but the cull during this century is going to be huge," he says. "The number remaining at the end of the century will probably be a billion or less."
John Schellnhuber of the Potsdam Institute for Climate Impacts Research in Germany is more hopeful. The 4 °C warmer world would be a huge challenge, he says, but one we could rise to. "Would we be able to live within our resources, in this world? I think it could work with a new division of land and production."
In order to survive, humans may need to do something radical: rethink our society not along geopolitical lines but in terms of resource distribution. "We are locked into a mindset that each country has to be self-sustaining in food, water and energy," Cox says. "We need to look at the world afresh and see it in terms of where the resources are, and then plan the population, food and energy production around that. If aliens came to Earth they'd think it was crazy that some of the driest parts of the world, such as Pakistan and Egypt, grow some of the thirstiest crops for export, like rice."
Taking politics out of the equation may seem unrealistic: conflict over resources will likely increase significantly as the climate changes, and political leaders are not going to give up their power just like that. Nevertheless, overcoming political hurdles may be our only chance. "It's too late for us," says President Anote Tong of Kiribati, a submerging island state in Micronesia, which has a programme of gradual migration to Australia and New Zealand. "We need to do something drastic to remove national boundaries."
Cox agrees: "If it turns out that the only thing preventing our survival was national barriers then we would need to address this - our survival is too important," he says.
Imagine, for the purposes of this thought experiment, that we have 9 billion people to save - 2 billion more than live on the planet today. A wholescale relocation of the world's population according to the geography of resources means abandoning huge tracts of the globe and moving people to where the water is. Most climate models agree that the far north and south of the planet will see an increase in precipitation. In the northern hemisphere this includes Canada, Siberia, Scandinavia and newly ice-free parts of Greenland; in the southern hemisphere, Patagonia, Tasmania and the far north of Australia, New Zealand and perhaps newly ice-free parts of the western Antarctic coast.
If we allow 20 square metres of space per person - more than double the minimum habitable space allowed per person under English planning regulations - 9 million people would need 18,000 square kilometres of land to live on. The area of Canada alone is 9.1 million square kilometres and, combined with all the other high-latitude areas, such as Alaska, Britain, Russia and Scandinavia, there should be plenty of room for everyone, even with the effects of sea-level rise.
These precious lands with access to water would be valuable food-growing areas, as well as the last oases for many species, so people would be need to be housed in compact, high-rise cities. Living this closely together will bring problems of its own. Disease could easily spread through the crowded population so early warning systems will be needed to monitor any outbreaks.
It may also get very hot. Cities can produce 2 °C of additional localised warming because of energy use and things like poor reflectivity of buildings and lower rates of evaporation from concrete surfaces, says Mark McCarthy, an urban climate modeller at the UK Met Office's Hadley Centre. "The roofs could be painted a light, reflective colour and planted with vegetation," McCarthy suggests.
Since water will be scarce, food production will need to be far more efficient. Hot growing seasons will be more common, meaning that livestock will become increasingly stressed, and crop growing seasons will shorten, according to David Battisti of the University of Washington in Seattle and his colleagues (Science, vol 323, p 240). We will need heat and drought-tolerant crop varieties, they suggest. Rice may have to give way to less thirsty staples such as potatoes.
Vegetarian dystopia This will probably be a mostly vegetarian world: the warming, acidic seas will be largely devoid of fish, thanks to a crash in plankton that use calcium carbonate to build shells. Molluscs, also unable to grow their carbonate shells, will become extinct. Poultry may be viable on the edges of farmland but there will simply be no room to graze cattle. Livestock may be restricted to hardy animals such as goats, which can survive on desert scrub. One consequence of the lack of cattle will be a need for alternative fertilisers - processed human waste is a possibility. Synthetic meats and other foods could meet some of the demand. Cultivation of algal mats, and crops grown on floating platforms and in marshland could also contribute.
Supplying energy to our cities will also require some adventurous thinking. Much of it could be covered by a giant solar belt, a vast array of solar collectors that would run across north Africa, the Middle East and the southern US. Last December, David Wheeler and Kevin Ummel of the Center for Global Development in Washington DC calculated that a 110,000-square-kilometre area of solar panels across Jordan, Libya and Morocco would be "sufficient to meet 50 to 70 per cent of worldwide electricity production, or about three times [today's] power consumption in Europe". High-voltage direct current transmission lines could relay this power to the cities, or it could be stored and transported in hydrogen - after using solar energy to split water in fuel cells.
If the comparatively modest level of solar installation that Wheeler and Ummel propose were to begin in 2010, the total power delivery by 2020 could be 55 terawatt hours per year - enough to meet the household electricity demand of 35 million people. This is clearly not enough to provide power for our future 9 billion, but improving efficiency would reduce energy consumption. And a global solar belt would be far larger than the one Wheeler and Ummel visualize.
Nuclear, wind and hydropower could supplement output, with additional power from geothermal and offshore wind sources. Each high-rise community housing block could also have its own combined heat and power generator, running on sustainable sources, to supply most household energy.
If we use land, energy, food and water efficiently, our population has a chance of surviving - provided we have the time and willingness to adapt. "I'm optimistic that we can reduce catastrophic loss of life and reduce the most severe impacts," says Peter Falloon, a climate impacts specialist at the Hadley Centre. "I think there's enough knowledge now, and if it's used sensibly we could adapt to the climate change that we're already committed to for the next 30 or 40 years."
This really would be survival, though, in a world that few would choose to live. Large chunks of Earth's biodiversity would vanish because species won't be able to adapt quickly enough to higher temperatures, lack of water, loss of ecosystems, or because starving humans had eaten them. "You can forget lions and tigers: if it moves we'll have eaten it," says Lovelock. "People will be desperate."
Still, if we should find ourselves in such a state you can bet we'd be working our hardest to get that green and pleasant world back, and to prevent matters getting even worse. This would involve trying to limit the effects climate feedback mechanisms and restoring natural carbon sequestration by reinstating tropical forest. "Our survival would very much depend on how well we were able to draw down CO2 to 280 parts per million," Schellnhuber says. Many scientists think replanting the forests would be impossible above a certain temperature, but it may be possible to reforest areas known as "land-atmosphere hotspots", where even small numbers of trees can change the local climate enough to increase rainfall and allow forests to grow.
Ascension Island, a remote outpost buffeted by trade winds in the mid-Atlantic, may be a blueprint for this type of bioengineering. Until people arrived in the 17th century, vegetation was limited to just 25 scrubby species. But plantings by British servicemen posted there produced a verdant cloud forest. "It shows that if you have rainfall, forest can grow within a century," says ecologist David Wilkinson of Liverpool John Moores University in the UK, who studied the phenomenon.
Even so, the most terrifying prospect of a world warmed by 4 °C is that it may be impossible to return to anything resembling today's varied and abundant Earth. Worse still, most models agree that once there is a 4 °C rise, the juggernaut of warming will be unstoppable, and humanity's fate more uncertain than ever.
"I would like to be optimistic that we'll survive, but I've got no good reason to be," says Crutzen. "In order to be safe, we would have to reduce our carbon emissions by 70 per cent by 2015. We are currently putting in 3 per cent more each year."

What the Doctor Ordered

SUBHEAD: The Onion News Network knows what's going on out there.

[IB Publisher's note: As of 10/21/14 we noticed these two videos have been buggy. The embed code must be a bit off. I suggest you select the fullscreen option at the bottom right corner of each video before trying the play button.]

By Juan Wilson on 26 February 2009 for Island Breath -

This material was gleaned from ONN News. These people have been churning out great satire of newspaper and television voodoo news for decades. They know what the media is doing to us and have a plan. Enjoy!

Video above:  
 FDA approves depressant drug for the annoyingly cheerful
From (,14310/)

Unless you are ready for a really dark (and funny) view of the state of things, don't watch the next video ot you may need a anti-depressant to get through tomorrow.

Video above:  
Are violent video games preparing children for the Apocalypse?
From (,14314/).

See also:
Island Breath: Goodbye Blockbuster 6/26/08
Island Breath: World of World of Warcraft 6/19/08
Island Breath: War for the Whitehouse 9/28/08


The Investment Delusion

SUBHEAD: A wise investor understands the confusion between money and wealth.
By John Michael Greer on 25 February 2009 in The Archdruid Report
Image above: A still from the 1951 movie "Quo Vadis" (L. Where are you going?)
You might think that my position as head of a contemporary Druid order, with the colorful title and funny hat that go along with it, would keep me safely out of touch with the mainstream of American opinion. Still, it's been my experience that when I talk about peak oil to a pagan audience, I get the same reactions and questions I can expect from the most mainstream listeners.
I had a reminder of that the weekend before last, when I spoke on the future of industrial society at Pantheacon, one of the largest pagan conventions in America these days. Yes, pagans have conventions; this one happens annually on President's Day weekend at the Doubletree Inn in San Jose, California; it's an endless source of amusement, at least to me, that conference rooms more often used for corporate sales meetings spend one weekend a year hosting something so different.
Pantheacon is always a learning experience. (Mind you, one lesson I learned this year was that it's wise to avoid the Doubletree's pet steak house, Spencer's, unless you fancy undistinguished food and glacially slow service at a jawdropping price.) Still, I also gained a useful reminder of the way that certain misguided ideas pervade every corner of contemporary society, and it came – as such insights usually do – during the question and answer session that followed my talk on peak oil and the coming deindustrial age.
Finally, though, there's always somebody who wants to know what investment strategies I recommend. These days, the person who asks that question is usually silver-haired, nicely dressed, and visibly worried. I wish I had a crisp reply for that question, or for that matter, some good advice to offer. I don't, because the question itself embodies a series of fatally flawed assumptions that reach right down to the nature of wealth itself. On its own terms, it's as unanswerable as a question about how to build a working perpetual motion machine.
Yes, someone at my Pantheacon talk asked about investment strategies, and yes, she was silver-haired, nicely dressed, and visibly worried. I fumbled through an answer, but the question deserves more than that, if only because it's on so many minds these days. Thus this week's post. I should caution those of my readers who have investments that they won't like what follows.
Let's start with fundamentals: the nature of wealth. Ask ten people on the street today for a definition of wealth, and dollars will get you doughnuts every one of them will tell you that wealth consists of the possession of plenty of money. That's what nearly everyone thinks, but they're quite wrong, and it's easy enough to show the fallacy.
Imagine that a private jet full of politicians makes an emergency landing on an uninhabited island in the Pacific. Each of the politicians is carrying a briefcase containing $1 million – we'll be polite and say it's from campaign contributions. The island has a water supply and enough natural foodstuffs that the politicians don't have to worry about starving to death. Will the politicians on the island have a standard of living corresponding to their net worth of $1 million each? Of course not; their actual prosperity will be measured by the breadfruit they harvest, the fish they catch, the huts they make, and so on.
Money, in other words, is not wealth. It's a social mechanism for distributing wealth. It means nothing unless there's real wealth – actual, nonfinancial goods and services – to back it up. In a healthy market economy, there's a rough balance between the amount of money in circulation and the amount of real wealth produced annually, and so the confusion between money and wealth can slip by unnoticed. When money and wealth get out of sync with one another, problems sprout.
The economic history of the 19th century offers a good example. The rising industrial economy of the time drove a massive increase in the production of real wealth. Most industrial nations, though, inherited money systems backed by gold reserves that offered few options for expanding the money supply to match the supply of real wealth. The result was a deflationary spiral that brought major economic depressions every couple of decades for most of the century. In response, in the 20th century, nation after nation abandoned the gold standard's straitjacket and retooled their money systems to meet the needs of an expanding economy.
That's the context of the present crisis because, in terms of real wealth, we no longer have an expanding economy. The production of real wealth in the world's industrial nations has been in decline now for decades. Some of the deficit has been made up by importing real wealth from overseas, but not all; compare the lifestyle available to a single salary working class American family in 1969 to the lifestyle available to a similar family today and it's possible to get a glimpse of just how much impoverishment has taken place over the last forty years.
This impoverishment went unnoticed by most people because the money supply didn't follow suit. Until the economy came unglued in the second half of 2008, money had never been so abundant or readily available. Some of it got spent on real wealth, which is why real estate and other commodities soared to giddy heights, but most of it was diverted instead into various forms of abstract pseudo-wealth related to money in much the way that money relates to real wealth. Yes, I'm talking about your investments.
The confusion between money and wealth and the biases imposed by the long economic expansion of industrialism have made it almost impossible to talk sensibly about investments these days. It seems normal to most people that they should be able to invest their money and, as a matter of course, get back more than they put in. This reflects the dynamics of an expanding economy; if the production of real wealth is increasing, investments on average will increase in value over time to match the growth in real wealth, and the payback on investments reflects this. Outside of the special conditions of a growth economy, though, that logic no longer applies.
The long economic expansion of the industrial age has fostered the massive growth of what old-fashioned Marxists used to call a rentier class – a class whose money makes money for them. Even among people who work for a living, the idea of joining the rentier class on retirement, and living comfortably off investments, has become very popular in recent years. The problem, of course, is that the age of industrial expansion is over; it was made possible in the first place only by exponentially increasing the use of fossil fuels and other natural resources; like all exponential growth curves, it faced an inevitable collision with the limits of its environment – and that collision is happening around us right now
We are thus entering a period of prolonged economic contraction – not a recession, or even a depression, but a change in the fundamental dynamic of the economy. Over the centuries just past, a rising tide of economic growth was interrupted by occasional periods of contraction; over the centuries ahead, the long decline of the industrial economy will doubtless be interrupted by occasional periods of relative prosperity. Just as a rising tide lifts all boats, a falling tide lowers them all, and if the tide goes out far enough, a great many boats will end up high and dry.
The desperate attempt by full-time and part-time members of the rentier class to avoid dealing with this unwelcome reality has had the ironic result of making the situation much worse than it had to be. As actual investments in productive economic activities stopped yielding a noticeable profit, more and more investors sought to make money via a menagerie of exotic financial livestock notable for their complete disconnection from the economy of goods and services. The result was a series of classic speculative bubbles, culminating in the crash of 2008 and the crisis still unfolding around us. In the process, eager investors who might have lost their money slowly over a period of years have, instead, lost it all at once.
Still, in a contracting economy, on average, all investments lose money. This is the hard reality with which all of us will have to deal. This is why, in the twilight years of the Roman world, a complex money economy that made heavy use of credit and investment gave way to purely local economies of barter and customary exchange, in which money played a very minor role and credit was unheard of. It is also why the two great religious movements that rose out of Rome's ruins, Christianity and Islam, both considered lending at interest a mortal sin – though Christianity managed to talk itself out of that useful teaching some centuries ago.
Thus the only investment advice I can offer is to get out of investments altogether, and put your money into something that will actually be useful: training in practical skills that will make you employable in a deindustrializing economy, for example, or extra insulation so you can keep your home livable with less energy. At this point in history, the belief that it's possible to have your money make your living for you is basically a delusion; it's likely to be a fairly persistent one, but those who can shake themselves free of it and adjust to life in a radically different economic reality are likely to do better than those who keep on chasing the prospects of an age that is ending around us.

The Post-Oil Community

SUBHEAD: A widespread return to tribal living for survival will spread in this century.

By Peter Goodchild on 25 February 2009 in
Image above: detail from "The Venerable Bede", circa 735AD
One thing that will hit Americans rather hard in the future is the problem of “individualism” vs. “collectivism.” Americans are loners. If you put a group of Asians on a desert island, they will get together and build a boat. If you put a group of Americans on a desert island, they will start arguing about property rights. The weakness of individualism could be seen during the Great Depression of the 1930s: in those days, the average person was isolated, lost, and afraid. It was a “shame” to be poor, so one could not even discuss the problem with one’s neighbors. The news media and the government largely denied that the Depression existed, so there was little help from them.

Closely related to the problem of individualism is that of the lack of ideological unity. The basic premises of any major discussion seem to be absent. In a typical crowd of Americans, half will deny that any of the dozen aspects of systemic collapse even exist, and most of the other half will say, “Well, I believe . . .” and proceed to spout whatever Pollyanna nonsense their illiterate brains have been filled with.

Of course, if politicians never say a word about overpopulation, resource consumption, or any other real issue, then how can the average American be blamed for mental laziness? Well, perhaps there’s something to be said for intellectual responsibility. Certainly no one can say that informative books aren’t available — a good collection can be put together, at a dollar apiece, by roaming the second-hand stores, since nobody seems interested in reading books these days.

The individualist mentality has always been typical of Americans. There is a sort of frontier mentality that still pervades much of American life. In certain ways, this has been beneficial: freedom from the obligations of the “old country” have provided much of the motivation for those who came to what was called the “New World.” The beneficial side of individualism is self-sufficiency, which made it possible for pioneers to survive in the isolation of the wilderness. But individualism will not be as useful a response in the future as it was in the days of the pioneers. In fact individualism might just be more beneficial in good times than in bad, in times of prosperity rather than in times of hardship.

The most obvious negative effect of individualism can be seen in today’s false democracy: political leaders can tell the most remarkable lies, and the response is silent obedience. It’s hard to understand such a thing happening in “the land of the brave,” until we realize that most Americans have little means of behaving otherwise. They are probably somewhat lacking in family or friends with whom they can share information or compare ideas, and they are therefore entirely dependent on the news media for their comprehension of human society. A solitary evening in front of a television set is not likely to promote healthy social relationships.

So how will people form viable groups in the future? To answer that question properly, we must first realize that the ideal political system is not a “political” matter at all, but a psychological one. I mean, it is not a conscious, cerebral decision; it is a matter of the hard-wiring of our nervous system. And I say that as one who does not believe in evolutionary psychology or sociobiology, or any other of those ant-like portrayals of human mentality. Humans and their ancestors spent over a million years living in small groups, hunting and gathering. To judge from primitive societies that still exist, those groups had neither perfect dictatorship nor pure democracy, but something in the middle, a sort of semi-anarchic but functional process of majority rule. Chiefs who didn’t perform well got the cold shoulder.

The group was small enough so that each person knew every other person, and that rather clumsy democracy could work because both the “voters” and the “politicians” were visible. It has only been in a tiny fraction of the lifespan of humanity — the period called “civilization” — that political units have been created that are far too large for people to know one another except as abstractions. Small groups have their problems, but in terms of providing happiness for the average person, the band or village has always been more efficient than the empire.

The maximum practical size for human association may be Robin Dunbar’s number of 150, but we might need to be rather flexible about that — perhaps somewhere between about 20 and 200. Roman soldiers, for example, were organized into “centuries,” and modern Hutterite communities have between 60 and 160 members. In “A History of the Ancient World,” Chester G. Starr tells us that “whereas Paleolithic packs numbered perhaps 20 or 30, Neolithic farmers either lived in family homesteads, in villages of 150 persons (as at Jarmo), or in even larger towns (as at Jericho).” I once started to collect examples of present and past communities in which those numbers appeared, but I had to give up because of the immensity of the data.

But a close look at a half a dozen types of human groups is all that is necessary to get a good intuitive grasp of the sorts of numbers that are workable. I like the word “tribe” for the principal group because for myself it is a useful catch-all word, even if it gets a scowl from an expert in social anthropology. But in any case, I am not thinking of any particular group, I am simply referring to the basic unit of social organization that is found in any of the older of cultures.

Groups larger than that of the band, the small tribe, or the village simply don’t do as well in providing for the happiness of their individual members. A social group of a million or a billion may have military advantages but is more likely to operate as a tyranny than as a democracy — China is the obvious case. Larger groups are not necessarily unworkable, but they involve a greater risk of the loss of social cohesion.

One cannot throw a “tribe” together simply by sitting down and having a community chat in the course of one afternoon in a suburban living room. (The fact that we don’t instantly recognize something so obvious is in itself proof of our inability to form a “tribe.”) Primitive cultures may be organized into any of a number of social groupings, and those groupings in turn are often parts of a larger group — there is a pyramidal structure, so to speak.

But there are two characteristics that are found in these primitive cultures. In the first place, the “tribe” is always quite ancient; any group of that sort has been forming and reforming for generations, and one might say that the group is as old as humanity.

Secondly, any genuine “tribe” (or whatever you want to call it) consists of members who are all tied by the bonds of either blood or marriage. Everybody is everybody else’s cousin, so to speak. We may laugh at rural communities for what we regard as their “incestuous” behavior, but sometimes having close ties is precisely what keeps people alive.

The “tribe,” then, is characterized both by its antiquity and by its kinship patterns. Such patterns would certainly not be characteristic of a group of suburbanite refugees lost in the wilderness and suffering from shock and fatigue. It would be an understatement to say that such an ad hoc clustering of humans would face psychological challenges unlike those of people who had been living deep in the jungle since time immemorial.

These new tribalists will also be living on a planet that has lost its familiar borders. Long before the twenty-first century reaches its end, what we now think of as the geopolitical face of the world will have been considerably transformed. The “booming economies,” relatively speaking, will be those with an adequate ratio of population to arable land — in Canada, in parts of Africa, in parts of Latin America, and in a somewhat amorphous area that stretches from the Baltic across to what is now the Mongolian People’s Republic (not Chinese Mongolia).

The tables will often be turned in the social and political strife that now affects much of the world, so that both the land and the government will be returned to the peasants. The great irony, in other words, is that many countries that have suffered politically and economically have ended up with good population-to-arable ratios, and these ratios will be a great blessing in the agriculture-oriented future.

Of course, there won’t be as many people anyway. The world’s overall population density right now is about 33 people per hectare of arable land surface — far too many people. Even if we were all largely vegetarian, and if land were distributed fairly, we could not keep everyone alive in the post-oil world if the population were so large.

The number is now 8 times greater than the absolute maximum that is possible without fossil fuels. In a hunting-and-gathering society, each family might need at least 25 square kilometers of land — and if that sounds like too much, just ask serious deer-hunters how much land they cover merely to get one animal per year. In a society of subsistence agriculture, the numbers are greater, but they still come nowhere near that of the world’s present population.

The African countries with good population-to-arable ratios are not restricted to a particular part of the continent. To put it mildly, however, these African ratios cannot be regarded at the moment as indicators of well-being. The low density can partly be explained by saying that the countries have always been largely “pre-industrial.” But war, disease, famine, misuse of land, misuse of money, misuse of human resources, and lack of infrastructure have all had their effects.

There are 4 Latin-American countries — Argentina, Paraguay, Uruguay, and Guyana — that combine low population density with good soil and good climate. A great deal of the land, however, has been bought up by large international companies that practice monoculture and have no interest in gearing agriculture to the needs of the inhabitants. The implementation of land-reform policies will do wonders for the general population of those countries.

The countries with the worst proportions of population to arable land are mainly those of eastern Asia and western Europe. In Europe the exclusion can be better described in terms of too many people rather than too little farmland. The worst crowding of all can be seen in eastern Asia and the Arabian Peninsula.

One delusion that should be discarded is that of the “tropical island paradise.” Nearly all of the small Pacific islands are very densely populated, or are lacking in arable land. At the moment, they also have strict immigration policies. Coral islands — most of Micronesia, for example — lack both arable land and fresh water.

I have sometimes seen what might be called a “sour grapes” theory of the population-to-arable ratio: one could argue that countries that now have better ratios are merely indicating poor conditions of some other sort. To some extent this is true, but there are many important exceptions. The UK and the Republic of Ireland, for example, are very similar in almost all geographic respects, but the UK has 3 times the population-to-arable ratio; from the standpoint of subsistence farming, Ireland would be a far better place to live.

Food will become quite an obsession. If we look at “peak oil” in terms of its daily effect on the average person, we get a simple equation: “peak oil” equals “peak food.” Oil made it possible for us to keep 7 billion people alive — well, only barely alive, of course, since half of them don’t get a very good diet. When the oil is gone, most of that population will also have to go. But when I say they will “have to go,” I don’t mean that they will float up into the sky. And I don’t mean that we will invent spaceships to take them to Mars.

To put it rather bluntly, there will be some truly astonishing famines in the next few decades. The decline in oil production will be swift and ruthless, because without all the fertilizer and tractors and trucks, there will not be enough food for more than a small number of people. If we look at the oil-to-population ratios of previous years and project those same ratios onto the right-hand side of that bell curve, it’s fairly easy to see that about 50 million people will be starving to death every year as a result of global oil depletion. One way or another, the population will have to return to about 1 billion rather than 7.

Even that 1 billion is rather optimistic, because by then there will have been so many side-effects from the entire spectrum of systemic collapse — ranging from resource depletion to governmental collapse — that it is unlikely that the planet will be able to keep as many as a billion people alive. I’ve gone through the calculations a hundred times, adding in all the factors of war, epidemics, and so on, and my best guess is that the world’s population will eventually drop to about 1 percent of its present level.

To look at the future, then, we must start by looking at a world in which the human population has been dramatically reduced. The most basic principle is that each person will have to start thinking in terms of a smaller radius of activity. The globalized economy will have to be replaced by the localized economy.

Most food will have to be produced at a local level, and probably each family will have to produce its own food. The catch to growing food, however, is that most of the world’s surface is unsuitable for growing food, no matter what techniques of farming are employed. This is not the fault of mankind, it is merely a consequence of the nature of the planet. On many parts of the globe, the climate is too hot, too cold, too wet, or too dry. In other cases, the land is too barren to support anything but a sparse growth of wild plants, which in any case are simply growing and then dying and replacing their own material.

A small human population could nevertheless survive on agriculture, at least if it learned how to revert to some ancient methods, particularly as described by F.H. King in “Farmers of Forty Centuries.” One technique of some Asian cultures was to bring grass or other wild plant material from the mountains, for example, and turn it into compost, thereby making use of the nitrogen, phosphorus, potassium, etc. of the wilderness, as well as the basic humus (carbonaceous plant material). Many other cultures used wood ashes.

What it amounts to is that a large area of wild land was scoured to provide growth materials for the cultivated vegetable. The nutrient “source” of the wilderness, in other words, fed the nutrient “sink” of the farmland. This process of taking from the “source” and giving to the “sink” is one of the basic principles behind all “organic gardening,” although few practitioners would admit it or even know it. The process also raises some enormous doubts about our concepts of “sustainability,” but I’ll sidestep that issue.

A second technique used by Asian cultures was to recycle all sorts of materials, and to do so as intensively as possible. Among the most important materials were human and animal feces. (Let us conveniently ignore the backbreaking labor that went into all this.) Of course, the process of recycling could never be stretched to eternity. One cannot create a perpetual-motion machine: every time those materials are recycled, a certain amount of N-P-K is lost to leaching and evaporation.

A third agricultural technique, found in Asia as well as in other parts of the world, was to grow legumes or other plants that absorb nitrogen from the air. Unfortunately there are no similar tricks for phosphorus or potassium; plants with very deep roots can draw some of these elements from far underground, but not enough to turn barren land into farmland.

If we go further back in time, or further down the ladder of cultural evolution, we find an even simpler method of maintaining a sort of temporary sustainability — if such a term is not a self-contradiction. All over the world, many primitive cultures simply grew crops in one area for a few years and then abandoned that plot, cut and burned another patch of forest or jungle, and started a new garden. Such a practice is hard on the environment, but for a sparsely inhabited region the technique is feasible. In any case, sheer necessity will make this a common practice in future ages.

David Pimentel, in his excellent analyses of food and energy resources, points out that if one is living mainly on cultivated plants, at least a quarter of a hectare per person would be needed, in the absence of synthetic fertilizers or mechanized irrigation. For example, one could live — barely — on about 400 kilograms of dried non-sweet corn (maize) per year. The yield per hectare of corn, however, is not likely to be over 1,500 kilograms.

It might be worthwhile to take a closer look at the overused and misused word “organic.” “Organic fertilizers” can certainly do the trick, but in a post-petroleum world where are these going to come from, and how are they going to be transported? Powered dolomite, for example, will supply calcium and magnesium, but that’s very heavy stuff. If farmers are living in an environment where the soil is naturally barren, and if they have no access to petroleum-based manufacturing and delivery, then that dolomite might as well be sitting on the moon.

I had a 1-acre vegetable garden in central Ontario for 7 years. The soil was quite barren. The native vegetation was fragile and sparse. Without a motorized vehicle, there would have been little access to anything that would promote the growth of vegetables. That part of Ontario simply had very little N, P, K, Ca, Mg etc. in the soil.

Yes, I could pay somebody to send me some cow manure, or I could arduously create an acre’s worth of humus using grasses or whatever, but humus per se is just dead plant matter with C, H, O, and often — for example, in my part of the world — not enough of the other 13 elements. Humus is useful, but by itself you can still end up with your vegetables turning purple from phosphorus starvation. I think many people fail to understand that any plant life on the planet Earth contains those 16 elements. They are not necessarily synthetic in nature. For that matter, our own bodies are made up of such elements.

“Organic gardening” should be treated as a scientific hypothesis — and, indeed, it is worthy of consideration. Instead it is treated as a cult. One either “believes” or does not “believe,” and any request for precise observation or measurement is treated with scorn. That attitude does no one any good. No genuine information is offered, only a quasi-religious trance and a simplistic dogma. I have often wondered what a Freudian would say about the bodily fixations of those who are mesmerized by what is ultimately just cow dung.

The most useful crops will be those that are high in carbohydrates and protein. Crops that are susceptible to diseases, pests, bad soil, or bad weather should be avoided. In most of North America, the most important crops will be corn and beans. Of course, those would have to be open-pollinated types, because hybrid varieties do not produce viable seeds — you have to buy the seeds every year from big companies that produce them, and those big companies will not be around in the future (which is perhaps a great blessing). In other parts of the world, other grains will be more suitable.

Good farmland will of course be scarce, but many people will become aware of one of the curious side-effects of the urbanization that has characterized so many countries since the Industrial Revolution: the abandonment of good land. Over the last few centuries, as people moved from the countryside to the city, the result for some of those rural areas was a considerable decline in population. The same process is still underway.

Even in highly developed countries, although the cities may be crowded there are rural areas that are steadily losing population. Such depopulation will present opportunities for those with a pioneering spirit. Admittedly a lot of these abandoned lands are what the encyclopedias dismiss as “marginal uplands” — as opposed to the lands along the valley bottoms, where rivers and rains have carried the good soil — but the better farmers will know how to deal with these more-fragile environments.

It would be quite an understatement to say that, without gasoline and diesel fuel, transportation will be limited. Not only will the fuel be lacking, but even the roads to drive on will become less common. Anyone who has driven past a construction site should suspect that a modern road is not as durable as Roman aqueduct. Asphalt is made from oil. As oil becomes scarce, so will asphalt, and paved roads will therefore go unrepaired. As social chaos intensifies and municipal governments watch their budgets disappearing, the maintenance of paved roads will be further reduced.

When those roads are not repaired, it will take little time for them to become cracked and unusable, and they will often be blocked by smashed and abandoned cars whose owners have lost the ability — or the sheer willpower — to keep them running. In any case, the main roads will generally be going in the wrong directions: from one city to another, exactly where most people will not want to go. Any clever human being would stay away from the cities, and instead go up into the hills, well away from populated areas, further on, to greener pastures.

There will be only 3 methods of travel: on foot, in a non-motorized boat, or on the back of a horse, a donkey, or some other animal. One’s speed by any of these 3 methods will be about the same: 40 km per day, if one is in excellent shape. For short distances, one means of transport may be quicker than another, but the longer the distance we take into consideration, the less it seems that walking is to be despised. Certainly the history of bicycles is not likely to go on for much longer: even where paved roads are usable, bicycles will be hard to repair without the industrial infrastructure to provide the spare parts and the servicing.

It should be obvious that those who live in the country will be better prepared than those who live in the city. A city is a place that consumes a great deal and produces little, at least in terms of essentials. A city without incoming food or water collapses rapidly, whereas a small community closely tied to the natural environment can more easily adjust to technological and economic troubles. Even out in the country, however, the present housing patterns often resemble the gasoline-induced sprawl of the suburbs.

Paradoxically, many “rural” areas have become “urbanized,” in the sense that they are doing their best to imitate the worst aspects of large cities. More useful would be something resembling a traditional village, with the houses at the focus and the fields radiating from that point — we can read Thomas Hardy’s novels to see how this used to be.

“Something resembling a traditional village” is, of course, different from the real thing. In a genuine “traditional village,” people have known one another for generations, and a bunch of pale-skinned visitors is not likely to be received with open arms. If these urban refugees show up flashing their useless credit cards all over the place, and demanding assistance, but they have no practical skills and don’t even have the muscles for basic manual labor, it is unlikely that they will be welcomed in any long-settled community. These refugees will have to develop their own communities, and they will have to overcome the problem of their inadequate social skills. But they will learn. In spite of themselves, they will learn.

I have great hopes for the future, when the hard times are over. By the end of the present century, the human population will be much smaller than it now is. The 200-odd nations of the present day will be only a dim memory, and the major languages will have broken up into local dialects, to such an extent that a linguistic outsider will be one who lives only over the next hill. Grass will be growing everywhere, and the long miles of cracked highways will be merely a curiosity.

Yet those days will not be the Dark Ages: on the contrary, starlight will once again appear over the cities at night. Humans were not designed to live in groups of such immense size as we see today, nor were they given the physiological equipment to deal with the over-stimulation of crowded living-spaces. It is also true, for various reasons, that the sight of green trees is more pleasing than that of gray machines. It is not just a platitude to say that we are out of touch with Nature.

We can compare the coming age to a world of many centuries ago. In the year 731, the Venerable Bede wrote his “Ecclesiastical History,” in which he describes the world of the Seven Kingdoms into which Britain was then divided. Bede begins his story by telling us how the Roman Empire was destroyed by the Goths in 410, as a result of which the Romans no longer ruled in Britain, and the people of that land had to develop a culture separate from that of the Romans.

In Bede’s time the empire was still in the process of turning to rubble and dust, but England’s “Dark Ages” were filled with light, as the monks scratched away in their scriptoria. Bede himself almost single-handedly invented the writing of true history, that is to say, history based on a fixed and accurate dating scheme. Thirteen centuries after Bede, we might be proud to accomplish as much.

I think even in our dreams we imagine a less crowded planet, a less noisy one, a less busy one. Such dreams tell us the obvious truth that daily life should not be a fast-paced interminable struggle of each person against every other. Surely we imagine standing in a doorway and watching golden fields of grain rippling in the wind like the waves of the sea. Surely in our dreams we imagine the song of the scythe, and the whir and thud of the loom. Who knows? One day anything might happen. Perhaps we could even have a world where people can live with nobility, dignity, and grace.

See also:
Island Breath: After the Techno-Fix 12/25/07