Academic paper: Indigenous African soil enrichment as a climate-smart sustainable agriculture alternative by Solomon Dawit et al, Frontiers in Ecology and the Environment, 2016; 14(2): 71–76, doi:10.1002/fee.1226
Like many people interested in sustainable agriculture, I'm always interested in texisting echniques that might be applied more generally. I've become particularly interested in anthropogenic soils where these relate to unique habitats, such as the machair of the Scottish Western Isles and parts of the mainland west coast. This technique appears to produce similar results to the terra preta soils found in parts of South America and involves, among other things, the application of pyrogenic carbon (commonly known around here as biochar), which might help us to develop best practice in its use. These soils sequester disproportionate quantities of carbon, have abnormally high cation exchange capacity, and make a large contribution towards food security and the incomes of those families practicing the technique, allowing the creation of productive agroecosystems in difficult conditions. It sounds too good to be true.
That is if the claims of sustainability stack up.
It's clear the authors are aware of issues surrounding soil degradation, climate disruption and food insecurity, and this seems to have been a motivating factor behind the study. It's unclear whether this is a technique that crossed the Atlantic during the Columbian Exchange (the soils in South America seem to originate much earlier than those in Africa) or if the same or similar techniques with comparable results were developed independently in several locations but, for our purposes, it probably doesn't matter.
I have three questions:
1) Is this a well-designed study and a high-quality paper?
2) Is the technique genuinely sustainable?
3) If so, can the technique be more generally applied?
The study examined a total of 14 sites in three areas in Ghana and Liberia, out of a total of 177 candidate sites. I'd consider this a small sample size, but this is respectable given the constraints of studies in agronomy: I've read other studies in the discipline that use much smaller samples, sometimes comparing single sites in a way that verges on the scientifically bankrupt. The study is probably just about big enough to generalise from. Recognised anthropological techniques were used to understand the means by which the dark earth soils are maintained. The main problem is the source of comparison. The dark earth soils were compared with nearby soils as a means of ascertaining the key differences, which were measured using standard soil sampling techniques.
The problem is that the nutrients in the dark earth soils are concentrated from surrounding areas, and a large area of land with low productivity is needed in order to create and maintain the higher-productivity dark earth soils. This presents us with two issues.
1) The surrounding soils may be poorer in nutrients than they would otherwise be had they not been robbed of nutrients for the formation of dark earth soils and
2) This seems to bring the sustainability of the whole enterprise into question.
The techniques used successfully transform weathered, infertile oxisols and utisols into fertile, enriched soils that make a disproportionate contribution to household diets and incomes. Those plants that grow poorly on the surrounding soils are much more successful on the dark earths. Biological availability of nitrogen, calcium and phosphorus is much higher in the dark earth soils, and aluminium toxicity is much lower, in part because of lower acidity. Critically, for our purposes, this supports the kind of multi-storey home garden that we have seen in other discussions is key to high levels of carbon sequestration involved in the most effective forms of carbon farming. This applies particularly to the degraded red and yellow earths common in much of sub-Saharan Africa, and may apply elsewhere. Many of the specific organic residues used to maintain these soils are unlikely to be found outside the region, but the same principles should apply. The pyrogenic carbon is probably stable over the long term.
The problem with the pyrogenic carbon is similar to the sustainability issues surrounding what we usually call biochar: it involves the moving of nutrient from one place to somewhere it can be concentrated. This makes it inherently unsustainable, not because of the carbon but because of the other nutrients that go with it. The same applies to many, of not most, of the other residues used to maintain the soils and the agroecosystems they support.
Where does this leave us? I think this is a well-conducted study with exaggerated implications. The authors themselves acknowledge that the broader application of the techniques are unknown. These techniques are not as sustainable as the authors assert, in effect stripping nutrients from one place to concentrate them in another. As a cultural phenomenon it's extremely interesting, especially given the loss of indigenous knowledge that was a consequence of the European invasion of South America. It may provide pointers to the reasons for mixed results in biochar trials. In our present situation, where we need to maintain high yields over the long term over our existing agricultural land in order to support growing populations, it's probably not a generalisable solution.