Below the fold is the 3rd in a series of posts providing analysis (and follow up) on the difficulties of maintain our current energy paradigm with renewable energy (generally, 'the fake fire brigade'). The main authors are Hannes Kunz, President of Institute for Integrated Economic Research (IIER) and Stephen Balogh, a PhD student at SUNY-ESF and Senior Research Associate at IIER. IIER is a non-profit organization that integrates research from the financial/economic system, energy and natural resources, and human behavior with an objective of developing/initiating strategies that result in more benign trajectories after global growth ends. The authors wrote over a 20,000 word follow-up to the questions raised in the original posting and we've broken into 4 pieces for readability - the 2nd installment, with a focus on large scale biofuels, follows. In this post, the authors show that even if we can double current extraction of biomass, this only represents approximately 6% of total U.S. primary energy consumption, and will likely never be enough to serve as a meaningful stabilizer in future energy systems.
The previous essays in this series were 'The Fake Fire Brigade - How We Cheat Ourselves about Our Energy Future' and Revisiting the Fake Fire Brigade Part 1 - General Issues.
In many resource discussions, biomass emerges as a solution that allows us to continue many activities currently powered by fossil fuels: First, to move cars, trucks, machinery and planes when oil runs out or becomes too expensive. Second, to provide flexibility in electric power generation, i.e. when other sources are stochastic and inflexible, biomass would provide the necessary gap-filling power. Third, to heat our homes, after natural gas production declines. There are many estimates of future uses for biomass, and many new technologies that are making their first baby-steps, quite a few of them promising and worth trying.
When we add up what biomass is projected to accomplish in the future, we quickly arrive at large numbers, which don’t take into account the fact that extracting energy from solar flows is not something that can be easily scaled up at our discretion. Below, we look at some key biomass technologies, and also spend some time attempting to put claims and reality into perspective.
Ultimately, there is no doubt we can produce some transportation biofuels, and little stands against burning wood or waste to generate heat and electricity, as many countries already do. However, what we need to bear in mind is that these solutions will replace only a fraction of today’s fossil fuels, and with far less flexibility. Below, we try to present an overview of biomass as an energy source and its significant limitations in the bigger picture.
To do so, we look at a model of the total theoretical biomass potential for energy in the United States, and then compare a number of potential uses. The U.S. is one of the world’s least population dense areas with moderate and inhabitable climate, which provides significant biomass potential on a per-capita basis. Europe, for example, has 3.6 times as many people per square kilometer when compared to the U.S., so any biomass calculation for the “old world” looks significantly worse, even when factoring in that an average Western European country only consumes about 50% of the primary energy per capita when compared to the U.S. (EIA). While we will look at the “best case” in the developed world today, Europe has its own history of biomass limits, which we now briefly explore.
At the end of the 13th century, Europe’s forest coverage, as historic data sources show and isotopic analyses of sediments confirm, was as low as approximately 10% of its total land mass. Today, 38% of the EU countries’ surface is covered with forests, up from 25% only 50 years ago. At the same time, today’s population in the EU-27 countries stands at close to 500 million, whereas at the peak in medieval times (between the years 1300 and 1350), Europe was inhabited by 70-100 million people.
Back then, the continent was reaching its biophysical limits for supporting humans in a biomass-based economy, and after the last marginal spots of land were put to use for farming, there was little that could provide any more sources for food, fire and construction wood for a thriving and quickly growing society. The discovery of fossil fuels and the New World were still things of the future. A few poor harvests led to famines in the 14th century all across Europe, and when an already undernourished and thus weakened population was exposed to the black plague, Europe’s population shrank by 35-50% (depending on sources) within less than a century. Things only improved once untapped resources in America were discovered, and new trade routes opened. So clearly, back then, there wasn’t enough sustainably growing biomass for a fraction of the people living today.
Crop yields a few centuries ago were equally only a fraction of today’s, because many things known and available to us weren’t. However, current yields – a result of ongoing research and innovation – are not just a product of science and better technology, but also come at the cost of additional energy inputs from (mostly fossil-fuel driven) fertilizers, minerals, pesticides, irrigation, equipment and other technology. Thus, while human labor requirements in agriculture have fallen, net output increases in terms of “energy content in food” are much smaller after subtracting the fuels and embodied energy in the equipment used in modern agriculture. Some authors even outright claim that when factoring all this in, agricultural outputs are energy negative, i.e. require more (non-solar) inputs than they generate in outputs (Steinhardt and Steinhardt, 1974)
One of the key reasons why we were able to dramatically increase yields is the fact that we have been able to better understand how plants grow. By comprehending this process and managing it properly, we are able to emulate nature’s cycles while still extracting significant amounts of biomass for human purposes. We do so by adding fertilizer and nutrients, providing enough water, by eliminating competition through herbicides and pesticides, and by focusing on high yields through breeding, hybridization and genetic modification. To understand the scalability of biomass, understanding some of these basics helps, so we think.
“Primary production” is the term that describes the creation of new organic biomass from inorganic molecules, namely water and atmospheric (or aquatic) carbon dioxide. In green plants, this is accomplished primarily through photosynthesis, and requires the presence of certain macro- and micro-nutrients. We calculate net primary production by subtracting from gross primary production the energy lost to cellular respiration and maintenance metabolism. Net primary production (NPP), then, is the rate at which plants in an ecosystem transform diffuse solar energy into useful energy. Some of this energy is used by the plant to grow and reproduce; some energy is transferred to herbivores and omnivores in the ecosystem, when they eat plant biomass.
Terrestrial primary production varies based on many factors, chiefly insolation, temperature and precipitation. There are 16 identified chemical elements essential to plant growth. In addition to Carbon, Hydrogen and Oxygen available in air and water, macronutrients from the soil include Nitrogen, Phosphorus, Potassium, Calcium, Magnesium, and Sulfur. Other nutrients are required also, though in much smaller amounts. These micronutrients include Silicon, Manganese, Boron, Copper, Iron, Chlorine, Molybdenum, and Zinc.
Figure 1: simplified nutrient cycle (IIER)
In adequate conditions, plants grow to maturity, produce offspring, and then die. Under most natural conditions, approximately 0.1% of incoming solar energy is converted into biomass by plants, and 10% of that is assimilated by higher level consumers. The remainder of the plant biomass is shed as litter or returned to the ground when the plant dies. Consumers that eat plants also return their nutrients to the ecosystem by excreting wastes and by their bodies decomposing after their death. Decomposers in the soil (bacteria, fungi, etc.) consume this waste from plants and consumers and make these nutrients available for uptake again (Figure 1).
While all of the nutrients listed above are required for plant growth and respiration, plant growth is a function of the least available nutrient. Liebig’s Law of the Minimum states that if one of the nutritive elements required for plant growth is deficient or lacking, growth will be poor even when all the other factors are abundant. Therefore, in an ecosystem with few losses to the environment outside its boundaries, nutrients are cycled efficiently through the system. In an ecosystem where humans extract large amounts of biomass, however, nutrients can be drained from the soil at a rate greater than natural replenishment. Over time, the quality of the soil degrades, and net primary production falls. Below is an example of a more detailed nutrient cycle for nitrogen.
Figure 2 - Nitrogen Cycle (Source: Wikimedia Commons)
It is therefore important to note that over time, land under agricultural production degrades because of permanent extraction from growing crops, erosion, and other factors. A highly productive piece of farmland requires increasing fertilizer and mineral inputs to maintain high yields. Similarly, yields can be raised even on lower-quality land by increasing the amount of fertilizers and minerals used. However, this relationship is non-linear, and at high levels of fertilizer use the response becomes asymptotic. Since the farmlands degrade over time due to nutrient depletion and erosion, any disruption in inputs would reduce yields to a level below the original yields obtained on the site without inputs (Figure 3). In addition, the application of herbicides and pesticides can render the soil sterile and falling humus levels in the soil result in decreased soil moisture and contribute to soil erosion.
Figure 3 - Isopleths of Agricultural Yield. Source: Hall et al. (2000)
It is relevant to understand that each human extraction of biomass, if not compensated by the return of an equal amount of nutrients, weakens the future potential and capacity of the land. Ultimately, only ecosystems without human inputs and withdrawals arrive at relatively stable situations, where they operate at or close to a 100% regeneration rate.
So simply put, if we plan to extract significant amounts of biomass, this will ultimately deteriorate the soil and affect future fertility of the land, unless we can replenish it. There are multiple ways of replenishment, but all incur energy expensed elsewhere for those replenishments or slow the rate of extraction and make harvesting more complex (fallow periods, frequent switch of plants, multi-plant environments). A good source for more detail is provided here: UBC Soil Web.
The current extraction of biomass in the United States might provide some guidance regarding future potential. A current practical maximum of sunlight conversion into biomass is approximately 1W/m2 on average per annum. This translates to a maximum of 8.76 kWh (1 Watt x 365 days x 24 hours) per square meter and year. However, this is accomplished on prime land that is flat, low altitude, with sufficient insolation and water, with specific crops and with at least some fertilizers and technology use. Therefore, assuming even half (0.5 W/m2) as an average is probably generous for any larger country, as many locations don’t meet those ideal criteria due to elevation, weather, latitude, shades, absence of sufficient water, lower quality soils, etc. A look at many sources, including a recent FAO documentation supports this assessment.)
For the United States, this assumption of 0.5W translates to a total energy capture of 29’765 TWh (or 101.6 quadrillion BTU* (short: quads) for the continental U.S. for all cropland, forests and pasture areas. For comparison, total oil consumption in the United States amounted to 41.4 quads in 2008, and the country’s total energy use was 99.4 quads. In 2009, total energy consumption in the U.S. was down to 94.6 quads, mostly related to the recession. So ultimately, what is theoretically possible to be annually produced as biomass from U.S. land almost exactly equals today’s total energy consumption.
*The assumption of 0.5 W/m2 and a theoretical maximum production of 100 quads is significantly higher than available in other scientific publications.
When looking at the above theoretical maximum of slightly above 100 quads, there are quite a number of uses for this biomass:
In the future, others have proposed to use more biomass for the following purposes:
To understand the feasibility of this objective, and the scale at which this is possible, it is useful to look at aggregate numbers from an ecosystems perspective.
In 2008, total agricultural biomass outputs (including animal feed and base for fuel production, excluding forests) equaled to approximately 6.6 quadrillion BTU (quads) . This calculation is based on the energy content in retrieved crops, where quantities can be found in the FAO’s agricultural database. Of those, approximately 20% (1.4 quads, mostly from corn) went into biofuels, producing fuel with an energy content of about 640 trillion BTU in the form of ethanol and biodiesel (using additional energy inputs from fossil fuels).
Additionally, in a meat-intensive society like the U.S., a significant portion of agricultural biomass (approximately 2.9 quads or 44%) is used as animal fodder. On top of that, animals feed from pasture land, which – at a relatively low conversion efficiency, but with very limited inputs – indirectly adds another 1.8 quads to the human food chain. Together, these 4.7 quads produce about 570 trillion BTU (0.57 quads) of annual animal products (meat, milk, eggs, cowhides, etc.). It is difficult to estimate the total quantities extracted from pasture land through grazing and haying, but as the above calculation translates into an 8:1 conversion ratio between fodder input and animal protein output, it seems reasonable. It is important to note that a significant part of the pasture land isn’t feasible for other uses, due to soil quality, insufficient water availability, topography, or size.
On top of that, about 0.1 quad goes towards other uses (tobacco, fabrics), so ultimately, from all arable land, about 2.2 quads are processed for human food directly, supplemented with another 0.6 quads from animals, and about 0.02 quads (20 trillion BTU) of biomass extracted from water bodies (fresh- and saltwater), which we will ignore due to its insignificance.
From U.S. forests, about 300 million dry metric tons of wood and byproducts with an energy content of approximately 5 quads were removed in 2008 (derived from FAO data). Of these, a significant portion of approximately 2 quads (EIA) went into energy production, mostly in the form of waste from saw mills, paper production and others already, which already today get mostly converted into electricity or process heat – in most cases providing the energy to process the wood and its by-products.
Table 1: Biomass extraction in the U.S.
So in summary, Americans extract approximately 13% of the gross biomass potential of the country, of which we ultimately use about 9%, after converting a significant portion to animal protein. Other sources say that this extraction is closer to 25% of total land potential on a global scale, so we might be too optimistic about the initial potential.
Figure 4: U.S. biomass outputs (2008)
Agriculture and forest use have been areas of high innovation during the past decades. If not for the “green revolution”, which included the optimized use of fertilizers, pesticides and hybrid or genetically modified “designer” crops, we wouldn’t be able to feed almost 7 billion humans on Planet Earth the way we do today.
Together, those human biomass extractions amount to between 9% (net) and 13% (gross) of the already generously calculated maximum of 100 quads of total annual biomass production in the U.S. All this is accomplished after a few thousand years of trial and error, with most of the gains in the past century from the application of inputs based on fossil fuels. Therefore, the above numbers do not include the significant energy inputs into extracting those 13.4 quads of gross output.
Should we have to provide external energy inputs mostly from biomass, it would further reduce net outputs. We have only limited knowledge about the exact ratio, but there are some indications. Pimentel (2007) calculates an EROI of 4:1 for corn (the crop itself, not the ethanol product). A 2004 estimate by the Congressional Research Service, estimated that 1.7 quads of inputs are required to produce the agricultural part of the biomass (6.6 quads gross, or 4.3 quads net after conversion to animal protein). The latter results in a similar net energy return ratio (EROI) of 3.9:1 (gross, generously ignoring pasture) or 2.5 (net, after conversion of fodder to animal protein).
All energy we use can be initially measured as primary energy. To do so, we count the raw energy content of each source contribution to our energy systems. The concept of primary energy allows us to compare what would otherwise be“apples and oranges”. For example, 1 kWh of directly usable electricity produced from hydropower is counted exactly the same as 3’413 BTUs (equivalent to 1 kWh) of theoretical heat value in coal, which has to be converted to something useful like process heat or electricity at a loss. To continue with this example, if we had to replace 1 kWh of hydro-powered electricity with coal, we would have to use 3 kWh of embodied energy in coal, converting it to electricity in a power plant with a typical 33% efficiency. So in order to serve the same purpose (in this case electricity), we would need three times as much coal to make the final amount of electricity available to society.
This brings us to one of the key problems of biomass, which doesn’t just apply for uses in agriculture: when it comes to primary energy content, we need more primary energy per unit of usable energy from biomass relative to equivalent fossil fuel based energy. The key reason for this is because biomass undergoes a lower conversion efficiency when turned into the energy forms we want to use (electricity, transportation fuels, etc), due to its low energy density, its state (solid, not fluid or gaseous) – both of which lower the effectiveness of converting the primary energy into usable fuels.
Table 2: Energy density and conversion efficiency of multiple materials
This translates to a different energy delivery profile (lower heat, less flexibility), and thus lower overall efficiency. The only place where biomass achieves a relatively good result is in heating, where relatively low temperatures are required. However, bringing biomass into individual buildings would require major refitting, which will reduce overall efficiency again.
To arrive at useful energy inputs, these original biomass inputs undergo a conversion, which creates significant losses. On average, many of those biomass-to-useful-energy conversion processes operate at efficiencies between 30 and 50% to arrive at the desired energy quality that is on par with comparable fossil fuels (coal, oil, natural gas). Below, we present a few approximate conversion efficiencies for multiple technologies and compare them with traditional pathways to produce the same final output. All examples are gross conversion efficiencies, not taking into account the energy used in the equipment, which mostly is more complex for biomass.
Table 3: Penalties for desired biomass solutions (IIER)
In order to understand possible consequences, we applied the lower efficiencies of biomass conversion to gross consumption of fossil fuels in the U.S. in 2008, based on end use. Approximately 143 quads of biomass are required to replace the 83.34 quads of primary energy consumption of fossil fuels – coal, natural gas and crude oil (Table 4). This handicap is not due to lack of appropriate technology, but simply related to the chemical properties of the materials involved, e.g. state (solid, fluid, gaseous) and energy density of the different sources. With significant shifts to better pathways, for example by heating more homes with wooden biomass directly, we might improve the situation slightly, but never fully, and always at the cost of significant technology investment.
Table 4: Estimation of gross biomass needed to replace the useful energy from fossil fuels
If we take the more generous EROI ratio for biomass (3.9:1) and assume that the ratio remains unchanged when scaling to higher levels of biomass extraction, we can estimate how much biomass would be required to produce the same (or more) outputs from biomass in a self-sustained system where farms produce their own energy. In order to use biomass to provide the power to grow and harvest additional biomass, we also need to convert it into a usable form, e.g. to run a tractor, we need Biodiesel or Ethanol, to produce fertilizer we would require methane and or hydrogen, to mine and process nutrients, we would need fuels and electricity, and so on.
Thus, when looking at the 6.6 quads of total biomass from crops, an additional 1.7 quads of energy inputs would – at a 50% conversion rate – actually require 3.4 quads of agricultural biomass as primary input. So in that gross input model, for each net unit of energy extraction, input worth 0.51 units would be required.
Table 5 - gross energy requirements for agricultural output when energy is self-provided
Figure 5: U.S. cropland biomass outputs and energy input requirements
It is probably fair to assume that forest related extraction is slightly less energy-intensive, maybe operating at 5:1 for the gross energy retrieved. Inputs to forestry include: management expenditures, planting new trees, harvesting, and transporting removed biomass. In this case, assuming a 40% efficiency of converting cellulosic biomass into useful input energy (which is very ambitious), we would need 0.5 energy units for each final unit of energy extracted (0.2 fossil fuel inputs for each unit of final wood biomass would require 0.5. raw units of biomass before conversion to useful energy).
These examples show that a re-introduction of a biomass-based society will likely use a significant amount of the biomass itself in order to steadily reproduce the same quantities.
One indication of how much biomass could be extracted comes from an otherwise very optimistic renewable energy study conducted in 2000 by Pimentel et.al. where the authors assume that total biomass extraction for energy could be grown to 5 quadrillion BTU in 2050, about 50% above today’s. However, other scientists and energy planners believe that today’s achievement of diverting about 10% of the annually generated biomass to human use can be easily doubled, tripled or quadrupled within a few decades. One of the most ambitious estimates was published in a combined study by the Department of Energy and the U.S. Department of Agriculture in 2005. This study assumes that the U.S. would be able to sustainably produce gross 1,366 million tons of dry biomass per annum available for energy purposes by 2030. This quantity would amount to 23.5 quadrillion BTU, or ¼ of current U.S. energy needs, about 23% of the theoretical maximum biomass net production. To our knowledge, this is the most ambitious estimate available; most others suggest various fractions of it. When adding current uses for food and industry, total extraction would amount to almost 37 quads, or 35% of total maximum potential. This means that we are to triple our biomass extraction from nature within a few decades.
Below, using these estimates, we build what we consider an optimistic model for the United States where biomass is produced with a generous input to output conversion ratio of 5:1 for biomass production (not including solar radiation), assuming future technology improvements from today’s levels. Then, the part of outputs going back into the production of biomass needs to be refined and converted to its final use (such as methane for fertilizer production, fuels for machinery, feedstock for electricity), which we assume will happen at an average conversion rate of 50%.
Table 6: Theoretical need for biomass at EROI of 5:1 and conversion efficiency of 50% for applicable energy
The above calculation shows that in order to net 32.8 quads worth of food and energy within those two decades we would have to extract almost 50% of the total annual U.S. biomass production for human use, compared to approximately 10% today. This is not something we consider feasible.
Figure 6: Energy output from biomass in 2030 DOE/USDA scenario
We could go into further detail regarding those projections, for example analyzing the assumption made in the above EIA/USDA paper that we can safely remove about 2.6% of all biomass from forests every year without adversely impacting the ecosystem (current removal rates are at around 1.5% in the U.S.). Knowing that even the fastest growing pine plantations have a cycle time of 30-35 years, and other trees around twice that much, makes this objective of extracting an average forest’s biomass every 38 years ambitious. These projections also ignore events like forest fires, pests, droughts, and other things leading to bad crop years.
All in all, it seems highly unlikely that we will be able to triple our output and divert almost 50% of total annual potential U.S. biomass production towards humans. Particularly since we know that this system is built on a cycle, which in the long run expects almost the same input as it produces.
Phosphorus is one of the key macronutrients required for plant growth. Unlike other limiting nutrients, phosphorus may be the most difficult to replace. It cannot, irrespective of crop, be fixated locally, and is relatively rare. To maintain or increase yields, we must replenish the soil with phosphates whenever we extract biomass. Phosphates are required by almost all species – not just plants – to build DNA, and if it’s absent, yields shrink relatively quickly.
Unfortunately, of all the ingredients needed for plant growth, phosphate rock is the one with the smallest amount of known reserves globally, which are geographically concentrated to just a few places on earth. Currently, phosphorus, which is mined in the form of phosphate-bearing rock, has known reserves of about 16 billion metric tons worldwide (USGS) , which represents about 100 years of current use. However, already now, the “half-empty glass” problem becomes increasingly visible, as phosphorus content of mined phosphate rock is decreasing rapidly, and prices are going up accordingly. Some people are talking about “peak phosphorus” within less than 30 years, but again, this is a moving target, depending on exploration and effort. But what is clear is that phosphorus, once it becomes scarce, will become more, and potentially very expensive to extract. 2008 provided a brief glimpse of what a world of limited phosphorus could look like. Shortly before the economic crisis hit, phosphate rock prices rose from $30-40 to 400$/metric ton within less than one year. With the crisis, they came back down to about 70-80$, still twice as much as before the increase .
If we plan on extracting two or three times as much biomass from our soils, one would expect that we will have to increase phosphorus use. So in order to produce those 32.8 quadrillion BTU, we likely would need three times as much phosphorus as today, which will pull the end of easily accessible phosphorus closer, maybe as close as 10 or 20 years.
This leads us to the conclusion that our current agricultural systems aren’t as “renewable” as they may seem.
After covering the entire system’s potential, we see only limited opportunity for any significant biomass output growth. However, current plants used for ethanol production aren’t ideally optimized for inputs (a vast majority in the U.S. is corn), which has led to the discussion of replacing them with other plants with better growth profiles. Among them are a number of perennial grasses, like Miscanthus, or switchgrass, which both produce cellulosic biomass. Sugar cane, the key energy crop used in Brazil, which produces high shares of carbohydrates, unfortunately doesn’t grow in most U.S. regions due to its need for high moisture levels and temperatures. Importantly, these grasses (as is sugarcane) are capable of fixating nitrogen directly (due to bacteria in their own rhizome), which greatly reduces nitrogen fertilizer needs. In addition, they produce – under ideal circumstances – biomass output close to the aforementioned 1 Watts per square meter. The downsides are that the produced biomass is cellulosic, which requires a much more difficult-to-manage fermentation (or gasification) process to turn it into a useful combustion fuel like ethanol or biodiesel. As of yet, no meaningful and scalable processes have emerged with decent conversion efficiency. What works is to burn cellulose for the generation of heat and electricity, which is the predominant use of wood residuals, but even there, conversion efficiency is low, particularly for electricity.
Within biomass for energy, we basically look at three types, with different characteristics: lipids (oils), carbohydrates (sugars, starch, etc.) and cellulosic materials (wood). They are different in their chemical structure, and come from different parts of plants. As humans can’t digest cellulosic biomass (unlike cattle), our food mostly consists of lipids, carbohydrates and proteins. Humans utilize grazing animals to convert cellulose to usable biomass in the form of milk and meat. In order to understand the potential of biomass further, it is worthwhile looking at the potential of each plant category. We won’t go too much into detail, but do provide an overview. In the table below we show the characteristics of each group and their usefulness for energy crops.
Table 7 - Characteristics and feasibility of various crops.
The fact that cellulose is so hard to “digest” into something that is useful as a liquid fuel, led to a focus on biofuels that either come from lipids (Biodiesel) or carbs (Ethanol). The generation of cellulosic ethanol proves to be much more complex and energy-intensive when compared to fermenting ethanol from carbohydrates, which is the key route taken today in the U.S (mostly from corn) and Brazil (mostly from sugarcane, where the bagasse also provides the process heat from its cellulosic material).
There are several reasons that algae are seen as a promising future source of renewable energy, especially as a source of biofuels. First, algae grow at a high conversion efficiency (theoretically up to 4 W/m2), with a large share (up to 60% of total mass) of easy-to-process lipids and proteins. Unlike corn ethanol, sugar cane, and any energy crop that requires higher quality arable land to grow, algae can minimize or avoid competition with food crops. Many plans involve using desert locations to maximize sunlight use. Also, algae can use waste water and saline water inputs and can be fed from stationary sources of CO2, turning sources of waste (such as carbon-or natural gas-fueled power plants) into energy and recycling carbon emissions (DOE 2010). The problem with algal fuels is that their production has never moved beyond small scale trials (relative to consumption), so scalability is still an unchecked issue.
One might think that this is normal with a brand-new technology, however, it seems important to note that – like most other things discussed as alternatives today – the concept of turning algae into fuel is not a new one. Research began as far back as the 1950s, but did not gather true momentum until the first oil shock of the 1970s, when the U.S. government funded the largest research program related to algal biomass, which ran from 1978 to 1996 (the Aquatic Species Program). The results were disappointing, with open pond systems often failing due to the starting strain of microalgae being overgrown by other, less energy-producing species that contaminated the pond. Open ponds, particularly in places with high insolation, also require massive amounts of water. A hectare sized cultivation system can require 2 million liters of water (530,000 gallons) to fill and around 50,000 liters a day to replace evaporative losses (DOE 2010). Since most of the evaporation is pure water (H2O) only, the pool cannot be endlessly replenished with wastewater, because this would slowly increase concentration of toxins and other materials in the pond. Providing freshwater, on the other hand, is energy-intensive, particularly in the locations where insolation is high. These are some of the reasons why things haven’t worked out so far. None of the companies suggesting open-pond systems seem to have answers to this problem, and to us, it looks like one that doesn’t yet have a solution.
Closed-system approaches, where algae are grown in a sealed tank that is supplied with carbon dioxide and nutrients in a controlled fashion have produced promising results in the lab, but have equally faced difficulties scaling up to a meaningful size. The key problems continue to be stability and long-term returns, plus the high cost of creating a clean and closed environment, which can be compared to gigantic petri-dishes with a lid. The key issue with closed-system approaches is that the only input we don’t have to provide from the outside the system is the sunlight; everything else needs to be produced, supplied and managed directly. Producers must use inexpensive sources of nutrient supplies, especially N, P, and K, to keep biofuels from algae cost-competitive. Higher quality sources may compete directly with inputs into our agricultural system. Closed systems require constant monitoring and precise application of nutrients as well. Too little of an additive and the crop will fail to produce; too much may render the growing medium toxic – or lead to issues with waste water disposal. Recycling the growth medium and wastewater from algal plants is possible, but expensive and energy intensive. Usually, this doesn’t bode well for EROIs, despite potentially higher yields and lower environmental impact.
These are the two approaches tried for algal photosynthesis. As with any yet unproven technology, promises are high, but equally are risks, and most seem insurmountable to us, or only at the cost of large energy inputs to maintain stability in a scaled-up facility. One other use of algae is not so much for the creation of new biomass (autotrophic), but instead for a heterotrophic way where algae do not need sunlight but instead are fed carbohydrates (from plants) and convert them into lipids, thus creating an almost fuel-type input. While this system might actually increase the efficiency conversion of – say – corn to a biodiesel product, it does not change the primary requirement for growing more biomass on land. It is important not to confuse the two approaches, as this one doesn’t overcome the limits of growing biomass on land. And again, it will suffer from similar problems with scalability. Links to algal biofuel information:
IIER’s European offices are in Switzerland, the country that has abandoned most landfills, and instead burns more than 95% of its non-recyclable flammable waste (including outputs from wastewater treatment) to generate energy or process heat. Overall process efficiency for combined heat and electricity generation (for which about 70% of total waste is used) is approximately 42% (in German), which includes energy for internal use of the waste power plants.
Using Switzerland as an example, from the original energy content, 35% can be extracted as energy for external use (24% as heat, 11% as electricity), about 6% are used for the operation of processing facilities. Technology feasibility studies show that net electricity production might be slightly increased to probably 15% of total energy content. What is important to know is that those waste plants work best and most efficient when producing base load at a high load factor of close to 90-100%. They are – like other entities burning biomass – not really capable of modulating easily below a 60% level, as total fuel is so heterogeneous and needs a constant level of heat to burn properly.
When applying today’s Swiss technology to the U.S., converting 232 million metric tons of waste to energy at the above rates, this would deliver approximately 119 TWh (0.4 quads) of base load electricity (at 15% efficiency), which is nothing more than 3% of today’s electricity consumption, and 0.65 quads of usable heat energy (at 25% efficiency).
Based on all the above assumptions, we have tried to establish our own best case for future potential biomass extraction. We assume a significant realignment of land for growing energy crops (or algae), steady doubling of outputs from some of the switches, and a much more efficient energy conversion process compared to today’s best-in-class technologies. Our model operates with the following assumptions, which even require hard decisions by people, e.g. the reduction in consumption of animal protein: - Reassignment of all corn land used for biofuels (mostly corn) to a better biomass producing crop, doubling the output per acre
Table 8: gross and net increases in biomass production
With these rather radical steps, we would expect today’s biomass gross energy extraction could be increased from 3.4 quads to 9.8 quads. However, as this energy is in “low quality” form, converting it to useful inputs (like combustion fuels or methane) will likely yield only 40% of that gross amount, e.g. 3.9 quads in this scenario.
The above 9.8 quads (gross) or 3.9 quads (net) are what we can expect as biomass output per annum. This does not take into account the required energy and material inputs into production. In a future scenario where farms are self-sustained, these inputs would have to come from biomass once again, which would reduce net available energy even further. The 9.8 or 3.9 quads are only available as long as we mostly have fossil fuels as inputs into agricultural production.
Biomass (and waste) have been repeatedly quoted as a source of flexible electricity generation. As already described in the section about waste, biomass is – when burned directly – mostly capable of providing base load energy, as the flexibility of burning wood at relatively low temperatures is very limited. Also, a storage problem will make it unrealistic to keep biomass as a flexible source for (long term) bridging of gaps in other renewable production. This is due to the low energy density of cellulosic biomass when compared to its fossil competitors.
One feasible alternative is the conversion of biomass to methane. However, this process that uses two steps (fermentation first, then burning) shows low conversion efficiency, turning an already small amount of energy in biomass into an even smaller amount of electricity.
In this part of our series of posts, we have revisited biomass potential as a contributor to future human energy systems. Until now, we only use biomass as a by-product of other uses and/or in subsidized ways (both by money and fossil fuels) at very small scales compared with total energy consumption. Increasing our biomass production is going to be problematic in many ways, and not just in the conversion from biomass to fuel, as with cellulose, but more so with the generation of enough biomass in the beginning, where we think it farfetched to produce much more than we do today. Overall, we come to the following conclusions:
So even if we can double current extraction of biomass, this only represents approximately 6% of total U.S. primary energy consumption, and will never be enough to serve as a meaningful stabilizer in future energy systems.