Sunday, July 28, 2013

An Architect Explains Diminishing Returns

The concept of diminishing returns is useful and so fundamental to the discussion of the predicament of modern civilization. The difficulty is that it is an abstract concept, and thus hard to grasp for a lot of people. I'm going to try and rectify that by using two examples from architecture - building insulation and window areas.

One of my colleagues emailed an article with the following title: Diminishing returns on investment in R-Values. Obviously, the mention of diminishing returns caused my ears to perk up. The article gives a good, concrete example of the diminishing returns to an activity, in this case adding more insulation to a roof. There is a reason why you see 6 to 8 inches of batt or 2 to 3 inches of rigid insulation on modern buildings rather than 12 or 18" of either. Surely more is always better isn't it?
Note the benefits accrued by the leftmost dot. Then note how much is added by the rightmost dot. Now project these into the future and you have diminishing returns. Eventually it flatlines.

After a certain point, adding more insulation actually gets you less and less insulative value for the cost. Plus, there are other tradeoffs, of which the cost of additional insulation is the most obvious, but also potentially the thickness of the wall, the complexity of construction, thicker foundations, etc.

Because the R-Value is described by the reciprocal function, going further out on the reciprocal function graph nets you less and less benefit. After a certain point, the increasingly meager gains are offset by the tradeoffs, potentially entering negative territory. From the article:
The R-value (or R-factor) is a measure of a material’s resistance to heat loss. It is the reciprocal of the U-value (or U-factor), which measures the rate of heat flow through a building material. Most often, design teams are concerned with the heat flow rate through a material or building assembly (i.e., the U-value). In this regard, the use of R-value can be misleading because of the reciprocal function.

R-values have become to the building owner what MPGs have become to the car owner: A largely understood metric that perpetuates a skewed understanding of energy economy. Mike Allen made this observation with MPGs a few years ago (October 2009) in Popular Mechanics, and he suggested that the car industry use “gallons per mile,” the reciprocal of MPGs (miles per gallon). Allen recognized that differences in MPG ratings do not reflect the actual differences in fuel consumption the way that differences in gallons per mile ratings would. The same is true for R-values as compared to U-values. However, the U-value is not as intuitive as the R-value. We understand that greater R-value correlates to better insulation. Conversely, a U-value of U-0.050 (that is, R-20) is difficult to put in perspective.
Diminishing Return on Investment for Increasing R-Values (Environmental Design and Construction)

As a result of the reciprocal function between R-value and heat flow rate, every additional dollar invested in increasing a construction assembly’s R-value becomes less effective after a certain point.
We can see from above that adding more and more insulation to our building will get us less and less benefit. This is critical! Obviously an insulated building performs better than a noninsulated one, and we should insulate buildings. Superinsulated buildings, which use larger amounts of insulation and details to minimize cold bridging are also a good idea, as are Passive Houses, which attempt to maintain a stable indoor temperature with the minimum of mechanical intervention. However, after a certain point, more is not better. If that were true, every inch of insulation would add additional equal benefit, and we could just add 10, 12, 14 inches of insulation and get double the benefit of 5, 6, 7 inches. But it doesn't work that way. Plus the cost and constructability tradeoffs of constructing such walls make adding more insulation unfeasible after a certain point. That's why you don't see such walls, and you never will.

The most benefit of insulation you get is from the first inch! And from there, you get less and less with each additional inch:
Contrary to intuition, doubling the insulation investment to R-40 does not double the effective reduction in heat flow rate. Rather, for twice the money ($376,000), eight inches of XPS insulation (R-40) will only reduce the heat flow by an additional 50 percent compared to the R-20 code minimum. In fact, as the accompanying chart illustrates, every additional “doubling” of the insulation investment will only yield a 50 percent additional reduction in heat flow rate.
This is not to say that we should not strive for higher R-values...
A similar case is that of window area. Obviously, we want windows on our building to let in light and air.  But glass is nowhere near as good a building wall insulator as an opaque wall using insulation as described above.  Glazing systems have little ability to control heat flow and radiation across the building assembly. So how much glass is too much? How much glazing is the optimal amount to maximize daylighting while minimizing unwanted heat gain or loss?

In the past, windows were not very thermally efficient. A series of technical innovations, from insulated (multi-pane) glass to thermally-broken frames have dramatically increased the R-value of window assemblies. This has permitted architects to clad entire buildings in glass, which is hardly a good idea in most cases! In fact, many so-called "green" buildings are clad in glass with R-values as low as 2. And although there are glazing systems with performances as high as R-12, these are much more complex and expensive than even a relatively simple opaque wall with insulation.

John Straube of the University of Waterloo and Building Science has done extensive research in this regard. He writes:
Normal thermally-broken commercial aluminum windows and curtainwalls typically have U-values of about 0.5, corresponding to an R-value of two. By using the best available low-e coated, argon-filled double-glazed units (with a center-of-glass R-value of four) in thermally-broken aluminum frames, overall window R-values of three can be approached. If the goal is low-energy buildings, why cover large portions of any building with such a low R-value system, particularly in cold climates?...It is difficult to understand how such a choice of exterior wall could be considered a “green” system, when a simple low-cost wall with only an inch of continuous rigid insulation will provide significantly better control of heat flow.
BSI-006: Can Highly Glazed Building Fa├žades Be Green? (Building Science)

So the ideal ratio of glass to wall is certainly lower than 100 percent! The additional glass conveys no real benefit, but introduces a substantial thermal penalty, which must be made up by extensive and complex mechanical systems like air conditioning and heating. Straube continues:
But the thermal qualities are only part of the story. Glazing lets light in. That, after all, is the primary reason we use glazing. The solar heat gain that results is the reason many buildings require air-conditioning. The size of a building’s air-conditioning plant is almost always defined by the glazed area: more glazing means more chillers, ducts, coils, and fans. ... It is a testament to the miracles of modern glazing (which uses low-e, low emissivity coatings to selectively allow more visible light than infrared heat radiation), that many buildings can have such large window areas and remain comfortable in the summer. Nevertheless, even very good commercial clear glazing still allows about one-third of the sun’s heat to enter.
Furthermore:
It is true that daylight can offset the need for electric lighting and provide a psychologically healthy connection to the outdoors, but one doesn’t need floor-to-ceiling glass for that. In most occupancies and building types, there is no benefit to vision glass installed at floor levels (unless the occupants spend much of their time lying on the floor near the window), but there is a substantial energy penalty. Good daylighting design can reap all the benefits of glazing using vision glass covering less than half the enclosure. Numerous studies have shown there are no daylighting or energy benefits with window-to-wall ratios over 60 percent, and in most cases an area of between 25 and 40 percent is optimum (that is, lowest energy consumption). Even at these ratios, windows in a low-energy building should generally be high performance (triple-glazed in cold climates), with large thermal breaks (over 1/2" thick) and some form of exterior shading (preferably operable).
So from the above, we can see that glass much above sixty percent of the wall area is probably wasted, requiring more complex and energy-intensive mechanical systems to keep the building comfortable, with no real  payoff in terms of daylighting. So why do we see so many all-glass buildings?  Every building is unique, and these rates should be calculated on a case-by-case basis, but it's clear that adding glazing over this window/wall ratio brings diminishing returns - no further benefits are gained from the glazing, but significant losses in performance are accrued. It should be noted that many older buildings that did not have the sophisticated and energy-intensive mechanical systems of today had window/wall areas in this range.

In fact, the history of window glazing itself provides another object lesson. In the past, glass was typically very thin, about 1/4 inch, and could only be made in small panes. This limited its use. Then a series of innovations made it more thermally efficient. The first was double-paned glass: two sheets of glass separated by a dead air space. Then came thermally-broken frames, where the frames that held the glass were separated into inner and outer parts, connected by a low conducting material such as neoprene. Later, the spacers for the double-paned glass followed this technique as well. Then came a series of sophisticated low emissivity (Low-e) coatings that allowed visible light to pass through while preventing wavelengths that added heat from passing through (sort of a reverse greenhouse effect). Later developments including adding additional panes of glass, and using an inert gas like argon in the space between the glass panes instead of air.

U-Values for glass (COG=Center of glazing). Each additional development gets less and less improvement over the original. Source
However, if you were to plot all these innovations out, you would see something interesting. The very first development - double paned glass, provided the biggest boost to thermal performance of all of these. Additional developments yielded better performance, true, but no development got you as big a "bang for the buck" as insulated glass - not low-e coating, not thermally broken frames, not the addition of an additional pane of glass. In fact, every new development over time, if you plotted them out, got you less and less of an improvement over the earlier development!  The cumulative effect of all these developments gets you a window around 90 percent better performing than our initial 1/4" pane of glass. That's great, but does anyone expect another 90 percent improvement? Now we're spending massive amounts of research and development just to get a few percentage point increase. And adding additional panes of glass won't do - each pane gets you less and less benefit, as described by the reciprocal function above, while adding additional cost for the glass, and the larger frame, etc. This is why you don't see more than triple-paned glass, and never will.

The very first steam engines were very inefficient - probably ten percent or less of the coal went into doing useful work; the rest was lost to waste heat, noise and vibration. Thus, in the early days of a discovery there is lots of room for improvement. But the laws of thermodynamics set an upward limit to efficiency. The closer we get to that upward limit, that harder and harder it gets to squeeze further improvements. That doesn't mean that further improvements are not possible or not desirable; they are. It does mean, however, that big "breakthroughs" are less and less likely. Thus over time, innovation doesn't speed up, it in fact slows down. And it does so naturally, because the "big" developments are done first. These are the low-hanging fruit so to speak. So when those engines become, say ninety percent efficient, we can expect smaller incremental gains than we did decades earlier when such engines were very inefficient. We can't double efficiency forever; this is physically impossible.

Anyone who has tried to get fit has experienced this first-hand. When you first start to exercise from being sedentary and out-of-shape, weight flies off relatively quickly, and you add muscle relatively easily. After a certain point, however, you hit a plateau and work ever harder to gain muscle and lose fat. Eventually, you are spending hours in the gym just to lose a few pounds that flew off when you first started, or you have to work intensely just to gain a pound of muscle. Bodybuilders and elite athletes need to spend hours in the gym for even incremental improvements at their fitness level, and many even  turn to drugs to gain additional muscle or lose extra fat, leading to negative health consequences.

One could say the same in regards to population. It can be said that a certain amount and density of population is beneficial for innovation and economic growth. But after a certain point, simply adding more and more people delivers no benefit, yet causes all sorts of serious problems, similar to adding ever more glass to buildings. Many of the most culturally rich and innovative cities in history had less than a hundred thousand people, sometimes less than ten thousand.

Note this does not mean that there are no new innovations or discoveries to to be made, or that we should not engage in research to find them. This is the straw man often brought out whenever this point is made. What it does mean, however, is we should stop anticipating some sort of new "innovation" to fix all our problems, to create vast new industries, to employ massive amounts of people, to completely transform our economy, or similar things. This is wishful thinking based on past history. There is an upward limit on what we can accomplish.

The first amazing breakthrough experiments in the eighteenth and nineteenth century were done by amateurs in drawing rooms with equipment that could be found in the average secondary school science lab. Many twentieth century companies were famously started in garages. Now, multi-million dollar research labs are required to find even  the smallest new breakthroughs and innovations, and corporations must invest millions of dollars in research and development. As John Michael Greer pointed out:
The fusion research community, in effect, is being held hostage by the myth of progress. I’ve come to think that a great deal of contemporary science is caught in the same bind.  By and large, the research programs that get funding and prestige are those that carry forward existing agendas, and the law of diminishing returns—which applies to scientific research as it does to all other human activities—means that the longer an existing agenda has been pursued, the fewer useful discoveries are likely to be made by pursuing it further.  Yet the myth of progress has no place for the law of diminishing returns; in terms of the myth, every step along the forward march of progress must lead to another step, and that to another still. 
Held Hostage by Progress (The Archdruid Report)

Plus earlier research is focused on the really big problems. Once those are solved, often new "problems" are little more than inconveniences, such as the refrigerator that cools down room-temperature drinks in five minutes. The benefits for this feature are negligible while the costs are substantial; this refrigerator retails from many times that of a conventional one. As Robert Gordon pointed out, indoor plumbing, sanitation, refrigeration, electrification, instantaneous communication and universal personal mobility are much bigger and more transformative than online photo albums. Things like universal education and women entering the workforce can only be done once.

So this is the critical takeaway: diminishing returns applies to everything, including technology, innovation, society, and economic growth. There is a point at which more is not better, yet we are unable to recognize this. More is not always better, and the benefits decline over time. As certain forward-thinking economists have pointed out, economic growth has consequences, and the consequences can potentially outweigh the benefits, in pollution, inequality, fragility, quality of life, etc.  So many innovations are simply attempting to solve problems caused by earlier innovations, such as cancer treatments, large-scale geoengineering projects, etc. Others are desperate attempts to maintain the status-quo, as the above mentioned fusion power, electric cars, vertical farms and so on. The amount of true net benefit to more innovation is becoming problematic. Innovation is less about solving problems than generating required profits for holders of capital who expect ever increasing returns to their investment. But as we see from the above, this is not possible forever.

So, armed with this understanding, we know that the future cannot be like the past. Unlike what the mainstream media preaches, we cannot wait, Beckett-like for some sort of innovation to fix our escalating problems. And we cannot expect the high-growth type of society we've come to expect and rely upon, and that we've built all our institutions around, to continue. That time has passed.

3 comments:

  1. The point of diminishing returns is an important one, but like your examples show, it mostly deals with new, better iterations of existing technology. It doesn't necessarily mean that we won't see dramatic technological change in the future, just that the dramatic changes are likely to come from new, under-explored areas of technology, not from evolutionary improvements in old technology. For example, new areas of technology, like synthetic biology, might not have yet gotten to that first, key, "1 inch of installation" or "first steam engine" step that's the biggest single improvement.

    ReplyDelete
  2. That's true. I think biology is one of the last untapped frontiers - DNA is still a relatively recent discovery. I also think there's a lot more to do with digital technology. These are probably closer to the left-hand side of the graph.

    But here's the key point - these alone won't provide the growth rates we're counting on to "rescue" us from all our mounting social problems. And they don't employ a whole lot of people. In the past, entire new industries absorbed displaced workers. Digital technology, for one, promises to do the opposite - eliminate the need for workers across all job categories simultaneously. And biotech promises to be very dangerous in the wrong hands. From the one percent engineering their offspring for perceived "superiority" to disgruntled elements engineering a supervirus, it's innovation we may not survive.

    What that means, in turn, is we have to start thinking about some very basic ways our society is organized if we want to survive. Instead, we're engaging in mass denial and make-believe.

    ReplyDelete
  3. I certainly don't think that we should assume that technology is going to just solve all of our problems without any need for social reform.

    That said, I think you're talking about two different scenarios here. It's possible that innovation will slow down, because all the low hanging fruit has been picked (or for some other reason, like poor resource allocation on our part causing an innovation slowdown), and that that will lead to a dramatically slower rate of economic growth and to stagnation. Some economists are worried about that scenario. It's also possible that we will have rapid innovation in automation and computer technology that will lead to mass unemployment as digital technology replaces workers, and that advances in biotech will pose serious threats if misused. Those scenarios seem mutually exclusive, though; even if innovation in other areas aren't very significant, that rate of change in computers and biotech would lead to rapid and seriously disruptive economic growth and change, not to stagnation. Either one of those technologies on its own would likely to be as disruptive as electrification.

    Either one of those two scenarios is possible, and they're both worth worrying about, but they can't both happen; I think you can't get both stagnation and massive disruptions in employment from innovation in digital technology and biotechnology at the same time.

    ReplyDelete