5 Comments

Great read !

Wondering how could one apply this (cost reduction driver framework) to electrodialysis based ocean capture CDR ?

Expand full comment

Thanks Sanat. For a direct ocean capture scheme as you're describing, there could be exogenous cost decreases via decreasing costs of electrolyzers/precious metals, renewable energy, labor, and general plant construction. Economies of scale could be achieved by scaling processing equipment, such as pumps, as much as possible to take advantage of sublinear scaling factors, purchasing feedstocks and equipment in bulk, and distributing fixed plant labor costs over a larger production volume.

LBR could occur through finding new catalysts, plant designs, additives, or other methods that could allow for more carbon removal with the same amount of or cheaper materials and equipment. Learning-by-deployment could happen in many ways such as through finding better plant designs or locations via trial and error and interacting with desalination plants to learn best practices for saltwater processing.

Expand full comment

Got it. Loved your paper on adapting TLCs to CCU and how a system element approach (granular approach of calculating TLC) gives a greater reduction of plant CAPEX.

As I understand, this could be immensely useful for investors, corporate buyers & policy-makers to optimize their funding & grant portfolio.

Quick question - Is it advisable to use this TLC approach to prioritize certain CDR methods over others in the short term (next 3-5 years) ?

Expand full comment

Technology learning curves are often most interesting in academic literature and most accurate when looking at cost declines retrospectively. Calculating these curves and corresponding learning rates can enable comparisons of different technologies, helping us tease out the factors that can help technologies decrease in cost as cumulative production increases. Some of the general factors are discussed in this post in the LBD section. While this analysis can help us make some very rough projections and even come up with some high-level likelihoods (such as "modular approaches will feature faster cost declines"), I'm not quite sure that we should put too much emphasis on them alone when allocating significant amounts of funding.

In my opinion, it is ideal if you can build a very detailed, bottom-up cost analysis of an emerging technology and then work with a team of skilled and experienced engineers and others to figure out specific ways in which that technology could potentially come down in cost with increasing production. The effects of projected learning, in conjunction with exogenous changes, anticipated research breakthroughs, and scale effects, can then be analyzed using the model. While this can't be perfect—we can't predict what we will learn a priori as we don't know what we don't know yet—and some of these drivers are interrelated, this is a higher-resolution approach that can offer more specificity in terms of what exactly we should be funding.

Expand full comment

Thanks for the explanation Grant !

Very helpful.

Expand full comment