Harold Urey’s paleothermometer and the nature of proxy measurement — Extinct



In addition, proxy methods are developed in ways that piggyback on significant dependencies in nature. There is no in principle reason why measurements must rely on significant dependencies between the target phenomenon and a measurement output. One could use global pCO2 averages over time as a measure for how much CO2 a particular country is emitting, just so long as the contribution of all other countries could somehow be tightly constrained. But this would be difficult and bound to introduce uncertainties; so proxy methods are grounded in what are thought to be significant causal relationships. Urey’s insight regarding the oxygen paleothermometer involved noticing that temperature is a significant influence on oxygen isotope fractionation such that the variability of oxygen isotope fractionation in a given context is largely a function of temperature (relative to known confounds). (See Wilson and Boudinot (2022) for a more technical discussion of causal significance.)

All proxy methods carry a certain cost stemming from the inability to directly control confounding factors. Still, the reliability of a given proxy doesn’t ultimately depend on the method of control. Measurement reliability, proxy or otherwise, is a matter of how well confounds are controlled. As such, proxy measurements are not in principle less reliable than non-proxy measures, even if they may require more sophisticated and varied strategies for controlling confounds. Indeed, the oxygen paleothermometer has continued to be refined over the years, long after the initial worry over ice volume, with novel confounds being discovered and incorporated into the measurement model (e.g., isotopic variation with shell-size, foraminiferal lifecycle, and ocean pH levels). As a result, the oxygen isotope paleothermometer has become one of the most reliable and commonly used proxy measures in paleoclimatology.

* * *

I will conclude by considering one last limitation on the use of historical proxy measures. Some of the most concerning challenges for interpreting proxy measures emerge from their temporal resolution. Aja Watkins, on this very blog, grapples with the philosophical problem of how to settle on rates derived from proxy measurements, how to compare modern rates derived from modern instruments, and whether there are even such things as “real” rates.

I will restrict the rest of my discussion to one specific rate-related issue: the rate at which proxy records accumulate, and thus the temporal resolution of the target measure. This is often very low relative to climate phenomena we observe in the present (the Vostok Core averaged ~1.4 cm/year as it accumulated, while Hays’ continuous ocean sediment cores averaged ~3 cm/kyr). So, even if we could decide what rate(s) to ascribe to existing proxy records, many of these records would underdetermine the climatologically significant processes that we know to occur on shorter timescales. Individual historical proxy records can, at best, average the known climatological variance occurring over shorter timescales. (To be clear, non-proxy measures experience the same kind of temporal underdetermination: a standard mercury thermometer requires some number of seconds to respond to local temperature changes, and so cannot tell us about variance on the nanosecond scale. For climate purposes, however, it turns out that the behavior of the global climate over time can for most purposes be adequately represented in terms of seconds or longer time units. This is not the case for the timescales captured by historical proxy measures.)

The problem is a general one. Tree rings capture seasonal temperature variation in the growth patterns of their rings, but fail to capture temperature variance occurring at daily or weekly intervals. Ice cores like Vostok can exhibit sufficient resolution for annual temperature averages but obscure intra-annual seasonality in deeper sections. Our oldest continuous ocean sediment cores are resolved closer to millennial timescales, and so average together several hundred years of temperature signal. What empirical constraints a proxy measure is capable of providing will be a function not only of the amount of time represented in the record and our ability to vicariously control confounds, but also the temporal resolution of the record.

There are a couple things we can say about how historical proxy users work with such constraints. First, proxy measures of differing resolutions will be particularly suited to assessing hypotheses at differing timescales. Deep sea sediment cores experience slower accumulation rates that make them more suitable for assessing variance on the order of 10–100kyrs, like the periodicity of earth’s orbit around the sun. On the other hand, tree rings grow and coastal sediments accumulate relatively quickly, making them suitable for tracking more recent variation in El-Niño Southern Oscillation (on the order of 4–10-year cycles). In fact, studying climate change rarely requires anything more fine-grained than annual temporal resolution, so we shouldn’t worry about the lack of an hourly paleothermometer. Underdetermination need not be a problem so long as the proxies are used for temporally appropriate purposes.

Secondly, individual proxy records best contribute to our understanding of the more complex earth system in the context of other proxy measures and independent background theory. Climate simulation models, for example, can provide a useful venue for the integration of empirical observations and relevant dynamic principles into a more complex and coherent vision of the past. Wendy Parker (2017) argues that climate simulation models can even play an important role in facilitating measurement practices. In this more interdependent empirical context, proxy measures of differing temporal resolution can provide distinct empirical constraints for the model’s behavior. Thus, while a single proxy method will rarely provide a richly detailed image of the past on its own, they provide crucial empirical constraints, which work alongside our other epistemic considerations to produce a more richly detailed understanding of the past.

So while it may be common to speak of traces in the historical record as providing a kind of “snapshot” of the past, it would be a mistake to import the temporal precision of a typical photograph into the analogy. Instead, it would be better to understand the analogical photograph as a product of longer exposure, no longer depicting so “snappy” a moment in time. The lines and shapes of the photograph may thus blend and blur, capturing the motion within the frame better than the boundaries of the subjects themselves. Yet the trained eye may still be capable of interpreting the patterns. In developing such a long exposure tool, Urey and colleagues provided an important way to interpret these motions of the past.



Source link

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles