I read a paper today (actually, more like an essay) by Peter Wangersky, a longtime chemical oceanographer. Titled “Methods of sampling and analysis and our concepts of ocean dynamics,” it is essentially a personable ramble through six decades of marine science, reflecting on the technical capabilities and sampling methods over time and the way those capabilities and methods influenced the assumptions that were made and the questions that were posed—in essence, the working mental picture we have of the ocean. Things have indeed changed a lot since he began during World War II. The paper is full of quotable nuggets:
Adding machines did exist, and some could even be made to multiply, after a fashion. They were strictly mechanical, however, and the wear and tear on various joints, machine and human, coupled with the high frequency of random inputs, discouraged us from the use of any but the most necessary statistical tools. Perhaps this is why most analytical chemists of this vintage believe that if one does the analyses right, there’s no need for statistics. This is a self-reinforcing fallacy; if you don’t do the statistics, you never discover your limitations or the limitations of your methods and the universe you are sampling.
He also talks about the analysis of water samples for salinity, in the days before CTDs could measure it in-situ with electrical conductivity:
A group of female technicians, the salinity girls, ran the Mohr-Knudsen silver nitrate titrations to a chromate endpoint for eight hours a day. Needless to say, there was a considerable turnover in this group. There was always a shortage of salinity girls, and the sample bottles kept coming in from the ships and stacking up in the hallways.
Those were different times…
The most interesting aspect of the paper is his discussion of the shift in emphasis starting to occur from ship-based sampling to automated sampling with collections of stationary instruments. My thesis research is using a stationary echosounder, so I’ve done a fair amount of thinking about the differences between the traditional ship-based approach and the more recent ocean observatory approach. I appreciated hearing about this shift from the perspective of someone who has seen the entire evolution of modern oceanography firsthand. He doesn’t discuss it in the same vocabulary I might, but he comes to essentially similar conclusions.
Wangersky’s point is that our perception of the ocean and its dynamics is hugely dependent on the tools we have available to study it. It will be a very interesting time in the coming years, as more types of sensors and instruments are deployed long-term at more locations around the world. Just by virtue of observing the ocean at a new spatio-temporal scale, we’re likely to find stuff going on that we missed before. In the old days, oceanographers generally assumed that the ocean was in a steady state. When hydrographic samples were few and far between, and you had to bring them back to shore for the “salinity girls” to analyze before you knew what they meant, this assumption was kind of necessary. These assumptions were also kind of wrong, which is the point that Stommel made in his famous 1963 paper (remember Stommel?).
As instruments, methods, and computing power have increased, these types of oversimplifications have retreated, to the point where we now embrace variability and dynamic changes, and try to understand them in and of themselves, rather than as unwanted noise on top of some supposed equilibrium or steady state that isn’t really there. Every advance in instrumentation and every expansion of the scope of our observations has yielded a new perspective on the oceans, adding on to an understanding that is slowly and gradually becoming more complete.
Peter J. Wangersky (2005). Methods of sampling and analysis and our concepts of ocean dynamics Scientia Marina, 69 (S1), 75-84 : 10.3989/scimar.2005.69s175