Even when going boating in my home waters where I think I know every stand-alone rock and the boundaries of channels and will only be out for a few hours, I check the weather and carry a navigation chart.
Mariners have done this for as long as there have been boats, but the technology for creating forecasts and charts has evolved steadily over time. It’s making big leaps now through changes in the way satellites collect data about the atmosphere and ocean and how it is analyzed.
The first U.S. satellite was launched to counter Russia in the space race, but it was clear from the beginning that satellites would be of great value in studying the earth. On April 1, 1960, NASA launched the first meteorological satellite. It took 23,000 cloud-cover pictures that were useful in weather analysis.
The earliest dedicated weather satellites carried sensors and cameras to collect atmospheric data but couldn’t “see” through the clouds. Starting with Nimbus 3 in 1969, the satellite had the technology to gather data in vertical columns rather than just of the region above the clouds; it could take “soundings” in the atmosphere. This work allowed the atmosphere to be modelled in 3-D, and had major advantages over high-altitude weather balloons, which previously did the work but had to be filled and released in a laborious process and were not useful for observations over the ocean.
The work of Landsat 1 led to the field of Satellite Derived Bathymetry, or SDB, bathymetry being defined as the measurement of depth from the sea surface to the sea floor. SDB works by capturing light reflected off the earth’s surface and breaking it down into individual wavelengths, then applying the information in a well-established theoretical framework that links how light coming from the sun is reflected by the atmosphere, the sea, and the seabed before reaching a satellite. It doesn’t work in the deep ocean, only in shallower water where the bottom is visible enough to show colors, generally 30 meters or less. This happens to be where bottom measurements are crucial for safe navigation by ships and smaller boats, both commercial and recreational.
Whether a satellite focuses on the atmosphere or the ocean, it is looking at energy radiation, specifically the electromagnetic spectrum of radiation. The main instrument for this work is a spectrometer which separates the spectrum into individual wavelengths. This includes visible light (useful for SDB), radio waves, microwaves, infrared light, ultraviolet light, X-rays and gamma-rays.
The information being fed into the spectrometer can come from a camera or a scanner. These are familiar devices, and each has its purpose. Landsat 1 researchers started out with both but quickly realized that scanners were more efficient, as they gathered data continually while they moved rather than in a series of pictures that had to be stitched together to show large scale phenomena. The multi-spectral scanner satellites carried could simultaneously record data in multiple wavelengths, including ones for visible light and infrared.
Multispectral scanners also became the instrument of choice for researchers using satellites to forecast weather, allowing them to collect energy data on temperature, moisture, clouds and storms.
To make satellite-derived bathymetry data useful for mariners, it has to be translated into visual charts. For this purpose, researchers must decide how to assign the colors picked up by the spectrometer to specific depths (cruisers to the Bahamas know how important this skill is for navigating on the shallow banks). But light reflects differently depending on water turbidity and the composition of the seabed, and getting consistent performance throughout different coastal environments has proven challenging. It turns out that SDB comes closest to mirroring sonar readings only in clear water.
A team of Korean scientists recently took up the challenge of closing the accuracy gap. They chose bathymetry data from three areas around the Korean peninsula, each with a different set of ocean conditions: clear water, turbid water and one where the seabed contains various types of sediments. Each region already had accurate charts derived from sonar. The team then “taught” their computers to be smarter, feeding in the satellite data, pondering what might be causing readings from the turbid and sediment-laden regions to be inaccurate, and adding variables they thought the computers might be missing in their analysis, until the satellite data more closely matched the existing charts. This model can be useful in future applications of SDB, especially where taking sonar readings is not feasible.
A surge in extreme weather over the last few years has prompted forecasters to improve their ability to predict extreme events like tornadoes and hurricanes and to understand longer-term weather and climate phenomena like El Niño and La Niña. In 2027, the JPSS-4 satellite, jointly operated by NOAA and NASA, will carry a new scanner called a Cross-Track Infrared Sounder. The CrIS is hyperspectral, which means it will be able to break down the radiation energy emitted by the atmosphere into 2,000 “channels,” or bands of wavelengths, instead of the previous capacity of 19. This will allow much greater resolution and smoother transitions in images created by the data in the way that a higher number of pixels in a camera’s image sensor allows you to zoom in and achieve closer resolution and detail.
The challenge with these new methods of gathering and analyzing data will be to understand what is being said and to apply it accurately. Without that understanding, you just have more raw data.