Exploring the world’s oceans
Throughout history, the oceans around the world have played very important roles in the lives of people. Ranging from modes of travel to means of trade and business, it’s surprising to see how little we know about these great expanses of water. The present approximate percentage of ocean exploration happens to be 5%. Even with the advancements of modern technology, we still have no idea about 95% of the depths of the world’s oceans.

It’s only when we compare nautical exploration to space exploration that we truly see humanity’s progress in underwater exploration. Despite the universe being vast and generally thought of as completely unknown, humanity has already discovered and explored quite a bit of our visible universe—4% to be exact. It’s almost as if scientists have a better sense of infinite space than they have of what is actually within the limits of our own planet.
The HMS Challenger Expedition

In 1872, the HMS Challenger conducted the first oceanographic voyage in history. Led by Sir C. Wyville Thomson, the Challenger Expedition contributed surprisingly precise data for the first-ever deep voyage to have taken place. Most of their data, covering oceanic temperatures, ocean currents, and the contours of great underwater basins, are reliable enough for scientists to continue using them, almost completely unaltered, in the 21st century. The scope and thoroughness of this expedition gave the HMS Challenger a place in the history of undersea exploration.
This expedition was just the start of the long and continued journey of humanity to reach the bottom of the world’s oceans.
Ocean floors at a glance
All over the world, people scuba dive and snorkel for amusement, spending their vacations visiting exotic coastal locations to see vibrant coral reefs under the surface of the ocean. Researchers also dive beneath the ocean surface with their gear, not for the stunning views, but to study the health of the reefs and the underwater environmental systems.

But a person can only see so much of the ocean in a dive. What if you wanted to assess the ocean floor over an entire region or see how these underwater ecosystems were faring on a global scale?
NASA’s Explorations

This is where NASA comes in. Most people only know of the National Aeronautics and Space Administration for its decades of research into the worlds beyond our atmosphere. What they may not know is that NASA doesn’t just explore the universe but also explores extreme environments on our planet— which according to scientists, aren’t as different to extraterrestrial planets around our Solar System as we’d like to believe. A project called SUBSEA leads NASA’s underwater explorations. SUBSEA stands for Systematic Underwater Biogeochemical Science and Exploration Analog. Led by Darlene Lim of NASA’s Ames Research Centre, in California’s Silicon Valley, SUBSEA develops technology which aid global scientific underwater exploration, while assessing the best ways to conduct remote science missions to streamline future space explorations.
The General Fluid Lensing Algorithm
In SUBSEA’s team at Ames Research Centre, researcher Ved Chirayath developed new hardware based on his software technique called Fluid Lensing that can see clearly through the moving water to image reefs and other underwater ecosystems. Getting accurate depth measurements and clear images is the difficult part of underwater photography, due to how light is absorbed and intensified by the water and distorted by its surface. By running complex calculations, his algorithm, the General Fluid Lensing Algorithm, at the heart of underwater exploration technology is largely able to eliminate these troublesome effects.
Surface wave distortion and optical absorption of light pose a significant challenge for remote sensing and recording of underwater environments which most amateur underwater photographers constantly face. But for accurate scientific exploration of deep-sea ecosystems, humanity had to develop technology to counter the distortion and absorption of light underwater.
Solving the problem of distortion

Ved Chirayath developed the General Fluid Lensing Algorithm to try and solve this problem, to make it easier for NASA’s underwater explorations to continue without this distortion. It not only enabled robust imaging of underwater objects through refractive distortions from surface waves at sub-cm-scales but also exploited surface waves as a magnifying optical lensing element to enhance the resolution and signal-to-noise properties of remote sensing underwater instruments. Based on this software, he developed two technologies to image through the ocean surface— the FluidCam system (a Cube Satellite-based Computational Imaging System) and MiDAR (the Multispectral Imaging, Detection, and Active Reflectance instrument).
FluidCam

The FluidCam instrument is essentially a high-performance digital camera. This technology is mounted on a small satellite, or a CubeSat, and sent into an orbit around the Earth. Once images of the sea floor are captured, the fluid lensing software takes the images and undoes the distortion created by the waves on the ocean surface. Chirayath’s algorithm also accounts for the way an object can look magnified or appear smaller than usual depending on the shape of the wave passing over it. It also adjusts the brightness of the object which, on normal cameras, would be affected by the various absorption lengths of light by the ocean waves.
MiDAR
After FluidCam processes images of the ocean floor, the second technology, MiDAR, takes over the algorithm. It collects data from the ocean floor by transmitting light and bouncing it back to the instrument, similar to how radar functions. It also operates in a wider spectrum of light, meaning it can detect features invisible to the human eye, and even collect data in darkness. This technology is vital for collecting data on the ocean floor from miles away in space. After all, the technology and its software will be orbiting the Earth outside its atmosphere.
Using its many functions, MiDAR is also able to see deeper into the ocean than would be possible with traditional cameras close to the ocean surface. MiDAR manipulates the water surface magnification to its advantage, processes and produces much higher resolution images than people thought possible. It could even allow a satellite to explore a coral reef on a centimetre scale.

Both of Chirayath’s technologies bring us closer to mapping the ocean floor with a level of detail previously only possible when teams of divers were sent underwater to take photographs. NASA has given the scientific community a way to observe the oceans at this same level of detail across the globe.
Saving the oceans

The last question left to ask is why mapping the ocean matters. Apart from NASA’s explanations that the ocean floors could be very, very similar to the environments of many nameless planets, we’re sure to discover in the future, that the ocean happens to be the Earth’s largest ecosystem. Due to Global Warming, climate change and numerous other human-made environmental disasters, this underwater ecosystem is dying. Without techniques like fluid lensing, it is virtually impossible to be able to track these ecosystems and the organisms living within the depths of the ocean. Tracking them leads to the availability of data for researchers to finally have the ability to find solutions to the rapid deterioration of life underwater.
Reading your article has greatly helped me, and I agree with you. But I still have some questions. Can you help me? I will pay attention to your answer. thank you.