Ocean Eye: AI's Venture into Marine Depths
Despite nearly two-thirds of the Earth's surface being covered by water, much of its depths remain unexplored. Currently, only 9% of all marine biodiversity has been studied. Scientists from the Technical University of Denmark (DTU) and Aarhus University aim to make a breakthrough in ocean exploration using AI drones.
Previously, oceanic research relied on divers, drones, and satellite imagery for new discoveries. The Danish researchers are introducing a new set of tools: hyperspectral cameras, lasers, and the power of artificial intelligence.
Traditional drones are usually equipped with cameras that capture the environment using three primary colors: red, green, and blue. In contrast, a hyperspectral camera can identify up to 30 different shades, taking thirty separate images at once, each for a specific color. The laser system in the Ocean Eye is akin to those used in self-driving cars, where it's employed for distance measurement. This system is designed to focus on various ocean depths to search for marine life.
The equipment is mounted on an autonomous ship, tasked with collecting data from ocean depths beyond the reach of satellite technology. These depths could be home to various marine species, such as certain kinds of algae, starfish, and corals.
“A diver can only be under water for a limited time, and it’s hard to cover a large area. But with our method, we expect to be able to quite accurately say that, for example, 37 per cent of the seabed in this area is covered by eelgrass and 12 per cent by red algae,” explains Professor Christian Pedersen from DTU.
The hyperspectral cameras and lasers of the Ocean Eye project are expected to collect such an extensive amount of information that manually analyzing and organizing it will be impractical. Therefore, a key task for the project team is the development of a sophisticated language model that can identify and interpret specific scientific data, making relevant deductions from it.
“The three technologies complement each other, so when you take all the data and get an AI to analyse it, it can give a pretty accurate answer to whether what’s in the picture is a clam or red algae,” says Christian Pedersen.
Professor Pedersen is the driving force and leader of the Ocean Eye project. Spanning from 2023 to 2026, the project has already secured a grant of DKK 6 million from the charitable foundation Villum Fonden.
Traditional drones are usually equipped with cameras that capture the environment using three primary colors: red, green, and blue. In contrast, a hyperspectral camera can identify up to 30 different shades, taking thirty separate images at once, each for a specific color. The laser system in the Ocean Eye is akin to those used in self-driving cars, where it's employed for distance measurement. This system is designed to focus on various ocean depths to search for marine life.
“A diver can only be under water for a limited time, and it’s hard to cover a large area. But with our method, we expect to be able to quite accurately say that, for example, 37 per cent of the seabed in this area is covered by eelgrass and 12 per cent by red algae,” explains Professor Christian Pedersen from DTU.
“The three technologies complement each other, so when you take all the data and get an AI to analyse it, it can give a pretty accurate answer to whether what’s in the picture is a clam or red algae,” says Christian Pedersen.