A global bathymetric database
David Mearns, marine scientist, oceanographer and explorer, surveys the urgent need to discover more about the largest part of our planet
Whenever I am asked about the influences on my career as a shipwreck hunter I credit the 1985 discovery of Titanic by Bob Ballard for ushering in a new age of deep ocean exploration that paved the way for other famous shipwrecks to be found. In the eyes of most people, including me, Titanic still occupies the top spot as the most famous of all shipwrecks. However, long before Titanic was found, a group of scientists and engineers from the US Naval Research Laboratory (NRL) quietly set about conducting a mission of national importance that truly opened the doors to the deep ocean for the first time. Sadly, it was another catastrophic loss that instigated NRL’s ground-breaking mission.
On 10 April 1963, the US nuclear submarine Thresher (SSN 593) sank in the North Atlantic while conducting deep diving tests. Thresher was the first, and worst, instance of a nuclear submarine loss at sea, with all 129 of the boat’s crew killed. The magnitude of the loss was such a shock to the US Navy that a plan was literally developed overnight to find the Thresher. Before then the deepest shipwreck found was less than 450 metres and the method of searching comparatively primitive, based on dangling a man within a sealed observation chamber fitted with viewports. Thresher’s depth of 2560 metres necessitated an entirely new towed, multi-sensor approach to searching – and from this tragedy the United States’ deep ocean search capability was born. Because most of NRL’s missions using the search ship Mizar were classified, the world knows very little about what they achieved and the record depths they reached.
NRL’s pioneering development of deep tow technology, photographic techniques and search methodology in the 1960s and early 1970s provided the foundation for Ballard’s success in the 1980s. Ballard continued to rely on the same type of photographic search systems pioneered by NRL, and the experience he gained from his own surveys of the Thresher and Scorpion (another US nuclear submarine found by NRL in 1968) wreck sites helped him refine the search for Titanic. Although Ballard’s method of finding Titanic was not very different from how NRL located the Thresher 22 years earlier, the worldwide interest generated by Titanic’s name created a game-changing moment in history. Suddenly, the entire world knew that the deep ocean was accessible, without ever realising that the greatest ocean depths had been conquered more than two decades earlier.
No longer hidden and unreachable
The discovery of Titanic was a watershed moment without which my own discoveries of Lucona, Derbyshire and Hood might not have happened. The companies I worked with to find those wrecks employed more modern and more efficient sonar-based search systems, but the breakthrough from NRL’s early age of deep ocean shipwreck hunting had less to do with technology and was more about an awakening and a realisation of what was possible. After Titanic, the deep ocean was no longer the hidden and unreachable realm that most people assumed. It didn’t matter that NRL had found and filmed a wreck at 5010 metres in 1970 (the LeBaron Russell Briggs) or that a top-secret CIA project recovered parts of a Soviet ballistic missile submarine (K-129) from a depth of 5030 metres in 1974. Because these record-setting achievements were shrouded in government secrecy, it was as if they had never happened – and thus their impact beyond the super-secret world of the US Navy was virtually non-existent.
I’ve been incredibly fortunate that the arc of my career has coincided with a period in history when technology has caught up with mankind’s ambition to explore the deep ocean with greater precision and capability. Having cut my teeth on one of the earliest commercially available side-scan sonar systems, when image interpretation was considered an art more than a science, I am grateful to have endured long enough to see my industry grow by leaps and bounds into the mature state it is in today. New technologies that were at the very early prototype stage when I started out, such as autonomous underwater vehicles (AUV) and multi-beam echo-sounding sonars (MBES), are already proving their potential to completely revolutionise the gathering of data at sea.
There is, however, a much broader need for these technologies beyond shipwreck hunting that, with further advancement and expansion, will have an impact on everyone’s lives. It is often said (because it is true) that we know more about the surface of the moon than we do about the surface of our own planet. That’s because 71 per cent of the earth is covered by the oceans, and the seabed that is hidden below this watery curtain has barely been surveyed or explored. Current estimates are that only 10–15 per cent of the ocean floor is mapped to a resolution of 100 metres. Compare that to 98 per cent of Venus, 100 per cent of the moon (at higher resolution), and 100 per cent of Mars (with much of it at 20-metre resolution), and you start to get the full picture. In a future where the world’s population will be increasingly dependent on the oceans – the water and the seabed and everything in between – we can no longer afford to live in ignorance about the largest part of our planet.
There are many aspects of the oceans that need to be measured and understood on a global scale, but experts agree that mapping the ocean floor should be a priority. In the summer of 2016 a remarkable meeting took place in the Principality of Monaco, at which 150 senior representatives, scientists, scholars and business associates endorsed the objective of comprehensively mapping the ocean floor by the year 2030. It is a bold but achievable objective (called Seafloor 2030), and a necessity if we are to continue to exploit the oceans at the rate we are doing without damaging them any more than we already have. GEBCO (the General Bathymetric Chart of the Oceans), the world’s only international organisation with a mandate to map the ocean floor, will be responsible for compiling the data into a global bathymetric database with the ultimate objective of sharing the data in order to ‘make the seafloor public’. AUVs and multi-beam sonars will undoubtedly be at the forefront of that data collection.
A challenge for ocean exploration
At that same meeting in Monaco another bold challenge was laid down. This was in the form of a cash prize – a $7,000,000 XPRIZE to be exact. The prize, sponsored primarily by Shell with a $1,000,000 bonus prize offered by the US National Oceanographic and Atmospheric Administration (NOAA), challenges the competing teams to develop new innovative technologies to autonomously map the deep ocean floor in high resolution at rates faster than currently possible. The competition runs for two years, with up to 20 teams competing in two rounds of at-sea testing. It is hoped the end result will be to accelerate the innovation needed to bring forward the goal set by Seafloor 2030. As a marine scientist and explorer, I am honoured to have been selected by the XPRIZE scientific advisory board to serve as one of the judges for a competition that is so close to my heart.
Some people choose their careers, but I like to think that my career chose me. It happened the moment I switched on an ancient EG&G 259-4 side-scan sonar recorder for the first time and watched the geology of the West Florida shelf being revealed as if the ocean had been drained of water. Thirty-five years later I still have that same heightened sense of expectation whenever I switch on a side-scan sonar, as my scientific training and curiosity draws me forward to the screen in the hope that I’m about to see some long-lost object or an interesting geological feature. Creating the global bathymetric database – the base map for 71 per cent of our planet – is the closest we will all come to that experience of draining the oceans of water and revealing what lies below.