How scientists and supercomputers could make oceans drinkable
Removing salt from seawater is an enormous challenge. Researchers may have the answer – but it will require a whole lot of processing power.
Aleksandr Noy has big plans for a very small tool. A senior research scientist at Lawrence Livermore National Laboratory, Noy has devoted a significant part of his career to perfecting the liquid alchemy known as desalination – removing salt from seawater. His stock-in-trade is the carbon nanotube. In 2006, Noy had the audacity to embrace a radical theory: maybe nanotubes – cylinders so tiny, that they can be seen only with an electron microscope – could act as desalination filters. It depended on just how wide the tubes were. The opening needed to be big enough to let water molecules flow through but small enough to block the larger salt particles that make seawater undrinkable. Put enough carbon nanotubes together and you potentially have the world’s most efficient machine for making clean water.
Just how tiny are carbon nanotubes?
Most of his colleagues at the lab dismissed the idea as sci-fi. 'It was hard to imagine water going through such very small tubes,' says Noy. But if the nanotube theory was correct, the benefit would be incalculable. Many of the world’s regions are currently in the midst of a potable water shortage, with 1.2 billion people – about one-sixth of the global population – living in areas of water scarcity. Desalination can help, but the infrastructure in place today requires massive amounts of energy (and therefore money) to heat seawater or to force it through complex filters. If nanotube filters worked, they could greatly reduce the world’s water woes.
Noy’s team set up a simple filtration experiment and let it run overnight. In the morning, two assistants noticed a puddle on the lab floor; water had slipped through the nanotubes so rapidly that the small reservoir meant to catch the liquid had overflowed. Researchers would later confirm that the flow rate of water through carbon nanotubes is six times higher than it is through the filters used in today’s desalination plants.
That puddle may have been small, but it was one of the biggest discoveries of Noy’s career. 'The experiment was exciting,' he recalls, 'because nobody knew what to expect.' Now that everyone does, a huge challenge remains – one that might be possible to surmount with enough computing power.
Luckily, scientists are on the verge of a breakthrough called exascale computing, which in Google’s case is likely to come from a throng of machines connected in the cloud. Exascale will dwarf today’s most powerful supercomputers. This kind of extreme processing power will be a huge asset to researchers working out how to make nanotubes work as large-scale water filters. These tubes – and the billions of molecules that flow through them – are far too small to study in detail, and physically testing different variations is difficult and time-consuming. Exascale computer modelling will put those tiny tubes into sharper focus, which will dramatically speed up nanotube desalination research. In fact, the technology will help tackle a number of today’s thorniest environmental problems.
The Promise of Exascale Power
Vastly increased speed could help surmount once-impossible challenges and lead to big breakthroughs.
For those not versed in the jargon of Silicon Valley, exascale refers to the horsepower offered by the next generation of supercomputers. An exascale machine will have the ability to crunch a quintillion (a billion billion) calculations per second. That’s nearly 11 times more powerful than China’s Sunway TaihuLight, the fastest computer in use today. Think of exascale as the processing power of roughly 50 million tethered laptops.
A worldwide race is on to build the first exascale machine, which will enable scientists to revisit everything from theoretical physics to long-term weather forecasts. But research like Noy’s quest to understand nanotubes are likely to be some of the first projects to realise the advantages of increased computing capabilities.
'A jump in computation power will be a huge benefit to materials science, drug discovery and chemistry', says George Dahl, a research scientist on the Google Brain team. All of these areas of research, Dahl explains, require building computer models of molecules – an activity that demands a lot of processing power. 'These are very slow computations', says Dahl, 'for each and every molecule or material we want to analyse.'
But there’s more, he adds: 'If you apply machine learning – which also benefits from advances in computing power – to molecular simulations, you get a double whammy of increased power. You can use machine learning in conjunction with materials science to find all-new materials.'
These are exactly the type of advances that will lead to a better, less expensive salt water filter. And that’s not the only way exascale-level computing might help the planet’s water challenges.
Because exascale computing will also be exceptional at processing significant quantities of data, it could help with projects such as the work being done, in part, by Google engineers Noel Gorelick, co-founder of the Earth Engine platform, and Tyler Erickson, a senior developer advocate focusing on water-related analyses for the platform. The cloud-based platform analyses environmental data on a global scale. A recent ambitious effort, led by Gorelick and the European Commission’s Joint Research Centre, sought to create high-resolution maps of surface water around the world. By looking at 30-plus years of satellite images using the Earth Engine data, the team mapped (and measured) the evolution of the Earth’s bodies of water over the decades, revealing vanished lakes and dried-up rivers as well as the formation of new water bodies. It would have taken three years just to download the necessary data if it had been done all at once. That’s quite an archive, Erickson says, but exascale will allow the team to collect even more information – at a vastly quicker speed – to produce even more accurate maps.
'There are other data sources that we could be looking at if we had more processing power', Erickson says. An exascale machine, he points out, has the potential to tap into the world’s most undervalued resource: citizen scientists. Imagine if the water-mapping project were opened to, say, anyone flying a drone who shoots HD video. 'That would be a pretty spectacular amount of data', he says. Secondary school students piloting DJI Phantoms over rivers and estuaries might upload video to the Google Cloud, where, thanks to exascale power, it could be filed, geo-referenced against Google’s base map of the world, analysed and distilled into digital cartography. This democratisation of science in action could aid agricultural planning, prepare regions for disasters or even help monitor ecological changes. To spur similar projects at other organisations, in 2014 Google announced that it is donating a petabyte of cloud storage for climate data as well as 50 million hours of computing with the Google Earth Engine platform.
Dahl, for his part, is quick to add that leaps in processing power won’t solve every computing challenge. But, he says, the biggest benefits may come from uses we have yet to imagine. He makes an analogy to the invention of the microscope – a device that led to life-saving new discoveries. 'Maybe there will be something that we’ve never considered doing that will suddenly become practical', he says. 'Maybe it will allow us to build something like the microscope – a completely new tool that, in turn, enables completely new discoveries.'
Only 3 percent of the water on Earth is fresh and drinkable
And we can only access a tiny fraction of that.
High-performance computing is measured in FLOPS. This metric can be applied to any machine, from a laptop to the world’s fastest supercomputer. More FLOPS equal more speed; more speed equals higher resolution, or the ability to see things in finer detail; higher resolution equals more accurate computer-simulation images and predictions. This is especially valuable to places such as the National Oceanic and Atmospheric Administration (NOAA), which uses computers to predict weather patterns, changes in climate and disruptions in the oceans and along the coasts.
Exaflop systems can perform 1018 (a billion billion) calculations per second.
NOAA expects to use exascale systems in the 2020s. 'It will give us the ability to provide more accurate warnings of severe weather at finer scales and longer lead times that will provide much better protection of lives and property,' says Brian D. Gross, deputy director of high-performance computing and communications at the agency. Scientists could help build up resilience in anticipation of extreme climate events, such as a devastating hurricane, enabling an entire region to limit the damage and death toll.
To convey the scale of that computing power, Gross explains that the department used teraflop systems in the 2000s (a trillion calculations per second) that could accurately track large weather features roughly the size of a state; today the systems use petaflops (a quadrillion calculations per second) and can accurately track weather features the size of a county. Exascale computing will allow NOAA to zoom in much closer for more detail – for instance, accurately mapping thunderstorms as small as a city. This resolution provides more information, which reveals a lot more about how storms of all sizes will behave and evolve. 'Higher-resolution models more accurately depict larger-scale weather systems like hurricanes, improving the prediction of rainfall and storm tracks,' says Gross. To put it another way: a few years from now, weather forecasters will have little excuse if they get the five-day forecast wrong. And we’ll know more about exactly where and when the next superstorm will hit.
Exascale computing can help solve fresh water shortages
Faster supercomputers will aid researchers who are studying desalination and depollution filters to boost the amount of drinkable water in the world.
Fresh water access is a challenge around the world. From the depleted aquifers beneath Saudi Arabia to the baked soil of Brazil to America’s breadbasket, where drought has spread across the Great Plains like cracks in a pavement, mass dehydration is looming. A 2012 US intelligence report concluded that fresh water shortages will even impact national security. The demand for fresh water is expected to be 40 percent higher than the global supply by 2030.
Rising temperatures, less rainfall, more people, pollution and poverty – the challenges underlying the demand can seem, on the surface, insurmountable. But Aleksandr Noy remains convinced that an exascale machine will help him create a nanotube membrane that filters water and saves lives. 'With so much computer power, we can run a quick simulation before we go to the lab', he says. 'That’s really helpful because it will allow us to focus our energy on the experiments that make sense.' And there is still a lot to work out: the precise measurements required for water transport through the nanotubes has yet to be established, and no one knows the best membrane material in which to embed a bunch of nanotubes, or how they should be arranged. 'In many of the nanotube modelling studies using simulations, there is still discrepancy in the numbers', says Ramya Tunuguntla, a postdoctoral researcher working with Noy. 'That’s a challenge we must overcome.' Like Noy, she thinks a more robust supercomputer will take their research to the next level: 'With exascale, we could run longer simulations to collect more data.'
In 2023, a new computer will be installed at Livermore Lab. With four to six times the number-crunching power of the current system, this machine, dubbed Sierra, is likely to be the last step before exascale lets researchers ogle all of those gorgeous high-def images that come with a quintillion FLOPS. In fact, exascale may have already arrived elsewhere by then. One top researcher at Livermore says that while the first exascale machines will start appearing in the US around 2020, China – the prohibitive favourite in this race – claims it will deliver a prototype either later this year or early next year that some have called the 'super-supercomputer'.
Costas Bekas, a two-time Gordon Bell Prize winner and an exascale expert at the IBM Research lab in Zurich, points out that exascale isn’t the end – computing power will continue to grow. He foresees a day when computer modelling allows us to examine the universe at not just the molecular level but at the atomic level.
'Exascale means that we can finally crack – in an acceptable amount of time and energy spent – things that are very complex, such as how carbon nanotubes work', Bekas says. 'Exaflops will not save the planet. We have too many problems. However, it will definitely make the Earth a much better place to live.'
Back at Lawrence Livermore, Aleksandr Noy and Ramya Tunuguntla load another nanotube membrane into a test cell, flip a switch and collect more data. Soon they – along with exascale computing – may change the lives of billions.
RENE CHUN is a writer based in New York. His work has appeared in publications ranging from The New York Times and The Atlantic to Wired and Esquire.
Animations by Justin Poulsen
Illustrations by Matthew Hollister