In this video, researchers investigate the millennial-scale vulnerability of the Antarctic Ice Sheet (AIS) due solely to the loss of its ice shelves. Starting at the present-day, the AIS evolves for 1000 years, exposing the floating ice shelves to an extreme thinning rate, which results in their complete collapse. The visualizations show the first 500 years of the simulation — after that point, the ice sheet continues to evolve and retreat, but less dramatically.
The biggest uncertainty in end-of-the-century sea level rise comes from the Antarctic Ice Sheet (AIS), the miles-thick, continent-sized polar ice mass that covers the South Pole. However, current earth system models struggle to account for events unfolding in the Antarctic region—the coupling between the evolving earth system and the ice sheet is complex and difficult to fully implement in models.
To address this, a team of scientists from Berkeley Lab, Swansea University, and the University of Bristol collaborated to create an ice sheet modeling tool that uses high performance computing resources at the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab to systematically examine where the AIS is vulnerable and the resulting potential for large contributions to sea level rise.
The BISICLES tool investigates the millennial-scale vulnerability of the Antarctic Ice Sheet due solely to the loss of its ice shelves. Starting at the present day, the ice sheet evolves for 1000 years, exposing the floating ice shelves to an extreme thinning rate, which results in their complete collapse. These visualizations show the first 500 years of the simulation — after that point, the ice sheet continues to evolve and retreat, but less dramatically.
The modeling tool — the BISICLES ice sheet model — has enabled the first fully resolved, systematic study of millennial-scale ice sheet response to regional ice shelf collapse (Millennial-scale Vulnerability of the Antarctic Ice Sheet to Regional Ice Shelf Collapse). While high‐resolution projections have been performed for localized Antarctic regions, full‐continent simulations have until now been limited to low‐resolution models. Key to the accuracy of the BISICLES tool’s ability to quantify the vulnerability of the entire present‐day AIS is adaptive mesh refinement (AMR). AMR dynamically places high resolution specifically where the ice sheet is changing most rapidly. Using AMR to locally deploy fine resolution allows researchers to focus on the small regions that control the overall evolution of the AIS, like fast-moving ice streams, retreating edges and the point at which ice sheets transition from grounded ice to floating ice shelves (called grounding lines) — features that migrate over continental scales.
To get the right answer, we need to resolve the areas where there’s the most activity at a very high (sub-kilometer) resolution, but we can’t resolve all of Antarctica at that level of resolution because of the huge computational expense that would require,” said Dan Martin, a computational scientist and group leader of the Applied Numerical Algorithms Group in the Lab’s Computational Research Division and a co-developer of BISICLES. “With AMR, we can deploy high resolution only where we need it, so as the ice sheet evolves, you can automatically change where that resolution goes.” AMR is a technique that has been developed at Berkeley Lab over the last 25 years and used to enable efficient and accurate simulations across a wide range of applications. BISICLES is implemented in Chombo, one of the resulting software frameworks.
BISICLES has been in development since 2009, and is currently part of the DOE SCIDAC-funded ProSPect application partnership, which aims to improve sea-level projections by bringing a wide range of DOE expertise to bear. Beyond the DOE, researchers all over the world are using BISICLES in their modeling efforts.
What allowed us to accomplish this work, which entailed an unprecedented 35,000 years of high-resolution full-continent simulations, is the combination of AMR and access to NERSC,” says Martin. “While each of our NERSC runs is not that big in supercomputing terms, each simulation would still have taken 10 years on a desktop computer—we’ve used more than a million CPU hours on NERSC’s Edison supercomputer.”
Source: LBNL