In this episode of Let’s Talk Exascale, Mark Taylor of Sandia National Laboratories talks about using exascale supercomputers for severe weather and water resource forecasting. A sub-project within the US Department of Energy’s (DOE’s) Exascale Computing Project (ECP) called E3SM-MMF is working to improve the ability to simulate the water cycle and the processes around precipitation. Our guest on the latest episode of ECP’s podcast, Let’s Talk Exascale, is Mark Taylor of Sandia National Laboratories, principal investigator of the E3SM-MMF project.
Let’s Talk Exascale: Forecasting Water Resources and Severe Weather with Greater Confidence
Deep Learning for Predicting Severe Weather
Researchers from Rice University have introduced a data-driven framework that formulates extreme weather prediction as a pattern recognition problem, employing state-of-the-art deep learning techniques. “In this paper, we show that with deep learning you can do analog forecasting with very complicated weather data — there’s a lot of promise in this approach.”
Computational Evaluation of Cloud HPC with a Global Atmospheric Model
Daniel Arevalo from DeVine Consulting gave this talk at the HPC User Forum. “Prompted by DoD priorities for modernization, cost savings, and redundancy, this project compared the performance of the NAVGEM on an in-house Cray system against the follow cloud offerings: AWS c4.8xlarge, Penguin B30 queue, Azure H16r, and AWS c5n.18xlarge.”
Accelerating High-Resolution Weather Models with Deep-Learning Hardware
Sam Hatfield from the University of Oxford gave this talk at the PASC19 conference. “In this paper, we investigate the use of mixed-precision hardware that supports floating-point operations at double-, single- and half-precision. In particular, we investigate the potential use of the NVIDIA Tensor Core, a mixed-precision matrix-matrix multiplier mainly developed for use in deep learning, to accelerate the calculation of the Legendre transforms in the Integrated Forecasting System (IFS), one of the leading global weather forecast models.”
ClimateNet Looks to Machine Learning for Global Climate Science
Pattern recognition tasks such as classification, localization, object detection and segmentation have remained challenging problems in the weather and climate sciences. Now, a team at the Lawrence Berkeley National Laboratory is developing ClimateNet, a project that will bring the power of deep learning methods to identify important weather and climate patterns via expert-labeled, community-sourced open datasets and architectures.
Video: Big Data Assimilation Revolutionizing Weather Prediction
In this video from the HPC User Forum in Tucson, Miyoshi Takemasa from RIKEN presents: Big Data Assimilation Revolutionizing Weather Prediction. “A new project harnessing data from a Japanese satellite called Himawari-8 could improve weather forecasting and allow officials to issue life-saving warnings before natural disasters. The breakthrough is the result of pairing data collected by Japan’s Himawari-8 weather satellite with a program run on a supercomputer at the RIKEN science institute.”
PSSC Labs offers HPC Solutions for Weather Modeling and Wildfire Analysis
PSSC Labs is offering HPC clusters solutions for Weather Modeling and Wildfire Analysis in collaboration with Atmospheric Data Solutions (ADS). “The solutions ADS are developing use a comprehensive blend of predictors that are important to large wildfire potential including dead fuel and live fuel moisture, near-surface atmospheric moisture and temperature, wind speed and gusts.”
Building a GPU-enabled and Performance-portable Global Cloud-resolving Atmospheric Model
Richard Loft from NCAR gave this talk at the NVIDIA booth at SC17. “The objectives of NCAR’s exploration of accelerator architectures for high performance computing in recent years has been to 1) speed up the rate of code optimization and porting and 2) understand how to achieve performance portability on codes in the most economical and affordable way.
Video: Scientel Runs Record Breaking Calculation on Owens Cluster at OSC
In this video, Norman Kutemperor from Scientel describes how his company ran a record-setting big data problem on the Owens supcomputer at OSC.
“The Ohio Supercomputer Center recently displayed the power of its new Owens Cluster by running the single-largest scale calculation in the Center’s history. Scientel IT Corp used 16,800 cores of the Owens Cluster on May 24 to test database software optimized to run on supercomputer systems. The seamless run created 1.25 Terabytes of synthetic data.”
Chinese Research Team Wins Gordon Bell Prize using #1 Sunway TaihuLight Supercomputer
A weather science team from China has won 2016 ACM Gordon Bell Prize for their research project, “10M-Core Scalable Fully-Implicit Solver for Nonhydrostatic Atmospheric Dynamics.” The winning team presented a method for calculating atmospheric dynamics on the world’s fastest computer, the 93 Petaflop Sunway TaihuLight system. “On the road to the seamless weather-climate prediction, a major obstacle is the difficulty of dealing with various spatial and temporal scales. The atmosphere contains time-dependent multi-scale dynamics that support a variety of wave motions.”