As of Tuesday, the National Weather Service bulked up its forecasting muscle and completely switched over to a pair of high-performance computers that are two of the fastest supercomputers in the world.
Ranked 49 and 50 among the fastest supercomputers in the world, Dogwood and Cactus triple the computing power NWS had under its previous supercomputing. More on the names later.
Dogwood and Cactus are a “game changer for NOAA” with their increased speed and redundancy, NWS Director Ken Graham told a group of reporters Monday.
General Dynamics IT built both computers under a task order as part of the Weather and Climate Operational Supercomputing System II contract won in 2020. The fixed-price task order has a value of $150 million, while the entire contract has a ceiling of $505 million.
GDIT sees the win and roll-out as cementing its role as a leading supercomputer provider to the federal government, said Kevin Connell, GDIT’s vice president and general manager for science and engineering.
The company also operates the supercomputing environments for the National Oceanic and Atmospheric Administration’s Research and Development organization, Environmental Protection Agency and the Defense Department.
Cray, Cloud HQ and Redline partnered with GDIT to build the new NWS supercomputers.
The system is able to ingest at least 35 terabytes of data a day, said Harper Pryor, GDTI’s high performance computer program manager for NOAA. Each supercomputer operates at speed of 12.1 petabytes.
Pryor said Dogwood and Cactus take in data that is collected from a multitude of sensors on satellites, weather stations, weather balloons, ocean buoys and even the wing tips of aircraft.
But the power of the new supercomputers isn’t just about the amount of data they can take in. There is also the higher speed and number of times they can run the models that produce the weather forecasts and other products.
“The more you run the models, the more it increases the confidence in the forecast,” she said.
Confidence is critical when considering who relies on the forecasts produced by the National Weather Service. That is more than just knowing if it is going to be a nice day at the beach.
Every sector of the economy needs weather information as do cities, states, the federal government, first responders and emergency management organizations.
The contract also stands out in the way NWS competed it. GDIT and the incumbent IBM went head-to-head for the project.
In any traditional competition, even for a fixed-price contract — the government issues requirements and companies respond with a solution and their price.
But in this case, NOAA issued a solicitation that included the amount they wanted to spend. NOAA also wanted to spend every penny.
GDIT and IBM had to respond with a solution that would spend NWS’ available budget.
“We had to balance hardware and software costs and the power. We had to optimize how we maxed out the budget,” Connell said.
NWS used a unique approach to the say the list. Bidders had to focus on power utilization, staffing, new capabilities and space considerations to name a few.
As a fixed-price contract, GDIT is taking on all of the risk. GDIT is responsible for increases in salaries for the hard-to-hire supercomputer experts and computing power.
“The variables are all on us,” Connell said.
For the government, the cost remains at $150 million over 10 years.
GDIT had to consider energy costs and connectivity, which is how the Dogwood supercomputer site ended up in Virginia and Cactus came to be in Arizona. Hence the names.
The sites are on opposite sides of the continental divide and are on separate power grids. Day-to-day forecast work is happening at Cactus, while Dogwood serves as a real-time backup site and also where NWS will work on new product development.
GDIT packed Dogwood and Cactus with newer technologies and designed them to be “future-proof.”
“It’s just not the system they’ll be using tomorrow,” Pryor said Monday. “But there is a framework for the future.”