By Larrisa Beth Turner
With access to the Downtown Data Center (DDC) tightly monitored and controlled, open only to authorized staff members, it’s no surprise that outside interest remains high in the UIT facility nicknamed “one of the U’s better-kept secrets.”
So when UIT’s Center for High Performance Computing (CHPC) recently announced an opportunity for university affiliates to explore the data center, which is operated by Glen Cameron and his team and houses a significant portion of the University of Utah's storage and server capabilities, the reservation list quickly reached its limit.
Kelly Peterson said he signed up for the April 5 tour — his second one — because he likes to learn about the various research taking place at the university, as well as the role CHPC and the DDC play in supporting those projects.
“I like to see how they are facilitating all of that,” said Peterson, who is a University of Utah Health employee but works on projects for Veterans Affairs, “and as someone who enjoys building machines and maintaining machines, it's really interesting to see this happen at scale.”
Peterson joined about a dozen other people on the tour led by CHPC Director Tom Cheatham, Assistant Director for Research Consulting and Faculty Engagement Anita Orendt, and Senior IT Architect Brian Haymore. Nearly all the attendees, Orendt pointed out, use CHPC already, so it’s natural they’d be curious about the data center and its systems.
“Some people just want to see the machines they run on — they have no idea what it is,” Cheatham said.
That was the case for Lexie Wilson and Chris Pennell of the Utah Division of Air Quality (UDAQ), which began its collaboration with CHPC about five years ago to conduct air quality modeling. This method, called the photochemical model, simulates the chemistry, physics, and meteorology needed to demonstrate whether the rules and regulations that UDAQ employs and proposes will make a difference in the future, Pennell said.
“We really can’t do what we do on just regular desktop-class computers,” he added, noting that UDAQ once had a cluster of four computers duct-taped together in a closet that wasn’t as capable or reliable as CHPC’s advanced computing resources. “… [The partnership has] worked out well for us, and I think the EPA and the state of Utah has benefitted a lot.”
About halfway through the tour, Wilson and Pennell finally got what they came for. Haymore escorted them to the Kings Peak cluster, where he singled out just a few UDAQ nodes among dozens of others in the rack. Wilson bent down and snapped a few photos.
Later, when asked about their impressions of the facility, Pennell and Wilson said the data center wasn’t quite what they expected. Pennell referenced a picture of wires and neon lights glowing among darkened clusters, an image featured on the CHPC website and one Cheatham had shown to visitors earlier in the day. In reality, the DDC is fairly well lit.
“You imagine everything's really dark and there's neon lights everywhere and Edward Snowden is walking around,” Wilson said, laughing. “I also imagined that we would have more physical computer stuff — like we have three nodes, four nodes, and it took up, just a loaf of bread worth of space. And that was a little surprising. I don't know why we would have more space; if anything, it's cool that we have less space.”
Peterson toured the facility with his colleagues, who said the supercomputers surpassed their childhood expectations.
“They look different than what we thought,” Peterson said, “but it's visually impressive, to be able to see the racks and racks of everything.”
Cheatham, Orendt, and Haymore also dazzled guests with insider information — from the innerworkings of the physical and IT infrastructure to various cost-saving measures. Visitors, of course, had some follow-up questions.
“How much water does this facility use?”
“Is this powered from a substation?”
“So this was meant to be a data center before you even bought it?”
“Do you have any FPGAs?”
“Are there other data centers?”
The last question, Cheatham said, doesn’t come with a simple answer as it depends on how one defines "data center," but some buildings on campus have network access points, network closets, and machine rooms.
Still, none compares to the DDC.
Pennell, a former U atmospheric sciences graduate student, cited the difference between the systems housed on the fourth floor of Intermountain Network Scientific Computation Center (INSCC) on campus, where he studied, and the Downtown Data Center.
“This is just a vision,” he said. “I was just really impressed by the level of resources and the level of room to grow.”
Peterson also spoke highly of the data center, noting the attention to detail and organization. He said he enjoyed meeting the CHPC and DDC staff, seeing what they do, “and being able to say thanks.”
“They do a really great job,” Peterson added. “[The data center is] a great thing, and we're lucky to have this here at the university.”