Skip to content

Center for High Performance Computing to help create SLATE science platform

$4M National Science Foundation grant fosters collaboration

“”A SLATE edge platform within a campus Science DMZ hosts trusted services operated by a central team which might be operating a network of such services across several campuses. Science app developers interact with the SLATE platform service factory to define and launch elements of a science gateway, data cache, or local workflow service. [Photo credit: University of Chicago]

A SLATE edge platform within a campus Science DMZ hosts trusted services operated by a central team which might be operating a network of such services across several campuses. Science “app” developers interact with the SLATE platform service factory to define and launch elements of a science gateway, data cache, or local workflow service. [Photo credit: University of Chicago] 

The National Science Foundation (NSF) has awarded the universities of Utah, Chicago and Michigan a $4 million, four-year grant to produce SLATE, a new online platform for stitching together large amounts of data from multiple institutions that reduces friction commonly found in multi-faceted collaborations. UIT's Center for High Performance Computing (CHPC) is the grant partner representing the University of Utah, with Senior IT Architect Joe Breen as co-PI.

SLATE, which stands for Services Layer At The Edge, will be a local software and hardware platform that connects and interacts with cloud resources. This online platform will reduce the need for technical expertise, amount of physical resources like servers, and time demands on individual IT departments, especially for smaller universities that lack the resources of larger institutions and computing centers.

From the cosmic radiation measurements by the South Pole Telescope to the particle physics of CERN, multi-institutional research collaborations require computing environments that connect instruments, data and storage servers. Because of the complexity of the science, and the scale of the data, these resources are often distributed among university research computing centers, national high-performance computing centers or commercial cloud providers. This resource disparity causes scientists to spend more time on the technical aspects of computation than on discoveries and knowledge creation.

“Software will be updated just like an app by experts from the platform operations and research teams,” said Breen. “The software will need little to no assistance required from their local IT personnel. The SLATE platform is designed to work in any data center environment and will utilize advanced network capabilities, if available.”

CHPC will contribute to reference architecture, advanced networking aspects, core design, implementation, and outreach to other principal investigators from science disciplines and partner universities.

Breen has built in funding for four part-time undergraduate students, a part-time system administrator, and a part-time programmer. CHPC staff members currently collaborating with Breen on the grant are Scientific Consultant Brett Milash and Senior IT Architect Brian Haymore. 

“CHPC will also be leveraging the grant to explore different mechanisms for better supporting the research community here at the University and at sites that the U helps to support, i.e., Utah State University, Utah Valley University, and Southern Utah University," Breen added. "Most NSF grants have a broader outreach section which covers how to support the larger community. This grant specifically targets not only the three main universities but also collaborators at New Mexico State University, Clemson, and Florida International University, as well as the broader community.”  

The platform

Once installed, central research teams will be able to connect with far-flung research groups allowing the exchange of data to be automated. Software and computing tasks among institutions will no longer burden local system administrators with installation and operation of highly customized scientific computing services. By stitching together these resources, SLATE will also expand the reach of these domain-specific “science gateways.”

SLATE works by implementing “cyberinfrastructure as code,” increasing bandwidth science networks with a programmable “underlayment” edge platform. This platform hosts advanced services needed for higher-level capabilities such as data and software delivery, workflow services and science gateway components.

“A central goal of SLATE is to lower the threshold for campuses and researchers to create research platforms within the national cyberinfrastructure,” said University of Chicago senior fellow Robert Gardner.

Practical applications

Today’s most ambitious scientific investigations are too large for a single university or laboratory to tackle alone. Dozens of international collaborations comprised of scientific groups and institutions must coordinate the collection and analysis of immense data streams. These data streams include dark matter searches, the detection of new particles at the high-energy frontier, and the precise measurement of radiation from the early universe. The data can come from telescopes, particle accelerators and other advanced instruments.

Today, many universities and research laboratories use a “Science DMZ” architecture to balance the need for security with the ability to rapidly move large amounts of data in and out of the local network. As sciences from physics to biology to astronomy become more data-heavy, the complexity and need for these subnetworks grows rapidly, placing additional strain on local IT teams.

Since 2003, a team of computation and Enrico Fermi Institute scientists led by Gardner has partnered with global projects to create the advanced cyberinfrastructure necessary for rapidly sharing data, computer cycles and software between partner institutions.

User benefits

“Science, ultimately, is a collective endeavor,” said Shawn McKee, director of the Center for Network and Storage-Enabled Collaborative Computational Science at the University of Michigan. “Most scientists don’t work in a vacuum, they work in collaboration with their peers at other institutions. They often need to share not only data, but systems that allow execution of workflows across multiple institutions. Today, it is a very labor-intensive, manual process to stitch together data centers into platforms that provide the research computing environment required by forefront scientific discoveries.”

With SLATE, local research groups will be able to fully participate in multi-institutional collaborations and contribute resources to their collective platforms with minimal hands-on effort from their local IT team. When joining a project, the researchers and admins can select a package of software from a cloud-based service — a kind of app store — that allows them to connect and work with the other partners.

By reducing the technical expertise and time demands for participating in multi-institution collaborations, SLATE will be especially helpful to smaller universities. The SLATE functionality can also support the development of “science gateways” that make it easier for individual researchers to connect to HPC resources such as the Open Science Grid and XSEDE.

Share this article:

 

Node 4

Our monthly newsletter includes news from UIT and other campus/ University of Utah Health IT organizations, features about UIT employees, IT governance news, and various announcements and updates.

Subscribe

Categories

Featured Posts

Last Updated: 4/11/22