News

Supercomputer Commissioned To Visualize Larger Data Sets

The National Science Foundation (NSF) has awarded a $7 million, three-tier grant to the Texas Advanced Computing Center at The University of Texas at Austin to run a supercomputer system that will carry out large-scale scientific visualization and data analysis for U.S. researchers and educators.

When operational, the new system, nicknamed Longhorn, will be capable of 20.7 trillion floating-point operations per second (TFlops), with additional computational muscle to be provided by a set of graphics processing units (GPUs). For visualization duties, the system could render up to 154 billion triangles per second. It will have a distributed working memory of 13.5 terabytes, and be supported by a 210T storage system. Researchers will be able to access the system from their own offices.

"TACC and its partners will enable the analysis of data at the largest scales and will bring desktop capabilities to the user via a remote resource," said Kelly Gaither, principal investigator and director of data and information analysis at TACC, in a statement.

Gaither noted that the system was designed to better exploit the parallel-processing architecture inherent in low-cost commodity clusters, which have exploded in popularity in the high-performance computing space in the past several years.

"The capabilities of [visualization and data-analysis] resources have not kept pace with the explosive rate of data production leading to a critical juncture in computational science," Gaither said. "Interactive visualization, data analysis and timely data assimilation are necessary for exploring important and challenging problems throughout science, engineering, medicine, national security and safety, to name a few important areas."

In terms of hardware, the system will be comprised of 256 Dell R610 and R710 servers. Each server will have two 2.53 GHz Intel Xeon 5500 quad-core processors, giving the system a total of 2,048 processor cores. In addition, the system will also have at its computational disposal 128 Nvidia Quadroplex 2200 S4 cards, with each card holding four Quadro FX 5800 GPUs with 122,880 Cuda processor cores.

On the software side, TACC plans to offer a collection of open source and commercial-visualization and data-analysis tools, as well as a framework for easily incorporating such software into end user projects. A number of different institutions will offer advanced support services for the software, namely the Scientific Computing and Imaging Institute at the University of Utah, the Purdue Visual Analytics Center, the Data Analysis Services Group at the National Center for Atmospheric Research, and the University of California Davis.

TACC expects the system to be fully operational by January 2010.

NSF's Office of Cyberinfrastructure awarded the grant under its eXtremeDigital program, an ongoing effort to provide American researchers and educators the ability to work with large data sets. "These eXtremeDigital program awards meet the end-to-end needs of the user science community spot-on," said Barry Schneider, program director for OCI. "They will enable visualizing the results from complex numerical simulations and experiments in real time."

This award is funded under the American Recovery and Reinvestment Act of 2009.

About the Author

Joab Jackson is the chief technology editor of Government Computing News (GCN.com).

comments powered by Disqus
Most   Popular