Lawrence Livermore National Lab Chooses Mellanox for Large-Scale Hyperion Cluster

End-to-End 40Gb/s InfiniBand Connectivity Boosts Cluster Performance and Helps LLNL Achieve Government & Industry Collaboration Goals

SC’10, NEW ORLEANS, LA. – Nov.16, 2010 Mellanox® Technologies, Ltd. (NASDAQ: MLNX; TASE: MLNX), a leading supplier of high-performance, end-to-end connectivity solutions for data center servers and storage systems, today announced that Lawrence Livermore National Laboratory (LLNL) is utilizing Mellanox’s full range of end-to-end 40Gb/s InfiniBand products, including ConnectX®-2 adapter cards, 648-port, 324-port and 36-port switches, and cables in its upgraded Hyperion cluster. The nearly 1,300-node Hyperion cluster will provide LLNL with optimized clustered application performance and will help enable their long term industry collaboration objectives.

The Hyperion cluster utilizes technology from Dell, Intel, Mellanox and other vendors to create a large-scale testbed for critical national security applications, and to support the goal of making PetaFLOP/s (quadrillion floating operations per second) compute and storage more accessible for industry and research development.

Hyperion represents an innovative way to accelerate the development of next-generation high-performance cluster computing systems critical to the National Nuclear Security Administration’s (NNSA) effort to ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing. The LLNL Hyperion cluster also supports shared research activities and objectives of the Livermore Valley Open Campus (LVOC), which aims to ensure the nation’s economic competitiveness by facilitating collaboration between R&D labs, industry, and academia. 

“The Hyperion business model is being used as the prototype ‘innovative government, industry and academic’ partnership model,” said Dr. Mark Seager, Asst. Dept. Head for Advanced Technology. “By working with industry leaders such as Mellanox, we can better fulfill our national security missions and enhance U.S. economic competitiveness in an ever-more competitive global marketplace by advancing the state-of-the-art in high performance computing.”

“We value our continuing collaborations with Lawrence Livermore National Laboratory that aim to enhance high-performance computing and storage accessibility for government and academic researchers,” said John Monson, vice president of marketing at Mellanox Technologies. “We are pleased to have our industry-leading 40Gb/s InfiniBand solution chosen by LLNL to be, once again, the high-performance networking backbone for their joint laboratory initiative.”

Mellanox’s end-to-end InfiniBand connectivity, consisting of the ConnectX®-2 line of I/O adapter products, gateways, cables and comprehensive IS5000 family of fixed and modular switches, delivers industry-leading bandwidth, efficiency, reliability and scalability. Mellanox provides its worldwide customers with the broadest, most advanced and highest performing end-to-end networking solutions for the world’s most compute-demanding applications.

Supporting Resources:

About Mellanox
Mellanox Technologies is a leading supplier of end-to-end connectivity solutions for servers and storage that optimize data center performance. Mellanox products deliver market-leading bandwidth, performance, scalability, power conservation and cost-effectiveness while converging multiple legacy network technologies into one future-proof solution. For the best in performance and scalability, Mellanox is the choice for Fortune 500 data centers and the world’s most powerful supercomputers. Founded in 1999, Mellanox Technologies is headquartered in Sunnyvale, California and Yokneam, Israel. For more information, visit Mellanox at

Mellanox, BridgeX, ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, InfiniPCI, and Virtual Protocol Interconnect are registered trademarks of Mellanox Technologies, Ltd. CORE-Direct, FabricIT, and PhyX are trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.


NVIDIA Mellanox Cookie Policy

This website uses cookies which may help to deliver content tailored to your preferences and interests, provide you with a better browsing experience, and to analyze our traffic. You may delete and/or block out cookies from this site, but it may affect how the site operates. Further information can be found in our Privacy Policy.