Mellanox InfiniBand Provides NASA with Leading Cluster Performance to Enable Advances in Weather and Climate Research

40Gb/s InfiniBand Enables NCCS Users with Faster Global Modeling and Data Analysis

Sunnyvale, Calif. and Yokneam, Israel – Aug. 9, 2010 – Mellanox® Technologies, Ltd. (NASDAQ: MLNX; TASE: MLNX), a leading supplier of high-performance, end-to-end connectivity solutions for data center servers and storage systems, today announced that its industry-leading end-to-end 40Gb/s InfiniBand connectivity products, including ConnectX®-2 adapter cards, IS5025 switches and cables, provide the high-performance server and storage networking for the new NASA Center for Climate Simulation (NCCS) 1,200-node cluster located at the Goddard Space Flight Center in Greenbelt, Maryland. With 14 400 processors, the new Dell-based cluster will double NCCS computational capabilities to more than 300 trillion floating-point operations per second. At these new performance levels, NCCS users will have the ability to fine-tune global model resolutions, capturing smaller-scale features in the atmosphere and oceans so that they can better understand and predict climate change.

“Today’s climate science is very data intensive and requires scalable high-performance server clustering for faster data assimilation, analysis, modeling and visualization,” said Phil Webster, Chief, Goddard's Computational and Information Sciences and Technology Office. “The new 1,200-node cluster, connected with Mellanox 40Gb/s InfiniBand, will leverage Mellanox’s offloading technology and their advanced feature set to provide our scientists with significant improvements in performance and resolution to accelerate advancements in weather and climate research.”

“Atmospheric data analysis, weather simulations and climate forecasting use and produce a vast amount of data resources,” said John Monson, vice president of marketing at Mellanox Technologies. “By incorporating Mellanox end-to-end 40Gb/s InfiniBand networking, the NASA Center for Climate Simulation can quickly handle these large data sets and provide their extensive team of Earth scientists with faster and more detailed atmosphere modeling and weather projection.”

Mellanox’s end-to-end InfiniBand connectivity, consisting of the ConnectX®-2 line of I/O adapter products, cables and comprehensive IS5000 family of fixed and modular switches, deliver industry-leading performance, efficiency and economics for the best return-on-investment for performance interconnects. Mellanox provides its worldwide customers with the broadest, most advanced and highest performing end-to-end networking solutions for the world’s most compute-demanding applications.

Supporting Resources:

About Mellanox
Mellanox Technologies is a leading supplier of end-to-end connectivity solutions for servers and storage that optimize data center performance. Mellanox products deliver market-leading bandwidth, performance, scalability, power conservation and cost-effectiveness while converging multiple legacy network technologies into one future-proof solution. For the best in performance and scalability, Mellanox connectivity solutions are a preferred choice for Fortune 500 data centers and the world’s most powerful supercomputers. Founded in 1999, Mellanox Technologies is headquartered in Sunnyvale, California and Yokneam, Israel. For more information, visit Mellanox at www.mellanox.com.

Mellanox, BridgeX, ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, InfiniPCI, PhyX, and Virtual Protocol Interconnect are registered trademarks of Mellanox Technologies, Ltd. CORE-Direct and FabricIT are trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.

NVIDIA Mellanox Cookie Policy

This website uses cookies which may help to deliver content tailored to your preferences and interests, provide you with a better browsing experience, and to analyze our traffic. You may delete and/or block out cookies from this site, but it may affect how the site operates. Further information can be found in our Privacy Policy.