Mellanox Provides Leading I/O Performance and Fabric Flexibility for Dell™ PowerEdge™ C6100

40Gb/s InfiniBand and 10GbE Connectivity Provides IT Administrators with Critical Bandwidth, Ultra Low-Latency, Ease-of-Use and Fabric Flexibility

SUNNYVALE, CA. and YOKNEAM, ISRAEL – Nov. 8, 2010 – Mellanox® Technologies, Ltd. (NASDAQ: MLNX; TASE: MLNX), a leading supplier of high-performance, end-to-end connectivity solutions for data center servers and storage systems, today announced that its ConnectX®-2 adapter card with Virtual Protocol Interconnect® (VPI) technology is now available on Dell PowerEdge C6100 ultra-dense rack servers. Mellanox’s ConnectX-2 VPI adapter card provides industry-best bandwidth and latency performance over 40Gb/s InfiniBand and 10 Gigabit Ethernet, delivering unmatched clustering, GPU-based computing, and I/O virtualization performance to high-performance computing, Web 2 and cloud users.

“Simplified I/O networking and performance is imperative to reducing application runtime, operational expenses and enabling greater resource utilization within the data center,” said John Monson, vice president of marketing at Mellanox Technologies.  “The combination of Mellanox’s InfiniBand and Ethernet connectivity products and Dell PowerEdge C6100 rack servers reduces network complexity and provides end users with an ultra-dense, performance-optimized solution for a variety of performance-demanding business applications.”

“IT and cluster administrators are requiring greater bandwidth performance, better CPU utilization, and more efficient use of their data center space,” said Reuben Martinez, Engineering Director, Data Center Solutions (DCS), at Dell. “Our Dell PowerEdge C6100 provides just that. When coupled with Mellanox ConnectX-2 VPI adapters, this system provides customers with enhanced application performance and scalability, while reducing overall space, energy costs and networking complexity.”

Mellanox ConnectX-2 VPI adapter cards can connect to either 40Gb/s InfiniBand or 10GbE networks and are suited to a variety of business and clustering applications. ConnectX-2’s 40Gb/s InfiniBand connectivity, with its 1us application latency, enables high server productivity and supports application and communication offloads with CORE-Direct™ and GPU efficiency enhancements via NVIDIA GPUDirect™. ConnectX-2’s 10GbE connectivity, with support for industry-standard RDMA over Converged Ethernet (RoCE), delivers industry-leading Ethernet-based latency of 1,3us. FlexBoot™ enables servers to boot from remote InfiniBand or LAN storage targets and makes it easier for IT managers to deploy infrastructure that meets the challenges of a dynamic data center.

Availability
Mellanox ConnectX-2 VPI adapter cards are available now through the Dell.

Supporting Resources:

About Mellanox
Mellanox Technologies is a leading supplier of end-to-end connectivity solutions for servers and storage that optimize data center performance. Mellanox products deliver market-leading bandwidth, performance, scalability, power conservation and cost-effectiveness while converging multiple legacy network technologies into one future-proof solution. For the best in performance and scalability, Mellanox is the choice for Fortune 500 data centers and the world’s most powerful supercomputers. Founded in 1999, Mellanox Technologies is headquartered in Sunnyvale, California and Yokneam, Israel. For more information, visit Mellanox at www.mellanox.com.

Mellanox, BridgeX, ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, InfiniPCI, PhyX, and Virtual Protocol Interconnect are registered trademarks of Mellanox Technologies, Ltd. CORE-Direct and FabricIT are trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.
###

NVIDIA Mellanox Cookie Policy

This website uses cookies which may help to deliver content tailored to your preferences and interests, provide you with a better browsing experience, and to analyze our traffic. You may delete and/or block out cookies from this site, but it may affect how the site operates. Further information can be found in our Privacy Policy.