Mellanox(TM) Technologies, Ltd. (NASDAQ: MLNX) (TASE: MLNX), a leading supplier of semiconductor-based, server and storage interconnect products, announced that later this year VMware is expected to provide enablement for Mellanox InfiniBand-based adapters in the newest release of VMware Infrastructure, which includes VMware ESX Server 3.5 and VMware VirtualCenter 2.5. The InfiniBand LAN networking and block storage drivers are based on the OpenFabrics Enterprise Distribution (OFED) version 1.2.5 and were developed by Mellanox, VMware and other participants of the VMware Community Source program. Mellanox’s industry leading InfiniBand adapters seamlessly replace multiple fiber channel and Gigabit Ethernet adapters typically deployed in virtualized environments, which reduces data center power, cooling, capital expenditure and cost of ownership.
“As one of the first members to join and actively collaborate in the VMware Community Source program, Mellanox was a key player in enabling InfiniBand I/O technologies for VMware Infrastructure,” said Brian Byun, vice president of global partners and solutions at VMware. “The enablement of InfiniBand-based adapters in VMware Infrastructure enables the entire InfiniBand ecosystem to leverage the market-leading VMware platform.”
“Higher I/O bandwidth and I/O consolidation in VMware environments are critical needs, further exacerbated by deployment of multi-core CPUs and I/O real estate constraints driven by green data center initiatives,” said Thad Omura, vice president of product marketing at Mellanox Technologies. “Mellanox is pleased to have worked with VMware to deliver a solution that marries the high price-performance capabilities on Mellanox I/O adapters to the robust, seamless and easy-to-use virtual infrastructure platform from VMware.”
When InfiniBand I/O adapters are used with VMware ESX Server 3.5, a single adapter can replace multiple gigabit Ethernet NICs and Fibre Channel HBAs while maintaining or enhancing I/O throughput from virtual machines. For example, SAN throughput from a virtual machine can reach up to 1500 Megabytes per second (MB/s) or it can be shared linearly across multiple virtual machines, e.g., about 400 MB/s per virtual machine across four virtual machines on the same VMware ESX Server host. This is equivalent to using four 4 Gb/s Fibre Channel HBAs, one dedicated to each of the four virtual machines. This I/O consolidation and scale out, which results in significant cost and power savings, is achieved transparently as operating systems and applications running in the virtual machines continue to run over traditional virtual NIC and HBA interfaces available in VMware virtual machines. Network and storage I/O provisioning for virtual machines and features like VMware VMotion are also configured transparently using VMware VirtualCenter 2.5, which exposes only the familiar virtual NIC and virtual HBA interfaces available over the unified InfiniBand I/O adapter. Similarly, features such as high availability and migration of virtual machines are preserved as if they were operating on Ethernet NICs and Fibre Channel HBAs.
Because the InfiniBand drivers for VMware Infrastructure are based on OFED 1.2.5, the entire ecosystem of InfiniBand infrastructure suppliers who support OFED will be interoperable.
The new release of VMware Infrastructure is expected to be generally available later in 2007. The software drivers for Mellanox InfiniBand-based adapters for VMware Infrastructure will be generally available during the same time from Mellanox. It is expected that InfiniBand-based solutions will become available from some of Mellanox’s partner OEMs as well.
Mellanox Technologies is a leading supplier of semiconductor-based, high-performance, InfiniBand and Ethernet connectivity products that facilitate data transmission between servers, communications infrastructure equipment and storage systems. The company’s products are an integral part of a total solution focused on computing, storage and communication applications used in enterprise data centers, high-performance computing and embedded systems. Founded in 1999, Mellanox Technologies is headquartered in Santa Clara, California and Yokneam, Israel.
Mellanox, ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, and InfiniPCI are registered trademarks of Mellanox Technologies.