Blade Server from NextComputing Offers Simplified, Rapid Application Deployment with VMware vCenter Server and vSphere 5

Share Article

Dense chassis features six (6) CPUs, 288GB total memory, independent management server in 2U rack space

NextStream 2U high-density blade server

NextStream high-density blade server

NextComputing, manufacturer of small form-factor and dense computing servers and workstations, announces support for VMware® vCenter™ Server and vSphere™ hypervisor in its NextStream blade server. The compact 2U high NextStream is optimized for IT environments looking to reduce hardware footprint, while maintaining the same performance and flexibility as traditional servers.

As the leader in virtual machine technology, VMware products have allowed enterprises to drastically reduce hardware expenditures while easily ramping up cloud services for their users. With its focus on server consolidation, NextComputing provides dense computing platforms that enable these enterprises to deploy infrastructure with as little hardware footprint as possible. That is why NextComputing has integrated VMware vCenter™ Server virtualization management console and vSphere™ hypervisor in the NextStream.

Today’s budget-conscious enterprises and government agencies are trying to reign in the cost of deploying their applications. IT managers and CTOs alike are turning to virtualization to minimize the amount of hardware required to roll out new services and to simplify the management of many physical and virtual servers. In particular, with the rapid move to the cloud, the requirements for back-end processing and storage have increased dramatically as IT departments hurry to satisfy demand for cloud-based applications.

The NextStream high-density blade server solves this problem by consolidating the hardware needed to run sophisticated deployments of multiple physical hosts and virtual machines into a space- and power-saving 2U chassis. NextStream features:

  •     Three server blades, each with dual (2) low-wattage Intel® Xeon® CPUs, capable of running up to six (6) separate vSphere licenses
  •     Six processing cores per CPU, for a total of 36 cores per chassis (72 with Intel HyperThreading enabled)
  •     96GB DDR3 ECC memory per blade, for a total of 288GB per chassis
  •     Embedded management PC server with Intel® Core™ 2 Duo processor and removable solid state disk for running vCenter Server virtualization management console
  •      (1) or (2) PCI Express 2.0 x8 full height (3/4 length) expansion slots per blade, ideal for installing 10Gbit network interface cards
  •     Integrated Gigabit Ethernet switch with 8 rear-access RJ-45 ports (4 up-link, 4 down-link)
  •     Integrated Gigabit Ethernet connection between blades, eliminating the need for external switches and allowing full use of the (8) rear-access ports for other devices
  •     Dual (2) front-access removable 2.5” SATA or SAS hard drives, up to 1TB each
  •     Redundant 2+1 1140W power supply

In order to provide this amount of computing resources using traditional 1U servers and switches, 4U of rack space or more would be required. With the above combination of features in a 2U rack server, the NextStream offers space reduction of 50% or more. This means a standard 48U datacenter rack can house twice as much computing power in the same amount of space, requiring less floor space and less electricity to run and cool.

For information about NextStream with VMware virtualization solutions, please contact sales(at)nextcomputing(dot)com or call +1-603-886-3874.

About NextComputing

Based in Nashua, NH, NextComputing is a unique technology company specializing in extreme-performance portables and dense streaming rackmount computers. Its open-standards, modular systems are used throughout many industries for a range of professional applications including real-time 3D visualization, high-throughput data streaming, and high-end application demonstration. Visit for more information.

Copyright legal disclaimer: All trademarks contained herein are property of their respective copyright holders.


Share article on social media or email:

View article via:

Pdf Print

Contact Author

Aaron Sherman
Visit website