It’s still early, but there are some profound technological changes starting to take shape that will fundamentally alter not only the server market, but also how define what a server actually does.
As the price of memory continues to fall and the number of processors on a chip increases, the ability to run more application code in memory is becoming a lot more feasible. Instead of relying on hard disks for primary I/O storage, next-generation servers are going to rely on memory to service I/O requests that used to be handled by primary storage systems.
In fact, examples of these new in-memory computing servers, such as the High-Performance Analytics Appliance (HANA) that SAP developed in conjunction with Dell, IBM and Hewlett-Packard, are already coming to market. As customers become more aware of this trend, many of them are going to sharply curtail purchases of existing servers in anticipation of what new server systems that could make many existing servers obsolete by this time next year.
In fact, SAP CIO Oliver Bussmann says that much of our existing IT infrastructure will flatten because there won’t be need for separate systems to run data warehousing and analytic applications. Those applications will access memory directly on the same in-memory server that is running the production applications.
Andy Lark, head of global marketing for large enterprises for Dell, adds that next generation servers are going to rely on in memory processing to process workloads in parallel. When an IT organization wants to scale up and application, they will simply add more in-memory server capacity in a way that scale out applications using concepts typically associated with high performance computing (HPC) environments.
In fact, when it comes to deploying systems for a cloud computing environment, Larks says there is little difference between what is required to build a cloud computing systems and the way traditional HPC systems are managed.
Like most innovations, next-generation servers are a double-edged sword in the channel. They represent an opportunity on the one hand to essentially make over the entire data center. At the same time, many solution providers are depending on a steady stream of server-related revenues emanating from the data center, so anything that has the potential to disrupt that flow of revenue has to viewed with some concern. But as every solution provider knows, whoever is forewarned usually winds up being the most forearmed.Tags: channel, HPC, high performance computing, Cloud Computing, solution providers, High Performance Analytics Appliance, HANA, SAP, Storage, Hewlett-Packard, IBM, Dell, in-memory computing