Dear Visitor,

Our system has found that you are using an ad-blocking browser add-on.

We just wanted to let you know that our site content is, of course, available to you absolutely free of charge.

Our ads are the only way we have to be able to bring you the latest high-quality content, which is written by professional journalists, with the help of editors, graphic designers, and our site production and I.T. staff, as well as many other talented people who work around the clock for this site.

So, we ask you to add this site to your Ad Blocker’s "white list" or to simply disable your Ad Blocker while visiting this site.

Continue on this site freely
  HOME     MENU     SEARCH     NEWSLETTER    
CUSTOMER RELATIONSHIP MANAGEMENT NEWS. UPDATED 5 MINUTES AGO.
You are here: Home / Analytics / HPE Aims To Reinvent Computers
HPE Demos 'The Machine' Next-Gen Memory-Driven Computing
HPE Demos 'The Machine' Next-Gen Memory-Driven Computing
By Jef Cozza / CRM Daily Like this on Facebook Tweet this Link thison Linkedin Link this on Google Plus
PUBLISHED:
NOVEMBER
29
2016
Hewlett Packard Enterprise (HPE) is betting that a prototype for a new type of system architecture will lead to computers that are several orders of magnitude faster and more powerful than current systems. The new architecture, dubbed memory-driven computing, was developed as part of the company’s Machine research program, one of HPE's largest and most complex projects.

"With this prototype, we have demonstrated the potential of memory-driven computing and also opened the door to immediate innovation,” said Antonio Neri, executive vice president of HPE's enterprise group. “Our customers and the industry as a whole can expect to benefit from these advancements as we continue our pursuit of game-changing technologies."

Non-Volatile Memory

The company said it is "committed to rapidly commercializing the technologies developed under The Machine research project into new and existing products." The proof-of-concept prototype, first brought online in October, is built on four different fundamental technologies: non-volatile memory, fabric, ecosystem enablement, and security, according to the company.

Non-volatile memory is a type of computer memory that can retrieve information even after having been turned off, such as flash memory and computer hard disks. Traditionally, non-volatile memory has been useful for long-term storage, but too slow to be used for the type of processing workloads that require faster types of memory, such as RAM.

HPE is aiming to develop non-volatile memory that approaches the performance of DRAM while offering the capacity and persistence of traditional storage. The company said it expects to be able to bring such products to market as soon as 2018.

The new architecture also makes use of HPE’s X1 photonics chip module, a relatively new technology currently undergoing early-stage testing that will use optical connections rather than electrical ones to transmit data. Sending data via beams of light through an optical connection rather than via electrons through a copper wire could potentially be a game-changer in data management. HPE has already demonstrated data transfers of up to 1.2 terabits per second using the X1 module.

Building out the Ecosystem

In addition to developing faster and more powerful hardware, HPE said it is already working on new types of software specifically designed to take advantages of the new memory-driven architecture. The company has already released some code developed for the upcoming systems on GitHub, and plans to use it in new products coming to market next year. The company’s goal is to develop a rich ecosystem of applications able to exploit memory-driven systems to their fullest.

Finally, HPE said the new architecture will allow developers to build systems that are inherently more secure than existing systems. The company said it is working on new hardware security technologies alongside new security software features with the goal of releasing the new products by 2020.

HPE said that its memory-driven computing architecture is incredibly scalable, from tiny Internet of Things devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.

Tell Us What You Think
Comment:

Name:

Rudy:
Posted: 2016-12-12 @ 12:45am PT
This sounds promising.

Like Us on FacebookFollow Us on Twitter
MORE IN ANALYTICS
CRM DAILY
NEWSFACTOR NETWORK SITES
NEWSFACTOR SERVICES
© Copyright 2017 NewsFactor Network. All rights reserved. Member of Accuserve Ad Network.