Intel’s Bold Plan to Reinvent Computer Memory (and Keep It a Secret)
Wired, March 2017
We all read John von Neumann’s groundbreaking paper “First Draft of a Report on the EDVAC” back when it was published in 1945, did we not (just joking)?
You should. And read it well, because not many people did (I know of no-one). OK, I admit that some of the mathematics is above me as well, but one thing stuck with me: there are two ways of building a programmable computer. One is how we started doing it, which has become known as the “von Neumann architecture”: a memory unit and a processing unit, and a speeding cycle between the two, transferred through a bus.
We’ve all seen Alan Kay’s famous cardboard model of what a modern computer could look like, from 1965, did we?
As we all did (OK I’ll stop here, but I am not kidding: you should have read them!), Alan Kay had read the article by Gordon Moore (inspired by Doug Englebart already in 1959), about cramming more processors on an Integrated Circuit:
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.
Gordon Moore, 1964
Useless, Alan Kay reasoned, to think about what computers can do now. We should be thinking about what they might be able to do 10 years from now, because it is almost unfathomable! And the Dynabook was a concept born from that daydreaming. A machine with a radio inside, able to communicate with similar devices in a world-spanning peer-to-peer network. How about that!
I fear that Alan Kay may be one of the very few who realised the consequence of Moore’s Law (as it was named in 1975). And now here we are, more than half a century later, and still we believe that von Neumann’s first proposal is the only way to build computers! And we still believe that creating the tools to tap that power should be built based on that architecture (memory-processor = data-functions)!
Change is inevitable. Even though Alan Kay’s vision from 1965 is still not really realised, it might be coming near now. Developments in hardware like memory and processors really start showing the shortcomings of the “first” von Neumann architecture. Massive parallelism, but especially memory that is so fast that the difference between memory and disk is disappearing, and memory that is persistent, is changing the game in a fundamental way. New computer architectures are desperately needed.
Intel’s 3D XPoint promises memory that is more than a 1000 times faster than flash, and stores more than 10 times more than DRAM. Memory and storage will no longer be different, databases are a thing of the past (everything can be done in-memory, just think about this!).
Leave a Reply