As we explained earlier in the week, ULTRARAM looks like the messiah of memory technologies. Said to be at least as fast as RAM but with lower power demands, it also matches the non-volatile characteristics of flash, only this stuff lasts a thousand years and was seemingly branded by Douglas Adams’ dearly departed soul. Anyone for some Brockian Ultra-Cricket? Anyway, ULTRARAM, what’s not to like?
Well, the baffling and indeed equally Adams-esque technicalities of the nascent technology, such as triple-barrier resonant tunnelling structures presumably conceived by super-intelligent shades of the colour blue, do make ULTRARAM rather tricky to sense check as a mere lay observer. But more pertinently, there are some pretty obvious questions around real-world practicalities.
After all, Intel and Micron had a pretty similar narrative around the 3D XPoint technology which formed the basis of the Optane line of SSDs and memory persistent storage memory DIMMs. But that didn’t go well, did it?
So, if Intel couldn’t make Optane work, what hope an independent start up? To find out, we spoke to someone who ought to have a decent idea. None other than Manus Hayne, physics professor at Lancaster University in the UK. Oh, and he’s also the inventor of ULTRARAM. Handy.
Professor of Physics, Lancaster University
First up, that Intel Optane parallel, what does Hayne make of it?
“There is no doubt that there is a huge challenge ahead,” says Hayne. “But if we did not think that there are prospects for wide adoption we would not be pursuing it as a commercial prospect.
“Intel certainly got into problems with Optane, but it is not an isolated case. There are several emerging memories, with Optane being the most famous and important example in terms of market share. These memories struggle to compete with the performance of DRAM or the low cost of flash. Combining that with the challenge of climbing the capacity mountain means that they occupy <1% of the market, often in niche but stable applications.
“ULTRARAM is intrinsically fast and very efficient, so is technically competitive with DRAM at single bit level, but the scale-up challenge remains.”
In other words, Hayne is both well versed in the broader memory market and realistic about the challenges, which is certainly a promising start. He also recognises how long it takes for technologies to become fully established.
“Due to technical and commercial issues with volume production and the associated costs there won’t be a ‘big bang’ disruptive change. For example, it took (is still taking) a long time for flash, which was invented in the 1980’s, to replace hard disk drives,” he says.
On the subject of flash, Hayne is well aware that money talks. “The trend in flash is more and more capacity and decreasing endurance, because lower cost/bit wins,” he says. But there are still use cases where ULTRARAM could replace both RAM and flash, “most likely in applications where little storage is needed, e.g. household appliances and IoT devices.”
But ULTRARAM is arguably a better bet as an alternative to RAM. And there are very good technical reasons why it could succeed. For instance, its use of compound semiconductors, as opposed to silicon, could actually be a manufacturing advantage.
“There is a whole different semiconductor world where compound semiconductors dominate: photonics,” Hayne explains. “Compound semiconductors are readily available, with multiple layers of different compound semiconductors (heterostructures) routinely grown with very high material quality in volume manufacturing processes. Indeed, because much of the complexity of ULTRARAM is implemented in this single technological process, the number of steps required to produce the memories in the semiconductor fab is substantially reduced, lowering the cost.”
Similarly, ULTRARAM is said to be at least as good if not better than conventional RAM when it comes to capacity and density.
“We have an architecture for ULTRARAM which is at least competitive with DRAM, and given the appropriate tools expect it to be scalable below 10nm. So, again, at least competitive with DRAM. Another factor is the peripheral circuitry.
“All memory chips need logic to address the arrays, program/erase/readout functions and I/O circuity. However, DRAM needs to compensate for refresh and destructive read, and flash needs charge pumps and wear-levelling to compensate for poor endurance. ULTRARAM needs none of these, leaving more of the chip for the actual memory,” Hayne says.
If all that rings true in the final release form of the technology, there will presumably still be some major technical barriers to supporting ULTRARAM in computing platforms. Are the likes of Intel or AMD really likely to add support for ULTRARAM to their CPUs, for instance?
Unsurprisingly given the early nature of the ULTRARAM project—it’s only being showcased for the first time at the moment—Hayne won’t be drawn on the details. But he can confirm that conversations with industry players are already in flight.
All we can say is that the core component set of the PC has been relatively stagnant of late. So we, and our Hoovooloo overlords, absolutely welcome something new and innovative, especially a technology as promising as ULTRARAM.