Researchers at the University of Minnesota Twin Cities have developed a groundbreaking technology that could dramatically reduce the energy consumption of AI processing. The new hardware, called Computational Random-Access Memory (CRAM), can potentially cut energy requirements by at least a thousand. The researchers published their findings in “npj Unconventional Computing,” a peer-reviewed journal by Nature.
Traditional AI computations involve transferring data between processing components and storage units, consuming up to 200 times the energy used in the actual computation. CRAM addresses this inefficiency by integrating a high-density, reconfigurable spintronic in-memory compute substrate directly within memory cells. This means data never leaves the memory, making the system running the AI application much more energy-efficient.
Cram boosts AI energy efficiency
In a test involving an MNIST handwritten digit classifier task, the CRAM technology proved to be 2,500 times more energy-efficient and 1,700 times faster than a near-memory processing system utilizing 16nm technology. This task is commonly used to train AI systems to recognize handwriting.
The importance of such advancements is significant, as recent reports suggest AI workloads consume increasing amounts of electricity. Total energy consumption in AI reached 4.3 GW in 2023 and is expected to grow significantly in the coming years. Lead author Yang Lv, a postdoctoral researcher in the University of Minnesota Department of Electrical and Computer Engineering, and the research team have already applied for several patents related to this new technology.
They plan to collaborate with leaders in the semiconductor industry to provide large-scale demonstrations and bring the hardware to market, aiming to boost AI functionality while reducing energy consumption. This breakthrough could represent a significant step toward more sustainable AI technologies, mitigating the ever-growing energy demands of artificial intelligence processing.
Cameron is a highly regarded contributor in the rapidly evolving fields of artificial intelligence (AI) and machine learning. His articles delve into the theoretical underpinnings of AI, the practical applications of machine learning across industries, ethical considerations of autonomous systems, and the societal impacts of these disruptive technologies.




















