Forbes contributors publish independent expert analyses and insights. This article discusses memory and chip and system design talks at the 2025 AI Infra Summit in Santa Clara, CA by Kove, Pliops and ...
TL;DR: SK hynix CEO Kwak Noh-Jung unveiled the "Full Stack AI Memory Creator" vision at the SK AI Summit 2025, emphasizing collaboration to overcome AI memory challenges. SK hynix aims to lead AI ...
As SK hynix leads and Samsung lags, Micron positions itself as a strong contender in the high-bandwidth memory market for generative AI. Micron Technology (Nasdaq:MU) has started shipping samples of ...
Samsung's new codename Shinebolt HBM3e memory features 12-Hi 36GB HBM3e stacks with 12 x 24Gb memory devices placed on a logic die featuring a 1024-bit memory interface. Samsung's new 36GB HBM3e ...
JEDEC’s HBM4 and the emerging SPHBM4 standard boost bandwidth and expand packaging options, helping AI and HPC systems push past the memory and I/O walls.
With doubled I/O interfaces and refined low voltage TSV design, HBM4 reshapes how memory stacks sustain throughput under data ...
GridGain Systems today announced the availability of what is being touted as the industry’s first end-to-end In-Memory Computing infrastructure stack. The GridGain Stack includes high performance ...