Micron Technology supplies high-bandwidth memory for data centers, which Nvidia has integrated into its GPUs.
The number of AI inference chip startups in the world is gross – literally gross, as in a dozen dozens. But there is only one ...
Google’s AI (artificial intelligence) chip TPU (tensor processing unit), poised to challenge NVIDIA’s GPU (graphics processing unit), is reshaping the HBM (high-bandwidth memory) market. The HBM ...
TL;DR: NVIDIA has requested Samsung to double GDDR7 production to support its new B40 AI GPU, shifting from expensive HBM to cost-effective GDDR7 memory. Samsung has completed facility expansions, ...
A total addressable market is a forecast of what will be sold – more precisely, what can be manufactured and sold. It is not ...
Nvidia (NVDA) announced separate collaborations with SK Group, Samsung Electronics (OTCPK:SSNLF), and Hyundai Motor Group to help develop AI factories. Nvidia announced plans with Samsung to build a ...
Nvidia has launched its latest Rubin CPX GPU aimed at applications in the era of large-scale inference for AI. According to research firm SemiAnalysis, the launch represents a new direction in GPU ...
TL;DR: Samsung has secured a key HBM4 memory supply deal with NVIDIA, delivering up to 11Gbps chips for NVIDIA's 2026 Rubin AI GPUs. Utilizing advanced 10nm DRAM and 4nm logic, Samsung's ...
The NVIDIA-Groq $20 billion deal announced on December 24, 2025 is a major strategic move in the AI hardware space. NVIDIA ...
Meanwhile, incessant demand for NVIDIA’s innovative Blackwell chips and cloud graphics processing units (GPUs) has boosted ...
Companies with impressive demand visibility, rapid innovation cycles, and robust supply chains can pleasantly surprise ...
Nvidia is working with Japanese company Kioxia to develop solid-state drives (SSDs) that connect directly to GPUs. The goal is to achieve major performance improvements for artificial intelligence ...