Over the past several years, the lion’s share of artificial intelligence (AI) investment has poured into training infrastructure—massive clusters designed to crunch through oceans of data, where speed ...
If the hyperscalers are masters of anything, it is driving scale up and driving costs down so that a new type of information technology can be cheap enough so it can be widely deployed. The ...
The investment strengthens Groq's role in the American AI Stack, delivering fast, affordable compute worldwide. "Inference is defining this era of AI, and we're building the American infrastructure ...
Groq, a developer of hardware made specifically for artificial intelligence inference, has raised $750M in its latest funding round to reach a valuation of $6.9B. Groq builds custom hardware known as ...
It is beginning to look like that the period spanning from the second half of 2026 through the first half of 2027 is going to be a local maximum in spending on XPU-accelerated systems for AI workloads ...
A new technical paper titled “Analog optical computer for AI inference and combinatorial optimization” was published by researchers at Microsoft Research, Barclays and University of Cambridge.
A monthly overview of things you need to know as an architect or aspiring architect. Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with ...
Nvidia has brought out the Rubin CPX GPU, a specialised accelerator specifically made for massive-context AI models. The chip delivers 30 PetaFLOPS of NVFP4 compute performance on a monolithic die ...
Intel just made some noise in the AI benchmark scene with the latest MLPerf Inference v5.1 results. These tests, published by MLCommons, compared a wide range of hardware for AI inference, and Intel ...
Arm has launched Lumex – the first of its new Compute Sub-Systems (CSS) system-level platforms to be targeted at the mobile market. The premise of Lumex is that AI is the primary workload for mobile ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results