XDA Developers on MSN
Nvidia's new VRAM compression trick just gave it a reason to keep selling 8GB GPUs
It works like magic, but won't renew your old 8GB card's lease on life ...
One concern following NVIDIA's introduction of its ultra-efficient Neural Texture Compression (NTC) was that it would remain proprietary to only NVIDIA's GPUs. Intel has since introduced its own ...
Forward-looking: Intel is pitching a new way to pack game textures that leans heavily on neural networks but still nods to traditional block compression. The company's Texture Set Neural Compression, ...
TL;DR: Intel's Texture Set Neural Compression uses AI to drastically reduce texture memory and storage needs by up to 18 times with minimal visual quality loss. Available later this year as an SDK, it ...
Intel and Nvidia showed off their respective AI-powered texture-compression technologies over the weekend, demonstrating impressive reductions in VRAM use while maintaining texture quality, or even ...
NVIDIA has shown off its Neutral Texture Compression (NTC) technology before, but it is still a work in progress. The technology and its efficiency benefits have since been more clearly detailed and ...
Morning Overview on MSN
Nvidia demos neural texture compression, claiming 85% less VRAM use
Nvidia researchers have proposed a neural compression method for material textures that, according to results reported in their preprint, can significantly reduce the texture memory footprint during ...
During GDC 2026, Nvidia presented new progress in its Neural Texture Compression (NTC) technology, showing that the future of gaming depends not only on raw power but also on better optimization, ...
A team of researchers led by California Institute of Technology computer scientist and mathematician Babak Hassibi says it has created a large language model that radically compresses its size without ...
Sony’s as yet-unannounced PS6 could feature a 1TB SSD with no disc drive, according to hardware insider Kepler_L2. Speaking in a post on NeoGAF, the insider was following up on a previous claim that ...
We have seen the future of AI via Large Language Models. And it's smaller than you think. That much was clear in 2025, when we first saw China's DeepSeek — a slimmer, lighter LLM that required way ...
Even if you don’t know much about the inner workings of generative AI models, you probably know they need a lot of memory. Hence, it is currently almost impossible to buy a measly stick of RAM without ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results