techspot - Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to Redmond, Maia 200 can deliver "dramatic" improvements for AI applications and is already deployed in select US data centers on the Azur…
Back to Top / Tuesday, January 27, 2026, 2:21 pm / permalink 18455 / 4 stories in 5 wks
MatX nets $500M to build AI chips aimed at challenging Nvidia / 10 days
NVIDIA Rubin AI Platform Enters Full Production at CES 2026 / 2 months
Axelera AI raises over $250M to build inference chips aimed at Nvidia competition / 10 days
Meta expands NVIDIA partnership to deploy millions of GPUs and standalone CPUs / 17 days
C2i raises $15M Series A to tackle AI data center power bottlenecks / 18 days
Microsoft Unveils New Maia 200 AI Chip for Enhanced Cloud Services / 5 wks
Nvidia RTX GPUs Modified with Blower-Style Coolers for AI Data Centers in China / 7 wks
NorthFeed Inc.
Disclaimer: The information provided on this website is intended for general informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the content. Users are encouraged to verify all details independently. We accept no liability for errors, omissions, or any decisions made based on this information.