Microsoft Unveils Maia 200: Custom AI Inference Chip to Cut Costs
Microsoft introduces Maia 200 AI chip with 140B transistors, delivering 10 petaFLOPS to reduce AI inference costs at cloud scale.
Microsoft introduces Maia 200 AI chip with 140B transistors, delivering 10 petaFLOPS to reduce AI inference costs at cloud scale.
TSMC announces 3-nanometer AI semiconductor production in Japan with $52-56B capex for 2026, meeting with Japanese PM Sanae Takaichi.
Taiwan Semiconductor Manufacturing Company (TSMC) announces plans to produce cutting-edge 3-nanometer semiconductors at its second factory in Kumamoto, Japan, marking the first domestic production of such advanced chips in the country. The decision, revealed during a meeting between TSMC CEO C.C. Wei and Japanese Prime Minister Sanae Takaichi, represents a major win for Japan's semiconductor ambitions and will supply chips for AI applications, robotics, and autonomous driving.
AMD's Q4 2025 results show $390M from China-specific MI308 AI chips as total revenue reaches $10.3B, up 34% year-over-year amid export uncertainties.
Intel CEO Lip-Bu Tan announced the company has hired a new chief architect to lead GPU development, positioning Intel to compete in the AI accelerator market dominated by Nvidia and AMD. Tan also warned that memory chip shortages will persist until 2028, creating challenges for AI infrastructure expansion.
Broadcom is emerging as a major competitor to Nvidia by providing custom AI accelerator chips to tech giants like Google, Meta, and ByteDance, signaling a shift in the AI hardware market.
Amazon's Trainium and Google's TPUs are gaining traction, generating billions in revenue and providing a viable alternative to Nvidia's chips for major AI players like Anthropic.
China has granted conditional approval for its top AI startup DeepSeek to purchase Nvidia's H200 artificial intelligence chips, with regulatory conditions still being finalized by the NDRC. DeepSeek is expected to launch its next-generation V4 model with strong coding capabilities in mid-February 2026.
China has approved its first batch of Nvidia's H200 AI chips for import, covering several hundred thousand units valued at approximately $10 billion.
Microsoft unveiled Maia 200, a custom AI chip built on TSMC's 3nm process, designed to reduce reliance on Nvidia and compete with Google's TPUs and Amazon's Trainium processors for large-scale AI workloads.
Samsung cleared final HBM4 qualification tests, achieving industry-leading 11.7 Gb/s data rate. Mass production begins February for Nvidia's Rubin AI accelerator launching at GTC 2026.
Anthropic CEO Dario Amodei criticizes the US administration's decision to allow Nvidia to sell advanced H200 AI chips to China, warning of critical national security implications and comparing the move to nuclear weapons proliferation.
While Nvidia dominates the AI chip market, Wall Street analyst Beth Kindig has named Micron Technology her top AI chip stock for 2026. The company's critical high-bandwidth memory (HBM) is becoming essential for handling the massive data demands of AI workloads.
Apple has officially partnered with Google, integrating the Gemini AI model to power a revamped Siri. This move comes as the US government imposes new 25% tariffs on AI chips sold to China, impacting major players like Nvidia and AMD.
Elon Musk announced that Tesla is aiming for a nine-month design cycle for its AI processors, a pace that would surpass the yearly cadence of industry leaders Nvidia and AMD.
In a significant escalation of the tech trade war, Chinese authorities have reportedly blocked imports of Nvidia's H200 AI chips, a move that comes despite the US government having cleared them for export.
OpenAI has announced a landmark $10 billion agreement with chipmaker Cerebras to deploy 750 megawatts of AI computing power by 2028, significantly expanding its hardware infrastructure and reducing its reliance on Nvidia.