Nvidia CEO Jensen Huang announced the company expects at least $1 trillion in chip orders through 2027, driven by restarting sales in China, new AI inference processors, and a major push into robotics and autonomous vehicles. Speaking at Nvidia's GTC conference in San Jose, Huang revealed that the company has received purchase orders from Chinese customers and is ramping up manufacturing to fulfill them, marking a significant shift after previous U.S. export restrictions limited the company's access to that market. Why Is China's Market So Critical to Nvidia's Growth? China represents one of the world's largest markets for artificial intelligence infrastructure, and Nvidia's ability to sell there directly impacts its revenue projections. Huang stated that "we have received purchase orders from many customers and we're in the process of restarting our manufacturing. Our supply chain is getting fired up". This restart comes after U.S. restrictions on Nvidia's H200 processor prevented the company from selling to Chinese customers for an extended period. The reopening of this market could add billions to Nvidia's revenue stream, particularly as Chinese companies race to build their own AI data centers and compete globally in artificial intelligence applications. However, Nvidia's reliance on Asia for manufacturing introduces geopolitical risk. The company expects to become the largest customer for Taiwan Semiconductor Manufacturing Company (TSMC) later this year, making it vulnerable to any disruption in the Taiwan Strait. Huang expressed concern about this dependency, saying "My only hope is that we can all work together, stay in peace, look at the big picture, and stay calm. I am 100% certain that the world will depend on Taiwan for a very long time". How Could Nvidia's New Chips Push Revenue Beyond $1 Trillion? Huang suggested that the $1 trillion figure could grow significantly with the addition of new inference processors. The company unveiled the Groq 3 language processing unit (LPU), a specialized chip designed to run AI models efficiently after they've been trained. Made by Samsung Electronics' foundry division in South Korea, the Groq 3 is paired with Nvidia's Vera Rubin NVL72 rack to maximize efficiency across power, memory, and computing resources. Huang indicated that Groq could represent 25% of Nvidia's processing and storage solutions, potentially adding $250 billion to the company's revenue forecast. "Theoretically, that $1 trillion could become $1.25 trillion," Huang told media at the conference. This expansion reflects a broader industry shift from training large AI models to running them efficiently in production, a process called inference. Companies are increasingly focused on deploying AI models at scale rather than building new ones, and Nvidia is positioning itself to dominate both segments of the market. What Role Will Robotics and Autonomous Vehicles Play? While Nvidia's automotive segment contributed only 1% of total revenues in the past year, Huang expressed confidence that this business will eventually become a major revenue driver. The company announced an Open Physical AI Data Factory Blueprint designed to accelerate robotics, vision, and autonomous vehicle deployment by providing training data and simulation tools. Huang noted that "most trillion-dollar businesses started at zero at one point," citing Nvidia's CUDA computing platform as an example. CUDA initially generated no revenue while costing the company heavily, but it eventually became foundational to Nvidia's dominance in AI computing. Nvidia's autonomous vehicle strategy involves three separate computer systems that customers can purchase individually or as a complete package. This modular approach allows the company to capture revenue from different segments of the autonomous vehicle market, from perception systems to decision-making processors. Steps to Understanding Nvidia's Growth Strategy - China Market Restart: Nvidia is reactivating sales channels in China after U.S. export restrictions, with purchase orders already received and manufacturing ramping up to meet demand from Chinese AI infrastructure builders. - Inference Chip Expansion: The Groq 3 language processing unit represents a new product category that could add 25% to Nvidia's revenue by handling the computationally intensive task of running trained AI models in production environments. - Long-Term Robotics Bet: Although autonomous vehicles and robotics currently represent a tiny fraction of revenue, Huang believes these markets will eventually rival or exceed Nvidia's current data center business as AI systems become embedded in physical machines. - Supply Chain Concentration Risk: Nvidia's dependence on Taiwan Semiconductor Manufacturing Company for chip production creates vulnerability to geopolitical disruption, a concern Huang acknowledged but expressed confidence would be managed through international cooperation. What Energy Challenge Does Nvidia's Growth Create? The massive expansion of AI infrastructure that Huang is projecting comes with a significant energy problem. Data centers currently consume approximately 2 to 3% of global electricity, and projections suggest this share could double or even triple by 2030 as AI deployment scales globally. Nvidia's own growth depends on solving this energy crisis, which is why the company featured renewable energy technology in its keynote presentation at GTC. Eco Wave Power's onshore wave energy technology was highlighted as an example of how AI-driven digital twin modeling can improve the design and optimization of renewable energy systems. The connection between AI infrastructure and renewable energy is becoming increasingly critical. Artificial intelligence data centers require significant and continuous power to operate graphics processing units (GPUs), cooling systems, and other computing infrastructure. As AI deployment accelerates, energy providers and technology companies are exploring diversified renewable energy solutions capable of supporting large-scale computing infrastructure while reducing carbon emissions. Wave energy offers particular advantages for coastal regions where many data centers, ports, and population centers are located, making it relevant to Nvidia's growth plans. Are There Concerns About Nvidia's AI Graphics Technology? Not all of Nvidia's recent announcements have been well received. The company unveiled DLSS 5, a generative AI graphics technology designed to enhance visual fidelity in video games by using AI to "infuse" onscreen pixels with photoreal lighting and materials. However, the public reaction has been largely negative, with critics comparing the effect to "AI slop" and arguing that it homogenizes character appearances across different games. When asked about the criticism, Huang defended the technology, stating "First of all, they're completely wrong. The reason for that is because, as I have explained very carefully, DLSS 5 fuses controllability of the geometry and textures and everything about the game with generative AI". Huang emphasized that the technology operates at the geometry level rather than as post-processing, and that it preserves artistic control. However, his technical explanation has not addressed the core concern raised by critics: that the sample clips look homogenous regardless of the underlying technical approach. Prominent game developers including Bethesda and Capcom have expressed enthusiasm for the technology, but the gap between developer support and public perception suggests Nvidia may need to demonstrate more convincing results before the technology gains widespread acceptance. Huang's broader vision for Nvidia remains ambitious. The company is positioning itself not just as a chip manufacturer but as the central infrastructure provider for the AI era, from data centers to robotics to renewable energy optimization. Whether the company can execute on all these fronts while managing geopolitical risks and energy constraints will determine whether the $1 trillion revenue projection becomes reality.