DeepSeek's Explosive Growth: How a Chinese AI Startup Went From 7,475 Daily Users to 22 Million in 9 Months

DeepSeek has transformed from a relatively unknown AI research division into a global powerhouse, reaching 22.15 million daily active users by January 2025, up from just 7,475 in August 2024. The Chinese startup's rapid ascent represents one of the most dramatic shifts in the artificial intelligence landscape, driven by the January 2025 launch of DeepSeek-R1, a reasoning-focused model that outperforms OpenAI's o1 on several coding benchmarks while costing a fraction to develop .

What Triggered DeepSeek's Meteoric Rise?

The turning point came in January 2025 when DeepSeek released its R1 model and mobile app. Website traffic surged 312 percent from December to January, with daily visitors jumping from roughly 17,825 to 73,493 in a single month . By February 2025, that number had climbed to 97,193 daily visitors. The app itself was downloaded more than 75 million times, with over 3 million downloads occurring during its launch week alone .

What makes this growth particularly striking is the geographic distribution. While DeepSeek dominates in China, India, and Indonesia, which together account for 51.24 percent of its monthly active users, the platform has also captured significant traction outside its home market. Nearly half of all users are located outside China, and the United States alone represents 4.34 percent of the global monthly active user base .

How Does DeepSeek's Cost Advantage Change the AI Game?

The economics of DeepSeek's models represent a fundamental challenge to the prevailing assumption that cutting-edge AI requires massive capital expenditure. DeepSeek-V3, the company's flagship model with 671 billion parameters, cost approximately $5.5 million to develop, roughly one-eighteenth the cost of OpenAI's GPT-4 system . The company trained its models using just 2,000 Nvidia H800 graphics processing units (GPUs), specialized chips designed for artificial intelligence workloads, and achieved daily inference costs estimated at under $100,000 .

This efficiency stems from optimized training and inference methods that reduce computational requirements by nearly 30 percent in some workloads compared to competing approaches . The models were trained on 14.8 trillion tokens, a measure of the amount of text data used to teach the AI system, demonstrating large-scale data processing capacity without proportional cost increases .

  • Training Investment: DeepSeek-R1 required only $5.6 million in training costs, compared to hundreds of millions for competing frontier models from Western AI labs
  • Hardware Efficiency: The company achieved top-tier performance using 2,000 Nvidia H800 chips, a fraction of the compute clusters deployed by OpenAI and Google
  • Inference Affordability: Daily operational costs remain under $100,000, making the service economically viable at massive scale
  • Open-Source Strategy: DeepSeek released its R1 model for free, removing financial barriers to adoption and accelerating developer experimentation

How Are DeepSeek's Models Performing Against Global Competitors?

Performance benchmarks reveal that DeepSeek's efficiency gains have not come at the expense of capability. DeepSeek-R1 outperformed OpenAI's o1-1217 model on two out of five coding benchmark tests, achieving 65.9 percent on LiveCodeBench compared to OpenAI's 63.4 percent, and 49.2 percent on SWE Verified versus 48.9 percent . The model ranks in the top 10 globally across mathematics, coding, and reasoning benchmarks .

DeepSeek's specialized models also demonstrate competitive strength. DeepSeek-Coder-V2, the company's dedicated coding model, achieved a HumanEval score of 85.6 percent, positioning it as the top open-source code model available . DeepSeek-VL, a vision and language model, scored 87.2 percent on VQAv2, a visual question-answering benchmark, with optical character recognition accuracy of 92.1 percent .

Who Is Using DeepSeek, and How?

The user base reveals patterns typical of early-stage technology adoption. The platform skews male, with 66.12 percent male users and 33.88 percent female users . The 18 to 24 age bracket represents the largest cohort at 40 percent of all users, followed by the 25 to 34 age group at 33.51 percent, indicating strong adoption among younger professionals and students . Users aged 35 to 44 make up 23.65 percent, while older demographics show declining but meaningful engagement .

Usage patterns indicate deep engagement. The average user opens DeepSeek 8 to 9 times per month, with an average session duration of six minutes . Mobile usage accounts for nearly 80 percent of all activity, reflecting the platform's accessibility through smartphones . Monthly user retention exceeds 40 percent, a strong indicator of sustained value delivery .

What Does DeepSeek's Timeline Reveal About Its Strategy?

DeepSeek's evolution from research division to consumer platform unfolded deliberately. The company was founded in May 2023 as the research division of High-Flyer, originally operating under the name Fire-Flyer . In November 2023, DeepSeek announced its Coder model, signaling entry into the open-source coding language model space . Early 2024 brought the launch of DeepSeek LLM with 67 billion parameters, explicitly positioned as a low-cost alternative to challenge larger Chinese AI firms .

May 2024 saw the release of DeepSeek-V2, which gained recognition for balancing quality with low training costs . Late 2024 introduced DeepSeek-Coder-V2 with 236 billion parameters and a context window of 128,000 tokens, enabling the model to process roughly 100,000 words at once . The January 2025 launch of both DeepSeek-V3 and DeepSeek-R1 marked the inflection point that transformed the company from a regional player into a global phenomenon .

DeepSeek's trajectory challenges the assumption that AI leadership requires the largest capital pools. By prioritizing efficiency, embracing open-source distribution, and targeting underserved markets, the company has demonstrated that alternative paths to frontier AI capability exist. As the platform continues scaling, its impact on the competitive dynamics of the global AI industry will likely intensify.