Nvidia Defends Open Source AI LLM Turf with Nemotron Nano
By Reuters | 15 Dec, 2025
Nvidia releases its 3rd-gen open-source large language AI model as adoption surges of open-source models from Chinese AI developers like DeepSeek, Moonshot AI and Alibana's Quen.
Nvidia on Monday unveiled a new family of open-source artificial intelligence models that it says will be faster, cheaper and smarter than its previous offerings, as open-source offerings from Chinese AI labs proliferate.
Nvidia is primarily known for providing chips that firms such as OpenAI use to train their closed-source models and charge money for them. But it also offers a slew of its own models for everything from physics simulations to self-driving vehicles as open-source software that can be used by researchers or by other companies, with firms such as Palantir Technologies weaving Nvidia's model into their products.
Nvidia on Monday revealed the third generation of its "Nemotron" large-language models aimed at writing, coding and other tasks. The smallest of the models, called Nemotron 3 Nano, was being released Monday, with two other, larger versions coming in the first half of 2026.
Nvidia, which has become the world's most valuable listed company, said that Nemotron 3 Nano was more efficient than its predecessor - meaning it would be cheaper to run - and would do better at long tasks with multiple steps.
Nvidia is releasing the models as open-source offerings from Chinese tech firms such as DeepSeek, Moonshot AI and Alibaba Group Holdings are becoming widely used in the tech industry, with companies such as Airbnb disclosing use of Alibaba's Qwen open-source model.
At the same time, CNBC and Bloomberg have reported that Meta Platforms is considering shifting toward closed-source models, leaving Nvidia as one of the most prominent U.S. providers of open-source offerings.
Many U.S. states and government entities have banned use of Chinese models over security concerns.
Kari Briski, vice president of generative AI software for enterprise at Nvidia, said the company aimed to provide a "model that people can depend on", and was also openly releasing its training data and other tools so that government and business users could test it for security and customize it.
"This is why we're treating it like a library," Briski told Reuters in an interview. "This is why we're committed to it from a software engineering perspective."
(Reporting by Stephen Nellis in San Francisco; editing by Mark Heinrich)
Articles
- Netflix to Livestream Saturday BTS Concert from Seoul
- Tesla to Buy $2.9 Billion Worth of Chinese Solar-Panel Production Equipment
- OpenAI to Combine ChatGPT, Codex and Browser into One Superapp
- Jeff Bezos Seeks $100 Billion to Transform Manufacturing with AI-Based Automation
- Americans Think Trump Lying About Not Sending Ground Troops to Mideast
- The Real Difference Between Luxury Living in New York and China
- Takaichi Pressed to Join Hormuz Escort Service with Pearl Harbor Comparison
- Xiaomi's Lei Jun Quadruples AI Budget to $8.7 Billion As MiMo-V2-Pro Excites Developers
- Samsung to Supply HBM4 Chips for OpenAI's Custom Processor Project
- OpenClaw Craze Turns Ordinary Chinese into 'Lobster' Farmers
