Ant Group Unveils Ling AI Model Family and Launches Trillion-Parameter Language Model Ling-1T
News > Business News

Audio By Carbonatix
3:32 AM on Thursday, October 9
The Associated Press
HANGZHOU, China--(BUSINESS WIRE)--Oct 9, 2025--
Ant Group today announced the release and open-sourcing of Ling-1T, a trillion-parameter general-purpose large language model. This launch expands Ant Group’s Ling (also known as BaiLing) model family, which now comprises three main series: the Ling non-thinking models, the Ring thinking models, and the multimodal Ming series.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20251009240721/en/
As a flagship non-thinking model in the Ling family, Ling-1T achieves state-of-the-art (SOTA) performance on multiple complex reasoning benchmarks within constrained output token limits, striking a strong balance between efficient inference and precise reasoning, while delivering improved results across diverse use cases—including code generation, software development, competition-level mathematics problem solving, and logical reasoning.
For example, on the 2025 American Invitational Mathematics Examination (AIME) benchmark, Ling-1T achieves an accuracy of 70.42% at an average cost of over 4,000 output tokens per problem, performing on par with best-in-class AI models.
This follows Ant Group’s release of Ring-1T-preview—the world’s first open-source trillion-parameter thinking model in September.
He Zhengyu, Chief Technology Officer of Ant Group, stated: "At Ant Group, we believe Artificial General Intelligence (AGI) should be a public good—a shared milestone for humanity’s intelligent future. We are dedicated to building practical, inclusive AGI services that benefit everyone, which requires constantly pushing technology forward. The open-source release of Ling-1T and Ring-1T-preview represents a key step in fulfilling our commitment to open and collaborative advancement."
The Ling AI model family now comprises:
- Ling series: Mixture-of-Experts (MoE) non-thinking large language models
- Ring series: Thinking models derived from Ling
- Ming series: Multimodal models processing images, text, audio, and video
- Experimental model: LLaDA-MoE
Together, these models offer diverse sizes and technical capabilities tailored to various application scenarios.
About Ant Group
Ant Group aims to build the infrastructure and platforms to support the digital transformation of the service industry. Through continuous innovation, we strive to provide all consumers and small and micro businesses equal access to digital financial and other daily life services that are convenient, sustainable and inclusive.
For more information, please visit our website at www.antgroup.com or follow us on Twitter @AntGroup.
View source version on businesswire.com:https://www.businesswire.com/news/home/20251009240721/en/
CONTACT: Ant Group
Vick Li Wei
KEYWORD: CHINA ASIA PACIFIC
INDUSTRY KEYWORD: DATA MANAGEMENT TECHNOLOGY OTHER SCIENCE SOFTWARE ARTIFICIAL INTELLIGENCE SCIENCE
SOURCE: Ant Group
Copyright Business Wire 2025.
PUB: 10/09/2025 04:32 AM/DISC: 10/09/2025 04:32 AM
http://www.businesswire.com/news/home/20251009240721/en