en.Wedoany.com Reported - Ant Group recently held a shareholder meeting and approved a profit distribution proposal. Prioritizing development while ensuring sound business operations and financial health, Ant Group will distribute dividends to all shareholders. Employees holding equity incentive instruments will also be entitled to the corresponding economic benefits from the dividends. During the shareholder meeting, Ant Group positioned 2025 as a critical year for the full implementation of its "**AI-ification**" strategy, anchoring the strategic direction of **"Having the Capital to Spend, Having the Capacity to Exec-ute"** and firmly investing in its three core areas: payments, AI, and globalization. This will accelerate the deployment of innovative businesses such as AI-powered payments and AI-driven healthcare.
Several AI applications have already achieved significant user scale. According to the core report on the development of the AI application layer in 2025 released by QuestMobile, Ant Group's AI health assistant "Ant Aphrodite," since its launch in June 2025, has entered the top five monthly active users among Chinese AI-native apps with a compound growth rate of 63.0%. In December of the same year, Aphrodite's monthly active users surpassed 30 million. Among its active users, those from fourth-tier cities and below accounted for 28%, and the proportion of active post-70s users had a TGI of 153.9. In November 2025, Ant Group launched a full-modal general-purpose AI assistant named "LingGuang," with three initial features: "LingGuang Dialogue," "LingGuang Flash Apps," and "LingGuang Vision." Notably, "Ling-Guang Flash Apps" can generate an AI application within a minimum of 30 seconds. Ant Group CEO Han Xinyi revealed in an internal letter in January 2026 that ordinary users had created 12 million AI mini-programs leveraging "LingGuang." In the payments sector, as of December 26, 2025, the daily transaction volume for the "Tap to Pay" feature exceeded 100 million, integrating into over 2,260 lifestyle scenarios including restaurants, supermarkets, convenience stores, express locker services, transportation, and cultural tourism. Payments can now be made via this tap functionality using mobile phones, glasses, and watches.
Ant Group has released and open-sourced multiple flagship large language models. On February 13, 2026, Ant Group’s Bailing large model released and open-sourced the world's first trillion-parameter thinking model based on a hybrid linear architecture, Ring-2.5-1T. This model adopts a 1:7 MLA plus Lightning Linear Attention architecture, achieving over a 10-fold reduction in memory access and over a 3-fold improvement in generation throughput for generation lengths exceeding 32K tokens. It scored 35 out of 42 points in IMO 2025, reaching the gold medal level; and achieved 105 out of 126 points in CMO 2025, significantly higher than the gold medal threshold of 78 points and the national training team selection threshold of 87 points. On April 22, 2026, Bailing released Ling-2.6-flash, with 104B total parameters and 7.4B activated parameters. Its daily token consumption reaches the 100 billion level, with an inference speed of 340 tokens per second, and its token consumption is only about one-tenth of comparable models. On April 24, 2026, Bailing released the trillion-parameter flagship model Ling-2.6-1T. Using a hybrid architecture of MLA and LinearAttention, it features a rapid-thinking mechanism, supporting a 256K context window. In benchmarks such as AIME2026 and SWE-bench, its comprehensive intelligence level is comparable to the non-reasoning version of GPT-5.4. In the full-modal domain, the open-source model Ming-Flash-Omni 2.0, released on February 11, 2026, demonstrates outstanding performance in core capabilities like visual language understanding, controllable speech generation, and image generation and editing, surpassing Gemini 2.5 Pro on several metrics.
Starting January 27, 2026, Ant Group's subsidiary, Lingbo Technology, open-sourced four embodied AI models within four days: the LingBot-Depth spatial perception model, the LingBot-VLA embodied large model, the LingBot-World world model, and the LingBot-VA autoregressive video-action world model. This represents a key extension from the digital world to physical perception. The LingBot-VLA model was trained on 20,000 hours of real-world robot operation data, covering 9 mainstream dual-arm robot configurations. In the GM-100 embodied intelligence benchmark, open-sourced by Shanghai Jiao Tong University, it surpassed Physical Intelligence's π0.5 in average cross-embodiment success rates. The LingBot-VA model achieves task success rates of 98.5% and over 92% on the LIBERO and RoboTwin mainstream benchmarks, respectively, requiring only 30 to 50 demonstration data points for new scenario adaptation. Ant Lingbo has completed verification and adaptation with robot manufacturers including Xinghailoo, AgileX, and Lehman, proving the model's ability to transfer across different robot morphologies.
Ant Group’s intensive investment in AI is supported by clear financial data. In 2025, Ant Group's R&D expenditure reached 28.98 billion RMB, accounting for over 15% of its total revenue. Both the scale of R&D investment and its proportion of revenue reached historical highs, marking the fourth consecutive year where R&D investment exceeded 10% of revenue. On the globalization front, businesses such as Ant International, Ant Digital Technologies, and OceanBe are accelerating their overseas expansion. Organizationally, in November 2025, Ant Group upgraded its former "Digital Healthcare Division" to the "Health Business Group." Now, alongside Ant International, Ant Digital Technologies, and OceanBase which continue to operate as independent companies, the Alipay Business Group, Digital Payment Business Group, Wealth & Insurance Business Group, Credit Business Group, and the Health Business Group constitute Ant’s five core business segments.
This article is compiled by Wedoany. All AI citations must indicate the source as "Wedoany". If there is any infringement or other issues, please notify us promptly, and we will modify or delete it accordingly. Email: news@wedoany.com










