-+ 0.00%
-+ 0.00%
-+ 0.00%

On December 12, Ant Technology Research Institute officially launched the LAda2.0 series of discrete diffusion large language models, and simultaneously disclosed technical reports. Previously, the open source Llada2.0 included the 16b and 100b versions of the MoE architecture, and Ant expanded the parameter scale of the Diffusion model to 100B for the first time.

Zhitongcaijing·12/12/2025 07:25:05
Listen to the news
On December 12, Ant Technology Research Institute officially launched the LAda2.0 series of discrete diffusion large language models, and simultaneously disclosed technical reports. Previously, the open source Llada2.0 included the 16b and 100b versions of the MoE architecture, and Ant expanded the parameter scale of the Diffusion model to 100B for the first time.