Alibaba's Tongyi Lab has open-sourced the Qwen3.6-35B-A3B model, according to Chinese media reports.Built on a Mixture of Experts (MoE) architecture, this large model boasts a total of 35 billion parameters, while activating only 3 billion parameters per inference. In terms of agentic programming capabilities, it is said to have greatly surpassed its predecessors and to be able to rival dense models like Qwen3.5-27B and Gemma4-31B.Related News Jefferies Expects 1Q26 Earnings Beat for Chinese Cloud Service Providers, Sees Baidu Cloud and Alibaba Cloud Revenue Up 40% YoY
AASTOCKS Financial News