News Sharing
For sharing news, please enter the email address of you and the receiver, then press SEND button.*Mandatory Fields
Receiver*
Enter email addresses, separated by semicolon (;). E.g. a@a.com;b@b.com
Your email address*
Content Sharing
RedNote Releases First Open-Source Foundation Model 'dots.llm1'
China's social platform RedNote (Xiaohongshu) released its first open-source foundation model "dots.llm1", which is a 142 billion parameter mixture of experts (MoE) m...
Reset
Send
The window will close in 5 seconds
RedNote Releases First Open-Source Foundation Model 'dots.llm1'
Close
Recommend
2
Positive
3
Negative
0
 
 

China's social platform RedNote (Xiaohongshu) released its first open-source foundation model "dots.llm1", which is a 142 billion parameter mixture of experts (MoE) model, and activates only 14 billion parameters during inference, maintaining high performance while radically reducing training and inference costs.

The most notable feature of "dots.llm1" is its use of 11.2 trillion tokens of non-synthetic high-quality training data, which is very rare among currently available open-source foundation models.

Related NewsWhite House: Trump Extends TikTok 'Sell or Ban' Deadline by Another 90 Days
In Chinese tests, 'dots.llm1' performed exceptionally well, achieving an average score of 91.3, surpassing DeepSeek's open-source V2, V3, and Alibaba's open-source Qwen2.5 32B and 72B.

AASTOCKS Financial News
Website: www.aastocks.com

Copyright(C) AASTOCKS.com Limited 2000. All rights reserved.
Disclaimer: AASTOCKS.com Ltd, HKEx Information Services Limited, its holding companies and/or any subsidiaries of such holding companies endeavour to ensure the accuracy and reliability of the Information provided but do not guarantee its accuracy or reliability and accept no liability (whether in tort or contract or otherwise) for any loss or damage arising from any inaccuracies or omissions.