趋近智
活跃参数
671B
上下文长度
131.072K
模态
Text
架构
Mixture of Experts (MoE)
许可证
Proprietary
发布日期
12 Aug 2025
训练数据截止日期
-
专家参数总数
37.0B
专家数量
64
活跃专家
6
注意力结构
Multi-Head Attention
隐藏维度大小
2048
层数
61
注意力头
128
键值头
128
激活函数
-
归一化
-
位置嵌入
Relative Position Embedding
ILMU is a Malaysian sovereign language model. Based on a fine-tuned DeepSeek-V3 R1 architecture, it is optimized for localized reasoning and the linguistic profile of Bahasa Malaysia. The model uses reinforcement learning to align logic with local professional standards, specifically ensuring an inclusive approach to Malaysia's diverse religions and traditions, with strict respect for cultures, institutions, and royalty (3R). It is calibrated to recognize Malaysian sensitivities and profanity, to maintain proper social etiquette.
Intelek Luhur Malaysia Untukmu (ILMU) is a language model developed by YTL AI Labs. Trained on YTL AI Cloud infrastructure, the model is designed for Malaysian social norms and linguistic nuances including Bahasa Melayu, Chinese.
没有可用的 ILMU 1.0 评估基准。
排名
-
编程排名
-