ApX 标志ApX 标志

趋近智

ILMU 1.0

活跃参数

671B

上下文长度

131.072K

模态

Text

架构

Mixture of Experts (MoE)

许可证

Proprietary

发布日期

12 Aug 2025

训练数据截止日期

-

技术规格

专家参数总数

37.0B

专家数量

64

活跃专家

6

注意力结构

Multi-Head Attention

隐藏维度大小

2048

层数

61

注意力头

128

键值头

128

激活函数

-

归一化

-

位置嵌入

Relative Position Embedding

ILMU 1.0

ILMU is a Malaysian sovereign language model. Based on a fine-tuned DeepSeek-V3 R1 architecture, it is optimized for localized reasoning and the linguistic profile of Bahasa Malaysia. The model uses reinforcement learning to align logic with local professional standards, specifically ensuring an inclusive approach to Malaysia's diverse religions and traditions, with strict respect for cultures, institutions, and royalty (3R). It is calibrated to recognize Malaysian sensitivities and profanity, to maintain proper social etiquette.

关于 ILMU

Intelek Luhur Malaysia Untukmu (ILMU) is a language model developed by YTL AI Labs. Trained on YTL AI Cloud infrastructure, the model is designed for Malaysian social norms and linguistic nuances including Bahasa Melayu, Chinese.


其他 ILMU 模型
  • 没有相关模型

评估基准

没有可用的 ILMU 1.0 评估基准。

排名

排名

-

编程排名

-

ILMU 1.0:模型规格和详细信息