Active Parameters
671B
Context Length
131.072K
Modality
Text
Architecture
Mixture of Experts (MoE)
License
Proprietary
Release Date
12 Aug 2025
Knowledge Cutoff
-
Total Expert Parameters
37.0B
Number of Experts
64
Active Experts
6
Attention Structure
Multi-Head Attention
Hidden Dimension Size
2048
Number of Layers
61
Attention Heads
128
Key-Value Heads
128
Activation Function
-
Normalization
-
Position Embedding
Relative Position Embedding
ILMU is a Malaysian sovereign language model. Based on a fine-tuned DeepSeek-V3 R1 architecture, it is optimized for localized reasoning and the linguistic profile of Bahasa Malaysia. The model uses reinforcement learning to align logic with local professional standards, specifically ensuring an inclusive approach to Malaysia's diverse religions and traditions, with strict respect for cultures, institutions, and royalty (3R). It is calibrated to recognize Malaysian sensitivities and profanity, to maintain proper social etiquette.
Intelek Luhur Malaysia Untukmu (ILMU) is a language model developed by YTL AI Labs. Trained on YTL AI Cloud infrastructure, the model is designed for Malaysian social norms and linguistic nuances including Bahasa Melayu, Chinese.
No evaluation benchmarks for ILMU 1.0 available.
Overall Rank
-
Coding Rank
-