Parameters
3B
Context Length
4.096K
Modality
Text
Architecture
Dense
License
Apache-2.0
Release Date
15 Jan 2024
Knowledge Cutoff
-
Attention Structure
Multi-Head Attention
Hidden Dimension Size
-
Number of Layers
-
Attention Heads
-
Key-Value Heads
-
Activation Function
-
Normalization
-
Position Embedding
Absolute Position Embedding
VRAM requirements for different quantization methods and context sizes
MaLLaM-3B is a 3 billion parameter Malaysian language model designed for edge deployment. It is bilingual in Bahasa Malaysia and English, trained on Malaysian digital content and literature. The model supports local idioms and cultural references. Released under the Apache 2.0 license.
Malaysian Large Language Model (MaLLaM) is an open-source language model family developed to support Bahasa Malaysia and English. The model is trained on Malaysian text data including local news, literature, and digital content. It is designed to process Malaysian linguistic nuances and cultural context, available in multiple parameter sizes for different hardware deployments.
No evaluation benchmarks for MaLLaM-3B available.
Overall Rank
-
Coding Rank
-
Full Calculator
Choose the quantization method for model weights
Context Size: 1,024 tokens