ApX logoApX logo

MaLLaM-3B

Parameters

3B

Context Length

4.096K

Modality

Text

Architecture

Dense

License

Apache-2.0

Release Date

15 Jan 2024

Knowledge Cutoff

-

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

-

Number of Layers

-

Attention Heads

-

Key-Value Heads

-

Activation Function

-

Normalization

-

Position Embedding

Absolute Position Embedding

System Requirements

VRAM requirements for different quantization methods and context sizes

MaLLaM-3B

MaLLaM-3B is a 3 billion parameter Malaysian language model designed for edge deployment. It is bilingual in Bahasa Malaysia and English, trained on Malaysian digital content and literature. The model supports local idioms and cultural references. Released under the Apache 2.0 license.

About MaLLaM

Malaysian Large Language Model (MaLLaM) is an open-source language model family developed to support Bahasa Malaysia and English. The model is trained on Malaysian text data including local news, literature, and digital content. It is designed to process Malaysian linguistic nuances and cultural context, available in multiple parameter sizes for different hardware deployments.


Other MaLLaM Models

Evaluation Benchmarks

No evaluation benchmarks for MaLLaM-3B available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
2k
4k

VRAM Required:

Recommended GPUs

MaLLaM-3B: Specifications and GPU VRAM Requirements