ApX logoApX logo

SEA-LION-7B

Parameters

7.1B

Context Length

2.048K

Modality

Text

Architecture

Dense

License

Apache-2.0

Release Date

1 Dec 2023

Knowledge Cutoff

Sep 2023

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

4096

Number of Layers

32

Attention Heads

32

Key-Value Heads

32

Activation Function

GELU

Normalization

Layer Normalization

Position Embedding

Absolute Position Embedding

SEA-LION-7B

SEA-LION-7B (Southeast Asian Languages In One Network) is a 7.1 billion parameter decoder-only transformer model developed by AI Singapore to address the linguistic and cultural specificities of the Southeast Asian region. Built on the MosaicML Pretrained Transformer (MPT) architecture, the model is trained from scratch on a massive 980 billion token corpus. This training set is uniquely balanced, featuring significant representation for 11 regional languages including Indonesian, Malay, Thai, Vietnamese, Filipino, Tamil, Burmese, Khmer, and Lao, alongside English and Chinese, ensuring the model captures regional nuances often overlooked by Western-centric LLMs.

Technically, SEA-LION-7B diverges from standard MPT configurations by utilizing absolute learned positional embeddings rather than ALiBi, which provides a stable foundation for its 2,048-token context window. The architecture consists of 32 transformer layers with a hidden dimension of 4096 and 32 attention heads. It employs Low-Precision LayerNorm for normalization and uses the GeLU (Gaussian Error Linear Unit) activation function. A critical innovation is the SEABPETokenizer, a custom Byte-Pair Encoding tokenizer with a 256,000-token vocabulary specifically optimized to reduce the token-to-word ratio for Southeast Asian scripts, thereby improving inference efficiency and comprehension.

Designed for research and regional application deployment, SEA-LION-7B serves as a base for specialized natural language understanding and generation tasks. Its performance characteristics are tailored for multilingual translation, sentiment analysis, and culturally aware text generation within the ASEAN context. The model's open-weights release under the MIT license encourages community-driven fine-tuning and adaptation for specific regional industrial use cases while maintaining a transparent and accessible framework for researchers and developers.

About SEA-LION

Southeast Asian Languages In One Network (SEA-LION) is a family of language models developed by AI Singapore for Southeast Asian languages. The models support English, Indonesian, Malay, Thai, Vietnamese, Tagalog, Burmese, Khmer, Lao, Tamil, and Chinese. It focuses on regional linguistic patterns and is available in base and instruction-tuned variants.


Other SEA-LION Models

Evaluation Benchmarks

No evaluation benchmarks for SEA-LION-7B available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
1k
2k

VRAM Required:

Recommended GPUs