ApX logoApX logo

SEA-LION-7B-Instruct

Parameters

7.1B

Context Length

2.048K

Modality

Text

Architecture

Dense

License

Apache-2.0

Release Date

1 Feb 2024

Knowledge Cutoff

-

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

-

Number of Layers

-

Attention Heads

-

Key-Value Heads

-

Activation Function

-

Normalization

-

Position Embedding

Absolute Position Embedding

System Requirements

VRAM requirements for different quantization methods and context sizes

SEA-LION-7B-Instruct

SEA-LION-7B-Instruct is the instruction-tuned version of the SEA-LION model. It is optimized for task completion and question answering in Southeast Asian regional languages. The model is released under the Apache 2.0 license.

About SEA-LION

Southeast Asian Languages In One Network (SEA-LION) is a family of language models developed by AI Singapore for Southeast Asian languages. The models support English, Indonesian, Malay, Thai, Vietnamese, Tagalog, Burmese, Khmer, Lao, Tamil, and Chinese. It focuses on regional linguistic patterns and is available in base and instruction-tuned variants.


Other SEA-LION Models

Evaluation Benchmarks

No evaluation benchmarks for SEA-LION-7B-Instruct available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
1k
2k

VRAM Required:

Recommended GPUs