ApX logoApX logo

Sahabat-AI-Llama3-8B-Instruct

Parameters

8B

Context Length

8.192K

Modality

Text

Architecture

Dense

License

Llama-3.1-Community

Release Date

14 Nov 2024

Knowledge Cutoff

-

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

-

Number of Layers

-

Attention Heads

-

Key-Value Heads

-

Activation Function

-

Normalization

-

Position Embedding

Absolute Position Embedding

System Requirements

VRAM requirements for different quantization methods and context sizes

Sahabat-AI-Llama3-8B-Instruct

An 8 billion parameter model based on Llama 3, specifically tuned for Indonesian linguistic nuances and regional cultural context.

About Sahabat-AI

Sahabat-AI is an Indonesian language model family co-initiated by GoTo and Indosat Ooredoo Hutchison. Developed with AI Singapore and NVIDIA, it is a collection of models (based on Gemma 2 and Llama 3) specifically optimized for Bahasa Indonesia and regional languages like Javanese and Sundanese.


Other Sahabat-AI Models

Evaluation Benchmarks

No evaluation benchmarks for Sahabat-AI-Llama3-8B-Instruct available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
4k
8k

VRAM Required:

Recommended GPUs