Parameters
8B
Context Length
8.192K
Modality
Text
Architecture
Dense
License
Llama-3.1-Community
Release Date
14 Nov 2024
Knowledge Cutoff
-
Attention Structure
Multi-Head Attention
Hidden Dimension Size
-
Number of Layers
-
Attention Heads
-
Key-Value Heads
-
Activation Function
-
Normalization
-
Position Embedding
Absolute Position Embedding
VRAM requirements for different quantization methods and context sizes
An 8 billion parameter model based on Llama 3, specifically tuned for Indonesian linguistic nuances and regional cultural context.
Sahabat-AI is an Indonesian language model family co-initiated by GoTo and Indosat Ooredoo Hutchison. Developed with AI Singapore and NVIDIA, it is a collection of models (based on Gemma 2 and Llama 3) specifically optimized for Bahasa Indonesia and regional languages like Javanese and Sundanese.
No evaluation benchmarks for Sahabat-AI-Llama3-8B-Instruct available.
Overall Rank
-
Coding Rank
-
Full Calculator
Choose the quantization method for model weights
Context Size: 1,024 tokens