ApX logoApX logo

Typhoon-2-70B

Parameters

70B

Context Length

128K

Modality

Text

Architecture

Dense

License

Apache-2.0

Release Date

1 Jun 2024

Knowledge Cutoff

Dec 2023

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

8192

Number of Layers

80

Attention Heads

64

Key-Value Heads

8

Activation Function

SwigLU

Normalization

RMS Normalization

Position Embedding

Absolute Position Embedding

Typhoon-2-70B

Typhoon-2-70B is a high-capacity Thai-English large language model developed by SCB 10X, specifically architected to address the linguistic complexities of the Thai language. Built upon the Llama 3.1 70B backbone, this model undergoes extensive continual pre-training on a curated corpus of over 5 billion high-quality Thai tokens. This training process is designed to align the model with Thai cultural nuances and linguistic structures while preserving the original English reasoning capabilities of the underlying architecture. The resulting model serves as a foundation for enterprise-level applications requiring high precision in bilingual contexts.

The technical architecture employs a dense, decoder-only transformer structure with Grouped-Query Attention (GQA) to optimize inference efficiency and memory throughput. It utilizes a 128K token context window, enabling the processing of lengthy legal documents, technical manuals, and multi-turn conversational histories. The model integrates advanced post-training techniques, including supervised fine-tuning (SFT) and Direct Preference Optimization (DPO), to enhance its instruction-following accuracy and function-calling capabilities. These optimizations allow the model to interact with external tools and APIs, facilitating complex agentic workflows.

Released under the Llama 3.1 Community License, Typhoon-2-70B provides a transparent path for developers to integrate sovereign AI capabilities into production environments. Its design emphasizes performance in specialized Thai domains such as legal reasoning, cultural content generation, and sophisticated data analysis. By bridging the gap between English-centric foundation models and local language requirements, Typhoon-2-70B enables the development of localized AI solutions that maintain parity with global standards of reasoning and accuracy.

About Typhoon

Typhoon is a Thai language model family developed by SCB 10X. It is specifically optimized for the Thai language, addressing complexities such as the lack of word delimiters and tonal nuances. The models are trained on Thai-centric datasets including legal, cultural, and historical documents to ensure localized context and knowledge.


Other Typhoon Models

Evaluation Benchmarks

No evaluation benchmarks for Typhoon-2-70B available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
63k
125k

VRAM Required:

Recommended GPUs