ApX logo

ERNIE-4.5-0.3B

Parameters

0.3B

Context Length

131.072K

Modality

Text

Architecture

Dense

License

Apache 2.0

Release Date

30 Jun 2025

Knowledge Cutoff

-

Technical Specifications

Attention Structure

Multi-Head Attention

Hidden Dimension Size

-

Number of Layers

18

Attention Heads

16

Key-Value Heads

2

Activation Function

-

Normalization

-

Position Embedding

Absolute Position Embedding

System Requirements

VRAM requirements for different quantization methods and context sizes

ERNIE-4.5-0.3B

The ERNIE-4.5-0.3B model represents a foundational component within Baidu's ERNIE 4.5 model family. This model is designed as a compact and efficient language model, focusing on general-purpose natural language understanding and generation tasks. Unlike the larger Mixture-of-Experts (MoE) variants in the ERNIE 4.5 series, ERNIE-4.5-0.3B employs a dense transformer architecture, making it suitable for high-throughput applications and resource-constrained environments. Its engineering prioritizes robust language capabilities while maintaining a minimal operational footprint, enabling deployment across diverse scenarios where computational efficiency is paramount.

Architecturally, ERNIE-4.5-0.3B is built upon a dense transformer framework, featuring 18 layers and 16 attention heads. This configuration supports its text processing capabilities, facilitating the interpretation and generation of textual content. The model benefits from training methodologies shared across the broader ERNIE 4.5 family, which incorporate advanced techniques for optimized efficiency. While larger, multimodal ERNIE 4.5 models integrate heterogeneous Mixture-of-Experts and specific multimodal training, ERNIE-4.5-0.3B maintains its focus on delivering high performance within its text-only, dense design.

Designed for practical application, ERNIE-4.5-0.3B is particularly well-suited for high-throughput and edge computing deployments. Its use cases encompass tasks such as sentiment analysis, topic categorization, and spam detection at scale. The model's optimized design further enables direct execution on mobile devices, supporting real-time applications such as text completion or streamlined question-answering systems. Additionally, ERNIE-4.5-0.3B is capable of generating concise summaries of text documents, providing an efficient tool for rapid information review.

About ERNIE 4.5

The Baidu ERNIE 4.5 family consists of ten large-scale multimodal models. They utilize a heterogeneous Mixture-of-Experts (MoE) architecture, which enables parameter sharing across modalities while also employing dedicated parameters for specific modalities, supporting efficient language and multimodal processing.


Other ERNIE 4.5 Models

Evaluation Benchmarks

Ranking is for Local LLMs.

No evaluation benchmarks for ERNIE-4.5-0.3B available.

Rankings

Overall Rank

-

Coding Rank

-

GPU Requirements

Full Calculator

Choose the quantization method for model weights

Context Size: 1,024 tokens

1k
64k
128k

VRAM Required:

Recommended GPUs