ApX 标志

趋近智

ERNIE-4.5-0.3B-Base

参数

300M

上下文长度

131.072K

模态

Text

架构

Dense

许可证

Apache License 2.0

发布日期

30 Jun 2025

知识截止

-

技术规格

注意力结构

Multi-Head Attention

隐藏维度大小

1024

层数

18

注意力头

16

键值头

2

激活函数

Swish

归一化

RMS Normalization

位置嵌入

Absolute Position Embedding

系统要求

不同量化方法和上下文大小的显存要求

ERNIE-4.5-0.3B-Base

The ERNIE-4.5-0.3B-Base model is a constituent of Baidu's ERNIE 4.5 family of foundation models, explicitly engineered for general-purpose text understanding and generation tasks. This variant is characterized by its compact size, featuring 360 million parameters, and a dense architectural design, rendering it suitable for deployment in environments with limited computational resources or for applications requiring a lightweight inference footprint. As an open-source offering under the Apache License 2.0, it provides a foundational language model for developers and researchers to build upon and integrate into various text-centric systems.

From an architectural standpoint, ERNIE-4.5-0.3B-Base implements a transformer structure comprising 18 layers. It utilizes 16 attention heads for queries and 2 key-value heads, indicating a Grouped-Query Attention (GQA) mechanism for efficient processing. The model is trained to support a substantial context length of up to 131,072 tokens, enabling it to process and generate coherent text over extended sequences. Unlike some other variants within the ERNIE 4.5 series, this model employs a dense architecture rather than a Mixture-of-Experts (MoE) structure. The hidden dimension size is 1024, and it employs RMS Normalization and the Swish (SiLU) activation function. The model utilizes an absolute position embedding.

This model is primarily optimized for text completion and can be fine-tuned for specialized applications through various methods, including Supervised Fine-tuning (SFT), Low-Rank Adaptation (LoRA), and Direct Preference Optimization (DPO). Its compatibility with widely adopted frameworks such as Hugging Face Transformers and Baidu's FastDeploy toolkit facilitates its integration into existing development workflows. The model is designed to support both English and Chinese languages.

关于 ERNIE 4.5

The Baidu ERNIE 4.5 family consists of ten large-scale multimodal models. They utilize a heterogeneous Mixture-of-Experts (MoE) architecture, which enables parameter sharing across modalities while also employing dedicated parameters for specific modalities, supporting efficient language and multimodal processing.


其他 ERNIE 4.5 模型

评估基准

排名适用于本地LLM。

没有可用的 ERNIE-4.5-0.3B-Base 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
64k
128k

所需显存:

推荐 GPU

ERNIE-4.5-0.3B-Base: Specifications and GPU VRAM Requirements