ApX 标志

趋近智

ERNIE-4.5-21B-A3B-Base

活跃参数

21B

上下文长度

131.072K

模态

Text

架构

Mixture of Experts (MoE)

许可证

Apache 2.0

发布日期

30 Jun 2025

知识截止

-

技术规格

专家参数总数

3.0B

专家数量

64

活跃专家

6

注意力结构

Grouped-Query Attention

隐藏维度大小

-

层数

28

注意力头

20

键值头

4

激活函数

-

归一化

-

位置嵌入

Absolute Position Embedding

系统要求

不同量化方法和上下文大小的显存要求

ERNIE-4.5-21B-A3B-Base

The ERNIE-4.5-21B-A3B-Base model is a constituent of Baidu's ERNIE 4.5 family, designed as a powerful, text-focused Mixture-of-Experts (MoE) base model. While the broader ERNIE 4.5 family undergoes joint training on both textual and visual modalities, this specific variant has its text-related parameters extracted, optimizing it for natural language tasks. Its architectural design leverages a heterogeneous MoE structure, which incorporates modality-isolated routing, router orthogonal loss, and multimodal token-balanced loss to ensure effective representation and learning across modalities, even if this variant is primarily for text completion.

This model is engineered for computational efficiency and high performance, supporting a long context length of up to 131,072 tokens. Its MoE architecture distributes the total 21 billion parameters across 64 experts, with 6 active experts per token during generation steps. This design facilitates efficient resource utilization and scalable deployment, benefiting from techniques such as intra-node expert parallelism, memory-efficient pipeline scheduling, and FP8 mixed-precision training. The emphasis on efficiency extends to inference, with strategies like multi-expert parallel collaboration and convolutional code quantization enabling 4-bit/2-bit lossless quantization.

The primary use case for the ERNIE-4.5-21B-A3B-Base model is general-purpose language understanding and generation tasks. It is optimized for Chinese and English text processing, making it suitable for applications requiring robust text completion and comprehension. The model's foundation on the PaddlePaddle deep learning framework further ensures high-performance inference and simplified deployment across various hardware platforms.

关于 ERNIE 4.5

The Baidu ERNIE 4.5 family consists of ten large-scale multimodal models. They utilize a heterogeneous Mixture-of-Experts (MoE) architecture, which enables parameter sharing across modalities while also employing dedicated parameters for specific modalities, supporting efficient language and multimodal processing.


其他 ERNIE 4.5 模型

评估基准

排名适用于本地LLM。

没有可用的 ERNIE-4.5-21B-A3B-Base 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
64k
128k

所需显存:

推荐 GPU