ApX 标志

趋近智

ERNIE-4.5-21B-A3B

活跃参数

21B

上下文长度

131.072K

模态

Text

架构

Mixture of Experts (MoE)

许可证

Apache 2.0

发布日期

30 Jun 2025

知识截止

-

技术规格

专家参数总数

3.0B

专家数量

64

活跃专家

6

注意力结构

Grouped-Query Attention

隐藏维度大小

-

层数

28

注意力头

20

键值头

4

激活函数

-

归一化

-

位置嵌入

Absolute Position Embedding

系统要求

不同量化方法和上下文大小的显存要求

ERNIE-4.5-21B-A3B

ERNIE 4.5 is a family of large-scale models developed by Baidu, designed to advance multimodal and language understanding. The ERNIE-4.5-21B-A3B variant is a Mixture-of-Experts (MoE) model with 21 billion total parameters and 3 billion activated parameters per token, specifically optimized for text understanding and generation tasks.

This model incorporates a heterogeneous MoE structure that allows for parameter sharing across modalities while also supporting dedicated parameters for individual modalities, which is intended to enhance multimodal understanding without compromising text-related task performance. The architecture employs a fine-grained MoE backbone, routing text inputs to distinct expert sets to mitigate cross-modal interference. A subset of shared experts and all self-attention layers are maintained for all tokens to facilitate cross-modal knowledge integration. Additionally, it features a modality-aware expert allocation strategy where visual experts are proportioned to optimize visual information processing. The training infrastructure is designed for scalability and efficiency, utilizing heterogeneous hybrid parallelism, hierarchical load balancing, FP8 mixed-precision training, and fine-grained recomputation methods for high pre-training throughput.

For inference, ERNIE 4.5 models incorporate multi-expert parallel collaboration and convolutional code quantization algorithms, supporting 4-bit/2-bit near-lossless quantization. The model supports an extended context length of up to 131,072 tokens. It is designed for efficient deployment across various hardware platforms and integrates with tools like ERNIEKit for fine-tuning methods such as Supervised Fine-Tuning (SFT), Direct Preference Optimization (DPO), and Unified Preference Optimization (UPO). The models are accessible via the PaddlePaddle deep learning framework, which also supports PyTorch weights for the ERNIE-4.5-21B-A3B-PT variant.

关于 ERNIE 4.5

The Baidu ERNIE 4.5 family consists of ten large-scale multimodal models. They utilize a heterogeneous Mixture-of-Experts (MoE) architecture, which enables parameter sharing across modalities while also employing dedicated parameters for specific modalities, supporting efficient language and multimodal processing.


其他 ERNIE 4.5 模型

评估基准

排名适用于本地LLM。

没有可用的 ERNIE-4.5-21B-A3B 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
64k
128k

所需显存:

推荐 GPU

ERNIE-4.5-21B-A3B: Specifications and GPU VRAM Requirements