ApX 标志ApX 标志

趋近智

ERNIE-4.5-300B-A47B

活跃参数

300B

上下文长度

131.072K

模态

Text

架构

Mixture of Experts (MoE)

许可证

Apache 2.0

发布日期

30 Jun 2025

训练数据截止日期

Mar 2025

技术规格

专家参数总数

47.0B

专家数量

64

活跃专家

8

注意力结构

Grouped-Query Attention

隐藏维度大小

-

层数

54

注意力头

64

键值头

8

激活函数

-

归一化

-

位置嵌入

Absolute Position Embedding

ERNIE-4.5-300B-A47B

ERNIE-4.5-300B-A47B is a large-scale Mixture-of-Experts (MoE) foundation model developed by Baidu as a core component of the ERNIE 4.5 family. While the broader series encompasses multimodal capabilities, this specific variant is a text-focused model optimized for advanced natural language understanding, complex reasoning, and high-performance text generation in both English and Chinese. It serves as a high-capacity solution for knowledge-intensive tasks, balancing the expansive knowledge base of a 300-billion parameter system with the computational efficiency of sparse activation.

The technical architecture employs a novel heterogeneous MoE structure that facilitates parameter sharing while utilizing modality-isolated routing to prevent cross-modal interference during pre-training. It features 54 Transformer layers and 64 total experts, with 8 active experts per token, resulting in 47 billion active parameters during inference. The model utilizes Grouped Query Attention (GQA) with 64 query heads and 8 key-value heads to optimize memory bandwidth and throughput. Training was conducted using the PaddlePaddle deep learning framework, incorporating intra-node expert parallelism, memory-efficient pipeline scheduling, and FP8 mixed-precision training to achieve high hardware utilization.

Operational efficiency is enhanced through support for near-lossless 4-bit and 2-bit quantization, enabling deployment on a variety of hardware configurations including single-card and multi-GPU setups. The model maintains a substantial context window of 131,072 tokens, allowing for the processing of long-form documents and maintaining coherence across extended dialogues. For post-training, the model undergoes Supervised Fine-Tuning (SFT), Direct Preference Optimization (DPO), and Unified Preference Optimization (UPO) to align outputs with user instructions and ensure robust performance in production environments.

关于 ERNIE 4.5

The Baidu ERNIE 4.5 family consists of ten large-scale multimodal models. They utilize a heterogeneous Mixture-of-Experts (MoE) architecture, which enables parameter sharing across modalities while also employing dedicated parameters for specific modalities, supporting efficient language and multimodal processing.


其他 ERNIE 4.5 模型

评估基准

没有可用的 ERNIE-4.5-300B-A47B 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
64k
128k

所需显存:

推荐 GPU