ApX 标志ApX 标志

趋近智

Qwen3.5-35B-A3B

活跃参数

35B

上下文长度

262.144K

模态

Multimodal

架构

Mixture of Experts (MoE)

许可证

Apache 2.0

发布日期

24 Feb 2026

训练数据截止日期

-

技术规格

专家参数总数

3.0B

专家数量

256

活跃专家

9

注意力结构

Grouped-Query Attention

隐藏维度大小

2048

层数

40

注意力头

16

键值头

2

激活函数

SwigLU

归一化

RMS Normalization

位置嵌入

ROPE

Qwen3.5-35B-A3B

Qwen3.5-35B-A3B is Alibaba Cloud's efficient multimodal foundation model, released February 2026. With 35B total parameters and 3B activated through a Mixture-of-Experts architecture (256 experts), it delivers strong performance with minimal compute. It achieves MMLU-Pro (85.3%), GPQA Diamond (84.2%), SWE-bench Verified (69.2%), and Terminal-Bench 2.0 (40.5%). Qwen3.5-Flash is the hosted API version. Features unified vision-language capabilities, 262k native context (extensible to 1M), and strong performance on multimodal reasoning, coding, and multilingual tasks.

关于 Qwen 3.5

Qwen 3.5 is Alibaba Cloud's latest-generation foundation model family, released February 2026. It represents a significant leap forward, integrating breakthroughs in multimodal learning (unified vision-language foundation), efficient hybrid architecture (Gated Delta Networks with sparse Mixture-of-Experts), scalable reinforcement learning across million-agent environments, and global linguistic coverage spanning 201 languages. Available under Apache 2.0 license with open weights.


其他 Qwen 3.5 模型

评估基准

没有可用的 Qwen3.5-35B-A3B 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
128k
256k

所需显存:

推荐 GPU