ApX 标志

趋近智

Ministral 3 14B

参数

14B

上下文长度

256K

模态

Multimodal

架构

Dense

许可证

Apache 2.0

发布日期

2 Dec 2025

训练数据截止日期

-

技术规格

注意力结构

Multi-Head Attention

隐藏维度大小

-

层数

-

注意力头

-

键值头

-

激活函数

-

归一化

-

位置嵌入

Absolute Position Embedding

系统要求

不同量化方法和上下文大小的显存要求

Ministral 3 14B

Ministral 3 14B is a sophisticated, dense model within the Ministral 3 family, designed to deliver advanced capabilities while accommodating practical hardware constraints, making it suitable for a range of private AI deployments. It integrates a 13.5 billion parameter language model with a 0.4 billion parameter vision encoder, enabling multimodal understanding and processing of both textual and visual inputs. This model variant is distinguished by its multilingual capabilities, supporting over 40 languages, including major European and East Asian languages.

The architectural design of Ministral 3 14B focuses on efficient performance for edge and local computing environments. It is structured as a dense model, contrasting with Mixture-of-Experts (MoE) architectures found in larger models. The model supports a substantial context window of 256,000 tokens, facilitating the processing of extensive inputs and enabling more comprehensive interactions. Its attention mechanism is a Multi-Head Attention (MHA) structure, a standard component in transformer-based models that allows for processing different parts of the input sequence concurrently.

In terms of functionality, Ministral 3 14B offers advanced agentic features, including native function calling and structured JSON output, which enhances its utility in complex automation and conversational AI systems. This model is optimized for diverse use cases such as private chat applications, local AI assistants, and fine-tuning for specialized tasks. Its design prioritizes performance at a smaller scale, ensuring deployability across various hardware configurations, including those with limited resources.

关于 Ministral 3

Ministral 3 is a family of efficient edge models with vision capabilities, available in 3B, 8B, and 14B parameter sizes. Designed for edge deployment with multimodal and multilingual support, offering best-in-class performance for resource-constrained environments.


其他 Ministral 3 模型

评估基准

排名适用于本地LLM。

没有可用的 Ministral 3 14B 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
125k
250k

所需显存:

推荐 GPU