趋近智
参数
8B
上下文长度
256K
模态
Multimodal
架构
Dense
许可证
Apache 2.0
发布日期
2 Dec 2025
训练数据截止日期
-
注意力结构
Multi-Head Attention
隐藏维度大小
4096
层数
32
注意力头
32
键值头
8
激活函数
-
归一化
RMS Normalization
位置嵌入
Absolute Position Embedding
不同量化方法和上下文大小的显存要求
The Ministral 3 8B model is a member of the Ministral 3 family, developed by Mistral AI, engineered to provide advanced multimodal and multilingual capabilities for edge and resource-constrained environments. This model incorporates 8.4 billion language model parameters complemented by a 0.4 billion vision encoder, totaling 8.8 billion parameters, distinguishing it as a balanced and efficient solution for localized AI deployments. It is designed for versatility, supporting a range of applications from real-time chat interfaces to sophisticated agentic workflows.
Architecturally, Ministral 3 8B is a dense transformer model featuring 32 hidden layers and a hidden dimension size of 4096. Its attention mechanism utilizes 32 attention heads with 8 key-value heads, indicating the use of Grouped Query Attention (GQA) for efficient processing. The model employs Rotary Position Embeddings (RoPE) for handling sequence length and uses a SwiGLU (SiLU) activation function, alongside RMS Normalization for stable training and inference. The architecture is optimized for performance in scenarios where computational resources are limited, supporting an extensive context length of 256,000 tokens.
Ministral 3 8B is equipped with native multimodal understanding, enabling it to process and interpret both text and visual inputs. It offers robust multilingual support, proficient across numerous languages including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, and Korean. The model further integrates native function calling capabilities and supports JSON output, facilitating integration into various agentic systems and automated workflows. These characteristics make it suitable for applications such as image and document description, local AI assistants, and specialized problem-solving in embedded systems.
Ministral 3 is a family of efficient edge models with vision capabilities, available in 3B, 8B, and 14B parameter sizes. Designed for edge deployment with multimodal and multilingual support, offering best-in-class performance for resource-constrained environments.
排名适用于本地LLM。
没有可用的 Ministral 3 8B 评估基准。