ApX 标志

趋近智

Qwen2.5-1.5B

参数

1.5B

上下文长度

128K

模态

Text

架构

Dense

许可证

Apache 2.0

发布日期

19 Sept 2024

知识截止

-

技术规格

注意力结构

Grouped-Query Attention

隐藏维度大小

1536

层数

24

注意力头

32

键值头

8

激活函数

SwigLU

归一化

RMS Normalization

位置嵌入

ROPE

系统要求

不同量化方法和上下文大小的显存要求

Qwen2.5-1.5B

Qwen2.5-1.5B is a foundational large language model developed by Alibaba Cloud, forming part of the Qwen2.5 series. This model, with 1.54 billion parameters, is engineered for efficient processing and generation of human-like text across a diverse range of applications. It has undergone extensive pre-training on a large-scale dataset, encompassing up to 18 trillion tokens, and has been fine-tuned for specialized tasks such as instruction following, coding, and mathematical problem-solving. Its design emphasizes the ability to handle long contexts and generate coherent, accurate responses, making it suitable for various textual processing needs.

The architectural foundation of Qwen2.5-1.5B is a dense, decoder-only Transformer. Key components of its architecture include Rotary Position Embeddings (RoPE) for encoding positional information, SwiGLU as the activation function, and RMSNorm for effective normalization, which contribute to stable training and improved performance. The model incorporates Grouped Query Attention (GQA) with a specific configuration of 12 query heads and 2 key-value heads, facilitating efficient attention mechanisms. The model comprises 28 layers, with a hidden dimension size of 1536.

Qwen2.5-1.5B is designed to support a maximum context length of 128,000 tokens, with common configurations supporting 32,768 tokens for full context and enabling generation of up to 8,192 tokens. Its capabilities extend to multilingual understanding and generation across more than 29 languages. The model demonstrates proficiency in processing structured data formats such as tables and JSON. Practical use cases for Qwen2.5-1.5B include the development of conversational agents, virtual assistants, automated code generation tools, mathematical problem-solving platforms, and applications requiring robust content creation and summarization capabilities.

关于 Qwen2.5

Qwen2.5 by Alibaba is a family of dense, decoder-only language models available in various sizes, with some variants utilizing Mixture-of-Experts. These models are pretrained on large-scale datasets, supporting extended context lengths and multilingual communication. The family includes specialized models for coding, mathematics, and multimodal tasks, such as vision and audio processing.


其他 Qwen2.5 模型

评估基准

排名适用于本地LLM。

排名

#44

基准分数排名

0.32

17

0.32

20

排名

排名

#44

编程排名

#43

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
63k
125k

所需显存:

推荐 GPU