ApX 标志ApX 标志

趋近智

Gemma 3 270M

参数

270M

上下文长度

32K

模态

Text

架构

Dense

许可证

Apache 2.0

发布日期

14 Aug 2025

训练数据截止日期

Aug 2024

技术规格

注意力结构

Multi-Head Attention

隐藏维度大小

1024

层数

12

注意力头

16

键值头

16

激活函数

GELU

归一化

RMS Normalization

位置嵌入

Absolute Position Embedding

Gemma 3 270M

Gemma 3 270M is a compact, open-weights language model developed by Google, specifically engineered for hyper-efficient deployment on edge devices and resource-constrained environments. As the smallest member of the Gemma 3 family, it prioritizes task-specific specialization over general-purpose breadth. The model is uniquely structured with a high ratio of embedding parameters relative to its transformer blocks, facilitating a large 256k-token vocabulary that enables precise handling of rare tokens, multilingual text, and domain-specific terminology across 140+ languages.

Technically, the model utilizes a dense transformer-based architecture with 12 transformer layers and a hidden dimension size of 1024. It incorporates modern architectural improvements such as Rotary Positional Embeddings (RoPE) and RMSNorm to stabilize training and inference at scale. Unlike its larger multimodal siblings in the Gemma 3 series, the 270M variant is a text-only model optimized for low-latency execution. It features an interleaved attention structure that combines local sliding window attention with global self-attention to manage memory overhead effectively while supporting a context window of 32,768 tokens.

Designed primarily for fine-tuning, Gemma 3 270M serves as a foundation for specialized applications such as text classification, entity extraction, and intent routing. Its small memory footprint allows it to run entirely on-device, including mobile phones and IoT hardware, with minimal energy consumption. By training on a massive 6-trillion-token corpus, the model achieves high knowledge density and strong instruction-following capabilities for its size, making it a professional-grade choice for developers seeking to deploy private, local AI solutions without relying on cloud infrastructure.

关于 Gemma 3

Gemma 3 is a family of open, lightweight models from Google. It introduces multimodal image and text processing, supports over 140 languages, and features extended context windows up to 128K tokens. Models are available in multiple parameter sizes for diverse applications.


其他 Gemma 3 模型

评估基准

没有可用的 Gemma 3 270M 评估基准。

排名

排名

-

编程排名

-

GPU 要求

完整计算器

选择模型权重的量化方法

上下文大小:1024 个令牌

1k
16k
31k

所需显存:

推荐 GPU