Parameters
-
Context Length
32K
Modality
Text
Architecture
Dense
License
Apache 2.0
Release Date
15 Jan 2025
Knowledge Cutoff
-
Attention Structure
Multi-Head Attention
Hidden Dimension Size
-
Number of Layers
-
Attention Heads
-
Key-Value Heads
-
Activation Function
-
Normalization
-
Position Embedding
Absolute Position Embedding
Codestral 25.01 is Mistral AI's specialized coding model with deep understanding of software development. Features enhanced capabilities for code generation, completion, debugging, and refactoring across multiple programming languages. Trained on diverse codebases with focus on modern development practices, design patterns, and code quality. Excels at understanding developer intent and generating idiomatic, well-structured code. January 2025 release brings improved accuracy and expanded language support.
Codestral is a Mistral AI model designed for code generation and comprehension. It supports over 80 programming languages. The model family includes a 22 billion parameter variant.
Rank
#123
| Benchmark | Score | Rank |
|---|---|---|
Coding Aider Coding | 0.11 | 31 |
Overall Rank
#123
Coding Rank
#111
Full Calculator
Choose the quantization method for model weights
Context Size: 1,024 tokens