What is Devstral2? A Complete Guide to Mistral AI's 123B Coding Model

Dec 10, 2025

Devstral2 is Mistral AI's most advanced open-source coding model, designed specifically for software engineering tasks. With 123 billion parameters and a 256K token context window, Devstral2 represents the frontier of AI-assisted software development.

What Makes Devstral2 Special?

Devstral2 is not just another large language model. It's an agentic coding model specifically designed to excel at:

  • Codebase Exploration: Devstral2 can navigate and understand complex codebases with remarkable accuracy
  • Multi-File Editing: Devstral2 orchestrates changes across multiple files while maintaining architectural context
  • Tool Integration: Devstral2 supports function calling and works seamlessly with development tools
  • Software Engineering Agents: Devstral2 powers autonomous coding agents that can complete complex tasks

Devstral2 Technical Specifications

SpecificationValue
Parameters123 billion
Context Window256K tokens
FormatFP8 (quantized)
LicenseModified MIT
SWE-bench Verified72.2%
SWE-bench Multilingual61.3%
Terminal Bench 232.6%

Devstral2 Performance Benchmarks

Devstral2 achieves impressive results across industry-standard benchmarks:

SWE-bench Verified: 72.2%

This benchmark tests real-world software engineering capabilities. Devstral2's 72.2% score demonstrates its ability to solve actual GitHub issues and implement features in production codebases.

SWE-bench Multilingual: 61.3%

Devstral2 performs well across multiple programming languages, not just English-centric codebases. This makes Devstral2 suitable for international development teams.

Terminal Bench 2: 32.6%

This benchmark tests command-line and terminal operations. Devstral2 can effectively work with shell commands and system operations.

How to Get Started with Devstral2

The easiest way to start using Devstral2 is through Mistral Vibe CLI:

# Install via pip
pip install mistral-vibe

# Or install via curl
curl -LsSf https://mistral.ai/vibe/install.sh | sh

# Launch Devstral2
vibe

For production deployments, use vLLM with Devstral2:

# Pull the Docker image
docker pull mistralllm/vllm_devstral:latest

# Launch the server
vllm serve mistralai/Devstral-2-123B-Instruct-2512 \
  --tool-call-parser mistral \
  --enable-auto-tool-choice \
  --tensor-parallel-size 8

Option 3: Transformers

For integration with Hugging Face Transformers:

from transformers import MistralForCausalLM, MistralCommonBackend

model_id = "mistralai/Devstral-2-123B-Instruct-2512"
tokenizer = MistralCommonBackend.from_pretrained(model_id)
model = MistralForCausalLM.from_pretrained(model_id, device_map="auto")

Devstral2 Use Cases

1. AI Code Assistants

Build intelligent code assistants powered by Devstral2. With its deep understanding of code structure and context, Devstral2 provides accurate suggestions and completions.

2. Software Engineering Agents

Create autonomous coding agents with Devstral2. These agents can handle complex tasks like:

  • Bug fixing and debugging
  • Feature implementation
  • Code review and refactoring
  • Test generation

3. Legacy System Modernization

Use Devstral2 to modernize legacy codebases. Devstral2 can:

  • Analyze old code and understand its functionality
  • Suggest modern alternatives and patterns
  • Help migrate to new frameworks

4. Automated Bug Fixing

Leverage Devstral2 for automated bug detection and fixing. Devstral2 can:

  • Identify issues in code
  • Understand root causes
  • Implement corrections while maintaining code quality

Devstral2 Pricing

Devstral2 offers competitive API pricing:

TypePrice
Input Tokens$0.40 per million
Output Tokens$2.00 per million

This pricing makes Devstral2 up to 7x more cost-effective than comparable models for real-world coding tasks.

System Requirements for Devstral2

Full Deployment (123B)

  • Minimum 4 H100-class GPUs
  • Recommended: 8 GPUs with tensor parallelism

Devstral Small 2 (24B)

For smaller deployments, consider Devstral Small 2:

  • Can run on single-GPU systems
  • Supports CPU-only configurations
  • Achieves 68.0% on SWE-bench Verified

Supported Frameworks and Tools

Devstral2 is supported by:

  • vLLM (recommended for production)
  • Transformers
  • Cline
  • Claude Code
  • OpenHands
  • SWE Agent
  • Kilo Code

Coming soon: llama.cpp, ollama, lmstudio

Devstral2 License

Devstral2 uses a modified MIT license, providing flexibility for both personal and commercial use. This open-source approach allows developers to:

  • Deploy Devstral2 on their own infrastructure
  • Customize and fine-tune the model
  • Use Devstral2 in commercial applications

Conclusion

Devstral2 represents a significant advancement in AI-assisted software development. With its 123B parameters, 256K context window, and exceptional benchmark performance, Devstral2 is the ideal choice for developers and enterprises seeking powerful AI coding assistance.

Whether you're building AI code assistants, creating autonomous coding agents, or modernizing legacy systems, Devstral2 provides the capabilities you need to transform your development workflow.

Start using Devstral2 today and experience the future of AI-assisted coding.

Devstral2 Team

Devstral2 Team

What is Devstral2? A Complete Guide to Mistral AI's 123B Coding Model | Devstral2 Blog