Conversational AI
Mistral Large 2
The new generation of our flagship model.
Overview
123B parameters, offering state-of-the-art performance.
128k token context window for long-context applications.
Strong multilingual support, excelling in over a dozen languages.
Advanced reasoning and mathematical capabilities.
Native function calling and JSON outputting for agentic capabilities.
Pricing
Pay-as-you-go
Category
Conversational AI
Company
Mistral AI
Visual Guide
Nano Banana SlidesOpen Fullscreen ↗
Key Features
01
Supports dozens of languages including English, French, German, Spanish, Italian, Chinese, Japanese, Korean, Portuguese, Dutch and Polish.
02
Trained on over 80 coding languages such as Python, Java, C, C++, JavaScript, and Bash, as well as more specific languages like Swift and Fortran.
03
Best-in-class agentic capabilities with native function calling and JSON outputting.
04
State-of-the-art mathematical and reasoning capabilities, with high scores on benchmarks like GSM8K and MATH.
05
A large 128k context window, allowing for the processing of extensive documents and complex conversations.
Real-World Use Cases
Complex Code Generation
A developer needs to generate a complex algorithm in Python for a data analysis task.
Multilingual Customer Support
A global company wants to build a chatbot to handle customer support queries in multiple languages.
Financial Data Analysis
A financial analyst needs to analyze a large dataset of financial reports to identify trends and anomalies.
Quick Start
1
Step 1
Visit the Mistral AI website to access the model via la Plateforme.
2
Step 2
For self-deployment, download the model weights from Hugging Face.
3
Step 3
Follow the provided documentation to set up the model with
mistral_inference or transformers.4
Step 4
Start building applications by leveraging the model's capabilities through the API.